How to download websites for offline reading in Linux

Sometimes you need to read websites for offline reading. It is fast, and because there is no connection required for offline reading, facebook will never distract while you read. Hahaha. But, after several times struggling downloading web using  WebHTTrack, it stills the same: some page are not downloaded.

Never mind, Linux provides a command-line syntax to help us download web, and it is better than that WebHTTrack.

Just write this:

$ wget \
     --recursive \
     --no-clobber \
     --page-requisites \
     --html-extension \
     --convert-links \
     --restrict-file-names=windows \
         www.sitetobedownload.com/

This syntax, will download everything in the http://www.sitetobedownload.com. Be it a zip file, be it a CSS file, everything inside http://www.sitetobedownload.com will be downloaded and all the link will be converted to local links. Also, this will force the extension of all page to be .html. And, when the download is interrupted, the syntax can resume up to the previous page/file because of no-clobber syntax.

Now, you can enjoy your favorite websites offline.

ps. your downloaded sites will be in your home folder

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s