Sometimes you need to read websites for offline reading. It is fast, and because there is no connection required for offline reading, facebook will never distract while you read. Hahaha. But, after several times struggling downloading web using WebHTTrack, it stills the same: some page are not downloaded.
Never mind, Linux provides a command-line syntax to help us download web, and it is better than that WebHTTrack.
Just write this:
$ wget \ --recursive \ --no-clobber \ --page-requisites \ --html-extension \ --convert-links \ --restrict-file-names=windows \ www.sitetobedownload.com/
This syntax, will download everything in the http://www.sitetobedownload.com. Be it a zip file, be it a CSS file, everything inside http://www.sitetobedownload.com will be downloaded and all the link will be converted to local links. Also, this will force the extension of all page to be .html. And, when the download is interrupted, the syntax can resume up to the previous page/file because of no-clobber syntax.
Now, you can enjoy your favorite websites offline.
ps. your downloaded sites will be in your home folder