Linux Download a website with wget

The wget command is an excellent to for grabbing the entire contents of a website and converting it to flat HTML files.

It becomes useful when your website is in a CMS and you'd like to flatten it or you need to make an archive of a site that's not on your server(s).

The options listed above instruct wget to recurse into subdirectories off of the root, convert all links so they reference your downloaded files, get all images and provide an html extension for all HTML files.

Take note of your version of wget though. Releases before version 1.12 will not parse CSS files for background images, so you'll need to download and restore your background images manually.

This is arguably the most commonly used wget configuration for downloading a complete website.
wget -rpkE
Posted by
Snippet Viewed 2198 times.

Share your Linux code snippets:

  • Get some recognition & link back to your site.

 

Submit | Browse

Most Recent linux snippets

Most Viewed