Using the Wget Linux command, it is possible to download an entire website, including all assets and scripts. It is occasionally necessary to download and archive a large site for local viewing, and Wget makes this an easy process. This is an example of the options I use to download a complete copy of a site.
There are some additional options like --level or --exclude-directories that I also use for testing and debugging to limit the pages retrieved. The Wget manual is extremely thorough and discusses all of the available options. The options I commonly use are described below.
The mirror option allows you to completely mirror a site. This is actually just a shortcut for using the following options:
This option sets Wget to convert links to allow for local viewing of a site. Links to files that have been downloaded are converted to relative links to the new location. Relatvie links to files that have not been downloaded will be converted to absolute links.
This option sets Wget to append the .html extension to any file that is of the type “application/xhtml+xml” or “text/html” to allow files to be viewed without a webserver, ie. about.php becomes about.php.html.
The option sets Wget to dowload all assets needed to properly display the page, such as css, javscript, and images.
This option executes a command, just as if it were in the Wget startup file.
This option sets the recursion depth. This is great for testing and allows you to not download the internet.
This option sets the character encoding for downloaded files and links. This will mostly not be required and will default to the correct mode for the operating system being used. Available modes are “unix”, “windows”, “nocontrol”, “ascii”, “lowercase”, and “uppercase.”
This option sets the interval between retrievals. This can be used to throttle the requests being made and can be useful when downloading a large site.
This will cause Wget to output all messages to the logfile instead of the console.
This will specify a file where Wget will read the seed URLs from. This allows you to specify multiple URLs to download.
Specify a comma separated list of directories that should not be downloaded. I find this useful for testing to limit the amount of files retrieved.
This option sets Wget to span multiple hosts. This is only necessary if assets for the site are located across multiple domains. For instance, if assets are stored on a different domain, this option will enable downloading them.
This option specifies a white-list domains to retrieve files from. This is necessary when using --span-hosts to prevent Wget from downloading the whole internet.
There are many more options that can be used, this is the list of options that I found useful for archiving a site locally.