Hi, is it possible to download and store a whole website for viewing when offline? Thanks in advance.
Hi, is it possible to download and store a whole website for viewing when offline? Thanks in advance.
I think you need something called a spider - I know FreeDownloadManager comes with one. It's primarily a download manager, I've used it in the past many times, but according to this page it comes with the function you need (Ctrl-F spider and you'll see it).
http://www.freedownloadmanager.org/features.htm
How exactly you browse it, I don't know - I've never used it before.
Give this a whirl: http://www.httrack.com/
Be careful though, some web sites don't like you swamping them with multiple simultaneous connections (I guess there may be potential copyright issues as well, although it's unlikely to be a problem in practice).
j.o.s.h.1408 (30-01-2010)
Wget will do the job, there's a windows version for it.
Something like `wget -mk http://example.org` would do the trick.
`wget -mk -w 1 http://example.org` would wait a second between each fetch.
j.o.s.h.1408 (30-01-2010)
There are currently 1 users browsing this thread. (0 members and 1 guests)