Lets say, I wanted to copy the bbc.co.uk/news site and whack it on a cd to send to someone which auto ran.
What application would I need to rip the entire site to do this?
Lets say, I wanted to copy the bbc.co.uk/news site and whack it on a cd to send to someone which auto ran.
What application would I need to rip the entire site to do this?
Something like this?
http://www.webstripper.net/
I've not used offline web-crawling tools like this in years, but they used to have major problems with javascript links, but for static content sites they are great.
Basically you point it to the page to start ripping and tell it how many deep to follow the links and copy the content.
Bear in mind that the site has links to news, weather, sport, etc. across the top, so they would be trawled too unless you explicitly exclude them.
Is that the kind of thing you are after?
~ I have CDO. It's like OCD except the letters are in alphabetical order, as they should be. ~
PC: Win10 x64 | Asus Maximus VIII | Core i7-6700K | 16GB DDR3 | 2x250GB SSD | 500GB SSD | 2TB SATA-300 | GeForce GTX1080
Camera: Canon 60D | Sigma 10-20/4.0-5.6 | Canon 100/2.8 | Tamron 18-270/3.5-6.3
don't wanna pay for it![]()
wget, supplied with the right options, would do the ripping
wget is a command-line GNU tool for downloading things
once you have a mirror of the bbc news site saved, look at http://www.ashzfall.com/products/aut...owsercall.html
There are currently 1 users browsing this thread. (0 members and 1 guests)