OT - need some help/advice with grabbing data from a website

There's a website I'd like to get some data from. The data is all there, available and free and all; but they make you go through a series of drop-down pick-from-a-list things to get to each piece, and there are enough pieces that doing it manually would take WAY too long.

Is there some software or langauge that could automate this?

Reply to
jtaylor
Loading thread data ...

This is really not the best newsgroup to ask about it. I am not admonishing you at all, but want to say that you may get better help elsewhere.

That said, I do this -- it is sometimes called "scraping" -- all the time, my website scrapes data from another website and I make about $7 per day doing that.

I use Perl and package LWP::UserAgent to do it. I use perl for all other website purposes also. There is quite a bit of learning curve involved, but perl is amazingly powerful.

i
Reply to
Ignoramus5104

wget ... if you have a decent OS

Nick

Reply to
Nick Müller

.. and it's been ported to Windows, just in case you don't.

Reply to
nick

Google for WinHTrack, it is a web site grabber program. Works fine. Dave F.

Reply to
David L. Foreman

Reply to
RoyJ

The full version of Adobe Acrobat will let you download a web site and it's associated links to create a PDF file. Not sure if that helps or not...

Reply to
Joe AutoDrill

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.