Semi-OT: "Mining" Web Sites
It was thus said that the Great healyzh_at_aracnet.com once stated:
>
> I'm looking for a software package that can take a snapshot of a website for
> archival purposes and go 'x' levels deep. I want it to be able to snag
> stuff such as PDF documents.
Look for a program called ``wget''. It comes with Linux but it should run
fine under Unix in general. And Lynx might have the same functionality, but
don't quote me on that 8-)
-spc (Has used wget several times before. Cool program)
Received on Sat Oct 21 2000 - 13:57:48 BST
This archive was generated by hypermail 2.3.0
: Fri Oct 10 2014 - 23:33:17 BST