[TriLUG] wget, curl ???
Chess Griffin
chess at chessgriffin.com
Mon Jul 21 10:09:24 EDT 2008
James Jones wrote:
> All,
>
> I want to capture a list of files on a website. I don't want to
> download the files. I thought at first that wget would be the best for
> this, but it appears that it will download the files.
>
> What would be the simplest way to achieve my goal?
>
> jcj
A list of the files or a list of the links? If the latter, check out
lynx, w3m, or one of the other text browsers. Maybe something like this
would work:
$ lynx -dump http://www.example.com > log.txt
or if you know what types of files you want listed
$ lynx -dump http://www.example.com | grep ogg > log.txt
You can probably fine-tune the grep to eliminate the junk and capture
just what you want.
There are probably much better ways to this, but lynx was the first
thing that came to mind.
--
Chess Griffin
GPG Key: 0x0C7558C3
http://www.chessgriffin.com
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 250 bytes
Desc: OpenPGP digital signature
URL: <http://www.trilug.org/pipermail/trilug/attachments/20080721/e24ffa36/attachment.pgp>
More information about the TriLUG
mailing list