[TriLUG] Website Directory Listing via HTTP?
Tanner Lovelace
clubjuggler at gmail.com
Thu Aug 25 14:27:01 EDT 2005
On 8/25/05, Shane O'Donnell <shaneodonnell at gmail.com> wrote:
> One would suppose that if wget/curl were able to walk directories
> recursively on a web server, downloading as they go, the information
> I'm looking (listing of filenames in directory, hierarchical format)
> for is merely a subset of that, no?
>
> So perhaps the web servers I'm working with allow this remote
> directory listing capability. And if that's the case, I'm just
> looking for a quick/dirty utility to exploit it.
>
No, because the ability to recurse through a web page only
means that it recursively grabs all links. So, if you don't have
any index.html (or whatever other indices are configured) in
all the directories you want *and* the server is setup to automatically
create an index which is the list of files, then, and only then
would it work. But, really, this isn't something that http was
designed for and however you do it it will be like trying to wax your car
with butter -- it might work, but there are *much* better ways
of doing it.
Cheers,
Tanner
--
Tanner Lovelace
clubjuggler at gmail dot com
http://wtl.wayfarer.org/
(fieldless) In fess two roundels in pale, a billet fesswise and an
increscent, all sable.
More information about the TriLUG
mailing list