[TriLUG] Website Directory Listing via HTTP?

Shane O'Donnell shaneodonnell at gmail.com
Thu Aug 25 13:50:27 EDT 2005


One would suppose that if wget/curl were able to walk directories
recursively on a web server, downloading as they go, the information
I'm looking (listing of filenames in directory, hierarchical format)
for is merely a subset of that, no?

So perhaps the web servers I'm working with allow this remote
directory listing capability.  And if that's the case, I'm just
looking for a quick/dirty utility to exploit it.

Shane O.

On 8/25/05, Christopher L Merrill <chris at webperformanceinc.com> wrote:
> Shane O'Donnell wrote:
> > I'm trying to figure out a way to use wget/curl to grab a simple
> > listing of files without grabbing the files.  Think of it as a remote
> > "ls" via HTTP.
> 
> I think this is only possible if the server is configured to allow
> directory listings.  Most are not.  And for many of those that are, they
> would return the default page for the directory (index.html, etc) if
> it exists.
> 
> If you don't have control of the server in question, you may not have
> any choices beyond FTP/SSH.
> 
> C
> 
> 
> 
> --
> -------------------------------------------------------------------------
> Chris Merrill                  |  http://www.webperformanceinc.com
> Web Performance Inc.
> 
> Website Load Testing and Stress Testing Software
> -------------------------------------------------------------------------
> --
> TriLUG mailing list        : http://www.trilug.org/mailman/listinfo/trilug
> TriLUG Organizational FAQ  : http://trilug.org/faq/
> TriLUG Member Services FAQ : http://members.trilug.org/services_faq/
> TriLUG PGP Keyring         : http://trilug.org/~chrish/trilug.asc
> 


-- 
Shane O.
========
Shane O'Donnell
shaneodonnell at gmail.com
====================



More information about the TriLUG mailing list