[TriLUG] marginally OT: Some bandwidth and DoS questions

Michael Tharp gxti at partiallystapled.com
Tue May 29 13:29:45 EDT 2007


Matt Nash wrote:
> Hello all,
>
> I am involved in evaluating a lawsuit brought against a web hosting 
> provider by a former client, and some questions have arisen that I don't 
> know the answer to.  I hope some of the networking gurus here can help 
> me out:
>
> Bandwidth
> -For a DSL connection with 768 down and 384 up, if you are consuming all 
> of the upstream bandwidth, does that mean you have no available 
> downstream bandwidth?
>   
No, those two ratings are on separate channels. If you are consuming all 
of the upstream, there will be some TCP overhead taking up downstream, 
but for the most part this is negligible.
> -For a T-1 with a nominal speed of 1.5 Mbps, is 1.5 the total available 
> bandwidth for upstream and downstream, with the entire pipe 
> theoretically available for unidirectional traffic (i.e. 1.5 Mbps down, 
> 0 Mbps up)?
>   
Again, the rating is for two independent channels. If one is fully 
consumed, that will not affect the other (assuming no overhead).
> DoS
> -Have you ever heard of anyone (ab)using wget to accomplish a deliberate 
> non-distributed DoS attack?
>   
Yes. It's a rather poor way to do it though, if the intent is to cause harm.
> -How much traffic/how many requests per second would be required to 
> effectively DoS a 768/384 DSL connection? 
>   
As far as requests per second, it would depend on the server's 
capabilities. Traffic-wise, it would take 384kbps worth of bandwidth 
(minus overhead). The more connections used to accomplish this, the merrier.
> -Could this be done with wget? 
>   
Yes.
> -Is there any resource on the internet which outlines how much data a 
> computer with a given processor is capable of putting on a network 
> interface?
>   
Just about any hardware can saturate a connection with a normal TCP 
stream. A proper DoS would use hundreds or thousands of connections 
(whether from the same origin or not), which would bring the hardware to 
its knees as well.
> wget
> -Have you ever heard of anyone using wget to back up entire websites?
>   
Yes.
> -Would using this method on a PHP-based website actually accomplish a 
> backup?  It seems to me that it would only gather the generated HTML and 
> Javascript.
>   
You are correct that it would only backup the processed output, and not 
the original PHP scripts. It's not a half-bad way to mirror actual 
content (images, media files) though. If you want to deter this, wget 
(and many other standalone downloaders/spiders) will honor robots.txt, 
and Google would be more than happy to tell you how to set up such a file.
> Thanks for any help you might be able to provide.
>
> Matt Nash
>   




More information about the TriLUG mailing list