On Tue, 8 Feb 2005, Cameron Kaiser wrote:
> > Can't one put monitoring software in place that cuts off any one IP
> > address after, say, more than 100MB (or whatever amount makes the most
> > sense) has been downloaded?
>
> I don't know if ipfilters is that smart. Plus, you run the risk of cutting
> off proxies, which might have legitimate reasons to download a lot of data
> (like if a lot of people are using them). Not a big deal for FTP since there
> aren't many FTP proxies, but it's a possible problem for HTTP.
I'm sure there are legitimate reasons why someone would be accessing the
site through a proxy (from work? Or China perhaps?) but I don't see this
as a major problem.
> Since I implemented this (admittedly draconian) policy, I haven't lost many
> users except the ones I wanted to, and my site availability is much better.
I'll have to keep this in mind once I start offering content. What is
your system developed in? Perl?
--
Sellam Ismail Vintage Computer Festival
------------------------------------------------------------------------------
International Man of Intrigue and Danger http://www.vintage.org
[ Old computing resources for business || Buy/Sell/Trade Vintage Computers ]
[ and academia at www.VintageTech.com || at http://marketplace.vintage.org ]
Received on Tue Feb 08 2005 - 19:43:18 GMT