Talk:Request throttling

From Meta, a Wikimedia project coordination wiki

There will be problems if there are several user behind a router (NAT). --17:06, 8 October 2005 81.225.66.192

Actually, there are more severe problems. This page is written in terms of installing modules for Apache, but large MediaWiki sites tend to place their Apaches behind reverse-proxy caches such as Squid or Varnish. Any error messages or output from an Apache module to a malicious 'bot risks being cached and served to legit users attempting to access the same page, as non-logged-in pageviews are served from proxy cache where possible.

It also appears to focus on DDos-like activity and misuse of wget-style tools, but ignores other issues such as bulk creation of accounts, repeated spam 'bot posting, repeated failed captcha attempts or automated vandalism to wiki content. What's needed is to be able to auto-block user/IP's for repeated 'bot abuse. Instead of telling the 'bot that tries to create ten accounts in a row and post one page of gibberish "you have exceeded the number of new accounts per IP today", let it try to create a hundred accounts and then block them all, along with all IP's used, reverting all "contributions". The use of 'hotlinked' images by other sites (of which forums and blogs tend to be the worst) also needs to be addressed - in a standard Apache site, these are blocked using .htaccess and looking at the 'referrer' string, but if large wikis are running Squids or other reverse cache, the countermeasures need to be moved there.

As for the Apache modules mentioned, they are still extant and available from third-party sites - I'm not sure though if they're the solution. --Carlb 00:14, 27 August 2009 (UTC)[reply]

DDos-deflate

mod_evasive

mod_throttle