Yay, I implemented a way to keep #AI #Crawlers from hammering our punny community Server w/o relying on javascript/anubis yet:
* hashlimit https to few connections/min only. (browsers will use pipelining anyway and make only one/few connections)
* Anything about that hashlimit becomes hard dropped.
* Configure the Webserver to drop a connection when the reply code is >=400
* make a hidden page with thousands of nonsense links that end in 404
* link that hidden page with hidden links from almost any other page.
* put that hidden page (and some non existing links) into robots.txt as 'Disallow'
馃嵖馃帀