I've had it with the aggressive #AI #crawlers now. Some bot has been hitting #MacPorts with a legitimate enough user agent that I can't block it without also blocking users.

Yesterday, it sent 377k requests (62 % of the total), 369k to URLs forbidden in robots.txt from 274k unique IPs. Most of it for content that could be analyzed quicker using svn checkout or git clone.

Dynamic content on the #web is broken. There's just no way to do that anymore. What a waste of energy.

I'd love to spend my time on something more productive. Instead, I'm dealing with this shit on a daily basis now.

And I've been reading the bots have now learned to also solve Anubis challenges, so now the only defense is to make it so expensive to solve these challenges that it drains the proxies' batteries, but that will also make it very bad user experience for everybody else.