I've had it with the aggressive #AI #crawlers now. Some bot has been hitting #MacPorts with a legitimate enough user agent that I can't block it without also blocking users.

Yesterday, it sent 377k requests (62 % of the total), 369k to URLs forbidden in robots.txt from 274k unique IPs. Most of it for content that could be analyzed quicker using svn checkout or git clone.

Dynamic content on the #web is broken. There's just no way to do that anymore. What a waste of energy.

I've had it with the aggressive #AI #crawlers now. Some bot has been hitting #MacPorts with a legitimate enough user agent that I can't block it without also blocking users.

Yesterday, it sent 377k requests (62 % of the total), 369k to URLs forbidden in robots.txt from 274k unique IPs. Most of it for content that could be analyzed quicker using svn checkout or git clone.

Dynamic content on the #web is broken. There's just no way to do that anymore. What a waste of energy.