I'm not sure why I didn't think of it in these terms earlier, but one aspect of how generative AI is ruining the web just occurred to me - my plan with the Surfhosting site is to embargo content behind a paywall, so basically you pay a (very reasonable, and by that I mean like $3 a year or something around there) fee and get to read articles 180 days before open access ... and depending how bad things get, I might also put portions of the documentation hub (the wiki-like portion of the site, rather than the blog-like portion) behind a similar wall.
I thought of this just now because of the giant pain in the ass it was for me to work around the Dell CPU throttling behavior in Debian 13.
previously I wouldn't have *dreamed* of doing things this way - I'd have just posted the solution somewhere as a public benefit, and not seen search engines which led people to it as violating the social contract, even if they profited by putting ads next to search results.
but with scraper-fed #generativeAI, it feels very much like they're violating the social contract at my expense in order to transfer more wealth from my broke ass living an irl cyberpunk existence, to scheming billionaire scumbags like Sam Altman, and fuck every single aspect of that. so the content gets embargoed and everyone loses out to some degree. #genAI