I have been searching for awhile for the origin post of some meme that i can't find, so out of desperation I was like "ok if there is one thing that LLMs that have stolen the entirety of culture should be able to do, it's find imprecise matches to a copypasta format.
of course it couldn't do that, and one feature of this generation of chatGPT is that unless you have some prior chatty relationship with it in the context window or user prompt, the tone it heads for first if you say it's wrong is a sort of condescending exasperation, so i stuck around for another message to see if i could get it to try and soothe our relationship and it did this...
it didn't even do a web search, it just um... whatever this is. it's like when a toddler with no theory of mind tries to lie to you about something - like they can't compute who is supposed to know or not know what and so sometimes it comes out backwards as the lie you would have told them.