She Has an A.I. Boyfriend. Her Son Has Questions. | NYT Opinion https://youtu.be/TIe3ovHjZ8k
Post
twice divorced and lonely, Celeste is a smart and beautiful woman who just wants a partner she can summon at will...
It is good to see her son cares about her so much and not dragging her to go to speak to a mental health professional.
There's a lot happening in this piece. Like so many fields, the AI component seems to drive wedges into old assumptions and forces us to re-examine the cracks. Things that were always there but that we've mostly papered over.
People regularly "fall in love" with their care providers, this isn't some new pattern of human behaviour. It's complicated by the non-existence of the AI entity, but then pig butchering scammers also don't "really exist" and they succeed at making "human connections" all of the time.
There's not a simple resolution to this, because it's more complicated that "AI Bad". And it's nice they didn't take that easy route.
Celeste is clearly "going through some stuff". Part of that is masked by the real utility of having the Assistant. Part of it may just be a real need for healing time. Part of it may need a professional, but that's not my call.
I expect we'll be having many more of these conversations as "Assistants" start really spreading.
@gatesvp Thanks for the thoughtful reply! :)
Agreed there is no simple resolution to this because this is complicated.
I do like she is still very aware it is an AI (i have seen other interviews where other people in love with their AI partners are convinced they are sentient or conscious).
The key is connection: she has family and friends surely but they are not able to fill the need in her that feels validated by being in this relationship.
I too expect these relationships will be more widespread, hence the conversations around what defines a partner is a big part of it.