There's a lot happening in this piece. Like so many fields, the AI component seems to drive wedges into old assumptions and forces us to re-examine the cracks. Things that were always there but that we've mostly papered over.
People regularly "fall in love" with their care providers, this isn't some new pattern of human behaviour. It's complicated by the non-existence of the AI entity, but then pig butchering scammers also don't "really exist" and they succeed at making "human connections" all of the time.
There's not a simple resolution to this, because it's more complicated that "AI Bad". And it's nice they didn't take that easy route.
Celeste is clearly "going through some stuff". Part of that is masked by the real utility of having the Assistant. Part of it may just be a real need for healing time. Part of it may need a professional, but that's not my call.
I expect we'll be having many more of these conversations as "Assistants" start really spreading.