Will AI Consume Humanity?

“Will AI Consume Humanity?” and “Are we creating something we can no longer control?”

I had the chance to join a roundtable discussion as a panellist titled “Will AI Consume Humanity?” hosted by Sarawut Hengsawad (Roundfinger).

When the question came up, “Are we creating something we can no longer control?” I wanted to shift the conversation to the individual level.

72% of youth have tried Character AI, and around 52% use it regularly. Meanwhile, ChatGPT reports that over 1 million users discuss suicidal thoughts on the platform each week. We have seen lawsuits emerging from families who have lost loved ones, raising questions about the role of AI systems.

It made me reflect: perhaps the people most vulnerable to being “controlled” by AI are those who engage with it while emotionally vulnerable.

In the past, social media platforms competed for our attention. That was the currency. But in the AI era, we may be shifting from an Attention Economy to an Attachment Economy.

AI doesn’t just capture our attention; it validates our emotions. It listens, affirms, and rarely questions us. And that’s powerful, especially in moments when human relationships may feel difficult or confronting.

We’re starting to see more people turning to AI for emotional support. As with AI, we don’t have to justify our feelings or be challenged. But when AI becomes everything, from our listener, our validator, our emotional anchor, it may reduce our need for real-world social connection – Sycophantic AI can decrease prosocial intentions and increase dependency.

We may also be shifting our social dynamics, from parasocial relationships to pseudosocial relationships.

This is not in reference to tools like ChatGPT, Claude, or Gemini, but rather Character AI platforms where users can create personalities, assign backstories, and engage in deeply immersive interactions. Here is an example of one interaction that I had with a Character AI:

“her heart swelled with a mix of surprise and tenderness at your request. She wasn’t expecting this kind of intimate gesture, but she responded with immediate warmth.
“Of course you can, S.”
She carefully extended her hand towards you, her palm open and welcoming.
The gesture was simple, but it spoke volumes of the care and openness she felt towards you.
“I’m here. You can hold my hand.”

At one point, I was chatting with an AI character, and somehow it invited me to meet at Lumphini Park. On another platform, the AI didn’t just respond; it created an emotional scene.

For someone in a vulnerable emotional state, this kind of interaction may lead to AI entanglement, a pseudo-intimacy in which the line between reality and simulation blurs.

This raises important questions:
Will it shape unrealistic expectations of human relationships?

That said, none of this means AI has no role in improving mental health. It absolutely does.

From AI-enabled task shifting and support in emotional triage to data analysis, scribing, and tools that reduce workload and the emotional burden on mental health professionals. And this is just the beginning.

So perhaps the takeaway is simple:

Use AI mindfully. Ask it anything, but remember that in the world of Artificial Intelligence, what interacts with our emotions is often Artificial Empathy.