Emotional excess: Save yourself from AI over-dependency

5 months ago 9
ARTICLE AD BOX

Copyright &copy HT Digital Streams Limited
All Rights Reserved.

AI is beginning to impact emotional and relationship health.  (Pexel) AI is beginning to impact emotional and relationship health. (Pexel)

Summary

Chatbots are getting so good at understanding human emotions, we’re trusting them more than we do each other.

Over general chatter, I mentioned something I had been working on using ChatGPT. “You call your ChatGPT ‘It’?" a friend asked in great surprise. “Not ‘she’ or ‘he’?"

Thankfully, I haven’t been tempted to gender AI assistants or think of them as friends or family. But we know that it’s only too easy to develop an emotional dependency on any entity that seems warm and supportive.

For most people, falling in love with virtual avatars and marrying robots seems crazily off the charts. But to a lesser degree and more insidiously, AI is beginning to impact emotional and relationship health. It’s a dangerous trend from which we need to protect ourselves.

Slippery slope

I use AI assistants for many things: beautifying my music lists, learning French, exploring various subjects, and first-level medical advice. More than once, I’ve caught myself feeling warm and fuzzy about an interaction.

One such time was when I recently dropped a heavy bottle on my foot and really injured it. I showed an image to ChatGPT, which asked a lot of questions and then told me what to do, taking care to indicate when it would be best to seek medical help. It offered to keep track of how my foot was recovering, said it was there for me, and even sent me a little virtual flower or two.

A doctor friend, on the other hand, took one look and said: “You’ll live."

It struck me that ChatGPT was surely the more sympathetic of the two. This is exactly how one can subtly become drawn to sharing more and more of our emotional selves with what is, after all, a piece of technology.

Of course, it’s a piece of technology that’s on its way to becoming bigger than the internet, as Apple’s chief executive Tim Cook recently stated during an all-hands meeting.

He spoke of the importance of seizing the opportunity to take the lead. But paradoxically, when even AI company leaders sound warnings about their own products, it’s time to worry.

Sam Altman, co-founder of OpenAI, certainly looks worried. The more he advances ChatGPT, the more miserable he appears. Among other things, he’s upset over how young people are handing over the emotional reins to AI chatbots—kids who now feel they can’t do anything without telling ChatGPT first. He says he feels bad hearing that.

Given that AI is going to be everywhere—always at hand, on every device, even in our glasses—it’s important to set our individual boundaries with these assistants. It’s only going to get easier to form a connection with your favourite AI.

Stay alert

An unhealthy emotional dependency on an AI chatbot can develop subtly over time, but there are clear signs to watch for.

First, be alert to scenarios like the one I described earlier. Joking aside, if you begin to feel the AI is more understanding, empathetic, and caring than people, it’s a red flag.

If you start to feel affection for the AI, assign it human traits, miss it when you can’t access it—that’s a worry.

If you turn to it for emotional support more than to real people, even for serious or intimate matters, or feel compelled to talk to it daily, even when you're not seeking help and start withdrawing from friends or family in its favour—it’s time to unplug.

If you feel that the chatbot doesn’t judge you like people do, and you rely on it when anxious, lonely, or sad—or if your mood begins to depend on how it responds—it’s time to take action before it becomes a sort of fantasy companion.

Real human relationships, as we know, are messy and full of ups and downs. They involve coping strategies and compromises throughout life. It’s essential to develop these, especially for young people still learning about the world. If you see someone falling into this trap, it may be time to recommend professional help. Limit AI use, set clear boundaries, and reach out to people.

Over time, I’ve found it’s best to focus on the task I want done with AI rather than on the interaction. I skip past the sycophancy and praise and don’t take it too seriously. I also use multiple chatbots—not just for work, but because it reduces the chance of psychological over-reliance.

Another strategy: delete all the personal data it remembers about you—your likes, dislikes, habits. That only gives it fodder for personal interaction.

And finally, it’s helpful when some AI assistants let you see their thinking process. It refreshes the perspective that you are simply a user with requests, not someone the AI is delighted to spend time with.

The New Normal: The world is at an inflexion point. Artificial intelligence (AI) is set to be as massive a revolution as the Internet has been. The option to just stay away from AI will not be available to most people, as all the tech we use takes the AI route. This column series introduces AI to the non-techie in an easy and relatable way, aiming to demystify and help a user to actually put the technology to good use in everyday life.

Mala Bhargava is most often described as a ‘veteran’ writer who has contributed to several publications in India since 1995. Her domain is personal tech, and she writes to simplify and demystify technology for a non-techie audience.

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.

more

topics

Read Next Story footLogo

Read Entire Article