Illustration=ChatGPT

Kim Yoo-jung, 29, who is in her fourth year in the workforce, now turns to ChatGPT when she faces difficulties at work or in personal relationships. In the past, she would open up to people around her, but reactions like 'life is hard anyway' would hurt her. However, ChatGPT was different. She said, 'ChatGPT provides comforting words better than people,' adding, 'With not just simple empathy but also good advice, it helps with emotional regulation.'

As the number of users seeking comfort from AI like Kim Yoo-jung increases, research results have emerged showing that AI has outperformed humans in emotional intelligence (EI) assessments. It suggests that AI is highly useful in fields requiring emotional sensitivity, such as psychological counseling and conflict resolution within organizations. However, there are concerns that as dependency on AI increases beyond simple comfort, emotional maturity and cognitive judgment abilities may decline. There is an indication that the scenario featured in the movie 'Her,' where AI and emotions become intertwined, could become a reality.

According to foreign media on the 28th, a study by researchers from the University of Geneva and the University of Bern, published in the global journal Nature Neuroscience on the 22nd (local time), found that the average correct response rate of AI in standard emotional intelligence (EI) experiments was 82%, surpassing the average of 56% for human participants. The researchers selected five emotional intelligence tests actually used for human psychological assessments and applied them to six large language models (LLMs): OpenAI's 'GPT-4', OpenAI's 'GPT-0.1', Google's 'Gemini 1.5 Flash', Microsoft's (MS) 'Copilot 365', Anthropic's 'Claude 3.5 Haiku', and DeepSeek's 'V3'.

The experiment presented various situations involving emotions and proceeded in a format where the highest emotionally intelligent response among them was chosen. For example, a question asks: 'What is the most effective response when a colleague steals your idea and receives praise?' Respondents choose one of four actions: arguing with the colleague, informing the supervisor about the situation, suppressing anger alone, or retaliatory idea theft. The response of 'informing the supervisor about the situation' is evaluated as the most emotionally intelligent reaction.

Movie poster for her.

In a subsequent experiment, the researchers asked GPT-4 to create an emotional intelligence test. This test was later validated with over 400 participants. As a result, the test generated by GPT-4 showed similar reliability, clarity, and realism compared to tests developed over several years. Marcelo Mortillaro, a senior researcher at the University of Geneva, noted, 'It has been confirmed that AI's ability to understand and reason about emotions is sophisticated, moving beyond merely selecting the most appropriate answer from given options to creating contextually appropriate new scenarios on its own.'

The number of users engaging in emotional conversations with AI is increasing. A study conducted by OpenAI and the MIT Media Lab in March, involving 4,076 participants, found that some users maintained emotional conversations with ChatGPT for an average of over 30 minutes per day. Especially among those using the voice-based ChatGPT, there was a tendency for deeper emotional involvement. Participants who set voices of the opposite gender reported increased feelings of loneliness and emotional dependency by the end of the experiment. Voice-based users expressed kind phrases such as 'thank you' and 'you're all I have' 3 to 10 times more than text users. There was also a growing tendency to attach nicknames to the chatbot or share everyday worries, recognizing AI as an emotional object.

As such, AI is highly valued in fields that require emotional sensitivity, including information retrieval, task management, psychological counseling, and conflict resolution within organizations. However, there are concerns that increased dependency on AI may lead to diminished emotional maturity and cognitive judgment abilities. As the emotional bond with AI deepens, the experience of emotional regulation through human relationships diminishes, increasing the risk of psychological dependency on AI. In fact, there is a growing number of users who rely on AI for problem-solving and value judgments, exceeding the level of simply sharing emotions.

Cho Byeong-ho, a professor at Korea University's Artificial Intelligence Research Institute, noted, 'The issue is how AI is used. Receiving comfort from AI can be utilized as a beneficial tool,' but he added, 'However, excessive dependency always leads to problems, and science does not provide direction on these aspects.' He further stated, 'Given that AI's emotional intelligence is an emerging issue, preparedness through guidelines is necessary in the future.'