March 5, 2025
The GIST Editors' notes
This text has been reviewed in keeping with Science X's editorial course of and insurance policies. Editors have highlighted the next attributes whereas making certain the content material's credibility:
fact-checked
preprint
trusted supply
proofread
AI chatbots wrestle with empathy: Overempathizing and gender bias uncovered

You may discuss to an AI chatbot about just about something, from assist with every day duties to the issues chances are you’ll want to unravel. Its solutions replicate the human knowledge that taught it the right way to act like an individual; however how human-like are the newest chatbots, actually?
As folks flip to AI chatbots for extra of their web wants, and the bots get included into extra functions from buying to well being care, a group of researchers sought to grasp how AI bots replicate human empathy, which is the flexibility to grasp and share one other particular person's emotions.
A research posted to the arXiv preprint server and led by UC Santa Cruz Professor of Computational Media Magy Seif El-Nasr and Stanford College Researcher and UCSC Visiting Scholar Mahnaz Roshanaei, explores how GPT-4o, the newest mannequin from OpenAI, evaluates and performs empathy. In investigating the principle variations between people and AI, they discover that main gaps exist.
They discovered that ChatGPT general tends to be overly empathetic in comparison with people; nonetheless, they discovered it fails to empathize throughout nice moments, a sample that exaggerates human tendencies. In addition they discovered the bot empathized extra when informed the particular person it was responding to was feminine.
"This discovering could be very fascinating and warrants extra research and exploration and it uncovers a few of the biases of LLMs," Seif El-Nasr mentioned. "It will be fascinating to check if such bias exists in later fashions of GPT or different AI fashions."
Homing in on empathy
The researchers on this challenge are largely within the interaction between AI chatbots and psychological well being. As empathy has been studied by psychologists for many years, they introduced in strategies and classes from that discipline into their research of human-computer interplay.
"When individuals are interacting instantly with AI brokers, it's essential to grasp the hole between people and AI when it comes to empathy—the way it can perceive and later categorical empathy, and what are the principle variations between people and AI," Roshanaei mentioned.
To take action, the researchers requested each a bunch of people and GPT-4o to learn quick tales of human constructive and detrimental, and charge their empathy towards every story on a scale of 1 to five, and in contrast the responses. The tales got here from actual human experiences, collected from college students when Roshanaei was a postdoc and made fully nameless.
In addition they had the AI bot carry out the identical score activity after being assigned a "persona": being prompted with the story together with a set of traits together with a gender, perspective, or similarity of experiences. Lastly, that they had the bots carry out the score activity after being "fine-tuned," the method of re-training an already-trained mannequin like ChatGPT with a particular dataset to assist it carry out a activity.
Biases and over-empathizing
General, the researchers discovered that GPT-4o is missing in depth in providing options, options, or reasoning—what known as cognitive empathy.
Nevertheless, in the case of providing an emotional response, GPT-4o is overly empathetic, notably in response to unhappy tales.
"It's very emotional when it comes to detrimental emotions, it tries to be very good," Roshanaei mentioned. "However when an individual talks about very constructive occasions occurring to them, it doesn't appear to care."
The researchers observed that this over-empathizing was current when informed that the particular person it was chatting to was a feminine, and was extra much like a typical human response when informed the particular person was male. The researchers suppose it’s because AI mimics and exaggerates the gender biases that exist within the human-made supplies from which it learns.
"Once more, it’s one among these fascinating outcomes that require additional exploration throughout business AI fashions," Seif El-Nasr mentioned. "If such bias is constant, it will be essential for firms to know, particularly firms which might be utilizing such fashions for emotional help, psychological well being and emotion regulation."
"There are loads of papers that present gender biases and race biases in GPT," Roshanaei mentioned. "It's occurring as a result of the information comes from people, and people have biases towards different people."
Nevertheless, the researchers discovered that GPT-4o grew to become extra human-like in evaluating empathy after the fine-tuning course of. The researchers consider it’s because feeding GPT-4o a variety of tales enabled the AI to do one thing innately human: to match private experiences to that of others, drawing on one's personal layers of experiences to imitate human's conduct towards one other.
"The most important lesson I obtained from this expertise is that GPT must be fine-tuned to learn to be extra human," Roshanaei mentioned. "Even with all this huge knowledge, it's not human."
Bettering AI
These outcomes might impression how AI is additional built-in into areas of life resembling psychological well being care. The researchers firmly consider that AI ought to by no means change people in well being care, however could possibly function a mediator in cases the place an individual just isn’t out there to reply straight away as a result of components like time of day and bodily location.
Nevertheless, this serves as a warning that the know-how just isn’t fairly prepared for use with delicate populations resembling youngsters, or these with clinically recognized psychological well being circumstances.
For these working in AI, this research exhibits that there’s nonetheless a lot work forward for bettering chatbots.
"That is an analysis that exhibits that regardless that AI is superb, it nonetheless has loads of huge gaps compared to people," Roshanaei mentioned. "It has loads of room for enchancment, so we have to work towards that."
Extra info: Mahnaz Roshanaei et al, Discuss, Pay attention, Join: Navigating Empathy in Human-AI Interactions, arXiv (2024). DOI: 10.48550/arxiv.2409.15550
Journal info: arXiv Offered by College of California – Santa Cruz Quotation: AI chatbots wrestle with empathy: Overempathizing and gender bias uncovered (2025, March 5) retrieved 5 March 2025 from https://techxplore.com/information/2025-03-ai-chatbots-struggle-empathy-overempathizing.html This doc is topic to copyright. Aside from any honest dealing for the aim of personal research or analysis, no half could also be reproduced with out the written permission. The content material is offered for info functions solely.
Discover additional
ChatGPT perceived as extra empathetic than human disaster responders in experiments 0 shares
Feedback to editors
