Artificial intelligence (AI) is rapidly gaining abilities once deemed fundamentally human. Not long ago, University of Oxford researchers argued that empathy—understanding people’s reactions and why they have them—is a bottleneck that prevents machines from taking over certain human jobs. But as AI develops qualities like humor, creativity, and negotiation skills, it’s only natural to ask: could AI soon replace humans in roles that require more advanced social and emotional abilities, like empathy in leadership?
Machines can now even outperform humans on measures of empathy. In a 2025 study conducted by scientists at the University of Bern, the Czech Academy of Sciences, and the University of Geneva large language models like ChatGPT and Gemini were tested on five standard emotional intelligence tests. These tests include identifying what emotion a person most likely feels in a situation and how individuals can respond most adaptively in emotional situations. The AIs scored 81% accuracy versus the human average of 56%, showing that AI can give answers that reflect a good understanding of human emotions and how they are managed.
Still, AI empathy is not the same as human empathy. And for leadership, it’s unlikely to be a full substitute—at least, so far. Here are two reasons why.
Reason #1: People value human empathy over AI empathy
People feel more cared for when empathy comes from a human. In a 2025 series of studies of over 6,000 adults from the U.K. and the U.S., researchers from the Hebrew University of Jerusalem, Harvard University, and University of Texas had participants write about an emotional experience. Then, participants read an AI-generated written response, but were told either a human or an AI wrote it. Participants rated the same responses as more empathetic when they believed a human had written them. Even if participants thought a person had just used AI to help them craft a response, the response was seen as less empathetic and supportive.
Why? Because people believe that humans are uniquely able to feel and care. AI doesn’t feel emotions—it simulates what empathy looks like. Even if AI becomes more capable at mimicking empathy, it means more to us when we feel understood by a human than a machine.
The main message: empathy matters more to us when a human expresses it. The irony is that a University of Toronto study of 556 adults from the U.S. and the U.K. found that people rated human responses to an emotional story as less empathetic than responses generated by AI—even those written by trained crisis responders. This gap was smaller when participants knew the source was AI, but a difference favoring AI remained. Humans could benefit from investing more in empathy skills.
Reason #2: AI leaders are seen as less caring than human leaders
Trust in leaders and their decisions is tied to seeing leaders as genuinely caring—an area where human managers have an advantage over AI managers.
In a 2024 field study of 400 Chinese delivery riders and several experiments involving 2,350 U.S. adults, researchers from the Hong Kong University of Science and Technology found that AI managers were seen as less kind, warm, and caring than human managers. As a result, workers trusted AI managers less than human managers, especially in emotionally charged situations where workers wanted management to empathize with the emotions they were feeling, like asking for time off for their mother’s funeral versus for a vacation.
Dr. Katja Schlegel, a researcher at the University of Bern who authored the study on AI emotional intelligence, told me in an interview: “Emotion-focused AI can help leaders (and employees) to make sense of an emotional issue, suggest responses, and propose next steps. Examples include tools that brief managers on employee morale and well-being to boost productivity, or coach employees through tough customer situations. In these settings, it’s fine that AI doesn’t “feel” empathy like humans do (and likely never will). For many uses like providing advice, coming up with an action plan etc., cognitive empathy (recognizing and reasoning about emotions) is enough to be useful. This still leaves plenty of room for leaders’ human empathy in everyday face-to-face interactions, where time, care, and personal presence are the driving forces to build trust, commitment, cohesion, and identity within their teams.”
But perceptions can shift. A 2023 study of 310 English-speaking adults found that participants viewed a mental health chatbot as more empathetic and trustworthy when they believed the AI was trained to have caring motives with the best intentions to improve mental health (versus having no motives and only following text completion, or having manipulative motives with the intention of encouraging a purchase of services). These effects were stronger with more sophisticated AI. As AI becomes more advanced and societal beliefs about AI evolve, perceptions of AI caring may grow more positive—posing a challenge to human leaders.
Will AI empathy overtake leader empathy?
For now, empathy is still tied closely to being human. But this could change. Leaders risk losing ground if they don’t practice empathy, as people might start to turn to AI empathy because it is more available. A 2023 Workplace Intelligence survey of 800 employees from the U.S. shows this trend, as 47% of younger employees said that they get better career advice from AI like ChatGPT than from their managers. These statistics offer a warning. If leaders don’t make time for empathy, and as society gets more used to interacting with AI and AI’s abilities continue to advance, it may not be long before machine intelligence takes an edge over human intelligence.
As Dr. Fabiola Gerpott, a professor of leadership at the WHU – Otto Beisheim School of Management, said to me an interview: “We tend to underestimate how quickly AI moves from being a tool to becoming a partner and eventually a substitute in leadership. What seems irreplaceably human today may soon become algorithmically reproducible.” If leaders neglect empathy, AI can fill the void.