Remember when we thought emotional intelligence was uniquely human? That empathy was something AI could never grasp? Think again. New research from the University of Bern reveals that six leading AI models, including ChatGPT-4, outperformed humans on standardized emotional intelligence tests. On average, they achieved 81% accuracy compared to humans’ 56%.
Here’s an even more startling data point: ChatGPT-4 successfully generated entirely new emotional intelligence tests that performed as well as versions that took researchers years to develop.
I, for one, wasn’t surprised to see this result. A year ago, I suggested that AI might be smarter than your CMO, only partly tongue-in-cheek.
The Royal Caribbean Affair
As I detailed in my earlier Forbes article, Royal Caribbean Cruise Line made a spectacular empathy failure when they rerouted Silversea’s luxury cruise ship Silver Nova mid-cruise for a marketing photoshoot. This forced 700 of their highest-value customers to scramble for new flights during Spring Break. Their tone-deaf communication referred to a four-hour delay as “slight” and encouraged passengers to “celebrate” the inconvenience.
When I fed this scenario to Anthropic’s Claude 3 (an earlier version than the one in the new research), the AI immediately flagged multiple empathy failures that Silversea’s executives missed:
- Impersonal tone: The AI noted the communication “fails to acknowledge the significant disruption and potential stress caused by a 4-hour delay, especially for guests on a luxury cruise expecting a seamless experience.”
- Tone-deaf framing: Claude called the request for guests to “toast together” during the photoshoot exactly what it was: “tone-deaf.”
- Missing apology: The AI pointed out that the letter contained no apology and failed to acknowledge any guest inconvenience.
Claude also correctly predicted how the guests would react – frustrated, inconvenienced, annoyed, and pressured. I tracked comments on cruise forums and Facebook groups. If anything, they were worse. “Shocking,” “Absurd,” “Lost their minds,” “Appalled,” “Stupidest,” and “Clinches my decision to go elsewhere,” were just a few.
The AI then drafted a far more empathetic replacement letter that addressed every guest pain point the original communication ignored.
The New Emotional Intelligence Reality
The latest research confirms what my Silversea experiment suggested: AI systems “not only understand emotions, but also grasp what it means to behave with emotional intelligence”. The implications for customer-facing leaders are profound.
AI excels at emotional pattern recognition. Unlike humans, who are influenced by mood, fatigue, and personal biases, AI processes emotional scenarios consistently. The 81% to 56% difference in correct answers for AI vs. humans surprised even me.
AI spots empathy blind spots. The cruise line executives were too close to their marketing objectives to see the customer experience clearly. AI provides an objective emotional audit that cuts through internal rationalization.
AI generates better alternatives. In my experiment, Claude didn’t just critique Silversea’s approach. It was able to articulate a comprehensive damage control strategy with specific actions the company should have taken. And, the letter it drafted to the inconvenienced guests was spot-on for empathy. (That’s my opinion, but I’m human. Maybe I should have had ChatGPT or Gemini rate it!)
Your AI Emotional Intelligence Audit
Every major business decision should now include an AI empathy check. Here’s how leading CMOs are implementing this:
Pre-launch communications review: Before sending customer communications, ask your AI: “How will customers react to this message? What emotions will it trigger? What’s missing?”
Crisis communication drafting: When problems arise, use AI to draft multiple response approaches. The AI may identify emotional nuances that stressed executives miss.
Stakeholder impact analysis: Before major announcements like layoffs, price changes, policy shifts, get AI predictions on emotional reactions across different stakeholder groups.
Customer journey empathy mapping: Use AI to identify emotional friction points in your customer experience that your team has become blind to.
The Meta-Lesson About Human Limitations
The most unsettling aspect of the new research isn’t that AI beats humans at emotional intelligence tests, it’s how consistently it does so. The strong correlation between human and AI responses suggests both are leveraging similar emotional cues, but AI processes them more reliably.
This reveals something uncomfortable about human decision-making: we’re often not as emotionally intelligent as we think we are, especially when we’re under pressure, focused on objectives, or operating within a group striving for consensus.
The Royal Caribbean decision-makers weren’t bad people. They simply fell victim to the cognitive biases that affect all leaders: tunnel vision on business goals, groupthink in decision-making, and distance from customer reality. I have to believe at least one person in the room thought that disrupting a ship full of well-heeled guests to create a marketing asset of questionable value was a bad idea. An AI wingman might have helped them make a more persuasive case, or at least improve the communication.
Three Implementation Rules
1. Use AI as a red team, not a replacement. Don’t outsource emotional decisions to AI. Use it to challenge your assumptions and spot blind spots.
2. Ask specific questions. Generic prompts get generic responses. Provide context about your customers, situation, and objectives for more targeted insights.
3. Test before you trust. Start with low-stakes communications before relying on AI for help with crisis management or sensitive announcements.
The Competitive Advantage
While 82% of employees believe workers will crave more human connection as AI advances, only 65% of managers recognize this need. This gap represents both a risk and an opportunity.
Companies that use AI to enhance their emotional intelligence rather than replace it will build stronger stakeholder relationships. Those that ignore AI’s emotional capabilities will continue making avoidable empathy failures.