The pressure to “do something with AI” is mounting. Boards are asking for automation. Investors want efficiency. And even competitors are moving fast. But when CEOs rush to deploy AI and innovate without first aligning their people, culture and strategic goals, the result is often accelerated chaos.
Sadly, that’s the case for many business leaders today. In the scramble to adopt large language models, predictive systems and intelligent tools, most executives are skipping the real work of understanding what their organization is solving for, who their systems serve and whether the company is even culturally ready for AI. That’s not a problem with technology; it’s one with leadership.
“The biggest mistake I see is companies jumping into AI hoping for plug-and-play magic,” said Sagi Shahar, cofounder and CEO of Journey, in an interview. “They start automating before understanding what needs to be fixed and why. When you skip the deeper human work — understanding your team, your culture, your blind spots — AI just creates faster chaos.”
Shahar would know. His company embeds AI into executive workflows, but only after deeply mapping the strategic, cultural and emotional terrain of the organization. “We once worked with a CEO who rolled out an AI task manager to improve execution,” he told me. “But the leadership team wasn’t aligned on what success even looked like. The AI ended up optimizing tasks no one really needed.”
That’s what a lot of business leaders are quickly discovering about AI in the business world. They’re finding out that AI is not a magical wand that automatically solves everything.
Rather, they are learning that AI amplifies whatever already exists within a company’s business architecture — whether order or dysfunction, equity or bias, strategic clarity or chaos. And more and more, business leaders are learning that if they don’t set the direction of how they want to use AI, AI will choose one for them.
AI Without Context Is Just Expensive Guesswork
It’s easy to get seduced by dashboards, metrics and automation flows, but the truth is that raw data isn’t insight. Instead, insight comes from what you do with data. And AI trained on surface-level inputs can’t account for the things that actually make organizations work — like trust, emotional labor, context and culture.
“Data can tell you what happened,” noted Shahar. “But lived experience tells you why it mattered.”
That distinction is critical. Many CEOs chase clean KPIs — conversion rates, churn, task completion — without asking deeper questions. “We’ve worked with companies where retention issues weren’t about compensation,” Shahar continued. “They were about burnout, unclear priorities, or voices being ignored. No dashboard captured that.”
This is where AI becomes dangerous. When used without human input, it risks reinforcing and consolidating the very inequities that it’s supposed to fix.
“We’ve seen AI tools that summarize meetings beautifully — but consistently overlook who’s being interrupted or left unheard,” says Shahar. “Equity-centered AI starts by asking: whose voice is missing?”
Why Ethics And Inclusion Are Business Priorities
There’s still a mindset among some executives that ethics and inclusion are add-ons that are secondary to speed, scale and productivity. That thinking is outdated and, quite frankly, risky.
“If ethics and inclusion aren’t part of your core business priorities, you’re probably building something fragile,” Shahar warned. “AI systems shape who gets heard, how decisions are made and what behaviors get rewarded. That is your culture.”
The human cost of poor AI deployment is real. It’s a long list that includes, but isn’t limited to burnout, distrust, turnover and missed insights. Additionally, the financial cost isn’t far behind.
“One of the most dangerous things about AI is that it’s faster than human intuition,” noted Shahar. “That speed can feel like efficiency, but without reflection, it leads to blind spots.”
Solomon Eko, CEO and founder of Retink, agrees.
“AI should be your co-pilot, not your autopilot,” said Eko. “When leaders chase automation without understanding what their teams actually need, they often scale inefficiency instead of solving it.”
He added that “before companies deploy any AI system, they must ask: Who does it benefit? Who might be unintentionally harmed? It’s about building trust at every layer.”
How To Know If You’re Ready
So how can CEOs know if they’re truly ready to integrate AI responsibly? According to Shahar, the question isn’t if your AI is compliant, it’s whether it’s fair, inclusive and sustainable.
“Compliance is a floor, not a ceiling,” he said. “The real test is whether your AI reinforces the kind of company you want to build.”
His company, Journey, uses a three-part lens to help leaders self-audit:
- Fairness: Who benefits from AI-driven decisions — and who doesn’t?
- Inclusion: Are the inputs reflective of your real-world stakeholders?
- Sustainability: Is AI helping your team build capacity, or just working them faster?
“You need both metrics and meaning,” he explained. “Dashboards matter. But so do conversations — with your team, your skeptics, even your customers. If AI is creating more distance between your strategy and your people, it’s not working.”
A Note On Intersectionality
While business leaders often treat diversity — an ideal that’s already getting erased from Silicon Valley on the heels of President Donald Trump’s executive order to scrap DEI programs — as a checkbox, they must ask harder questions as they integrate AI into their decision-making.
Whose reality is reflected in our data? Who gets listened to? Who gets overlooked? These are some critical questions that business leaders must answer in their quest to use AI. As one study on “Intersectionality in Artificial Intelligence” emphasizes, AI systems can perpetuate biases against marginalized groups if not carefully designed. True inclusion means thinking beyond quotas and centering the lived experiences of those most often left out — especially women of color, LGBTQ+ individuals and frontline workers.
“Your AI is only as inclusive as your leadership mindset,” said Eko. “If you aren’t building systems that hear every voice, you’re building systems that silence some.”
Final Word: AI Starts With Leadership
Most companies already understand AI’s potential to transform operations and make employees more productive but, more importantly, they must realize that its success depends on leadership.
“AI doesn’t replace strategy,” said Shahar. “It scales it. So if your strategy is unclear, your AI will only make things worse. But if your leadership is aligned, intentional and inclusive — AI becomes a multiplier of everything that’s already working.”