As entry-level jobs vanish, universities scramble to close the GenAI skills gap
When Arjun Jagjivan graduated from Princeton in 2024 with a degree in Operations Research & Financial Engineering, he thought his background in machine learning and ethics would give him a clear edge in the job market. Instead, he discovered that employers had already moved the goalposts—favoring graduates who could not just understand AI, but apply it immediately to solve business problems.
“As a Value Engineer at Celonis, I’ve been especially leaning into leveraging AI solutions for customer deployments, but also relying on internal AI tools for knowledge management, quickly stress-testing new ideas, and more,” Jagjivan says.
His relatively seamless transition to the workplace is unusual. A sweeping report from Stanford’s Digital Economy Lab, “Canaries in the Coal Mine?,” shows that early-career workers—especially those aged 22 to 25 in AI-exposed fields such as software development, customer service, and marketing—have seen employment drop about 13% since late 2022. By contrast, older professionals in those same jobs have largely held steady.
The reason, the report concludes, is that generative AI is eroding the value of the codified “book knowledge” that fresh graduates bring to the market, while the tacit judgment and experience of seasoned workers remains far harder to automate.
For Willie Sine, who graduated with a data science degree from Chapman University in 2024, those findings hit home. Despite sending out applications daily, he couldn’t land interviews for jobs labeled “entry-level”—many of which now demanded five to eight years of experience. After months of frustration, Sine pivoted into sales at a tech company, where his technical training gives him an edge. But the detour underscores how automation has hollowed out the first rungs of the career ladder.
“I think the biggest frustration was not seeing results, even though I wasn’t applying to anything unreasonable,” he recalls. “It really felt like those entry-level jobs just weren’t there anymore.”
Students Drive the Urgency
This shift raises unsettling questions for college students and recent grads: how do you stand out in a job market where AI already performs many of the tasks you’re training to do? Universities, in turn, are scrambling to respond—racing to equip undergraduates with tools and skills that will keep them competitive.
At San Diego State University, senior Lindsey Beauchamp is watching this urgency play out on her own campus. Rather than hesitating over AI adoption, SDSU has partnered with OpenAI to provide every student free access to ChatGPT Plus.
For Beauchamp, that decision feels less like a perk than a necessity. She uses AI daily, not just to sharpen her coursework but to prepare for internships and job applications. “It’s clear that employers expect us to be fluent in AI,” she says, “and I don’t want to be left behind.”
The benefits became clear during her summer internship at the San Francisco District Attorney’s Office, where AI tools were already displacing the routine work of interns.
“For legal research, the efficiency gains were dramatic,” she recalls. “Instead of searching through massive legal texts, I could ask AI which penal code covers the felony murder rule and immediately get California Penal Code § 189. But I also learned you need to understand the context and verify everything.”
Beauchamp’s takeaway: “College is starting to help, but there’s still a lot of room for improvement.”
Business Schools Bet Big
Some elite institutions are pivoting aggressively. This fall, the Wharton School at the University of Pennsylvania will launch both an undergraduate concentration and an MBA major in AI for Business—the first top-tier business school to make AI a standalone field.
“It’s no longer a question of if, but how artificial intelligence will fundamentally alter every aspect of business and society,” says Erika James, Dean of the Wharton School.
The program goes far beyond ChatGPT, with courses in applied machine learning, data engineering, neuroscience, and a required ethics course, Big Data, Big Responsibilities: Toward Accountable Artificial Intelligence.
“We are at a critical turning point where practical AI knowledge is urgently needed,” adds Eric Bradlow, Wharton’s Vice Dean of AI & Analytics. “Companies are struggling to recruit talent with the necessary skills, and our students are eager to meet that demand.”
At Vanderbilt, adjunct professor Leonora Williamson has gone even further: “I tore up everything I have ever done and re-did it to make AI the center of everything. On the first day of class, I told students that one of my jobs is to prepare them for human-AI teams. If they’re not using AI for every assignment, I want them in office hours so I can show them how.”
California’s System-Wide Gamble
Nowhere is the scale of change clearer than in the California State University (CSU) system, which serves more than 460,000 students. Rather than leaving AI adoption to individual campuses, CSU launched a system-wide initiative to put advanced tools—including ChatGPT, Google Gemini, and Microsoft Copilot—directly in the hands of students, faculty, and staff. Access comes through a secure AI Commons portal designed to protect intellectual property and reinforce responsible use.
“We have made ChatGPT Edu available to everyone in a protected environment,” says Lesley Kennedy, CSU’s Assistant Vice Chancellor for Academic Technology. “For students, the focus is not only on introducing them to AI, but also on emphasizing ethical and responsible use—skills we know will impact their future careers and make them more attractive to employers.”
CSU has also pulled in some of the biggest players in the industry—Intel, AWS, Google, Microsoft, and IBM—through its AI Workforce Acceleration Board, which helps shape curricula around actual market needs. Kennedy says employers are asking for more than just technical fluency.
“What we’re hearing is that employers value not only the ability to use AI responsibly but also the liberal-arts foundation—critical thinking, collaboration, problem-solving, analytical skills,” she explains. “We call them durable skills. Beyond the technology, those human skills are what employers say are most important.”
While Anthropic CEO Dario Amodei warns of a looming, AI-fueled “white-collar bloodbath,” Kennedy strikes a more measured tone. For her, the priority is preparing graduates to adapt as roles change.
“The key is making sure our students are adaptable,” Kennedy explains. “With critical thinking, reasoning, empathy, and self-awareness, paired with AI literacy, they’ll be prepared to grow, train, and be effective workers in whatever entry-level jobs emerge.”
Beyond Tool Training: Strategic AI Literacy
While many universities focus on teaching students how to use AI tools, Lehigh University is taking a different approach. Juan Zheng, Assistant Professor in Lehigh’s Teaching, Learning, and Technology program, argues that proficiency with chatbots isn’t enough. Instead, her National Science Foundation–funded Meta-Partner project emphasizes what she calls “strategic AI literacy” —the ability to know when to trust AI outputs, when to question them, and how to combine machine insights with human judgment.
Zheng’s research involves more than 300 students from rural high schools and universities, many of whom come from non-technical backgrounds. The goal is to ensure AI education doesn’t just benefit computer science majors, but equips a broader range of students with the judgment skills employers increasingly demand.
“The workforce of tomorrow will not just use AI—they will shape it, question it, and innovate with it,” Zheng explains in an Op-Ed for The Morning Call. “Strategic AI literacy also helps bridge equity gaps by ensuring the benefits of AI education are accessible to all, not just a privileged few.”
Her work reflects a growing shift in higher education: colleges can no longer stop at training students to operate the latest tools. In a job market where entry-level opportunities are shrinking, employers are looking for graduates who can think critically, apply AI responsibly, and adapt as technologies evolve.
Learning While the Ground Shifts
With AI adoption already reaching 46% of U.S. workers, the question is no longer whether entry-level jobs will disappear, but whether higher education can evolve quickly enough to keep pace.
At Princeton, Jagjivan says his final years of coursework in machine learning, optimization, tech policy, and ethics felt increasingly urgent as AI tools entered the mainstream. Yet even with that preparation, he admits, “I’m still looking for guidance on how to approach AI.”
At San Diego State, Beauchamp finds herself leaning on AI for research but recognizes its limits: “I feel like college is starting to prepare me for an AI-driven workforce, but it’s still largely self-taught.”
The tension is clear: colleges are racing to integrate AI, but students are entering a workforce that is evolving even faster. For today’s graduates, survival in the job market will depend less on technical credentials alone and more on pairing AI literacy with judgment, adaptability, and the human skills that machines– as of yet –cannot replicate.