These days, I’m often invited onto news programs to comment about the U.S. government’s involvement — or potential involvement — with tech issues, whether that means taking a stake in Intel, providing incentives for onshoring manufacturing or creating a more secure supply chain for rare earth elements. The recent Google Public Sector Summit I attended in Washington, D.C. highlighted the other side of the coin: not the actions the government might take with tech companies, but how one big tech company is taking action to support government entities.
I thought the Google team did a great job, and I walked away from this event impressed. Gemini for Government is one of the best examples of AI agents I’ve ever seen related to the public sector, and I liked it even more when I was able to create an agent of my own in about seven minutes. Google Cloud — Cloud — is also leaning into on-prem solutions that meet the needs of government agencies and government contractors (think Lockheed Martin and the like) that want to minimize attack surfaces for their data. In these and many other ways, Google is showing that it understands the particular needs of the public sector.
(Note: Google is an advisory client of my firm, Moor Insights & Strategy.)
How The Public Sector Is Leading In AI Adoption
Karen Dahut, CEO of Google Public Sector, launched the event and immediately acknowledged the difficulties created by the current government shutdown. She pointed out that Google has been partnering with government and educational entities for 20-plus years, but also emphasized that “There is no more status quo” within the sector — which she takes as an opportunity to cultivate a healthy disregard for the “impossible.” She gave credit to Google cofounder Larry Page for that line, and I think she’s definitely on the right track to emphasize the newness and possibilities that go along with the upheaval we’re seeing today.
Google’s AI adoption within the public sector also seems to be on the right track. Dahut cited a couple of key stats that stuck with me. According to Google’s internal survey numbers, 46% of its public-sector respondents say that productivity has at least doubled where they’ve introduced AI, and 42% say they have deployed at least 10 AI agents. Dahut used figures like these to support her contention that, far from trailing the private sector, the public sector is actually leading when it comes to AI innovation.
Optimizing The Tech Stack For AI
Google Cloud CEO Thomas Kurian took the stage to talk about AI infrastructure, and as you would expect he had plenty of details about high-performance, cost-effective Google offerings paired with Nvidia chips, Google’s own TPU chips and high-end storage options. Like Dahut, he had plenty of stats for things like how many tokens Google’s AI infrastructure processes per month (more than 1 quadrillion), or the year-over-year growth in Google BigQuery data processed by Gemini AI models (26x). In the government-specific context, Google functionality enables thousands of researchers from the NIH and other agencies to access more than 115 petabytes of research data, and it helps developers from the U.S. Postal Service, the National Cancer Institute and other bodies to build AI models using Google Vertex.
Amid all the big, flashy numbers, I try to concentrate on the more functional elements, and I think Google is delivering on these as well. Kurian talked about how Google is building two different kinds of AI stack — one that’s optimized for large-scale models, and another that’s optimized for efficient, low-latency inference. To my mind, this is highly practical and much more likely to meet the needs of organizations where they are. The company is also focused on practicality with its Dynamic Workload Scheduler, which can cut some AI costs by as much as half. Plus there was plenty of talk about breaking down silos, which might sound repetitive until you’ve lived inside the reality of being stymied at every turn by siloed data and functionality. So I’d rather hear Google talk about it over and over — and deliver what’s being promised — than for it to overlook the central importance of connected functionality.
Gemini For Government — And The Importance Of Security
Not only in the public sector, I have appreciated the quality Google has achieved with its Gemini AI models. But Google has built Gemini for Government specifically for the public sector. That means delivering an intuitive and unified offering in a secure and open ecosystem. A product like this has to deliver the best protection for sensitive data at a global scale — and we’re talking about clients such as the U.S. Air Force that have the highest possible confidentiality requirements. This means that Google has done the work to make AI function even in air-gapped settings, and to give administrators the ability to control who has access to what and which models are plugged into which data.
Gemini for Government also has to deliver results across many different operational contexts, whether it’s figuring out a needed repair for a bridge (the topic of an onstage demo), summarizing the contents of a meeting or searching data within an organization. As you’d hope, you can build your own agents or draw from pre-built agents made by Google or third parties. And not only does the system include no-code agents, but even no-prompting agents; you just tell it your goal, and the system takes it from there. Kurian also emphasized that the Gemini models for customers are the same ones that Google itself uses for in-house functions across search, YouTube, Google Maps and so on.
At one point, Kurian had an onstage conversation with Ian Buck, who’s the head of Nvidia’s hyperscale and high-performance computing business. The two men talked about the history of collaboration between the two companies. During that conversation, Kurian said that Gemini for Government arose when Nvidia CEO Jensen Huang asked Google if it could make frontier models like Gemini available for high-security customers like government agencies. That request came less than two years ago, which is yet another reminder of how fast things keep moving in AI.
Pushing The AI Envelope For Government Agencies
I tend to be skeptical about just how efficient any government entity is likely to be. But it definitely reframes things to think of the U.S. federal government as a hyperscaler — a comparison that Buck made. And Google’s presenters made it clear throughout the event that agencies should not wait for the “perfect” moment to implement AI, because there are ripe opportunities now. In practical terms, many implementations can be achieved in months or even weeks, not a decade as is sometimes the case with public-sector IT.
This sense of urgency was reinforced in a video appearance by General Dan Caine, who serves as the chairman of the Joint Chiefs of Staff. He talked about the growing complexity of the “global risk algorithm” of the United States from a defense perspective, as well as “what is possible when we get it right” with AI across areas such as clinical research, mass transit and health care. “The public sector is already leading on AI,” he said. “Now we need to do it at scale.”
I appreciated General Caine’s call to action — not least because it was the type of inspirational message that Google often doesn’t emphasize. But it fit in very well with the themes of the event, starting with Dahut’s emphasis on disregarding what’s considered impossible. While I also want to see more tangible, real-world examples of how Google is helping government agencies overcome the inertia that’s all too common in the public sector, I came away from this event heartened about what’s possible.
