- Google is accelerating its cloud computing push, and it’s using artificial intelligence to try and catch up to Amazon and Microsoft.
- The company’s new strategy hinges on its own custom chips, innovations from DeepMind, and a full-stack AI approach rolled out with its latest products.
- All this is happening as competition in the cloud market heats up and the entire AI industry faces increasing financial strains.
Google’s betting that years of building its internal ecosystem will finally pay off, changing the game in cloud computing. The company showed up to the cloud market later than Amazon and Microsoft, but now it’s positioning AI as the core of its offering, not just an add-on.
At the core of this effort is a strategy focused on owning the whole stack: its chips, models, and infrastructure. Thomas Kurian, Google’s cloud chief, says this vertical integration lets Google operate more efficiently without needing third-party providers. Google claims this leads to bigger margins and more room for reinvestment, which is especially important as AI computing costs increase.
How is Google’s AI-First Cloud Strategy Catching Up to Rivals
According to sources, one of the major pillars of Google’s approach is its longtime investment in DeepMind, which has been crucial for advancing the company’s AI abilities. Alongside its proprietary Tensor Processing Units (TPUs) and Gemini models, Google is trying to stand apart from competitors who rely more on outside chipmakers like NVIDIA.
The recent launch of Google’s eighth-generation TPUs is a big step forward. One chip is built specifically for training AI models, and another is designed for inference, offering more memory for faster performance. This approach responds to the growing need to build and run AI systems efficiently and at scale.
Google says its integrated system gives it both a performance boost and a cost edge over competitors like Amazon’s Trainium chips and Microsoft’s Maia processors. Reducing dependence on NVIDIA’s GPUs which are usually seen as the industry standard could be a game-changer, especially as AI computing demand keeps rising.
Google Cloud’s revenue jumped 48% in the last quarter of 2025, and it’s projected to bring in over $70 billion this year, up from $43 billion in 2024. The company’s market share has climbed from 7% to 14% over eight years, pointing to steady progress, but it still hasn’t caught up to its larger rivals.
Also read: Google’s $40 Billion Cash and Compute Investment in Anthropic Explained
What is Driving this Intensifying Competition
Despite the gains, Google is still a distant third behind Amazon Web Services and Microsoft Azure. It is also criticised for allowing Anthropic and OpenAI to advance in AI products like coding tools and chatbots.
Shifting alliances and growing tensions make things even more complex. Google’s rise as a potential NVIDIA rival has strained their relationship, though Alphabet is still one of NVIDIA’s biggest customers. NVIDIA’s CEO Jensen Huang has publicly questioned the performance of Google’s chips, stakes are high as both sides race to push technology boundaries.
Partnerships still matter. Anthropic recently agreed to buy more of Google’s chips in a bigger deal, including up to $40 billion in investment and 5 GW of computing capacity over five years, valued at more than $200 billion. Deals like this show just how much investment it takes to stay competitive in AI and the risks involved.
OpenAI and Anthropic are reportedly losing tens of billions each year as they struggle for computing power. Even though they’ve raised more than $150 billion this year, mostly to prepare for potential IPOs, Kurian warns that this pace of spending can’t last forever and private capital markets could soon be tapped out.
According to Epoch AI, Google now controls about a quarter of the world’s AI computing power, running roughly 3.8 million TPUs and 1.3 million GPUs. Microsoft operates about 3.2 million Nvidia GPUs. These numbers show how much infrastructure is needed and how tough it is for smaller firms to break into the space.
Conclusion
Google’s latest push into cloud computing signals a bigger shift in the industry: owning AI infrastructure is now just as crucial as the services built on top of it. With heavy investments in its own chips, models, and data centers, Google aims to secure a sustainable, competitive spot in the cloud market.
While recent growth suggests the plan is working, there are still serious hurdles. Amazon and Microsoft aren’t backing off, and tensions with NVIDIA only complicate matters. Financial strains among AI startups could force the industry to consolidate soon. The next year or two won’t just be about innovation. Economics will decide who can actually afford to build the future of AI.








