Alphabet and Nvidia Invest in AI Startup SSI as AI Chip Competition Grows

Alphabet and Nvidia have joined major venture capital firms in backing Safe Superintelligence (SSI), the artificial intelligence startup co-founded by former OpenAI chief scientist Ilya Sutskever, as reported by Reuters. Just months after its inception, SSI has emerged as one of the most highly valued AI startups in the market.
The backing underscores the growing appetite among tech powerhouses and cloud service providers to invest in next-generation AI ventures, which depend heavily on high-performance computing infrastructure. Earlier this week, Alphabet’s cloud unit announced it would supply SSI with access to its proprietary Tensor Processing Units (TPUs), specialized AI chips developed in-house.
SSI’s Skyrocketing Valuation and Demand for Chips
According to sources, SSI was reportedly valued at an impressive $32 billion in a funding round led by Greenoaks, reflecting its strong position in the AI research space. Co-founder Sutskever’s reputation for pioneering AI breakthroughs has been a major draw for investors.
Like many advanced AI labs, SSI requires vast computing power to build its models. Although Reuters could not confirm the precise terms of Alphabet’s and Nvidia’s investments, it is clear the startup is attracting high-profile support.
Reuters reports that Alphabet’s dual involvement through both corporate investment and its cloud division highlights a shift in its AI hardware approach. While Google previously reserved TPUs exclusively for internal projects, it now makes these chips available to external innovators like SSI.
Darren Mowry, Google’s managing director for startup partnerships, told Reuters that this move represents a broader strategy to attract leading AI developers. He said, "With these foundational model builders, the gravity is increasing dramatically over to us.”
While Nvidia dominates the AI chip market with its graphics processing units (GPUs), which command over 80% market share, SSI is primarily using Google’s TPUs for its development needs, according to insiders.
Also read: Thinking Machines Lab Targets Historic $2 Billion Seed Funding
Competition Heats Up Among AI Chip Providers
According to Reuters, Google Cloud offers both Nvidia GPUs and its own TPUs, which are optimized for AI workloads and widely used for training large-scale models. These chips play a crucial role in AI development for companies like Apple and Anthropic.
Anthropic, despite receiving backing from both Google and Amazon, uses Google’s TPUs for its AI models. Meanwhile, Amazon has developed its own AI processors, Trainium and Inferentia, and is gaining traction in the AI chip market.
Since 2023, Anthropic has used Amazon's custom hardware and supercomputing resources, yet continues to spend significantly on Google’s cloud services without any slowdown.