Nvidia Launches Lepton: A Marketplace to Simplify AI GPU Access for Developers

Nvidia has unveiled Lepton, a cloud-based software platform designed to streamline how developers access AI-ready GPUs. The tool enables cloud providers to list their Nvidia chips in a single marketplace, making it easier for developers to find and rent GPU capacity for training and deploying AI models.
The launch addresses a growing challenge in the AI industry—scattered access to computing power. Despite massive demand, developers still rely on a patchwork of manual outreach and one-off agreements to secure the processing power they need.
Nvidia’s vice president of cloud, Alexis Bjorlin, described the current system as chaotic. “It’s almost like everyone’s calling everyone for what compute capacity is available,” she said in an interview with Reuters. Lepton is built to simplify this process and support rapid ecosystem growth.
Also read: Anthropic Explores AI Consciousness and Model Welfare with New Research Program
New Cloud Providers Join Nvidia’s Ecosystem
The Lepton platform already features several fast-growing “neocloud” providers, including CoreWeave, Nebius Group, Crusoe, Foxconn, GMI Cloud, and Yotta Data Services. These companies specialize in renting Nvidia GPUs, helping to meet the spike in demand driven by AI development.
Notably, major players like Amazon Web Services, Microsoft Azure, and Google Cloud are not part of Lepton at launch. However, Nvidia says the platform is open to all and built to accommodate their participation if they choose.
Built With Developers in Mind
Lepton also offers regional filtering, allowing users to find GPUs in specific countries to meet local data residency laws. Developers can continue working directly with their chosen cloud providers, as Lepton does not replace those relationships—it simply facilitates discovery.
Nvidia has not yet disclosed whether Lepton will charge fees or commissions. But with over 5 million developers already working within its ecosystem, Nvidia aims to increase access to GPU resources at scale.
According to IDC’s Mario Morales, “It’s a good move. Nvidia is creating a bridge between supply and demand.”