CEO's Column
Search
More
AI Infrastructure

Panmnesia Raises $30 Million to Develop Next-Generation AI Infrastructure

ByNeelima N M
2025-05-09.3 months ago
Panmnesia Raises $30 Million to Develop Next-Generation AI Infrastructure
Panmnesia Raises $30 Million to Develop Next-Generation AI Infrastructure, Image Credit: Linkedin

Panmnesia, a leader in AI infrastructure development, has announced that it is set to be funded with $30 million to work on an innovative project for revolutionizing the architecture of AI data centers.

The program will be geared towards the design of chiplet-based modular accelerators intended to improve the performance of high-end AI services like large language models (LLMs), vector search, and recommendation systems.

Meeting the Challenges of Contemporary AI Workloads

As AI becomes an integral part of life, the need for more precise models and larger data sets is increasing exponentially. This explosion in AI model complexity is putting unprecedented stress on the underlying infrastructure, increasing both cost and energy consumption.

Traditional AI infrastructure, which is mostly based on GPUs with dedicated computing and memory resources, tends to have poor resource utilization. Panmnesia will solve these problems by making more flexible, scalable, and efficient systems available to cater to the expanding demands of AI workloads.

Project Overview and Technological Innovation

With the $30 million in funding, Panmnesia plans to develop a next-generation chiplet-based AI accelerator and an integrated infrastructure system to support large-scale AI workloads efficiently.

The new accelerator will feature in-memory processing technology to minimize data movement, significantly reducing power consumption. This system will be part of Panmnesia’s proprietary CXL full-system solution, leveraging advanced memory and interconnect technologies.

Key Technological Features

Panmnesia’s AI accelerator uses a modular chiplet-based architecture for scalable, flexible design and faster development. It combines manycore and vector processor chiplets to optimize performance per watt, and integrates in-memory processing to cut energy use by reducing data movement during AI workloads.

Panmnesia’s infrastructure will incorporate CXL (Compute Express Link) technology to enable resource pooling and on-demand expansion of computing and memory resources. This flexibility allows customers to optimize costs while maintaining the scalability required for AI workloads.

Also read: Elea Launches Rio AI City, a 3.2GW Green Hub for AI and Cloud

A Game-Changing Approach to AI Infrastructure

Panmnesia’s CEO expressed optimism about the project, citing the company’s expertise in memory and interconnect technologies as key to securing the funding.

This project is expected to revolutionize the AI infrastructure landscape and significantly impact data center architecture, driving the evolution of AI data centers and enabling more efficient, cost-effective scaling for AI workloads.

Related Topics

AI Infrastructure ScalingAI infrastructure

Subscribe to NG.ai News for real-time AI insights, personalized updates, and expert analysis—delivered straight to your inbox.