Cerebras and Ranovus Secure $45M DARPA Contract to Advance AI Chip Connectivity

Silicon Valley-based artificial intelligence chip manufacturer Cerebras Systems and Canadian optical networking firm Ranovus have secured a $45 million contract from the US Defense Advanced Research Projects Agency (DARPA) to create ultra-fast, low-energy interconnects between advanced computing chips. The project is intended to support high-performance computing systems that can simulate complex battlefield scenarios in real time.
Innovative Chip Design Meets Optical Connectivity
While most AI chips are about the size of a postage stamp, Cerebras' Wafer-Scale Engine is designed to outperform clusters of smaller GPUs by eliminating communication bottlenecks between chips.
In this new DARPA-backed project, Cerebras will integrate its massive chips with Ranovus' cutting-edge optical networking technology. Unlike conventional electrical connections, Ranovus uses light to transmit data, enabling faster speeds and lower power consumption, key features for scalable and sustainable high-performance computing.
Unveiling New Tech for Defense Applications
Cerebras CEO Andrew Feldman said, “By solving these fundamental problems of compute bandwidth, communication IO and power per unit compute through Cerebras’ wafer scale technology plus optical integration with Ranovus co-packaged optics, we will unlock solutions to some of the most complex problems in the realm of real-time AI and physical simulations -- solutions that are today utterly unattainable.”
Also read: Retym Raises $180M to Drive Next-Gen AI Infrastructure with Programmable DSP Technology
Fueling the Race for Optical-Electronic Integration
As demand for more powerful and efficient AI computing infrastructure surges, particularly for defense, scientific, and enterprise applications, optical interconnects are gaining attention as a critical solution to bandwidth and energy challenges. This has triggered a wave of investment and competition among startups and established chipmakers alike.