- Odinn Infinity Cube combines multiple Omnia supercomputers into a single glass enclosure
- Memory capacity reaches 86TB of DDR5 ECC registered RAM
- NVMe storage in the cube totals a whopping 27.5PB
Odinn, a California-based startup, has introduced the Infinity Cube as an innovative solution to compress data center class computing into a visually appealing structure.
At CES 2026, the company unveiled the Odinn Omnia, a portable AI supercomputer. However, such a system alone would encounter throughput limitations, which the Cube aims to address.
The Infinity Cube is a 14ft x 14ft AI cluster capable of housing multiple Omnia AI supercomputers within a single glass enclosure.
Scaling AI with Modular Clustering
This device prioritizes extreme component density over incremental efficiency improvements.
According to Odinn, a fully customizable core specification allows the Cube to scale up to 56 AMD EPYC 9845 processors, aggregating 8960 CPU cores.
Its GPU capacity extends to 224 Nvidia HGX B200 units, complemented by 43TB of combined VRAM.
For storage, the device supports up to 86TB of DDR5 ECC registered RAM, while NVMe storage capacity reaches an impressive 27.5PB.
These specifications suggest significant internal interconnect and power distribution demands, details of which have not been publicly disclosed.
The device employs liquid cooling, with each Omnia unit managing its own thermal requirements independently.
This design theoretically eliminates the need for raised floors or centralized cooling systems.
The Infinity Cube utilizes a proprietary software layer called NeuroEdge to manage workloads across the cluster.
This software integrates with Nvidia’s AI ecosystem and common frameworks, automating scheduling and deployment.
This abstraction aims to minimize manual tuning, although it also places operational reliance on Odinn’s software development.
Organizations that depend on cloud infrastructure for AI workloads may question whether local orchestration simplifies administration in practical scenarios.
The company claims that the Infinity Cube is ideal for organizations with stringent privacy, safety, or latency requirements that discourage cloud reliance.
Bringing infrastructure closer to workloads can reduce network delays, but it also shifts the responsibility for uptime, maintenance, and lifecycle management back to the owner.
The concept of showcasing data center hardware within compact glass enclosures may have aesthetic appeal.
However, the practical trade-offs between density, accessibility, and resilience remain unaddressed without real-world deployment evidence.




