Home EnterpriseAIMicron 192GB SOCAMM2: Compact, Low-Power DRAM for AI Data Centers

Micron 192GB SOCAMM2: Compact, Low-Power DRAM for AI Data Centers

by Lyle Smith
Micron socamm2

Micron’s 192GB SOCAMM2 LPDDR5X module boosts AI data center performance with higher bandwidth, lower power use, and a compact design.

Micron has begun shipping customer samples of its new 192GB SOCAMM2 memory module, marking a significant advancement in low-power DRAM for AI data centers. Built on Micron’s most advanced 1-gamma process node and based on LPDDR5X technology, the module is set to deliver high bandwidth and improved energy efficiency in a remarkably compact form factor.

Micron 192GB SOCAMM2

As AI models get bigger and more complex, memory is now one of the biggest choke points in system design. Micron’s SOCAMM2 modules aim to take on this issue by offering large‑capacity, low‑power memory that can scale more flexibly than traditional server DIMMs.

Micron 192GB SOCAMM2 Highlights

This latest iteration of Micron’s SOCAMM (small outline compression attached memory module) design increases capacity by 50% over the previous generation without expanding its physical footprint. The new 192GB module is expected to dramatically improve real-time AI inference performance. According to Micron, systems using SOCAMM2 can reduce time to first token (TTFT) in inference workloads by more than 80% compared to traditional server memory.

Designed for next-generation AI workloads that demand high throughput and energy-conscious infrastructure, SOCAMM2 modules combine LPDDR5X’s inherent advantages (low-power consumption and high data transfer rates) with server-grade reliability and serviceability. This makes them particularly good for dense, liquid-cooled systems and environments that require significant memory capacity with tight power budgets.

SOCAMM2 also offers more than 20% power efficiency improvement over its predecessor and delivers an efficiency gain of more than two-thirds compared to equivalent RDIMMs. This results in substantial savings at scale, especially in full-rack AI server deployments, which may use over 40 terabytes of low-power DRAM. The module’s physical size is just one-third that of a comparable RDIMM, allowing data centers to improve memory density while minimizing system footprint.

Micron Working With NVIDIA

Micron’s collaboration with NVIDIA over the past five years played a key role in developing the SOCAMM form factor for data center use. Based initially on mobile memory architecture, SOCAMM has since evolved to meet the durability, bandwidth, and thermal demands of server environments. SOCAMM2 extends this vision with stacking innovations and modular construction that support easier serviceability and future scalability.

The company has also contributed to JEDEC’s development of SOCAMM2 standards and is working with industry partners to drive adoption across the AI data center ecosystem.

Availability

SOCAMM2 modules are now sampling in configurations up to 192GB and data rates up to 9.6 Gbps, with volume production planned in line with customer deployment timelines.

Micron SOCAMM2 Technology Enablement Program webpage

Engage with StorageReview

Newsletter | YouTube | Podcast iTunes/Spotify | Instagram | Twitter | TikTok | RSS Feed