Micron Technology introduced its 192GB SOCAMM2 (Small Outline Compression Attached Memory Module), marking a new milestone in energy-efficient DRAM for AI data centers. Built with LPDDR5X technology and Micron’s advanced 1-gamma process, the new SOCAMM2 extends the company’s low-power DRAM leadership by delivering 50% higher capacity within the same compact footprint as its predecessor. Early testing shows SOCAMM2 can cut time-to-first-token (TTFT) by more than 80% for real-time inference workloads, while improving power efficiency by over 20%.
SOCAMM2 continues Micron’s five-year collaboration with NVIDIA to expand low-power memory use in AI infrastructure. The 192GB module brings LPDDR5X’s hallmark energy savings and bandwidth performance to CPU-attached main memory, offering a 2/3 power efficiency gain over comparable RDIMMs in a module one-third the size. This compact, modular form factor enhances system serviceability and supports dense, liquid-cooled server designs targeting large-scale AI training and inference workloads.
Micron said its SOCAMM2 design incorporates advanced stacking and rigorous data-center-grade testing to ensure reliability and scalability. The company is also contributing to the JEDEC SOCAMM2 standard to help accelerate industry-wide adoption of low-power DRAM. Customer samples of the 192GB SOCAMM2 are shipping now, with speeds up to 9.6 Gbps and mass production aligned with major AI platform launches.
• 192GB SOCAMM2 increases capacity by 50% over previous generation
• Built on Micron’s 1-gamma DRAM process with >20% power-efficiency improvement
• Reduces time-to-first-token by >80% in real-time inference workloads
• Offers 2/3 power savings vs. RDIMMs in a one-third smaller form factor
• Designed for liquid-cooled, high-density AI data center systems
“As AI workloads become more complex and demanding, data center servers must achieve increased efficiency, delivering more tokens for every watt of power,” said Raj Narasimhan, senior vice president and general manager of Micron’s Cloud Memory Business Unit. “Micron’s proven leadership in low-power DRAM ensures our SOCAMM2 modules provide the data throughput, energy efficiency, capacity, and data center-class quality essential to powering the next generation of AI data center servers.”
🌐 Analysis: Micron’s SOCAMM2 marks a major step in redefining DRAM for AI infrastructure, where bandwidth and power efficiency are critical bottlenecks. By combining LPDDR5X and modular serviceability, Micron is positioning against traditional RDIMM approaches from Samsung and SK hynix, both of which are pursuing low-power and HBM solutions for AI compute. The company’s alignment with JEDEC and NVIDIA underscores its strategy to shape standards for sustainable AI data center memory.








