Micron’s 192GB SOCAMM2 LPDDR5X module boosts AI data center performance with higher bandwidth, lower power use, and a compact design.
Micron’s 192GB SOCAMM2 LPDDR5X module boosts AI data center performance with higher bandwidth, lower power use, and a compact design.
NetApp launches AFX AI portfolio to deliver disaggregated storage in high-performance environments. And it comes with NetApp AI Data Engine.
Lenovo announces new AMD AI desktops: all-in-one, SFF, and tiny builds.
QNAP launches an affordable, Lite-Managed 10GbE switch designed for creators.
Cisco introduces its 8223 high-performance fixed router, based on the Silicon One P200 chip, which is said to deliver 51.2Tb/s.
Waterless cooling has been redefined for AI with a turnkey server and cold plate, thanks to the collaboration between ZutaCore and ASRock Rack on B300 clusters.
Graid licenses Intel VROC to unify CPU and GPU RAID, delivering high-performance protection for NVMe-heavy AI, HPC, and enterprise workloads.
With an emphasis on AI inference, Intel unveiled a new data center GPU and a rack-scale reference design for Gaudi 3.
At OCP 2025, HPE upgrades iLO 7 and Gen12 servers with stronger security, modular designs, NVIDIA support, and MLPerf-leading AI results.
Broadcom’s Thor Ultra 800G AI Ethernet NIC is designed for clusters with more than 100,000 XPUs.