NVIDIA has announced the Magnum IO software suite, which is designed to help data scientists and AI and high-performance computing researchers process huge amounts of data very quickly. Magnum IO has been optimized to eliminate storage and input/output bottlenecks, and is quoted with data processing speeds that are twenty (20) times faster for multi-server, multi-GPU computing nodes when working with massive datasets. This will allow organizations to perform complex financial analysis, climate modeling and other HPC workloads.
NVIDIA partnered with industry leaders in networking and storage to develop Magnum IO, including DataDirect Networks, Excelero, IBM, Mellanox and WekaIO. This software suite release is also highlighted by GPUDirect Storage, which gives researchers the ability to bypass CPUs when accessing storage for quick access to data files in applications such as simulation, analysis or visualization.
Availability
NVIDIA Magnum IO software is available now, though GPUDirect Storage is currently available to only a select number of early-access customers with a broader release slated for the first half of 2020.
Dell Technologies is advancing its data protection portfolio to enhance cyber resiliency across appliances, software, and as-a-service offerings amid rising…
Since its launch in 2019, the Cray ClusterStor E1000 Storage System has emerged as a pivotal technology in the field…
Quantum Corporation has introduced Quantum GO, a subscription service designed to meet the escalating data demands and cost considerations enterprises…
JetCool has launched an innovative liquid cooling module tailored for NVIDIA's H100 SXM and PCIe GPUs, claiming a significant advancement…
iXsystems has launched the TrueNAS Enterprise H-Series platforms, designed to give organizations ultimate performance. The H10 model is now available,…
Hannover Messe 2024 represents a significant event in the global industrial sector, serving as the world's largest industrial trade fair.…