Home EnterpriseAI NetApp AIPod Blends Storage, Lenovo and NVIDIA for Easy AI Onramp

NetApp AIPod Blends Storage, Lenovo and NVIDIA for Easy AI Onramp

by Harold Fritts

NetApp has announced the AIPod with Lenovo ThinkSystem servers for NVIDIA OVX, a new converged infrastructure optimized for the generative AI era. It marks a significant advancement in their partnership with Lenovo.

NetApp has announced the AIPod with Lenovo ThinkSystem servers for NVIDIA OVX, a new converged infrastructure optimized for the generative AI era. It marks a significant advancement in their partnership with Lenovo.

This solution is designed to be simple, cost-effective, and powerful, supporting NVIDIA NIM inference microservices, the NVIDIA AI Enterprise software platform for generative AI development and deployment, and retrieval-augmented generation (RAG). It enables customers to utilize their private, proprietary data for AI applications without extensive model training.

NetApp AIPod

Generative AI presents vast opportunities for organizations, though challenges such as cost, integration complexity, and deployment times can make AI seem inaccessible. Additionally, pre-built large language models often lack training on the latest or most relevant data organizations need to generate valuable insights or automate routine tasks effectively.

The new NetApp AIPod with Lenovo harnesses this data beyond basic model training by facilitating inferencing and RAG, making it easier and quicker for organizations to integrate the power of updated and relevant large language models with the data stored on NetApp systems.

Sandeep Singh, Senior Vice President & General Manager, Enterprise Storage at NetApp, highlighted the transformational impact of the NetApp AIPod with Lenovo ThinkSystem servers for NVIDIA OVX on enterprise AI. This solution provides a pre-integrated, high-performance platform that simplifies the deployment and scaling of generative AI workloads. It empowers organizations to focus on deriving value and insights from their data without the complexities of building and managing AI infrastructure from scratch.

The NetApp AIPod integrates NetNetApp’s robust storage systems with Lenovo’s ThinkSystem SR675 V3 servers, which support the NVIDIA-certified OVX architecture with NVIDIA L40S GPUs and NVIDIA Spectrum-X Networking. This comprehensive infrastructure solution is designed to streamline AI adoption, enabling use cases like chatbots, knowledge management, and object recognition. It simplifies AI deployment by integrating seamlessly into existing ecosystems, streamlining operations, speeding up implementation, and securing AI infrastructure.

Kirk Skaugen, President of Lenovo Infrastructure Solutions Group, emphasized Lenovo’smmitment to expanding its partnership with NetApp and NVIDIA to make AI more accessible and manageable for businesses of all sizes, focusing on critical availability, ease of management, and infrastructure efficiency.

Bob Pette, Vice President of Enterprise Platforms at NVIDIA, also noted the importance of making AI accessible and scalable for every industry, describing the NetApp AIPod with Lenovo for NVIDIA OVX as a powerful, user-friendly platform that enables businesses to drive innovation.

The NetApp AIPod with Lenovo for NVIDIA OVX is expected to be available by the summer of 2024.

Engage with StorageReview

Newsletter | YouTube | Podcast iTunes/Spotify | Instagram | Twitter | TikTok | RSS Feed