StorageReview.com

Wasabi Expands with New High-Speed Storage Class and Enters Silicon Valley to Support AI Infrastructure

Cloud  ◇  Enterprise

Wasabi Technologies has announced a new development in its cloud storage offerings: the introduction of Wasabi Fire, a high-performance storage class specifically designed to support AI workloads. Alongside that, the company has opened a new storage region in San Jose, California, through a partnership with IBM Cloud. Together, these moves are part of Wasabi’s effort to strengthen its position in the growing AI infrastructure market, where organizations are seeking storage solutions that are fast, scalable, and cost-efficient.

Wasabi Fire is expected to roll out in early 2026, bringing SSD-level performance through NVMe technology. It’s built with heavy-duty AI workloads in mind, including machine learning training, real-time inference, high-frequency data capture, and media processing. Pricing is about $20 per terabyte per month, with no additional egress fees or hidden charges. The aim is to give organizations the speed and reliability needed for AI development without the high costs often associated with similar high-performance storage options.

The company framed object storage as a core part of AI infrastructure and identified a market gap in which customers often have to choose between affordability and performance. Wasabi’s goal with Fire is to avoid that tradeoff by offering a combination of high-speed access and predictable pricing. The company already provides HDD-based Hot Cloud Storage for active workloads, and Fire is set to extend that platform into higher-performance territory.

New AI-ready storage region

As part of the launch, Wasabi also announced the activation of a new AI-ready storage region located in San Jose, its 16th worldwide. The San Jose region is co-located with IBM infrastructure and aims to support organizations with ultra-fast storage to drive more complex, time-sensitive workloads. This setup is designed to address latency issues and reduce bottlenecks during AI training and inference. With this new offering, organizations can potentially improve GPU utilization, accelerate model development, scale workloads more efficiently, and better manage associated costs.

IBM Cloud acknowledged the expansion as an opportunity for customers to leverage secure, enterprise-grade infrastructure alongside the high-performance features of Wasabi’s new Fire storage class. The collaboration is based at IBM’s San Jose data center, supporting Wasabi’s entry into Silicon Valley.

Engage with StorageReview

Newsletter | YouTube | Podcast iTunes/Spotify | Instagram | Twitter | TikTok | RSS Feed

Lyle Smith

Lyle is a long-time staff writer for StorageReview, covering a broad set of end user and enterprise IT topics.