Home Hitachi Data Systems Delivers Next-Generation Hyper-Converged, Scale-Out Platform

Hitachi Data Systems Delivers Next-Generation Hyper-Converged, Scale-Out Platform

by Adam Armstrong

Today Hitachi Data Systems (HDS) announced its next-generation Hyper Scale-Out Platform (HSP), the HSP 400 series. The new HSP 400 series comes with native integration with the Pentaho Enterprise Platform. This native integration will help the HSP deliver a sophisticated, software-defined, hyper-converged platform for big data deployments. Supporting big data blending, embedded business analytics and simplified data management, the HSP 400 combines compute, storage, and virtualization to deliver seamless infrastructure.


Today Hitachi Data Systems (HDS) announced its next-generation Hyper Scale-Out Platform (HSP), the HSP 400 series. The new HSP 400 series comes with native integration with the Pentaho Enterprise Platform. This native integration will help the HSP deliver a sophisticated, software-defined, hyper-converged platform for big data deployments. Supporting big data blending, embedded business analytics and simplified data management, the HSP 400 combines compute, storage, and virtualization to deliver seamless infrastructure.

Massive amount of data is being generated on an almost constant basis now. The big culprits of this generation are information technology (IT), operational technology (OT), the Internet of Things (IoT). More and more companies are coming to the realization that value can be derived from this data, but they need the technology in which they can obtain this value. This is where the HPS comes in with its software-defined architecture that centralizes and supports the easy storing and processing of large datasets with high availability, simplified management and a pay-as-you-grow model. The HPS is offered as a turnkey appliance that can be installed in a matter of hours as opposed to months. Once installed the HPS supports production workloads, and simplifies creation of an elastic data lake that helps customers easily integrate disparate datasets and run advanced analytic workloads.

The scale-out architecture of the HPS gives customers a simplified, scalable and enterprise-ready infrastructure for big data. HDS also designed the HPS with a centralized, easy-to-use user interface to automate the deployment and management of virtualized environments for leading open source big data frameworks, including Apache Hadoop, Apache Spark, and commercial open source stacks like the Hortonworks Data Platform. The HPS is initially designed to focus on dig data analytics, but HDS’s long term goal is to make the platform deliver best-in-class total cost of ownership (TCO) for a variety of IT workloads.

As mentioned above, the HPS 400 series is natively integrated with Pentaho Enterprise Platform. This integration will give customers enterprise-grade features such as big data lineage, lifecycle management and enhanced information security while giving them better control of their analytic data pipeline. HDS also states that this will accelerate time to business insight and deliver rapid return on investment (ROI), while simplifying the integration of IT and OT.

Availability

The Hitachi Data Systems Hyper Scale-Out Platform is available now with SAS disk drives with an all-flash expected to be available later this year. 

HDS Big Data

Discuss this story

Sign up for the StorageReview newsletter