Home Enterprise GRAID SupremeRAID SR-1010 Review

GRAID SupremeRAID SR-1010 Review

by Brian Beeler

GRAIDTM made waves last year by introducing an NVMe “RAID card.”  Of course, it wasn’t exactly a RAID card at all, but an NVIDIA GPU, paired with GRAID’s software. The net results were astonishing, their first-gen card was able to hit nearly 9 million 4K random read IOPS with minuscule latency, with just 8 SSDs.

GRAIDTM made waves last year by introducing an NVMe “RAID card.”  Of course, it wasn’t exactly a RAID card at all, but an NVIDIA GPU, paired with GRAID’s software. The net results were astonishing, their first-gen card was able to hit nearly 9 million 4K random read IOPS with minuscule latency, with just 8 SSDs.

Now the company is back with the GRAID SupremeRAID SR-1010 which ups the ante with a Gen4 GPU and updated software that can deliver up to 19 million 4K random read IOPS and 110GB/s large block sequential read.

GRAID SupremeRAID SR-1010 in riser

GRAID SupremeRAID Primer

It was just a few months ago that we reviewed the SR-1000, so we’ll not belabor the point too much about what GRAID’s mission is. But for those who want the brief version, GRAID essentially saw a flaw in the way NVMe drives were being managed. Users were either using software RAID solutions to group SSDs together or they were using NVMe RAID cards that are full of their own challenges.

Software RAID is great in that it can be very low cost and easy to implement. But software RAID has penalties, specifically, there are CPU and system resources hits. Software RAID doesn’t have a hardware management component, so tasks must be offloaded to the system CPU and DRAM.

Physical RAID cards, which have been the standard for decades, haven’t been able to keep up with the pace of innovation in flash. A server with a Gen4 PCIe slot can only hope to attain 16GB/s out of a x8 slot or 32GB/s out of a x16 slot. The thing is, most RAID cards are x8, so a host system with even just 24 SSDs, would need multiple RAID cards to get to a point of full performance for the flash.

GRAID SupremeRAID SR-1010 Specifications

This gets us to the latest iteration of GPU-enabled RAID, the GRAID SupremeRAID SR-1010. The SR-1010 supports up to 32 Gen4 SSDs in a single box. The updated hardware, software and interface let the SR-1010 post nice progressive gains over the first card, most notably when it comes to writes. The SR-1010 card itself is an NVIDIA A2000, that’s designed to support RAID0, 1, 5, 6 and 10 with GRAID’s software.

The newer Gen4 card offers increased computation power and PCIe bandwidth. The PCIe bandwidth contributes to write performance since write data flows through the GPU card. This is where sequential write and random write performance double because of the bandwidth increase from 10GB/s to 20GB/s (PCIe Gen3 vs Gen4).

GRAID SupremeRAID SR-1010 with memblaze ssd

A more complete spec overview is below, and the detailed sheet from GRAID is here.

  • Supported RAID Levels – RAID 0, 1, 5, 6, 10
  • Max Physical Drives – 32
  • Max Drive Groups – 4
  • Max Virtual Drives per Drive Group- 8
  • Max Drive Group Size –
  • Defined by physical drive size
  • OS Support
    • Linux: Alma Linux 8.5
    • Rocky Linux 8.5
    • CentOS 7.9, 8.4, 8.5
    • openSUSE Leap 15.2, 15.3
    • RHEL 7.9, 8.4, 8.5
    • SLES 15 SP2, SP3
    • Ubuntu 20.04
    • Windows Server 2019 x86-64
    • Windows Server 2022 x86-64
  • Host Interface – x16 PCIe Gen 4.0
  • Max Power Consumption – 70W
  • Form Factor – 2.713″ H x 6.6″ L, Single Slot
  • Product Weight – 306 g

GRAID SupremeRAID SR-1010 Performance

To measure the performance of the SupremeRAID SR-1010, we leveraged our Dell PowerEdge R750 running Ubuntu 20.04 with eight Gen4 NVMe bays in front. We leveraged eight of the Memblaze PBlaze6 6926 12.8TB SSD, giving us a large footprint of high-performance NAND to use in a RAID5 configuration.

Since these are different SSDs than our last review, we completed a full round of new tests comparing RAID5 configurations using software RAID versus the SR-1000 and SR-1010. For Software RAID we leveraged mdadm with a 64K chunk size. Tests were performed running FIO against the RAID volume.

Dell PowerEdge R750 Specifications:

  • 2 x 8380 Intel 3rd Gen Intel Xeon Scalable CPUs
  • 32 x 32GB DDR4 3200MHz
  • 8 x 12.8TB Memblaze PBlaze6 6926
RAID 5 FIO Performance
Test SW RAID5 SR-1000 (Gen3) SR-1010 (Gen4)
1MB sequential write (16T/32Q) 1.3GB/s 11.1GB/s 17.7GB/s
1MB sequential read (16T/32Q) 56.2GB/s 49.4GB/s 49.4GB/s
8K random 70/30 (32T/64Q) 160.2k IOPS 1.51M IOPS 1.95M IOPS
4K random write (32T/64Q) 73.9k IOPS 838k IOPS 1.56M IOPS
4K random read (32T/64Q) 2.24M IOPS 10.3M IOPS 11.0M IOPS

Starting with sequential read bandwidth, the software RAID configuration had a slight edge, measuring 56.2GB/s versus 49.4GB/s on the SR-1000 and SR-1010.

Moving to sequential write though that lead switched back to GRAID with a huge difference. We measured 1.3GB/s from software RAID compared to 11.1GB/s from the Gen3 SR-1000 and 17.7GB/ from the Gen4 SR-1010.

In our random 8K workload with a 70/30 R/W mix, we saw 160.2k IOPS from software RAID compared to 1.51M IOPS with the SR-1000 and 1.95M IOPS on the SR-1010.

Moving down to the 4K random workload we continued to see huge gains, with 4K random write. The software RAID configuration measured just 73.9k IOPS compared to 838k IOPS on the SR-1000 and a much improved 1.56M IOPs on the SR-1010.

4K random write both the GRAID versions had similar performance with 10.3M IOPS from the SR-1000 and 11M IOPS on the SR-1010 but were much higher than software RAID which measured 2.24M IOPS.

Conclusion

When looking at the GRAID SupremeRAID SR-1010, we got exactly what we expected. While we were pleasantly surprised with the SR-1000 last year, with the SR-1010 we went in with higher expectations. To give the SR-1010 the best shot we could, we paired the GRAID card with a high-spec server and eight of our best enterprise SSDs in the Memblaze 6926 drives.

It should be noted that our performance results, while very good, won’t match the GRAID spec simply due to the number of drives under test. While we’d love to revisit the GRAID card with a fully-loaded 32 SSD configuration, we just don’t have that many matching SSDs in the lab.

In terms of the largest performance gains compared to the prior-gen card, the GRAID SupremeRAID SR-1010 had the biggest measured improvements in write workloads. We saw sequential write speeds pop from 11.1GB/s with the SR-1000 to 17.7GB/s with the SR-1010. Random write performance also improved substantially, increasing from 838k IOPS with the SR-1000 to 1.56M IOPS with the SR-1010. Overall we were very impressed by these numbers, which were exponentially higher than software RAID performance.

Organizations that want the most out of their investment in NVMe SSDs have a lot of options. What most don’t realize though is that both software RAID and traditional hardware RAID cards both have serious limitations that effectively kneecap the most expensive components in a server, the SSDs. GRAID opens up the flow of data while also freeing up system resources and/or I/O slots that would be used with legacy models. Setup is easy, performance gains are easily measurable. It’s well worth the PoC to see if these cards are the right fit.

Product Page

Engage with StorageReview

Newsletter | YouTube | Podcast iTunes/Spotify | Instagram | Twitter | Facebook | TikTok | RSS Feed