Home Enterprise Independent Storage Reviews Are All The Rage, Glad We’re Already There

Independent Storage Reviews Are All The Rage, Glad We’re Already There

by Brian Beeler

What a fun few days it's been for those that enjoy watching the storage industry turn on itself. Historically, the way enterprise IT gets talked about is when a company pays an analyst or media outlet to publish a report of one kind or another. In the storage world what this often means explicitly is that a vendor will pay an analyst to write a "validation report" showing that the vendor's claims are true, fair and accurate. These reports are then used by said vendor's sales team to extoll the virtues of their SAN or whatnot, pointing to the implicitly independent analyst report as proof. Storage buyers who have not the time or expertise in most cases to properly test products themselves (we recently saw a company testing a hybrid array with CrystalDiskMark for crying out loud) take the report as valuable, without knowing the report is a fraud. Fraud may be a broad brush to paint the industry with, but if the reports are being touted as independent, they're usually not and that's a deceptive practice.


What a fun few days it's been for those that enjoy watching the storage industry turn on itself. Historically, the way enterprise IT gets talked about is when a company pays an analyst or media outlet to publish a report of one kind or another. In the storage world what this often means explicitly is that a vendor will pay an analyst to write a "validation report" showing that the vendor's claims are true, fair and accurate. These reports are then used by said vendor's sales team to extoll the virtues of their SAN or whatnot, pointing to the implicitly independent analyst report as proof. Storage buyers who have not the time or expertise in most cases to properly test products themselves (we recently saw a company testing a hybrid array with CrystalDiskMark for crying out loud) take the report as valuable, without knowing the report is a fraud. Fraud may be a broad brush to paint the industry with, but if the reports are being touted as independent, they're usually not and that's a deceptive practice.

Interestingly, Chris Mellor did a piece on this very problem last week. While we weren't listed as one of his favorite independent testers, his points are valid. One bit of advice rings especially true; 

Any vendor who will not submit their product to review by a third party of your choosing must have damn good reasons for not doing so. If you come across a storage product with no independent reviews, using the criteria above, then I would suggest you walk away.

Mellor is absolutely right; this is exactly what StorageReview has been preaching for years. We do not charge for our reviews, everything you see on the site is independent and without vendor bias.

"But wait," the naysayer protests, "How can you make any money if you don't charge for your reviews?" It's funny, every time we engage with a new storage vendor on a potential review; this is one of the first questions that comes up because the industry has accepted pay-to-play. While our business model isn't a secret, we rarely disclose it in content like this because it's usually irrelevant. But here, as we draw distinctions between reviews a buyer can trust and reviews they cannot, it's critical. Like most media outlets we monetize our traffic with advertisements. The other thing we do is what we broadly call "lab services." This could be anything from testing one firmware revision against another; to putting alpha or beta systems through our workloads to see if we shake out issues a vendor's engineers may have missed. None of this data gets presented on the site however; it's for internal consumption only. 

Let's compare our approach of testing in our lab to the way analyst reports get done. Typically a storage vendor will engage an analyst to write a report. But here's the twist, the analyst is essentially told what to write and test. In fact, the truth is often much worse than this. In many instances the analyst does no testing at all, they watch the vendor test, in the vendor's location, and write a report based on those activities, billing mightily for their "service." There are many examples where it's clear the content output has been influenced by the vendor. 

Storage Swiss is a frequent target when looking at the storage analyst market. They've come under fire again this week from EMC for posting anti VMware VSAN content at the behest of GridStore. Previously their sponsored videos have led to some fun claims, like this gem when looking at a Nexsan array propped up on a dining table; "It has very clean lines, you can just see the reliability." Really? I highlight this particular instance because even after the sale of Nexsan to Imation, the company refuses to let our lab do independent testing of their hybrid array. Let's echo Mellor again, "Any vendor who will not submit their product to review by a third party of your choosing must have damn good reasons for not doing so."

Nexsan's relevance in the wider storage market is debatable; we can't argue either side since we've never seen a unit. But this isn't just a small company problem. Take Pure Storage for instance. They win the award for the most vocal Twitter account, but they also refuse to participate in our testing. This is odd of course because their list of claims is long and varied, but they won't let us look. In full disclosure, Pure did once offer us a system to review two years ago. Once we built a great Fibre Channel network to test their unit, they reversed field, pulling their support. Wildly active Twitter + only paid analyst reports = buyer be careful. Even the big boys are guilty, IBM refuses to even send us their press releases, forget about engaging in any meaningful way. Despite their bold claim of investing $1 billion into flash to catch up to the rest of the leaders in the market, it's hard to tell if any of their efforts are working due to their refusal to allow an independent look.

It's not just the report writers that should be questioned; the list of poor storage recommendations goes on. SearchStorage for instance gives out awards at the end of the year, which enterprise IT loves for marketing value. The thing is, these awards aren't based on repeatable, comparative quantitative factors, as the site doesn't actually do substantive testing on the products they issue awards too. Instead awards are based vendor submissions that are rated on these criteria:

Products were judged by a panel of users, analysts, consultants, and Storage magazine and SearchStorage.com editors. Products were rated based on innovation, performance, ease of integration into environment, ease of use and manageability, functionality and value.

If you can't trust a vendor who submits their products to SearchStorage and wins on self-proclaimed "innovation value," who can you trust?

This piece doesn't necessarily add much value to the conversation, but with The Register vocally calling BS on the industry last week and EMC again this week, I felt it was it was worthwhile to sound the horn again. The SMB/SME audience is especially vulnerable to bad reporting and the marketing of dubious awards that mean nothing in almost every case. Without the wherewithal to do extensive proof of concept testing internally though, then where do they turn? The answer of course is independent testing. The next time storage sales guy hands over an analyst report, ask them who paid for it, then take it for what it's worth and try to find another take.

Discuss This Story