Reviews Leaderboard Database Reference Search StorageReview Discussion Reliability Survey Search About StorageReview.com Contents

Same Drives = Same Performance?


StorageReview.com PayPal Donations



Same Drives = Same Performance?
  January 21, 2001 Author: Terry Baranski  


WinBench Results

The following WinBench tests were run five times on each drive: Disk/Read Transfer Rate, Disk CPU utilization, Disk Access Time, Business Disk WinMark 99, and High-End Disk WinMark 99. A single run of each of the above tests was considered a "trial", with five trials being conducted for each drive. The machine was rebooted between trials. Each test's final score represents the average of the five runs.


Sustained Transfer Rate Graphs:

STR Graphs Of Each Sample

Drive A

Drive B

Drive C

Drive D
- Click thumbnail for larger image -

One can see that all four drives have a maximum transfer rate of just under 30 MB/sec and a minimum transfer rate of around 17.5 MB/sec. This was expected: different samples of the same drive should, of course, have the same number of sectors per track in any given zone; therefore, sequential transfer rates should be identical.

Two graphs stand out. Note that drive C's chart is quite jagged in the first four zones - zone 1 in particular. This behavior was repeatable and consistent, and we're somewhat at a loss for an explanation. Drive D's graph is also interesting: there appears to be at least one remapped sector about halfway through the first zone.


The rest of the WinBench scores...

Ziff Davis WinBench 99 under Windows 2000 Professional using NTFS
Benchmark Drive ADrive BDrive CDrive D
Business Disk WinMark 99 (KB/sec) 6556 6588 6586 6324
High-End Disk WinMark 99 (KB/sec) 16420 16360 16240 16320
AVS/Express 3.4 (KB/sec)16520 15820 15680 16000
FrontPage 98 (KB/sec)63640 63680 63000 64100
MicroStation SE (KB/sec)21200 20960 20880 20900
Photoshop 4.0 (KB/sec)8616 8708 8566 8616
Premiere 4.2 (KB/sec)14240 14040 14220 13940
Sound Forge 4.0 (KB/sec)17980 17960 18860 18220
Visual C++ (KB/sec)16500 16920 16640 16620
Disk/Read Transfer RateStorageReview.com
Beginning (KB/sec)29767 29800 29443 28900
End (KB/sec)17500 17500 17500 17467
Disk Access Time (ms)15.04 15.16 15.26 15.18
Disk CPU Utilization (%)2.72 2.74 2.73 2.74

One can see that drives A, B, and C turn in virtually identical Business Disk WinMark 99 scores. Drive D, on the other hand, lags the others by 3%. High-End Disk WinMark 99 scores exhibit little deviation- all drives scored within 1% of each other. It should be noted, however, that there are some significant differences between drives in the individual application tests that make up the High-End score. For example, drive C's Sound Forge score is about 3.7% higher than both drive A and drive B's scores. However, it's important to note that these individual application tests run very quickly, so there are bound to be some differences in the scores, even when doing multiple runs on the same drive. Case in point: the difference between the lowest and highest of the five Sound Forge trials approaches 10%. As a result, we believe that the differences in application scores between drives are a product of the benchmark itself rather than inter-sample performance differences.

There is a 0.22ms difference between the lowest and highest access time scores (15.04ms for drive A, and 15.26ms for drive C). Does this mean that drive A seeks faster than the other three drives? While this is certainly possible, we have some reservations about drawing such a conclusion. One concern that we have with WinBench 99's access time test is its relatively short run-length. When measuring something such as average access time, accuracy increases as the number of strokes increases. The logic is obvious: when calculating an average, increased samples reduce the effect of outliers and other undesired effects.

Another concern we have about WinBench's access time test is a phenomenon that we observed while testing of these units. The table below consists of five access time scores for each drive (corresponding to the five trials that were run on each drive):

Access Time using Ziff Davis WinBench 99 under Windows 2000 Professional using NTFS
Access TimeDrive ADrive BDrive CDrive D
Trial 1 15.2 ms 15.4 ms 15.4 ms 15.3 ms
Trial 2 15.0 ms 15.1 ms 15.3 ms 15.2 ms
Trial 3 15.0 ms 15.1 ms 15.2 ms 15.2 ms
Trial 4 15.0 ms 15.1 ms 15.2 ms 15.1 ms
Trial 5 15.0 ms 15.1 ms 15.2 ms 15.1 ms

Note that, for all four drives, WinBench's reported access time decreases as more trials are run (e.g., the access time reported in trial 5 is always lower than the access time reported in trial 1). This behavior is very consistent. One explanation that occurrs to us was the fact that the drives weren't formatted in between trials. However, the question arises: why should they be? An access time measure purports to read random sectors, so reformatting shouldn't affect scores. A reboot was inserted between trials, so caching may be eliminated. We're at a loss to explain this issue, which is the main reason why it concerns us...

Finally, CPU utilization scores are virtually the same for all four drives, as was expected.

 IOMeter Results...


HOME | ARTICLES | LEADERBOARD | PERFORMANCE DATABASE | REFERENCE GUIDE
COMMUNITY | RELIABILITY SURVEY | SUPPORT SR! | ABOUT SR |

Copyright © 1998-2005 StorageReview.com, Inc. All rights reserved.
Write: Webmaster