Reviews Leaderboard Database Reference Search StorageReview Discussion Reliability Survey Search About StorageReview.com Contents

SR's Optical Testbed


StorageReview.com PayPal Donations



StorageReview.com's Optical Testbed Methodology Detailed

  April 30, 2000 Author: Tim Zakharov  

Problems and Solutions:

CD WinBench 99

As chronicled in StorageReview.com's Headlines for April 8th, problems were encountered obtaining consistent results between 4 CD WinBench 99 discs. Specifically, the access time test would sometimes error out on the higher-RPM drives, often forcing the drive to a slower, error-corrected RPM, as programmed into the firmware of most optical drives. Certain drive/disc combinations errored very frequently, while other combinations produced very consistent, error-free runs. This same problem would sometimes manifest itself into the transfer rate test (in the form of spiking and reduced transfer rate in the TR graph) as well as in the CD-ROM WinMark test, where certain drives would yield a wide range of Disk WinMark scores across the four discs, while others held tightly-knit results.

An email to Ziff-Davis Benchmark Operations detailing the problem was quickly responded to. They reported encountering similar problems as detailed in this article. They suggested the range of scores I was encountering was "not terribly out of line" with what they'd seen in the past and even offered to test and/or replace my benchmark CDs. Finally, they stated that any CD could be used with the low-level tests in CDWB99.

In speaking with engineers in the industry, it was revealed that this was a common problem with high-RPM CD drives. CD-ROMs, no matter how carefully pressed, nonetheless may contain minute imperfections (whether in the microscopic recorded pits, the aluminum or plastic coating, or in how perfectly the pressing is centered onto the physical disc). In addition, it is nearly impossible to keep CDs in a pure state if they are used regularly. Scratches, dust, oily fingerprints and smudges are all a fact of life when handling CDs.

Unlike hard drives, CD drives can only be benchmarked by inserting media from an external source. This, by definition, introduces variables, which are difficult, if not impossible to control completely. Considering the exacting standards StorageReview.com adheres to in testing hard drives, I really needed to take a step back and re-examine how, or even if to continue with this project.

It was at this point that we began differentiating between low-level and application-level performance and their corresponding relationships to the problems encountered. Low-level tests, we decided, should confirm or dispel manufacturers' specifications, while the CD-ROM WinMark test should in effect mimic how the drive would typically perform in daily applications across a variety of discs.

Thus, we began experimenting with different discs for the low-level tests, as suggested by Ziff-Davis Benchmark Operations. If a replacement CD-ROM was to be used for CD WinBench 99s low-level tests, it would need to be of sufficient size (at least 600 MB) to test across the majority of the disc surface; it would need to be as pristine as possible; and it would need to provide accurate results. In my research and experimentation, I came across one such candidate: the High Heat Baseball 2000 CD-ROM. It is 619 MB in size, brand new (I'd never gotten around to playing it), and in testing in a multitude of CD drives, it yielded low-level results closest to drive manufacturer specifications, with little to no errors. This was about as perfect as it was going to get, considering the imperfect science of measuring CD drive performance.

It was decided that for low-level tests, where perfect runs are desired in order to ascertain manufacturer specifications, a read error would be legitimate grounds for a retest. Read errors are very easy to recognize: there is usually a relatively loud click from the drive when the read error occurs, followed by an audible spin-down of the drive motor, a short pause, with motor spin-up resuming shortly. This may happen more than once during a single test run, and depending on the drive, the motor may eventually reduce RPM for the duration of the test in an effort to eliminate further read errors. Thus, when measuring access time and transfer rates, drives are tested until they produce 3 error-free runs from each test. In the event that a drive cannot complete either test without repeatedly erroring out, it will receive no score for that test. The offending drive will then be tested informally with other CD-ROMs to determine if it is an issue with the test CD-ROM or the drive itself.

In the CD-ROM WinMark test, however, we are measuring application-level performance. The WinMark test attempts to run through scripted versions of eight applications (hence requiring ZDs CDWB99 disc), in much the same way one would actually be running these programs at home or in the office. For the end-user, read errors must be lived with. If they occur, they affect the performance of the drive. Some discs interact differently with different drives. Thus, we will run the CD-ROM WinMark test on each drive with 4 different CD WinBench 99 v1.1 discs and average the scores. In this way, each drive will be tested with a variety of media, with a corresponding average score that takes this variety into account.

Interestingly, some reliable sources (as well as some readers emailing in) recommended making CD-R copies of the benchmark CD and running off of that. Indeed, this does eliminate many read errors and generally produces "cleaner" results, but introduces a new variable: not all optical drives are designed to read CD-R (and especially CD-RW) media at full-speed. This has much to do with the relatively poor reflectivity of CD-R and CD-RW media compared to CD-ROMs. Because of this, we chose to settle on CD-ROM media for our CD-ROM tests. In fact, we purposely made separate CD-R and CD-RW copies of our High Heat Baseball 2000 CD-ROM to measure each drive's ability to read such media during low-level testing.

CDTach98

Similarly, CDTach98 also gave inconsistent results during testing. First, the results themselves would vary widely between runs. This ranged from the transfer rate tests to both random and full-stroke access time tests. The transfer rate test, in particular, completes so quickly that it is obviously not measuring as thoroughly as CD WinBench 99 does. Mainly for this reason, we chose to pass up all but the Interface Burst Speed test in this benchmark. Many readers may be aware of the hard disk version of this benchmark, HDTach, and its relative uniqueness in measuring interface burst speeds.

Also, in testing two different CDTach98 CDs, I found they gave widely varying results when tested in the same drive. One disc was much "slower" than the other, despite no apparent read errors. I could only attest this to some variation in the pressing of the two discs. One was much older than the other, so they were obviously pressed in different batches. For this reason, I standardized on the "faster" disc for any tests conducted with this CD-ROM. The reason for choosing the "faster" disc had to do with how closely the results it produced mirrored expected results.

What was most fascinating about this phenomenon was that it was the older, less pristine disc that yielded the faster results. Perhaps it was pressed when the benchmark (in its most current revision) was brand new, which, as the name suggests, was a couple of years ago. I hypothesize that the master stamp (which all CD-ROM copies are pressed from), may have worn in the two or so years it has been used to press CD-ROM copies, and perhaps today's version may have deviated somewhat from the initial pressings because of this. Other factors to consider are the quality of the media used during the pressing process as well as how rigorously pressing procedures are followed.

Maintenance of Test CDs:

Because of the aforementioned problems, efforts are made to keep all test CDs in the best condition possible. The following steps are taken to do this:

  • If a test CD is not being used in a drive, it is always kept in a protective jewel case or sleeve.
  • Test CDs are regularly examined for dust, scratches and finger prints, and cleaned carefully with a radial CD cleaner when necessary. The radial CD cleaner is also examined regularly and replaced when the cleaning pad becomes dirty.
  • Care is taken to handle test CDs only from the edges and central hub.

Media Disclosure:

  • 4 CD WinBench 99 v1.1 CDs for CD-ROM WinMark tests.
  • High Heat Baseball 2000 CD for CD WinBench 99 low-level tests and disc copy test (619 MB; 3063 files; ~205kb/file).
  • CD-R copy of High Heat Baseball 2000 for CD WinBench 99 low-level CD-R tests.
  • CD-RW copy of High Heat Baseball 2000 for CD WinBench 99 low-level CD-RW tests.
  • CDTach98 disc for interface burst tests and file copy test (file "cdtach2.dat" is 635 MB in size).
  • Audio CD: Yes, "Union" for DAE/burning tests (65:23.28 in length).
  • "Twister" DVD for DVDSpeed99 tests and DVD visual inspection tests
  • "The Matrix" DVD for DVD visual inspection tests
  • blank CD-R discs (TDK Certified Plus) for testing CD-R drive performance. 8X-rated discs are used for CD-R drives rated at 8X and slower. For faster drives, the appropriate speed-rated discs of the same brand will be used.
  • blank CD-RW discs (Verbatim DataLifePlus) for testing CD-RW drive performance. 4X-rated discs are used for CD-RW drives rated at 4X and slower. For faster drives, the appropriate speed-rated discs of the same brand will be used (pending technological advances).

Methodology:

Prior to installing in the testbed, each drive's jumper settings are checked. For IDE drives, they are jumpered as Master. For SCSI drives, parity and termination are enabled if they already aren't, and the drive is set to SCSI ID #3.

Once the drive is in the testbed and the OS is loaded, I immediately go to System Properties (WIN+Pause/Break keyboard shortcut), click the Device Manager Tab, double-click the CDROM field, double-click the installed CD drive, click the Settings Tab, and make sure the Sync Data Transfer box is checked and the Auto Insert Notification box is unchecked. Additionally, for IDE units, I make sure the DMA box is checked. Once this is done, the system is restarted.

Before any testing begins, I check each drive manufacturer's website for any firmware updates. While there some excellent firmware sites around the 'net, often the authenticity of the firmware is in question. The firmware occasionally may be hacked or leaked in a beta state. We circumvent this issue by only downloading directly from the manufacturer's website. So far in my experience, some manufacturers have dedicated, regularly updated firmware sections, detailing the changes in each revision, while other manufacturers either have no firmware update section, or only provide them per request, as a means to resolve technical support issues. We will not request firmware updates for these drives unless there is a technical issue which prevents the drive from otherwise being accurately tested-in this event, the firmware must be publicly available. In the event that there are different firmware versions at other server locations around the world, the most recent US-based firmware revision will be used. Once the drive is flashed to the latest firmware revision, it is ready to be tested.

There are actually three sets of tests we conduct: First, all drives are run through the CD-ROM tests. Additionally, CD-R and CD-RW drives are run through burning tests, and DVD-ROM drives are run through the DVD tests.

CD-ROM Tests

CD-ROM Low-Level Tests: We use CD WinBench 99 v1.1 to measure access time, cpu utilization, and transfer rates. The High Heat Baseball 2000 CD-ROM is used in all drives to obtain data. As mentioned earlier in this article, we report the average of the first three error-free runs for each test. The system is rebooted between each run of each test, with the tests being run immediately after the system finishes booting into the OS.

CDTach98 is used to measure interface burst speed. We use the CDTach98 CD-ROM for testing. Per benchmark requirement, we minimize the drive's supplemental cache size and set the access pattern to "No read-ahead" within File System Properties. We report the average of three runs from the 16kb portion of the test. The system is rebooted between each run. After burst speed tests are completed, the drive is set back to "Large" supplemental cache size and "Quad speed or higher" access pattern for all other tests.

CD-ROM Application-Level Tests: Using 4 different CD WinBench 99 v1.1 CD-ROMs, we run the CD-ROM WinMark test once with each disc and report the average of these four scores. In the event of questionable results, we re-run the tests to check for validity. The system is rebooted between each test.

To measure disc and file copy performance, we use a standard stopwatch, as well as the High Heat Baseball 2000 CD-ROM (for multiple-file disc copies) and the CDTach98 CD-ROM (for single, large file copies). Using Windows Explorer, we select the appropriate file(s), Copy, then Paste to the Test partition on the testbed. The stopwatch is started at the same time the Enter key is pressed (with the Paste function highlighted), and stopped when the Copy Dialog box disappears from the screen. The Test partition is reformatted between each run of each test.

CD-R Read Tests: We use a CD-R copy of the High Heat Baseball 2000 CD-ROM, along with CD WinBench 99 v1.1 to measure access time and transfer rates in an effort to determine how each drive performs with CD-R media. Reported results are the average of the first three error-free runs. The system is rebooted between each test run.

CD-RW Read Tests: We use a CD-RW copy of the High Heat Baseball 2000 CD-ROM, along with CD WinBench 99 v1.1 to measure access time and transfer rates in an effort to determine how each drive performs with CD-RW media. Reported results are the average of the first three error-free runs. The system is rebooted between each test run.

Digital Audio Extraction Tests: CDSpeed99 v0.66 is used along with the audio CD Yes, "Union" to measure DAE speed and quality across the entire audio CD. Results are displayed in a transfer rate graph as well as in "X" ratings. Results reported are the average of 3 runs, with a system reboot between each run.

CDDAE99 is also used to audibly verify extraction quality. All audio tracks are ripped from the Yes CD to the Test partition on the testbed both in a single session, and individually. We listen to the first and last tracks (in .wav format) through our Grado SR60 headphones for any noticeable defects. Additionally, we spot-check any suspect areas by cross-referencing with the drive's DAE extraction graph. Any areas of the graph with unexpected dips are listened to for quality issues. Finally, CDDAE itself reports any differences between original and extraction. We also focus in on these areas if they are reported by the benchmark. These tests are done once per drive. After completion, the Test partition is reformatted to a clean state.

CD-R Tests

Audio CD Burn Test - Easy CD Creator: We use Adaptec's Easy CD Creator v4.0x for all burning tests. We use TDK 12X media with 12X or slower writers, and Verbatim 16X media with 16X writers. We use a stopwatch to time how long it takes to image the Yes audio CD to the Test partition of the testbed, then to burn the image back to a blank CD-R. The test is run 3 times and average times are reported. The Test partition is reformatted and the system rebooted between each test.

Audio CD Burn Test - Nero: We use Ahead's Nero 5.5x for all burning tests. We use TDK 12X media with 12X or slower writers, and Verbatim 16X media with 16X writers. "Read audio data with sub channel" is disabled. We use a stopwatch to time how long it takes to image the Yes audio CD to the Test partition of the testbed, then to burn the image back to a blank CD-R. The test is run 3 times and average times are reported. The Test partition is reformatted and the system rebooted between each test.

Data CD Burn Test - Easy CD Creator: We use Adaptec's Easy CD Creator v4.0x for all burning tests. We use TDK 12X media with 12X or slower writers, and Verbatim 16X media with 16X writers. We use a stopwatch to time how long it takes to image the High Heat Baseball 2000 CD-ROM to the Test partition of the testbed, then to burn the image back to a blank CD-R. The test is run 3 times and average times are reported. The Test partition is reformatted and the system rebooted between each test.

Data CD Burn Test - Nero: We use Ahead's Nero 5.5xx for all burning tests. We use TDK 12X media with 12X or slower writers, and Verbatim 16X media with 16X writers. We use a stopwatch to time how long it takes to image the High Heat Baseball 2000 CD-ROM to the Test partition of the testbed, then to burn the image back to a blank CD-R. The test is run 3 times and average times are reported. The Test partition is reformatted and the system rebooted between each test.

Burning Stress Tests:

  1. Unreal Tournament Intro Fly-by-we load up UT (640x480x32bpp) and allow the Intro fly-by to run through twice to make sure it is loaded into RAM. Next we ALT-TAB to the desktop, set up the burning software to burn a 195MB folder (the "Sounds" folder from our HHBB2K disc) from our hard drive's Test partition to a CD-R at the drive's maximum write speed. Immediately after the burn LED lights on the test unit, we maximize UT. When the burning software either completes the burn or errors out, it automatically minimizes UT to display its results. In the event of a failed burn, we lower the burner's write speed until we either get a successful burn or cannot lower the speed any further.
  2. Winbench 99 CPUmark99-we load the burning software and Winbench 99 into their own windows so that both applications are simultaneously accessible. We set up Easy CD Creator to burn the 195MB "Sounds" folder from our hdd's Test partition to a CD-R at the drive's maximum speed. We then set up WB99 to run the CPUmark99 test. The burn test is started. As soon as the burn LED lights up on the test unit, we start the CPUmark99 test. If the burn fails, we drop down to the next burn speed, until we either successfully burn, or cannot lower the speed any further.
  3. Update: With the introduction of Nero Burning ROM 5.5x our previous stress tests become obsolete. Nero's use of RAM buffer and hard drive cache makes it virtually impossible to force a drive into a buffer underrun situation using the above two methods. Thus, to test any anti-coaster technology on reviewed drives, we simply hit CTRL-ALT-DEL and freeze the system until the drive's buffer is forced dry. After a significant period of time, we unfreeze the system and allow the burning to continue. We then check the completed burn to see if the data is fully readable.

CD-RW Tests

Data CD Burn Test - Easy CD Creator: We use Adaptec's Easy CD Creator v4.0x and blank CD-RW discs for all burning tests. For drives rated at 4X or slower, we use Verbatim DataLifePlus media. For drives rated at higher speeds, we use TDK high-speed CD-RW media. We use a stopwatch to time how long it takes to image the High Heat Baseball 2000 CD-ROM to the Test partition of the testbed, then to burn the image back to a blank CD-RW at the drive's maximum speed. The test is run 3 times and average times are reported. The Test partition is reformatted and the system rebooted between each test.

Data CD Burn Test - Ahead Nero: We use Ahead's Nero 5.5x and blank CD-RW discs for all burning tests. For drives rated at 4X or slower, we use Verbatim DataLifePlus media. For drives rated at higher speeds, we use TDK high-speed CD-RW media. We use a stopwatch to time how long it takes to image the High Heat Baseball 2000 CD-ROM to the Test partition of the testbed, then to burn the image back to a blank CD-RW at the drive's maximum speed. The test is run 3 times and average times are reported. The Test partition is reformatted and the system rebooted between each test.

DirectCD Packet-writing Tests:

  1. Format: using the fastest-rated CD-RW media the drive supports, we time how long a full and quick format takes in Adaptec's DirectCD 3.0x. All reported results are the average of three trials.
  2. Erase: we use DirectCD's CD-RW Eraser and time how long our test unit takes to blank a DirectCD-formatted CD-RW. We use the fastest-rated CD-RW media the drive supports. All reported results are the average of three trials.
  3. Folder Copy: using Windows Explorer, we time how long it takes to copy a 195MB folder from our hdd's Test partition to an empty DirectCD-formatted CD-RW. This folder is the "Sounds" folder from our HHBB2K disc. We use the fastest-rated CD-RW media the drive supports. All reported results are the average of three trials.

InCD Packet-writing Tests:

  1. Format: using the fastest-rated CD-RW media the drive supports, we time how long a complete format takes in Ahead's InCD 2.x. All reported results are the average of three trials.
  2. Erase: we use Nero's CD-RW Eraser and time how long the test unit takes to blank an InCD-formatted CD-RW using the "Erase Entire CD-ReWritable" selection at the drive's maximum speed. We use the fastest-rated CD-RW media the drive supports. All reported results are the average of three trials.
  3. Folder Copy: using Windows Explorer, we time how long it takes to copy a 195MB folder from our hdd's Test partition to an empty InCD-formatted CD-RW. This folder is the "Sounds" folder from our HHBB2K disc. We use the fastest-rated CD-RW media the drive supports. All reported results are the average of three trials.

DVD Tests

DVD Low-Level Tests: We use DVDSpeed99, along with the "Twister" DVD, to measure access time, transfer rate and cpu utilization with DVD-Videos. We conduct identical tests with the DVD Tach 98 2.51 disc for data results. All 3 tests are conducted in one run, with a reboot between each run. The average of 3 runs are reported.

DVD Playback: Using WinDVD, we visually examine playback for smoothness. For the "Twister" DVD, we watch scene 3 (the "wheat field flyby") and check for any pauses, skipping, or jerkiness. For "The Matrix" DVD, we watch scene 15 (the "Morpheus/Neo match"), checking for smooth and fluid playback. Each scene is watched once with subjective reactions reported.

This concludes our Optical Storage Testbed and Methodology Disclosure! Stay tuned for our first drive roundup, where we examine our first batch of CD-ROM drives in detail!


HOME | ARTICLES | LEADERBOARD | PERFORMANCE DATABASE | REFERENCE GUIDE
COMMUNITY | RELIABILITY SURVEY | SUPPORT SR! | ABOUT SR |

Copyright © 1998-2005 StorageReview.com, Inc. All rights reserved.
Write: Webmaster