About Meter Accuracy
In the area of video calibration there has been more nonsense written on the subject of color analyzers and what you need to get accurate readings than any other subject I can think of. This is probably because some ethically challenged calibrators and software vendors have wanted those who lacked the ability to test their claims to purchase expensive equipment and services. I have dealt with some of these issues in the Video Calibration Myths, but this subject is important enough to warrant its own article.
The Binary Fallacy
The most widely-repeated claim about what you need to obtain accuracy is what I will refer to as "The Binary Fallacy". This usually takes the form of simple, categorical statements about whether a particular meter is "accurate" and hence worth purchasing or relying on. Such statements are fallacious because accuracy is not a binary concept—simply accurate or inaccurate. Rather, accuracy is measured by the degree to which an instrument approaches some purely theoretical standard of perfection. Thus, an instrument's accuracy should always be understood incrementally. It is always a matter of degree.
Let's take one very popular front projector as an example to illustrate this point. The JVC RS1 was released in 2007, and in at least one way it was a revolutionary product. It was the first 1080P projector that offered a very high native contrast ratio and high brightness at the same time. Some DLP projectors could claim similarly high native contrast ratios, but only by employing a dual iris system in the light path that substantially lowered light output. However, despite all of the initial enthusiasm for the projector, it was something of a one-trick pony. Yes, it offered a great black level, but that was about it. The gamma was too low, it lacked adequate custom grayscale controls, the LCoS panel design and optics resulted in a visibly less sharp image than many competing products, and, worst of all, it was designed with the most inaccurate color gamut of just about any display on the market. All of the primary and secondary colors were wildly oversaturated, and the RS1 provided no tools to fix it.
The most serious reaction to this was from those RS1 owners who just wanted to solve the problem. These folks were willing to try any number of creative approaches to mitigate the oversaturated colors. However, until the arrival of the very expensive Lumagen Radiance external video processor a few months later, there was no effective means to remedy the JVC's extremely exaggerated palette. Since then, much has changed. JVC subsequently released updated versions of the original RS1 that include a very effective color management system and an affordable external color management solution is now within the means of anyone, or at least anyone who can afford a $6,000 projector.
What does all of this have to do with assessing the accuracy of color analyzers? Well, the errors of the RS1 have been extensively documented. I have calibrated many of them myself. A common question among RS1 owners has been whether their meter was "accurate enough" to successfully fix the RS1's color problems. As perhaps you can see, this question falls prey to "The Binary Fallacy". It makes no sense to simply ask whether a meter is or is not accurate enough. There are only two answers to this question: "Yes" or "No". What one should be asking is how much of the color error exhibited by my display can be remediated by a given color analyzer? Once this question has been answered, then it is up to the consumer to decide whether to use that device or to invest in a more expensive, and presumably more accurate, instrument. What is in question is a price/performance ratio, and not everyone will come to the same conclusion.
Using the RS1 as an example, here's what we know. The RS1's most offensive color error was in the green primary. It measured in the neighborhood of x0.296, y0.680. Obviously, there was some unit-to-unit variation, but this figure is quite representative of what one was likely to find. This is an error of 0.080 in the y axis relative to the Rec. 709 HD standard or a CIE94 error of 8. I have had a lot of experience with affordable colorimeters and I also own a reference 5nm spectroradiometer, so I have a means to test the colorimeters accuracy against a known reference. Measurements rarely deviate from the reference as much as 0.010. This translates into a CIE94 error of about 2. What this means is that even with the inherent inaccuracies of a tristmulus colorimeter, a calibrator can remove approximately 80% of the RS1's color error. .
Could you do better? Sure. But be prepared to spend more money. So the question is not so much one of accuracy, but rather "What you are willing to pay for?" Are you satisfied paying $269 for a meter that removes 80-90% of the errors or would you pay more to eliminate the remaining error? Of course, if you invest in a professional quality tool you can remove 95%-99% of the visible errors. Notice, however, that the price/performance curve is not particularly consumer friendly. You have to pay thousands of dollars more for what is really just an incremental improvement in accuracy. For amateurs, the Display 3 is fine. For prosumers and dedicated amateurs, or even the professional just starting out, the Display 3 PRO is probably the best choice.
Spectroradiometers vs. Colorimeters
Another area where a lot of misinformation has been disseminated concerns the relative value of spectroradiometers and colorimeters. Because of their design, spectroradiometers can be more accurate than colorimeters. They are certainly less subject to degradation over time. However, there are several practical reasons to prefer a tristimulus colorimeter for most work.
But, the argument goes, spectroradiometers are significantly more accurate over a wider range of displays than colorimeters. This certainly can be true, but it is not always so. For example, the i1Pro, which is a true spectroradiometer, is generally no more accurate than a Display 3 PRO colorimeter. The reason for this is that to keep costs down the i1Pro is a 10nm design. This means that it samples the spectra of light with half the frequency as reference quality spectroradiometers. This clearly affects it accuracy.
And what about those reference 5nm spectroradiometers? First, they are very, very expensive—between 7 and 9 times as expensive as an i1Pro. Second, they can be relatively slow. Individual readings take anywhere from several seconds to a couple of minutes, depending on the instrument and the luminance of the target. So for all their accuracy, many are not very practical as field instruments for day-to-day use. Their most valuable use is as a laboratory reference instrument used to calibrate other less-accurate, but more practical, instruments.
Furthermore, the two biggest problems with colorimeters—inaccuracy and degradation over time as you expose the filters to variations in heat and humidity—can be remedied fairly simply. All that is required is that the colorimeter has been individually calibrated for a variety of different displays and that it is periodically recalibrated to maintain its initial accuracy. With this in mind, we offer the Display 3 PRO, a tristimulus colorimeter that offers much of the accuracy of a reference instrument without sacrificing affordability or practical utility.
For those who want to read more on this subject, see this white paper from X-Rite.
So How Accurate are These Meters, Anyway?
I now have enough data on the stock Display 3 meter to provide a fairly clear picture of its level of accuracy relative to a known reference. I am seeing errors of about 0.006, sometimes a little lower and sometimes a little higher. Of course, errors may increase over time as the filters are exposed to changes in heat and moisture, which is why annual recalibration is recommended.