Learn something new every day More Info... by email
Contrast ratio is a ratio that indicates the range between the brightest point and the darkest point that a display can produce. For example, if the ratio is 500:1, it means that the brightest white is 500 times brighter than the blackest black. This measurement is often used in marketing language for both playback and recording devices, and like many “number wars” in the consumer electronics industry, it is very complicated, and consumers should approach it with care.
Ideally, contrast ratios would be tested in the exact same conditions, using the exact same procedures, with carefully calibrated equipment and neutral professionals administering the test. Unfortunately, this is not the case, which means that the measurements can be extremely variable and not always very reliable. Companies may test their contrast under various conditions, using various methods, and the claims they make may be difficult to reproduce.
The most popular method is the full on/full off, in which a display shows an all white and then an all black image, usually in perfect darkness. This provides the biggest number, because the contrast will be quite radical, but these conditions are rarely seen in the real world. Some companies use other methods such as a checkerboard of white and black squares, which provide more realistic conditions and therefore a more accurate contrast. The method used is not always disclosed by the company, however, which can make it difficult to judge the reliability of the stated measurement.
Theoretically, the better the contrast ratio, the better the quality of the display, but displays can also be affected by the conditions where they are used. A television, for example, will have greater contrast in darkness than it will in light conditions. The quality of the material being displayed can also have an impact, as a poor recording will look bad even on the best display. Furthermore, the human eye's ability to discern contrast and detail are limited, which means that two displays with different ratios can look very similar to the average consumer.
In addition to being important for displays, this ratio also has an impact on the quality of recording devices like cameras. If the ratio is high, the device will be able to reproduce high levels of contrast, creating cleaner, crisper, better-quality images. With a low ratio, quality will also be lower, and it will usually be impossible to clean up or improve the image because the necessary data will be missing since it was never captured.
@nony - I don’t know about your monitor contrast ratio, but given your explanation I think a higher contrast ratio would have been best, and made the image comparable to what you were seeing in the viewfinder of your camcorder or in your television set.
I always go for the highest contrast ratios I can get with the computer monitors, and in that sense I’ve stuck with a lot of the big name brands. They are more expensive but they’re worth it in my opinion.
That being said, there are devices you can buy that will help you calibrate your monitor so that it’s as close as possible to your television. The device is aimed at the monitor (it’s kind of like a scanner) and it comes with software too so that you can make the adjustments easily.
I did some video editing for a friend once. He came over and we looked at the finished footage. It looked a little dark so we enhanced it in the video editing software until we thought it looked right.
I didn’t realize that the contrast ratio in my monitor could be drastically different from that of a camcorder or even a television screen. When we exported the video back out to tape and he brought it to work to show his boss (it was for a work project), instead of being too dark it was too white – nearly washing out the colors.
So my word of advice is if you’re doing video editing be sure to compare the footage with what’s actually on the TV screen (or whatever the target device will be) to make sure that it looks like the way that you want it to.