Hardware vs Software Monitor Calibration

Monitor calibration produces a set of curves, one for each of the three color channels. These curves are responsible for bending and twisting the device’s native color to reach our calibration goal. Where these curves are stored is a main differentiator between regular and so-called “hardware-calibrated” monitors.

Regular monitors depend on the computer’s video chip to store the curves. Hardware-calibrated monitors store the curves inside the monitor’s look-up table (LUT).

MonitorVsVideoLUTs

Above are the calibration curves for my current setup as shown by the ColorEyes Display Pro calibration software. On the left are the curves for my Retina MacBook Pro’s internal display; while on the right are the curves for the EIZO CG241W monitor. Note that this software puts the curves either in the video LUT or in the monitor LUT – but not both. Other packages, such as basICColor Display tend to utilize both for hardware-calibrated monitors.

While the video card stores these curves at 8-bit, my EIZO’s internal curves are at 12-bit. At higher bit depth calibration is more precise and virtually eliminates color banding and seepage. Hardware-calibrated monitors also store the curves permanently (of course until the next calibration).

Calibration software loads the video LUT as part of the calibration process. But what happens if the computer is rebooted or turned off and on again on the next day? Unfortunately video card hardware does not store and automatically re-apply calibration curves on startup. So the question remains: where to store them and who will reload them?

Apple invented a fairly obvious solution to answer this question: embed calibration curves into the display’s ICC profile. This way they could be handled together as a single entity. Because the ICC profile specification does not provide any storage space for calibration data, Apple had created a new profile tag, the infamous video card gamma table (VCGT). To complete their solution ColorSync loads these curves when needed. Calibration packages also support this by embedding newly computed calibration curves into the profiles they create.

Windows 7 and above also sports a video card LUT auto-loading feature, but it isn’t as obvious as on a Mac. I would recommend reading my old post about the topic.

In the next installment of my monitor calibration series I’ll talk about what can one reasonably expect from proper calibration and profiling.

  ☕ ☕ ☕

Did this post help you? Consider buying me a coffee if so.

Keeping OS X Display Brightness Unchanged

In the previous installment in my monitor calibration series, I mentioned the need to keep monitor brightness unchanged after calibration – as any change to it invalidates the profile.

But what if I press the brightness control buttons on the Retina MacBook Pro (or on any other MacBook)? Should I immediately re-calibrate and re-profile? Well, there’s a slick trick.

MacBookBrightnessControl2MacBook display brightness is changed by default in whole unit steps using the brightness keys. But holding down Shift + Option while pressing the keys will change it to 1/4 unit steps – the same amount ColorEyes Display Pro (and other software) uses when controlling the display.

What I usually do after calibration and profiling is: increase the brightness by 1/4 unit, take note of the (previous) value, and immediately decrease it back to where the software set it. This way I could return the display to the calibrated state even if I had to change, or accidentally changed its brightness.

Another enemy of keeping the calibration intact is the display dimming preference of OS X – which tells the machine to slightly dim the display while running off of batteries. It might be useful for users not requiring color accuracy and consistency, but turn it off for calibrated displays (by default it’s on).

DisplayDimming2

In the next installment I’ll examine the differences between hardware and software calibration.

Monitor Calibration vs Profiling

The photographic industry uses these phrases somewhat interchangeably, but they are two completely different concepts. So in this post I’d like to shed some light on what is what. This is the first in a series of posts about the subject of monitor calibration.

Calibration is the process of bringing a device into a known working condition. In case of monitors it consists of setting the device’s black point, white point and tone reproduction curve (sometimes incorrectly called gamma). The exact values for the calibration parameters depend on the actual application, but let me explain what they really mean as well as what are their recommended values or value ranges.

The white point’s luminance controls how bright the display will be. One should choose a value according to the output’s intended viewing environment. 80 cd/m2 is widely used in the printing industry, but not each monitor is capable of reaching this luminance or working there without ill effects. Some recommend 100-120 cd/m2 for brighter working environments. Output is the king, so one should set the working environment to match the output’s needs. Note that values over 120 cd/m2 will cause eye fatigue pretty fast. Top end EIZO displays or even the displays in Retina MacBook Pros has no problem working at 80 cd/m2. Actually the MacBook’s display is a bit over 50% of its brightness range at 80 cd/m2.

The white point’s color temperature controls how blue or yellow whites will be. The printing standard is D50 (5003K), where all wavelengths are roughly equally present. This is a good match for natural white papers, but brighter papers, loaded with optical brighteners might ask for “bluer” white, with higher color temperature, such as D65 (6504K). Ideally one should measure the color temperature off the papers. Regular monitors, especially those with bluish LED backlight have a hard time reaching D50: the blue channel is lowered too much and posterization kicks is. But the Retina MacBook Pro for example can be used at D50.

The black point controls how blacks will be handled. Its luminance should be set to minimum to utilize the monitors black separation capability as much as possible. ColorEyes Display Pro – the monitor calibration software I use – has a unique setting controlling how blacks are rendered. Every output device, be it a display or a printer, plugs the shadows to some degree. Relative black rendering tells ColorEyes Display Pro to find the lowest value where there’s still detail in the blacks and create the profile to map 0/0/0 there. Actually most calibration/profiling apps do this. But this not just makes matching multiple monitors almost impossible, but also causes problems with print preparation, because printers also tend to plug deep shadow detail.

The other choice in ColorEyes Display Pro is absolute black, which maps 0/0/0 to the absolute black (you know, the proverbial black cat in an unlit coal mine, in the middle of a moonless night). It results in more precise profiles not to mention that it makes multi-monitor matching possible.

A fortunate coincidence that both my EIZO and my MacBook Pro’s Retina display start to plug shadows where my favorite Hahnemühle papers do (around 8/8/8, the EIZO being a bit better and the Retina a bit worse), so using absolute black rendering I can see on the monitor how the paper will behave.

Digital files represent image data quite differently than humans see. For a digital file (and also sensors) the brightest stop contains half of the numeric values usable at a given bit depth (e.g. 128 for a 8-bit file, 32768 for a 16-bit file). The second stop contains quarter of the values, and so on. This presents two problems: 1) darker parts of the image would get only a few different levels, resulting in posterization, 2) it would be quite difficult to work on these files due to the lack of “perceptual uniformity”, that is 128/128/128 is not twice as bright as 64/64/64. The role of the tone reproduction curve is to map file values into humanly “processable” values. Human perception follows a power-law; so using a single exponent can attain a surprisingly good approximation of the “ideal TRC” of human vision. This exponent is known as the gamma, and the mapping using the TRC is called gamma correction.

A gamma value of 2.2 gives a pretty good approximation of the ideal TRC, but plugs the shadows. To compensate for this, the sRGB TRC, which uses a 2.4 exponent combined with a linear section in the deep shadows, was invented (this is used as the TRC of Lightroom’s internal Melissa RGB working space). You could safely forget about gamma 1.8, a relic from the ages when Apple computers and printers used to have an internal gamma greater than 1.0. Well, the simple gamma TRC itself is a relic from times when computers were slow and storing and working with a few kilobytes of curve data was infeasible.

Nowadays we have a markedly better solution, called the L* curve. This is the most precise approximation of the ideal TRC, and doesn’t present a problem for today’s (and even yesterday’s) machines. So I highly recommend to use the L* TRC. I moved to a completely L*-based workflow (including RGB and gray working spaces) five years ago, and never looked back. But this is another story.

Profiling is the process of measuring the device’s color reproduction capabilities and creating an ICC profile file. In case of monitors it simply measures and stores the red/green/blue primaries and the color temperature of white. Well, the profile might also contain calibration curves (my ages old post explains this Apple invention in detail – its about Windows, but you’ll find useful information in the “Issue” section even if you happen to use a Mac).

Calibration and profiling walk hand in hand. There’s no point in calibrating a monitor without profiling it, and vice versa, profiling without prior calibration is an exercise in futility. Why big name measurement device manufacturers market low-end devices that are incapable of doing proper calibration is beyond me. Avoid these at all costs.

There’s an important consequence: one can’t freely change the monitor’s brightness and/or contrast after calibration, as it would invalidate the calibration and the profile built on top of it.

And I’ll discuss how OS X can be made a cooperating partner in keeping these constant in the next installment.

Why ColorBase?

After my recent post about the new ColorBase version, a friend asked the question: “why is it better than factory calibration?” I though this could be interesting to other people, so here’s my (longish) answer.

Some background first. In the grand scheme of things, building a color profile for a device is a two-step process. The first step is calibration, which sets the basic operating parameters of the device to a well known (sometimes standardized) default. In case of monitors, calibration sets the black level, white luminance, color temperature and tone reproduction curve. In case of printers, it sets the relationship between color values and the actual amount of ink laid down to be linear – this is why this step is called linearization. The second step is the actual profiling. Here the software determines the color reproduction characteristics of the device and creates the profile.

On the low end, manufacturers tend to skip the calibration step, doing only the profiling. This is a nasty trick and the reason why I think that cheap colorimeter packages that can’t do the calibration step are downright dangerous and actually worth nothing. On the high end profile making is always preceded by calibration.

Speaking of printers: the lack of calibration (linearization) is less noticeable here, because profiling packages do a linearization step under the hood before starting to build the profile. This is not that accurate as the separate step, however (“true” linearization controls parameters in the rasterization process, whereas “simulated” plays with the color values). So basically it is more or less done for printer profiles.

My favorite example for showing color reproduction differences across devices is the TV department of your favorite electronics store. Almost every single one displays the same content differently. Consumer printers are the same. Take two Epson 2880s, and they will print different colors. In case of professional Epsons, all the devices are “factory calibrated” to be as identical as possible when they leave the factory. But this does not mean that they will not drift over time! And because of this drift (and inherent difference in consumer models) you’ll have to re-create all the profiles from time to time. Which could be a daunting task.

To be able to decide whether your device drifted out of tolerances, high end profiling packages provide a validation tool that measures the color reproduction accuracy of the calibrated/profiled device. This way you can check the status periodically and recalibrate/re-profile as needed – instead of doing this blindly every month or so.

Epson’s ColorBase is a software for both linearizing the printer driver and a validation tool for checking the linearization accuracy. A welcome extra is that it can do this for higher-end consumer printers. So one can utilize ColorBase in two different ways:

  • Use it to measure accuracy, and redo the complete linearization/profiling for each of the papers when the accuracy has drifted. This could still be daunting for several papers, but this provides the utmost precision.
  • Use it to measure accuracy, but only redo the linearization if the printer became out-of-spec. Because ColorBase returns the printer to the state it had been before creating the profiles, there are pretty good chances that the profiles will remain accurate.

I have been using the second method for five years with great success. And the longest period that the printer was in-spec reached 2 years with my late 4800. This demonstrates that we are talking more about peace of mind and process control here than visible results. This stuff is about to catch when something goes wrong before it ruins several prints.

And what’s the difference between factory calibration and ColorBase? Actually they are two different things. Factory calibration is for making sure that pro printers are identical when they leave the assembly line, whereas ColorBase is a tool for employing process control.

I must mention two glaring omissions in the package, however. Ink limiting and support for third party papers. You can control ink within the printer driver to some extent, but this should be done with the linearization step. Over-inking could be a serious problem using the driver with some papers. Not supporting third party papers could be worked around by linearizing the printer to the Epson paper selected as the media type for the third party paper (for example Velvet Fine Art in case of Hahnemühle Photo Rag). You will not have a linearization for Photo Rag (which would be the desirable), but at least you’ll be able to build its profile on a solid and consistent base.

If you need ink limiting and linearization for custom papers then moving to a RIP is the only solution these days.