Notes on Gamma
Gamma is a blight, a curse, and utterly annoying. Ever since somebody told me that RGB colours need to be Gamma corrected, RGB colours were spoilt for me. Gamma does to digital colour what kerning does to typography. This post is an attempt to get even with Gamma. Lost Innocence Because, you see, only once I became fully aware of Gamma, things really started to fall apart. In my pre-gamma-aware innocence, I must have done some things right. Let me show what I mean: Here, I generate a linear gradient using a GLSL fragment shader. Say, drawing a full-screen quad using this shader code snippet: 1 2 vec3 color = vec3 ( uv . x ); // increase brightness linearly with x-axis outFragColor = vec4 ( color , 1 ); // output to RGB swapchain image glsl And voila - a perceptually linear gradient. A perceptually linear gradient, innocently created But now, let’s get clever about this: We notice that we’re actually drawing to an sRGB monitor (most desktop monitors are, nowadays), so we should probably use an sRGB image for the swapchain (these are the most common 8bpc (read: “bits per channel”) swapchain image formats in Vulkan). Thus we somehow need to convert our RGB colours to sRGB. The most elegant way to do this is to use an image attachment with an sRGB format, which (at least in Vulkan) does the conversion automatically and precisely on texel save. If we draw the same shader as before, but now into an sRGB swapchain the swapchain image’s non-linear rRGB encoding should correct the sRGB monitor gamma. The gradient’s brightness values (as measured by a brigthness meter) should now increase linearly on-screen. They do. That should look like a linear gradient, right? Wrong. Instead, we get this: A perfectly ’linear’ gradient This doesn’t look linear: shades don’t seem to claim equal space. Instead, it looks as if dark shades are too bright, while bright shades wash out. Here’s what I’d expected to see: A perceptually linear gradient The difference is subtle, but look at both gradients and ask yourself: which seems to have more detail? Which is more balanced? Even though the first gradient is more linear in terms of physical brightness, the second one looks more linear. I find this counter-intuitive. But where intuition fails, ratio may help; and there is indeed a rational explanation: Visual perception is non-linear. And You See The Ones In Darkness/Those In Brightness Drop From Sight
Sorry, Bertolt! The human eye can distinguish more contrast detail in darker shades. You don’t have to take my word for it; instead take those of Dr. Charles Poynton – the person who gave HDTV square pixels and the number 1080. Here is a diagram of how our perception tends to respond to changes in lightness, which I found in his dissertation: CIE Lightness (Poynton 2018, pg. 17). Note how perception biases up around darker shades.
CIE Lightness, when defined as a (relative) curve of just noticeable differences, fits very nicely a power function with exponent (0.42), with a small linear bit below relative luminance of about 1%. This is fine. Nature. It probably helped our ancestors to survive or something. And it does explain why our physically linear gradient looked too bright in dark areas. Let’s draw a diagram: A linear gradient getting biased
in black: the biases
in red: the signal at this point in the chain Check the Bias Whenever we want to draw a perceptually linear gradient, we must remember to pay our dues to evolution, and factor in this perceptual bias. If you want the appearance of a linear gradient, you must display a non-linear gradient, one that tunes down darker parts. Effectively, you want to apply the inverse of the non-linearity that is introduced by perception. Confusingly, this may be done automatically for you if you forget to do any gamma correction: Two-Penny Gamma Correction If you have an sRGB monitor and you innocently don’t do any sRGB correction (by rendering into a linear RGB framebuffer such as FORMAT_R8G8B8A8_UNORM for example), linear gradient values will get biased by the monitor’s gamma response alone – the result will be a gradient that looks “about linear”. Our perceptually linear gradient, innocently created It looks “about linear” because what happens is that while the monitor will “gamma” the gradient, your eye will “de-gamma” the gradient again – and since these two non-linear effects on the signal (monitor, eye) are almost inverses of each other, we get a linear perceived signal at the end. And here’s a cool thing: This was by design!
(‘CRT’ here means ‘Cathode Ray Tube,’ but the same gamma was kept with LED, and OLED screens because it was so useful…) Poynton, 2018 (p.25), Colour Appearance Issues in Digital Video HD/UHD, and D-cinema 2018 (‘CRT’ here means ‘Cathode Ray Tube,’ but the same gamma was kept with LED, and OLED screens because it was so useful…) "The nonlinearity of a CRT is very nearly the inverse of the lightness sensitivity of human vision. The nonlinearity causes a CRT’s response to be roughly perceptually uniform. Far from being a defect, this feature is highly desirable." Why is this desirable? A big reason for encoding images in sRGB is that, because of sRGB’s perceptual nature, we get much better perceptual luminance contrast resolution out of 8bits per channel. Instead of wasting bits on high brightnesses where our eye has trouble noticing change, we spend most of the bit-budget where it counts: on darker shades.
... continue reading