Human Color Perception
The visible spectrum of light is a minute portion of the full electromagnetic spectrum. The spectrum ranges from wavelengths of around 380-740 nanometers (nm). Cones and rods within the human eye respond to light waves; it is the cones within the eye that determine how color is perceived.
The spectrum of light that is visible to humans, from roughly 380-740 nm. This is just a small sliver of the entire electromagnetic spectrum that extends from super-short-wavelength gamma rays (0.0001 nm) to extremely long radio waves (up to 100 meters). Image Credit: Radiant Vision Systems
Human beings have three types of cones—short, medium, and long (S, M, or L), each of which is sensitive to a specific wavelength range. Different wavelengths are perceived at different strengths, each of which is blended to encompass the area under each of the curves in the graph below.
Spectral response of human vision within the fovea region (the central 2° of our field of view where our eyes have the highest density of cones). The graph is a rough illustration of the normalized response of the cones; i.e., how powerful we perceive each wavelength to be. The peak of each curve is where our perception of that color is strongest. Image Credit: Radiant Vision Systems
Spectral Power Distribution (SPD)
Obtaining a solid understanding of the human visual response — otherwise known as human spectral sensitivity — can help humans plot how they perceive different light sources, as characterized by their spectral power distribution (SPD). An SPD is plotted by power as a function of wavelength, as shown in the graph above. The difference is that, instead of what our eyes perceive, an SPD plots the total output power of a light source. For each wavelength band, the SPD outlines how much optical power the light source emits. Wavelength (in nm) is shown on the X-axis. Spectral power is distributed on the Y-axis (measured in Watts per nanometer, W/nm), and is often normalized as an arbitrary unit such as power or intensity.
SPDs of common light sources. Illuminant A, for example, is a broadband light source that emits at every wavelength and with increasing intensity at longer wavelengths. In contrast, the red LED has a very narrow spectrum, emitting light primarily between 620 and 650 nm. Image Credit: Radiant Vision Systems
The CIE Color Space: Quantifying Color
The CIE (the acronym given to the Commission Internationale de L’Eclairage, International Commission on Illumination) defined a standard in 1931 to be used in the scientific quantification of the physical properties of colors. The total perceivable range of color of the human eye is encompassed by the CIE color space and represented as a two-dimensional diagram. Every color we see is a blend of three values (Blue, Red, and Green), which are depicted at the three outermost points of the triangular color space. The pure hues of a single wavelength (monochromatic) are numerical values along the edge of the color space (the spectral locus).
The CIE specifies mathematical “color matching functions,” in order to quantify color as a coordinate on this color space derived based on relative values of the three primaries at any point in the color space, which are termed tristimulus functions: X, Y, and Z.
A graphical representation of the CIE 1931 color space, charting all colors visible to the human eye. Numbers around the edge of the are wavelengths of monochromatic light defining the boundaries of the color space. Inside the color space, each perceptible color has a coordinate called CIEx, CIEy. The black body (Planckian) locus coordinates are shown on the curved line near the center; the black-body temperature is commonly called Correlated Color Temperature, or CCT and measured in Kelvin (K). Image Credit: Radiant Vision Systems
The black-body locus (also known as the Planckian locus) shown above is a plot of the chromaticity coordinates of light sources known as black-body radiators (an idealized object that absorbs all incident light and is also a perfect emitter). The black-body curve is shown within the CIE diagram to indicate the points where the light appears white. Since its inception, the CIE color space has facilitated accurate representation, measurement, and replication of colors for a significant range of applications, including light source characterization, printing, optical design, and measurement of illuminated display screens. The CIE color space has been rescaled first in 1960 and once again in 1976.
The 1976 color space has been designed to roughly replicate the proportion of perceived color differences, i.e., the extent to which a human eye perceives color differences is represented more accurately by the distance between two points on the graph. The CIE 1960 color space was the foundation of the system to quantify another color concept, color temperature.
Correlated Color Temperature
Combinations of all the colors in the visible spectrum are used to produce white light. Correlated color temperature (CCT) is a way of characterizing the color appearance (to a human observer) of any white light source using a single number. Different amounts of the various wavelengths cause the light to appear “warmer” (if it contains more yellow/orange wavelengths) or “cooler” (if it contains more blue/cyan wavelengths). Different light sources with very different SPDs can all appear white to our eyes.
Comparing the SPD of various light sources, all of which appear “white” to our eyes but have different spectral distributions. For example, a “warm” white LED is most intense at approximately 625 nm (the orange-red region), while a bright “daylight” white LED is most intense at approximately 450 nm (the cyan/blue region). (Image Source)
The CCT of a light source is the point on the Planckian locus matching most closely (perceptually) to the chromaticity coordinates of the light — called its color temperature — measured in degrees Kelvin (K).
It may appear to be slightly counter-intuitive when compared to thermal temperature measurements such as Fahrenheit, where higher values = warmer temperatures. However, when measuring CCT, lower values (e.g., 1800-2700 K) are considered warm tones, neutral white is about 4000 K, and the higher values such as 5000 K and above are the coolest CCTs.
Comparing the appearance of various white LEDs that have different CCT values. (Image Source)
It is important to recognize that CCT is not equivalent to SPD. SPD captures the entire spectrum of wavelengths that is output by the light source, including some that are invisible to the human eye; whilst CCT characterizes how humans perceive the color tonality of a light source. CCT should be understood as an estimate of spectral content.
Assessing Color Displays
A display device’s “color gamut” is often used to describe the total color range the display can produce. To date, no man-made display device has ever come close to replicating the complete spectrum of color and light, which is perceptible to the human eye (otherwise known as the total CIE color space).
Comparison of the CIE 1931 color space to the gamut of a typical HDTV display. While the television can produce vivid and exciting images to the viewer, the range of color is still limited compared to the full capabilities of human vision. (Image Source)
Manufacturers of displays continue to work to expand the gamut and to produce ever more varied and vivid on-screen colors. How do display makers therefore ensure, both within the initial development process and on the production line, that their displays will continue to deliver the most accurate colors and the highest performance and visual quality for their users? The answer is simple: manufacturers can measure their products.
Measuring Color in Display and Light Source
How is the color output of a light source or display device measured? There are a range of different metrology instruments available, which are suited to different measurement types for both quality assessment and visual performance.
The science of measuring wavelengths across the whole electromagnetic spectrum is called radiometry: a radiometer can measure ultraviolet, visible, and infrared light waves. A photometer is a device used to measure the brightness (luminance) of visible light as it is perceived by the human eye.
Colorimetry is the science of measuring color (chromaticity) as it is perceived by the human eye. Some instruments, for example, a spot meter, can measure luminance and chromaticity. Spotmeters measure one small area (spot) at a time, as the name suggests. Spot meters are typically low-cost but provide high accuracy, which can render them particularly useful in the display R&D process. A spectrometer is a specific type of spot meter, which offers both spectral data and highly accurate chromaticity measurements.
Another type of equipment used to measure color performance is an imaging system, such as Radiant’s ProMetric® Imaging Colorimeter. An imaging system captures an entire light source distribution or complete display screen in a single image, unlike a spot meter. An imaging system measures multiple points of light simultaneously, in a 2D spatial context, by use of a 2D detector (image sensor). Within display metrology, the role of imaging is to simulate the human visual system — just as our eyes take in our whole phone or TV screen in one glance, the aim of imaging is to evaluate the entire display at once.
Source: Radiant Vision Systems
||Typical display characteristic to be measured
||Luminance, Chromaticity, Contrast, Gamut, Gamma, Flicker...
||Luminance, Chromaticity, Uniformity, Contrast, Mura, Defects, Pixel-level analysis, Distortion...
||Radiance, Luminance, Chromaticity, Spectra...
||Response time, Viewing direction...
Imaging systems offer a compromise between the ability to capture all data points for spatial representation and the accuracy of a spot meter. An imaging colorimeter is the best tool to assess overall visual performance in many of today’s applications, encompassing:
- Location, identification, and severity of defects (pixels, lines, blobs)
- Contextual evaluation: uniformity, gradient, contrast, mura (blemishes), distortion
- Determining dimensions, distortion and focus quality (projection displays)
- Simultaneous measurement of multiple regions of interest (LED arrays, pixels, subpixels)
- Advanced analysis with multiple analyses possible per image captured
Importance of Display Metrology
Manufacturers are provided with an objective understanding of their product quality thanks to display metrology. Differences in fabrication processes, technology, and other factors can cause variations in the visual quality of a display, including defects (e.g., dead pixels) or simply a non-uniform appearance.
Display makers are helped to evaluate visual qualities of a display screen thanks to data captured by metrological instruments, which ensure that it meets product specifications and satisfies customer expectations, therefore protecting both brand reputation and manufacturing investments.
Jens Jensen, Radiant’s Director of Product Development, explains in more detail: “Display metrology incorporates scientific methods and equipment to capture, quantify, and assess these qualities as values of brightness, color, uniformity, contrast, and more. Using this data, manufacturers can set objective limits on these qualities and determine whether variations fall in or outside of required performance parameters.”
Display metrology incorporates scientific methods and equipment to capture, quantify, and assess these qualities as values of brightness, color, uniformity, contrast, and more. Using this data, manufacturers can set objective limits on these qualities and determine whether variations fall in or outside of required performance parameters.”
Jens Jensen, Director of Product Development, Radiant Vision Systems
In other words, across devices and industries, display metrology applies universal measurement principles. Highly efficient measurement by machines has been facilitated by this standardization, enabling automated inspection processes from design to production.
In order to provide an accurate quality assessment, systems that are used to measure the color and brightness of displays must be able to capture sufficient data. Measurement systems are therefore evolving faster than ever before in order to address integrations, needs for new display types and to keep up with technology advancements like OLED, microLED, and quantum dots.
ProMetric® Imaging Colorimeters provide high-resolution, high-volume, accurate automated optical inspection of displays, backlit components, light sources, and assemblies. Image Credit: Radiant Vision Systems
The Science of Measuring Display Quality: Dig Deeper
It is important to understand the building blocks of metrology for professionals in the display industry; to know and note the tools that are available, and to ascertain which tools provide the greatest benefits for specific applications.
Watch the Short Course “Fundamentals of Display Metrology” if you’d like to learn more about display characterization and measurement. This 3.5-hour course was originally delivered on May 20 at the virtual SID (Society for information Display) 2021 Display Week, and was co-presented by Jensen and colleagues from Radiant’s sister companies: Dr. Reto Häring from Instrument Systems and Yutaka Maeda of Konica Minolta.
The fundamental and standard principles of display metrology is covered in this course, along with how they were developed, and how these principles are applied to measure visual display qualities. Beginning with an introduction to the science of light and color, it covers a range of topics, units of measurement, and international measurement standards. Technologies that apply these principles for automated display testing are described, including spot meters, imaging equipment, time-resolved meters, and spectroradiometers.
Produced from materials originally authored by Anne Corning from Radiant Vision Systems.
This information has been sourced, reviewed and adapted from materials provided by Radiant Vision Systems.
For more information on this source, please visit Radiant Vision Systems.