Why is that Forest Red and that Cloud Blue?
How to Interpret a False-Color Satellite Image

By Holli Riebeek Design by Robert Simmon March 4, 2014

Chances are, you have a camera near you as you read this—in the smart phone in your pocket or on the tablet or computer you’re using to view this page. Some of you might have a 35 mm film or digital camera nearby. And at some point this week, you probably looked through photos posted by friends or even strangers on the Internet. In our photo-saturated world, it’s natural to think of the images on the Earth Observatory as snapshots from space. But most aren’t.

Though they may look similar, photographs and satellite images are fundamentally different. A photograph is made when light is focused and captured on a light-sensitive surface (such as film or a CCD). A satellite image is created by combining measurements of the intensity of certain wavelengths of light, both visible and invisible to human eyes.

earthinfrared_mes_2005214

From the Amazon rainforest to North American forests, plant-covered land is red in this view of Earth from the Messenger spacecraft. The image incorporates both visible and infrared light. (NASA image based on data from the Mercury Dual Imaging System (MDIS) on Messenger.)

Why does the difference matter? When we see a photo where the colors are brightened or altered, we think of it as artful (at best) or manipulated (at worst). We also have that bias when we look at satellite images that don’t represent the Earth’s surface as we see it. “That forest is red,” we think, “so the image can’t possibly be real.”

In reality, a red forest is just as real as a dark green one. Satellites collect information beyond what human eyes can see, so images made from other wavelengths of light look unnatural to us. We call these images “false-color,” and to understand what they mean, it’s necessary to understand exactly what a satellite image is.

yellowstone_infrared_nps

Infrared light renders the familiar unfamiliar. This infrared photograph shows the forests of Yellowstone National Park from Mount Sheridan. (Photograph courtesy National Park Service.)

Satellite instruments gather an array of information about the Earth. Some of it is visual; some of it is chemical (such as gases in the atmosphere); some of it is physical (sensing topography). In fact, remote sensing scientists and engineers are endlessly creative about what they can measure from space, developing satellites with a wide variety of tools to tease information out of our planet. Some methods are active, bouncing light or radio waves off the Earth and measuring the energy returned; lidar and radar are good examples. The majority of instruments are passive; that is, they record light reflected or emitted by Earth’s surface.

These observations can be turned into data-based maps that measure everything from plant growth or cloudiness. But data can also become photo-like natural-color images or false color images. This article describes the process used to transform satellite measurements into images.

Print this entire article