Using radio waves, data from Earth-orbiting satellites are transmittedon a regular basis to properly equipped ground stations. As the data arereceived they are translated into a digital image that can be displayedon a computer screen. Just like the pictures on your television set,satellite imagery is made up of tiny squares, each of a different grayshade or color. These squares are called pixelsshort for pictureelementsand represent the relative reflected light energy recorded forthat part of the image.
This weather satellite image of hurricane Floyd from September 15, 1999, has been magnified to show the individual picture elements (pixels) that form most remote sensing images. (Image derived from NOAA GOES DATA)
Each pixel represents a square area on an image that is a measure of thesensor's ability to resolve (see) objects of different sizes. Forexample, the Enhanced Thematic Mapper (ETM+) on the Landsat 7 satellitehas a maximum resolution of 15 meters; therefore, each pixel representsan area 15 m x 15 m, or 225 m2. Higher resolution (smaller pixel area)means that the sensor is able to discern smaller objects. By adding upthe number of pixels in an image, you can calculate the area of a scene.For example, if you count the number of green pixels in a false colorimage, you can calculate the total area covered with vegetation.
How does the computer know which parts of the image should be dark andwhich one should be bright? Computers understand the numeric language ofbinary numbers, which are sets of numbers consisting of 0s and 1s thatact as an "on-off" switch. Converting from our decimal system to binarynumbers, 00 = 0, 01 = 1, 10 = 2, 11 = 3. Note that we cannot use decimalnumbers since all computers are fussythey only like "on" and "off."
For example, consider an image that is made up of 8 columns by 5 rows ofpixels. In this figure, four shades are present: black, dark gray, lightgray and white. The darkest point is assigned the binary number 00, darkgray as 01, light gray as 10, and the brightest part the binary number11. We therefore have four pixels (B5, C4, D7 and E2) that thespacecraft says are 00. There are three dark gray pixels (B3, C2, C6 andE6) assigned the binary number 01, three light gray pixels (D3, D6 andE5) that are binary number 10, and 29 white pixels are assigned thebinary number 11.
Four shades between white and black would produce images with too muchcontrast, so instead of using binary numbers between 00 and 11,spacecraft use a string of 8 binary numbers (called "8-bit data"), whichcan range from 00000000 to 11111111. These numbers correspond from 0 to255 in the decimal system. With 8-bit data, we can assign the darkestpoint in an image to the number 00000000, and the brightest point in theimage to 11111111. This produces 256 shades of gray between black andwhite. It is these binary numbers between 0 and 255 that the spacecraftsends back for each pixel in every row and columnand it takes acomputer to keep track of every number for every pixel!
next: Color Images
back: Absorption Bands and Atmospheric Windows
Remote Sensing
Introduction and History
Radiation
Electromagnetic Spectrum
Absorption Bands and Atmospheric Windows
Spectral Signatures
Pixels and Bits
Color Images
Remote Sensing Methods
NASA Remote Sensing Accomplishments
References
Related Data Sets:
Observation Deck