GIS and Remote Sensing Definitions


This post provides several basic definitions about GIS and remote sensing, as a quick reference for future tutorials. For further information there are several free online sources, such as Landsat 7 Science Data User's Handbook or Wikipedia.


There are several definitions of GIS (Geographic Information Systems), which is not simply a program. In general, GIS are systems that allow for the use of geographic information (data have spatial coordinates). In particular, GIS allow for the view, query, calculation and analysis of spatial data, which are mainly distinguished in raster or vector data structures. Vector is made of objects that can be points, lines or polygons, and each object can have one ore more attribute values; a raster is a grid (or image) where each cell has an attribute value (Fisher and Unwin, 2005).

Several GIS applications use raster images that are derived from remote sensing.
Remote sensing is the measurement of the energy that is emanated from the Earth’s surface. If the source of the measured energy is the sun, then it is called passive remote sensing, and the result of this measurement can be a digital image (Richards and Jia, 2006).

The electromagnetic spectrum is "the system that classifies, according to wavelength, all energy (from short cosmic to long radio) that moves, harmonically, at the constant velocity of light" (NASA, 2013). Passive sensors measure energy from the optical regions of the electromagnetic spectrum: visible, near infrared (i.e. IR), short-wave IR, and thermal IR.
It is worth mentioning the active remote sensing, which works in the microwave range using radar sensors; the measured energy is not emitted by the Sun but from the sensor platform (Richards and Jia, 2006).

The interaction between solar energy and materials depends on the wavelength; solar energy goes from the Sun to the Earth and then to the sensor. Along this path, solar energy is:

  • "Transmitted - The energy passes through with a change in velocity as determined by the index of refraction for the two media in question.
  • Absorbed - The energy is given up to the object through electron or molecular reactions.
  • Reflected - The energy is returned unchanged with the angle of incidence equal to the angle of reflection. Reflectance is the ratio of reflected energy to that incident on a body. The wavelength reflected (not absorbed) determines the color of an object.
  • Scattered - The direction of energy propagation is randomly changed. Rayleigh and Mie scatter are the two most important types of scatter in the atmosphere.
  • Emitted - Actually, the energy is first absorbed, then re-emitted, usually at longer wavelengths. The object heats up." (NASA, 2013).

Sensors can be on board of airplanes or on board of satellites, measuring the electromagnetic radiation at specific ranges (usually called bands). As a result, the measures are quantized and converted into a digital image, where each picture elements (i.e. pixel) has a discrete value in units of Digital Number (DN) (NASA, 2013). The resulting images have different characteristics (resolutions) depending on the sensor.
There are several kinds of resolutions:

  • Spatial resolution, usually measured in pixel size, "is the resolving power of an instrument needed for the discrimination of features and is based on detector size, focal length, and sensor altitude" (NASA, 2013); spatial resolution is also referred to as geometric resolution or IFOV;
  • Spectral resolution, is the number and location in the electromagnetic spectrum (defined by two wavelengths) of the spectral bands (NASA, 2013)  in multispectral sensors, for each band corresponds an image;
  • Radiometric resolution, usually measured in bits (binary digits), is the range of available brightness values, which in the image correspond to the maximum range of DNs; for example an image with 8 bit resolution has 256 levels of brightness (Richards and Jia, 2006);
  • For satellites sensors, there is also the temporal resolution, which is the time required for revisiting the same area of the Earth (NASA, 2013).

For example, Landsat is a multispectral family of satellites developed by the NASA (National Aeronautics and Space Administration of USA), very used for environmental research. The resolutions of Landsat 7 sensor are reported in the following figure; also, Landsat temporal resolution is 16 days (NASA, 2013).

Often, a combination is created of three individual monochrome images, in which each is assigned a given color; this is defined color composite and is useful for photo interpretation (NASA, 2013). Color composite are usually expressed as "R G B = Br Bg Bb" where: R stands for Red; G stands for Green; and B stands for Blue; Br is the band number associated to the Red color; Bg is the band number associated to the Green color; and Bb is the band number associated to the Blue color.
The following example shows a color composite "R G B = 4 3 2" of a Landsat 8 image (for Landsat 7 it is 3 2 1) and a color composite "R G B = 5 4 3" (for Landsat 7 it is 4 3 2). The composite "R G B = 5 4 3" is useful to identify vegetation, because it is clearly highlighted in red.

Example of color composite of a Landsat 8 image.
Data available from the U.S. Geological Survey.

Sensors measure the radiance, which corresponds to the brightness in a given direction toward the sensor; it useful to define also the reflectance as the ratio of reflected versus total power energy. The spectral signature is the reflectance as a function of wavelength; each material has a unique signature, therefore it can be used for material classification (NASA, 2013).

A supervised classification is an image processing technique that allows for the identification of materials in an image, according to their spectral signatures. There are several kinds of classification algorithms, but the general purpose is to produce a thematic map of the land cover.

Land cover is the material at the ground, such as soil, vegetation, water, asphalt, etc. (Fisher and Unwin, 2005). Depending on the sensor resolutions, the number and kind of land cover classes that can be identified in the image can vary significantly.

Image processing and GIS spatial analyses require specific software. However, there are several open source alternatives to commercial software. For a list of free open source software see my previous post Open Source Software for GIS and Image Processing.

After the classification process, it is useful to assess the accuracy of land cover classification, in order to identify and measure map errors. Usually, accuracy assessment is performed with the calculation of an error matrix, which is a table that compares map information with reference data (i.e. ground truth data) for a number of sample areas (Congalton and Green, 2009).
The following is a scheme of error matrix, where k is the number of classes identified in the land cover classification, and n is the total number of collected sample units.


Ground truth 1
Ground truth 2
Ground truth k
Total
Class 1
a11
a12
a1k
a1+
Class 2
a21
a22
a2k
a2+
Class k
ak1
ak2
akk
ak+
Total
a+1
a+2
a+k
n


The items in the major diagonal (aii) are the number of samples correctly identified, while the other items are classification error. Therefore, it is possible to calculate the overall accuracy as the ratio between the number of samples that are correctly classified (the sum of the major diagonal), and the total number of sample units n (Congalton and Green, 2009).





References:
  • Congalton, R. and Green, K., 2009. Assessing the Accuracy of Remotely Sensed Data: Principles and Practices. Boca Raton, FL: CRC Press.
  • Fisher, P. F. and Unwin, D. J., eds. 2005. Representing GIS. Chichester, England: John Wiley & Sons.
  • NASA, 2013. Landsat 7 Science Data User's Handbook. Available at http://landsathandbook.gsfc.nasa.gov
  • Richards, J. A. and Jia, X., 2006. Remote Sensing Digital Image Analysis: An Introduction. Berlin, Germany: Springer.
Newer posts Older posts