Detectors
Passive Detectors
|
There are two ways in which remote sensing is done, by the use of a passive detector which just receives the information from objects through their normal emissions and reflections and activities detectors which have both a receiver and transmitter. Detectors can be either ground based, flown by aircraft (both traditional and UAV) or on earth orbiting satellites. The different types of detectors include thermal IR, visible and near visible light, radar and ultraviolet.
|
The most common type of detector used in visible and near visible wavelengths in remote sensing uses a solid state chip known as a Charged Couple Device (CCD), which is the same type of device used in a digital camera or smart device. The CCD chip may be a single device which separates colors by filters or wavelength sensitivity on the chip. Other methods use a device like a prism or diffraction grating to split the light into discrete bands and uses multiple CCD chips, with each chip receiving only a part of the total spectrum. A CCD detector can be flown on an Unmanned Aerial Vehicle, a traditional aircraft or an orbiting platform like a Landsat satellite.
The chip is actually a combination of lots of small light sensing devices. The more pixels that are on the chip the greater the resolution and the finer the details that can be imaged by the detector. An example would be the amount of mega pixels in a digital camera. Each cell on a chip is light sensitive (the more sensitive the greater the contrast) and when a photon strikes the cell it causes an electron(s) to be created. The number of electrons in each cell is counted on a periodic basis such as 1/60th of a second. Then the cell is cleared of electrons and the process begins again. The more electrons which can be collected before saturation the greater the contrast. A charged coupled device is only a black and white device and thus the needs to separate the wavelengths of light for each cell. Therefore by knowing the intensity of the light t in each cell the device can produce a pixel composed of the red, green and blue light.
The data received from the detector has a time stamp of RGB values and x,y cell position. Software is then used to assemble the data into an image. When a digital picture is recorded an image of the object is shown on the device. The image is assembled through software and hardware for display. While the methodology is similar when sensing other wavelengths there are additional specifications for these detectors.
Thermal IR Remote Sensing is looking at heat from an object. Thermal IR can be done either monochromatically such as night vision goggles or in false color. These sensors can be used in UAVs, aircraft or satellites, the earth’s atmosphere does absorb thermal IR due to the water vapor in the atmosphere. For aircraft based detectors they are usually used at night, when the sky is cloudless, many times these are used in the colder months of the year if looking for energy losses in a home.
Optics
With the rapid increase in the number of small UAS being used in the remote sensing field there is a need for a better understanding of the field of optics, specific as it relates to cameras. In the past when using manned aircraft or satellites, this was not a consideration of the remote sensor instead it was a function of those configuring the aircraft or the satellite, but this role has changed. To successfully take an image of a moving vehicle it is important to understand that while the image is being taken that the vehicle is moving which could cause a blurring of the image depending on numerous factors. For this discussion it will be assumed that the aircraft remains a fixed distance above the surface, remains on the same heading, with the same horizontal velocity. Velocity of an aircraft is traditional given in miles per hour, but will need to be converted into smaller units such as feet per second.
The chip is actually a combination of lots of small light sensing devices. The more pixels that are on the chip the greater the resolution and the finer the details that can be imaged by the detector. An example would be the amount of mega pixels in a digital camera. Each cell on a chip is light sensitive (the more sensitive the greater the contrast) and when a photon strikes the cell it causes an electron(s) to be created. The number of electrons in each cell is counted on a periodic basis such as 1/60th of a second. Then the cell is cleared of electrons and the process begins again. The more electrons which can be collected before saturation the greater the contrast. A charged coupled device is only a black and white device and thus the needs to separate the wavelengths of light for each cell. Therefore by knowing the intensity of the light t in each cell the device can produce a pixel composed of the red, green and blue light.
The data received from the detector has a time stamp of RGB values and x,y cell position. Software is then used to assemble the data into an image. When a digital picture is recorded an image of the object is shown on the device. The image is assembled through software and hardware for display. While the methodology is similar when sensing other wavelengths there are additional specifications for these detectors.
Thermal IR Remote Sensing is looking at heat from an object. Thermal IR can be done either monochromatically such as night vision goggles or in false color. These sensors can be used in UAVs, aircraft or satellites, the earth’s atmosphere does absorb thermal IR due to the water vapor in the atmosphere. For aircraft based detectors they are usually used at night, when the sky is cloudless, many times these are used in the colder months of the year if looking for energy losses in a home.
Optics
With the rapid increase in the number of small UAS being used in the remote sensing field there is a need for a better understanding of the field of optics, specific as it relates to cameras. In the past when using manned aircraft or satellites, this was not a consideration of the remote sensor instead it was a function of those configuring the aircraft or the satellite, but this role has changed. To successfully take an image of a moving vehicle it is important to understand that while the image is being taken that the vehicle is moving which could cause a blurring of the image depending on numerous factors. For this discussion it will be assumed that the aircraft remains a fixed distance above the surface, remains on the same heading, with the same horizontal velocity. Velocity of an aircraft is traditional given in miles per hour, but will need to be converted into smaller units such as feet per second.
Therefore, an aircraft traveling at 50 mph is the same as covering a little more than 7 each second. A point directly under the aircraft will have the same amount of displacement as the aircraft. Since many of the aircraft use commercial off the shelf cameras there are several parameters that must be set properly to receive good image quality. These include shutter speed, lens f stop, lens focal length, ISO of the chip, and focus of the lens.
The speed of the shutter, the f-stop of the lens and the ISO of the chip work together to produce a true color image. The first parameter to consider is the speed of the shutter, which, in general, is a fraction of a second. The quicker the shutter the less movement that will have occurred by the aircraft while the shutter is open, this can be in the thousandth of a second. The faster the shutter will require a larger opening of the lens (the f-stop), but the larger the opening of the lens the less the depth of field[1] in the image, which requires a more precise focus of the lens on the camera. What is occurring is that there is spherical distortion at the edges of the lens, which do not occur when only using the center of the lens. The ISO parameter, deals with how low the light level can be, to still receive a high quality image, but with a larger ISO number there is an increase in the noise associated with the image, and thus makes the image appear more distorted[2]. These parameters are usually set prior to flight and cannot be changed during the flight, so if the sky conditions change during the flight from overcast to sunny, the camera settings can cause for washed out imagery when it becomes sunny, because too much light is entering the camera. To determine the shutter speed it is important to determine how far the aircraft will move while the shutter is open, if the assumption is that the shutter will be open for 1/1000 of a second the following parameter is determined based on the aircraft speed listed above.
The speed of the shutter, the f-stop of the lens and the ISO of the chip work together to produce a true color image. The first parameter to consider is the speed of the shutter, which, in general, is a fraction of a second. The quicker the shutter the less movement that will have occurred by the aircraft while the shutter is open, this can be in the thousandth of a second. The faster the shutter will require a larger opening of the lens (the f-stop), but the larger the opening of the lens the less the depth of field[1] in the image, which requires a more precise focus of the lens on the camera. What is occurring is that there is spherical distortion at the edges of the lens, which do not occur when only using the center of the lens. The ISO parameter, deals with how low the light level can be, to still receive a high quality image, but with a larger ISO number there is an increase in the noise associated with the image, and thus makes the image appear more distorted[2]. These parameters are usually set prior to flight and cannot be changed during the flight, so if the sky conditions change during the flight from overcast to sunny, the camera settings can cause for washed out imagery when it becomes sunny, because too much light is entering the camera. To determine the shutter speed it is important to determine how far the aircraft will move while the shutter is open, if the assumption is that the shutter will be open for 1/1000 of a second the following parameter is determined based on the aircraft speed listed above.
So the aircraft will move less than an inch while the shutter is open, doesn’t sound like much. The next fact to be known is the size of an individual pixel. There are multiple ways this could be determined, for this example a single image from the camera was georeferenced in mapping software, the size of the image was measured and the number of pixels were known for the camera in both the x and y direction. The image represented 4980.4 x 3228.6 inches, the camera chip had 7360 x 4912 pixels. Thus a simple division determines that a pixel is .66 of an inch. Thus the aircraft moves more than one pixel while the shutter was open. A theoretical measurement of this can also be determined, by knowing the height of the aircraft, the focal length of the lens, and the size of the detector.[3]
Active Detectors
LiDAR is the use of a laser which may be in the visible, ultraviolet or near infrared to illuminate an object and measure the reflected radiation. Through the collection of the reflected radiation a range (distance) can be determined to the object from the platform. Multiple beams are sent out and thus an image can be composited of the reflectance. The platform can either be in an aircraft or terrestrial. When airborne it is used to measure elevations with a high degree of accuracy. Ground based units are used to measure dimensions like heights of overpasses, building features, trees and other ground based objects. The data received from these methods are extremely large and require processing to fully understand the results.
An older active detection technology is the use of Radar to create digital elevation models (DEM) which can be done both from space and aircraft[8]. When Radar is used to create a DEM then it requires either multiple passes over the same area or multiple antenna on the same
[1] Depth of Field is the distance in front and behind the specified focus of a camera that will be in focus.
[2] http://imaging.nikon.com/lineup/dslr/basics/13/
[3] http://www.edmundoptics.com/resources/application-notes/imaging/understanding-focal-length-and-field-of-view/
[4]http://en.wikipedia.org/wiki/Digital_elevation_model
[5]http://www.ehow.com/way_5855231_ground-penetrating-radar-techniques.html
[6] http://landsat.usgs.gov/landsat8.php
[7] http://landsat.usgs.gov/band_designations_landsat_satellites.php
[8]http://en.wikipedia.org/wiki/Digital_elevation_model
[9] http://www.ehow.com/way_5855231_ground-penetrating-radar-techniques.html
An older active detection technology is the use of Radar to create digital elevation models (DEM) which can be done both from space and aircraft[8]. When Radar is used to create a DEM then it requires either multiple passes over the same area or multiple antenna on the same
[1] Depth of Field is the distance in front and behind the specified focus of a camera that will be in focus.
[2] http://imaging.nikon.com/lineup/dslr/basics/13/
[3] http://www.edmundoptics.com/resources/application-notes/imaging/understanding-focal-length-and-field-of-view/
[4]http://en.wikipedia.org/wiki/Digital_elevation_model
[5]http://www.ehow.com/way_5855231_ground-penetrating-radar-techniques.html
[6] http://landsat.usgs.gov/landsat8.php
[7] http://landsat.usgs.gov/band_designations_landsat_satellites.php
[8]http://en.wikipedia.org/wiki/Digital_elevation_model
[9] http://www.ehow.com/way_5855231_ground-penetrating-radar-techniques.html