After reading this article you will learn about:- 1. Principles of Remote Sensing 2. Types of Remote Sensing 3. System Overview.
Principles of Remote Sensing:
Remote Sensing (RS) is generally defined as the science of collecting and interpreting information about a target without being in physical contact with the object under study. There are various categories of remote sensing starting from observation by naked eyes, photography by camera, photography from aircraft, and sensing by sensors from space satellite.
ADVERTISEMENTS:
Depending on its physical features and chemical properties, different objects on the earth’s surface reflect, reradiate or emit different amounts of electromagnetic energy in various wavelengths. The measurement of reflected or reradiated or emitted electromagnetic radiation forms the basis for understanding the characteristic of earth’s surface features.
Three typical responses are used to distinguish the objects from one another.
Only selected portions of the electromagnetic spectrum which can pass through the earth’s atmosphere with relatively little attenuation are used for remote sensing purpose.
The selected region of the electromagnetic spectrum which are employed in remote sensing include:
0.4 to 0.7 pm, 0.7 to 3.0 pm (IR band), 3 to 5 pm, and 8 to 14 pm (TIR) and 0.1 to 30 cm (microwave).
Types of Remote Sensing:
Depending on the source of energy which illuminate the object under study, the remote sensing techniques are classified into two types, viz.:
ADVERTISEMENTS:
(i) Passive remote sensing, and
(ii) Active remote sensing.
In passive remote sensing system, the naturally radiated or reflected energy from the earth’s surface features is measured by the sensors operating in different selected spectral bands on board the air-borne/space-borne platforms (similar to photography in daytime without flash).
An active remote sensing system supplies its own source of energy to illuminate the objects and measures the reflected energy returned to the system (similar to photography in night with flash).
System Overview in Remote Sensing:
Two major steps are involved in this process. The first one is data acquisition and the second data processing and interpretation. Data acquisition is made by sensors from platforms. This information is used for the formation of products which are finally used for interpretation purpose by comparing with ground truth information.
There are five distinct components in remote sensing process.
The details are depicted in Table 26.1:
i. Satellite Remote Sensing:
(a) Satellite and Sensors:
Earth Resources Technology Satellite (ERTS-1), later renamed as LANDS AT I, was the first remote sensing satellite launched (1972) by NASA for surveying, mapping, and monitoring of earth resources. Realising the potentials of this emerging technology many other countries like France (SPOT), India (IRS); Japan (JERS), and Europe (ERS) have entered into this venture.
So far six satellites in the series of LANDSAT have been launched of which first three are of first generation satellite and carried Return Beam Videocon (RBV) and Multispectral Scanner (MSS) imaging sensors while the second generation satellites carry apart from MSS an advanced imaging sensor called Thematic Mapper (TM).
Three satellites in the series of SPOT launched by France provide data in multispectral and panchromatic spectral bands with normal and/or stereoscopic mode. Indian remote sensing satellites, IRS-IA, IRS-IB. and IRS-IC, the indigenously developed satellites, have been put into orbit in 1988, 1991, and 1995, respectively, with sensors like L1SS-I, LISS-II, LISS-II1, WIFS, and PAN (Figs. 26.2,26.3,26.4 & 26.5).
Unlike geostationary satellites, these remote sensing satellites are sun-synchronous (crossing the equator/latitude at the same local time every day with descending node enabling the study of natural resources at various regions under the same illumination condition), and polar orbiting type with a repetitive cycle of 16 to 26 days enabling repeated collection of data at the same place at the same local time for continuous monitoring of the earth’s resources.
The imaging payloads of these satellites operate in different spectral bands, spatial resolutions (the smallest area in the ground being sensed by the sensors), and radiometric resolution (number of grey levels which are distinguishable).
IRS—an Overview:
The image sensing characteristics of various sensors of IRS system are:
IRS satellite systems were launched into a polar sun-synchronous orbit, the orbit plane rotates at the same rate as the mean rate of the earth around the sun (i.e., 0.9856 deg/day).
Thus, the satellite passes over a particular latitude approximately at the same local time. It enables the ground illumination conditions at sub-satellite regions to be constant throughout the mission. The equatorial crossing time of the descending node for IRS-IA is around 10.25 AM.
As the orbital period of IRS-IA is about 103 minutes, with the satellite completing 14 orbits/day, each successive orbit is shifted westward over the earth’s surface by 25.798 degree of longitude, corresponding to 2,872 km at equator (Fig. 26.6). The satellite’s path is shifted by 1.17 degree of longitude to the west every day corresponding to 130.54 km at the equator.
The satellite completes one coverage cycle of the Indian subcontinent in 22 days (i.e., 307 orbits) (Fig. 26.6). In order to facilitate convenient and unique identification of any of the geographical regions of interest and cataloguing of data products, an image referencing scheme designated by path and row numbers has been evolved for the Indian subcontinent.
The paths are sequentially numbered from east to west for all the 307 orbits in a coverage cycle. The row is the line joining the corresponding scene centres of different paths forming a contour parallel to the equator. The row number 1 falls around 81 degree North latitude. The Indian region is covered by row number 41 to 63 and path number 9 to 35 (Fig. 26.9).
IRS-IA image sensors operate in push broom scanning mode using Linear Image Self-scanning Sensors (LISS). In this mode of operation, each line of the image is electronically scanned by a linear array of detectors, located in the focal plane of optical system and successive lines of image are produced by satellite movement (Fig. 26.7).
IRS-IA has two types of imaging sensors, one with a spatial resolution of 72.5 meters and designated as LISS-I and other with two separate imaging sensors designated as LISS-IIA and LISS-IIB with a spatial resolution of 36.25 meters each. LISS-I provides a swarth of 148 km, while a composite swarth of 145 km is attained by the two LISS-II sensors by suitable mounting of detectors in the focal plane of the system (Fig. 26.8).
In IRS system, the dual spectral resolution data available from LISS I & LISS II offer the capability of overview of larger areas, besides providing an in-depth look at finer areas. The selection of spatial and spectral characteristics of IRS image sensors enables the complementary and supplementary use of data from contemporary remote sensing satellite like LANDSAT-5, SPOT.
A synoptic account of the band utility for resource information assessment is also shown in the Fig 26. 10:
ii. Microwave Remote Sensing:
Microwave portion of the spectrum includes wavelengths within the approximate range of 1mm to lm. There are two distinctive features that characterize microwave energy from a remote sensing standpoint Microwaves are capable of penetrating the atmosphere under virtually all conditions.
Depending on the wavelengths involved, microwave energy can ‘see through’ haze, light, ram and snow, clouds smoke and so on Microwave reflections or emissions from earth materials bear no direct relationship to their counterparts in the visible or thermal portions of the spectrum.
For example, surfaces that appear rough hi the visible portion of the spectrum may be ‘smooth’ as seen by microwaves. In responses afford us a markedly different ‘view’ of the environment-one face removed from the views experienced by sensing light or heat.
The word radar is an acronym for Radio Detection and Ranging. As its name implies, radar was developed by Heinrich Hertz in the late 1880 as a means of using radio waves to detect the presence of objects and determine their range (position).
The process entails transmitting short bursts, or pulseso a microwave energy in the direction of interest and recording the strength and origin of echoes or ‘reflections’ received from objects within the system’s field of view.
Radar systems may or may not produce images and they may be ground-based or mounted in aircraft or spacecraft. A common form of non-imaging radar is the type used to measure vehicle speeds.
These Items are termed Doppler Radar systems because the Doppler Effect frequency shifts in the transmitted end return signals to determine an object’s velocity. Doppler frequency shifts are a function of the relative velocities of a wave transmitter and a reflector. For example, we perceive Doppler shifts in sound waves as chosen in pitch, as in the case of a passing car horn or a train whistle.
The spatial resolution of a radar system is determined among other things by the size its antenna W For any given wavelength, the larger the antenna, the better the spatial resolution of an aircraft It is very difficult to mount a rotating antenna that is very large.
To circumvent the system most airborne radar remote sensing; techniques with systems use an antenna fixed on the aircraft and pointed to the sun s pitch. The term Side-Looking Radar (SLR) or S.de Looking Airborne Radar (SLAR) systems produce continuous strips of imagery which converge on ground areas near or adjacent to the aircraft’s light line.
2. SLAR System Operation:
The basic operation principle of SLAR system is shown in Fig. 26.11 a, Fig. 26.11 b. Microwave energy is transmitted from an antenna in very short bursts or pulses.
These high energy pulses are emitted over a time period of the order o microseconds (10 µ sec.). In Fig. 26.1 la propagation of one pulse is shown by indicating the wave front locations. Because the tree is less reflective of radio waves than the house a weaker response is recorded in Fig. 26.11 b.
By electronically measuring the return time of signal echoes, the range, or distance, between the transmitter and reflecting objects may be determined. Since the energy propagates in air at approximately the velocity of light, c, the slant range, SR, to any given object is given by:
SR = ct/2 ……… (1)
where SR = Slant Range (Direct distance between transmitter and object)
c = speed of light (3 x 108 m/sec)
t = time between pulse transmission and echo reception.
(Note that the factor of 2 enters into the equation because the time is measured for the pulse to travel both the distance to and from the target, or twice the range.)
This principle of determining distance by electronically measuring the transmission-echo time is central to all imaging radar systems. The manner in which SLAR images are created is illustrated in Fig. 26.12. As the aircraft advances, the antenna is continuously repositioned in the flight direction at the aircraft velocity V.
The antenna is switched from a transmitter to a receiver mode by a synchronizer switch. Each transmitted pulse returns echoes from terrain features occurring along a single antenna beam width.
The echoes are received by the airborne antenna and processed to produce an amplitude/time video signal. This signal is used to generate an image product in a film recorder. The signal modulates the intensity of a single-line cathode ray tube, exposing an image line on the film.
3. Spatial Resolution of SLAR Systems:
The ground resolution cell size of SLAR system is contributed by two independent sensing system parameters:
Pulse length and antenna beam width. The pulse length of the radar signal is determined by the length of time that the antenna emits its burst of energy.
Range Resolution:
For an SLAR system to image separately two ground features that are close to each other on the range direction, it is necessary for all parts of the two objects’ reflected signals to be received separately by the antenna. Any time overlap between the signals from two objects will cause their images to be blurred together.
This concept is illustrated in Figure 26.13. If the slant-range distance between A and B were anything greater than L/2, the two signals would be received separately, resulting in two separate image responses. Thus, the slant-range resolution of an SLAR system is independent of the distance from the aircraft and is equal to half the transmitted pulse length.
Although the slant-range resolution of an SLAR system does not change with distance from the aircraft, the corresponding ground distance does. As shown in Fig. 26.14 the ground resolution in the range direction varies inversely with the cosine of the angle between the horizontal ground plane and is called the depression angle, A.
Accounting for the depression angle effect, the ground resolution in the range direction, R , is found from:
Rr. = cÓ / 2 CosA0 (2)
where Ó is the pulse duration.
Azimuth Resolution:
As shown in Fig. 26.15 the resolution of a SLAR system in the azimuth direction R is determined by the angular beam width, B. of the antenna and the ground range (GR). As the antenna beam ‘fans out with increasing distance from the aircraft, the azimuth resolution deteriorates.
Objects at points A and B would be resolved (imaged separately) at ground range (GR) but not at ground range (GR). That is at distance GR, A and B result in separate return signals. At distance GR, A and B would be in the beam simultaneously and would not be resolved.
The beam width of the antenna of an SLAR system is directly proportional to the wavelength of the transmitted pulses and inversely proportional to the length of the antenna, L.
That is, for any given wavelength, antenna beam width can be controlled by one of two different means:
(1) By controlling the physical length of the antenna and
(2) By synthesizing an effective length of the antenna.
Those systems wherein beam width is controlled by the physical antenna length are called brute force, real aperture or non-coherent radars. The antenna in a brute force system must be many wavelengths long for the antenna beam width to be narrow.
For example, to achieve even a 10 milliradian beam width with a 50 mm wavelength radar, a 5 m antenna is required [(50 x 10 m)/(10 x 10) -5 m]. To obtain a resolution of 2 milliradians, we would need an antenna 25 m long! Obviously, antenna length requirements of brute force systems represent considerable logistical processes when detailed resolutions are sought.
4. SAR System:
Brute force systems enjoy relative simplicity of design and data processing. Because of resolution problems however, their operation is often restricted to relatively short-range, low altitude operation and the use of relatively short wavelengths.
These restrictions are unfortunate because in short-range, low altitude operation time the area of coverage obtained by the system and short wavelengths experience these atmosphere attenuation and dispersion.
The deficiencies of brute force operation are overcome in synthetic aperture (or coherent) radar systems. These systems employ a short physical antenna, but through modified data recording and processing techniques they synthesise the effect of a very long antenna.
The result of this mode of operation is a very narrow effective antenna beam width, even at far ranges, without requiring a physically long antenna or a short operating wavelength. For example, in a synthetic aperture system, a 2 meter antenna can be made, effectively, 600 m long!
The basic principle of synthetic aperture SLAR operation is illustrated in Fig. 26.16. In essence, return signals from the center portion of the beam-width are discriminated by detecting Doppler frequency shifts. Recall that a Doppler shift is a change in wave frequency as a function of relative velocities of a transmitter and a reflector.
Within the wide antenna beam, returns from features in the area ahead of the aircraft (Fig 26 16) will have up shifted (higher) frequencies resulting from the Doppler Effect. Conversely returns from the area behind the aircraft will have downshifted (lower) frequencies.
Returns from features near the centerline of the beam width will experience little or no frequency shift. By processing the return signals according to their Doppler shifts, a very small effective beam width can be generated.
The details of synthetic aperture signal recording and processing are very complex and, therefore, beyond the scope of this discussion. The basic concept is to record both the amplitude and the frequency of signals returned from objects throughout the time period in which they are within the beam of the moving antenna.
The frequency information is obtained by comparing the reflected return signals with a controlled frequency ‘reference signal’ generated internally by the system.
The comparison is made by having the ground-reflected signal interfere with the reference signal. These signals interfere in various patterns, depending on their comparative frequencies. The patterns are generally recorded photographically, resulting in a ‘signal film’ (magnetic tape recording may also be used).
Because the signals received by a synthetic aperture system are recorded over a long time period, the aircraft translates the real antenna over a correspondingly long distance. This distance becomes the length of the ‘synthetic’ antenna. The azimuth resolution with the effective antenna length is greatly improved.
Of interest is that this resolution is essentially independent of range because at long range an object is in the beam longer, hence returns from it are recorded over a longer distance.
5. Transmission Characteristics of Radar Signals:
The two primary factors influencing the transmission characteristics of the signals from any given radar system are the wavelength and the polarization of the energy pulse used. Table 26.4 lists the common wavelength bands used in pulse transmission.
The letter codes for the various bands (K, X, L etc.) were originally selected arbitrarily to ensure military security during the early stages of development of radar. They have continued in use as a matter of convenience and various authorities designate the various bands in slightly different wavelength ranges.
Naturally, the wavelength of a radar signal determines the extent to which it is attenuated and/or dispersed by the atmosphere. Serious atmospheric effects on radar signals are confined to the shorter operating wavelengths (less than 3 cm). Even at these wavelengths, under most operating conditions the atmosphere only slightly attenuates the signals.
As one would anticipate, attenuation generally increases as operating wavelength decreases and the influence of clouds and rain is variable. Whereas, radar signals are relatively unaffected by clouds, echoes from heavy precipitation can be considerable.
Precipitation echoes are proportional, for a single drop, to the quantity DW, where D is the drop diameter, At the same time, the effect of rain is negligible with wavelengths of operation greater than 3 cm.
So-short of the condition of a very heavy rainstorm—radar can be used through clouds, smoke, or fog. This feature combined with day/night operation, makes radar a particularly valuable tool when time-dependent operations are undertaken.
Irrespective of wavelength, radar signals can be transmitted and/or received in different modes of polarisation, that is, the signal can be filtered in such a way that its electrical wave vibrations are restricted to a single plane perpendicular to the direction of wave propagation. (Un-polarized energy vibrates in all directions perpendicular to that of propagation).
A SLAR signal can be transmitted in either a horizontal (H) or a vertical (V) plane. Likewise, it can be received in either a horizontal or vertical plane.
Thus we have the possibility of dealing with four different combinations of signal transmission and reception- H send, H receive; H send, V receive; V send, H receive; and V send, V receive. Like-polarized imagery is obtained from HV or VH combinations.
Since various objects modify the polarization of the energy they reflect to varying degrees, the mode of signal polarization influences how the objects look on the resulting imagery.
6. Terrain Characteristics Influencing Radar Returns:
There is a host of terrain characteristics that work hand-in-hand with the wavelength and polarisation of radar signals to determine the intensity of radar returns from various objects. These factors are many varied and complex.
Although many theoretical models have been developed to describe how various objects reflect radar energy, most of out practical knowledge in the subject has been derived from empirical observation. It has been found that the primary factors influencing objects’ return signal intensity are their geometrical and electrical characteristics.
Geometrical Characteristics:
One of the most readily apparent features of radar imagery is its side-lighted character when terrain of varying relief is imaged. This arises through variation in the relative sensor terrain geometry for various terrain orientations as illustrated in Fig. 26.17.
Local variations in terrain slope result in varying angles of single incidence. This, in turn, results in relatively high returns from slopes facing the sensor and the level low returns or no returns from slopes facing away from the sensor.
In Fig. 26.17 the return strength versus time graph has been positioned over the terrain such that the signals can be correlated with the feature that produced them. Above the graph is the corresponding image line, in which the signal strength has been converted schematically to brightness values. The response from this radar pulse initially shows a high return from the slope facing the sensor.
This is followed by a duration of no return signal from areas blocked from illumination by the radar wave. This radar shadow is completely black and sharply defined, unlike shadows in photography that are weakly illuminated by energy scattered by the atmosphere. Following the shadow, a relatively weak resource is recorded from the terrain that is not oriented towards the sensor.
The effect of the relative sensor/object geometry on the intensity of radar return signals is compounded by the effect of surface roughness. The roughness of an object’s surface is a function of its relief variations in relation to the wavelength of the reflected energy.
Surfaces with roughness essentially equal to or greater than the transmitted wavelength appear ‘rough’. As shown in Fig. 26.18, rough surfaces tend to act as diffuse reflectors and scatter the incident energy in all directions, returning only a small portion of it to the antenna.
Targets with surface roughness much less than the wavelength of the radar energy are ‘smooth’ specular reflectors of the energy. As shown in Fig. 26.18a & b, a smooth surface generally reflects most of the energy away from the sensor resulting in a low return signal.
However, the sensor- object orientation must be considered as well, since a smooth surface oriented towards the sensor would result in a very intense return signal. A particularly bright response results from a corner reflector as illustrated in Fig. 26.18c.
In this case, adjacent smooth surfaces cause a double reflection that yields a very high return. Because the corner reflectors generally cover only small areas of the scene, they often appear as bright ‘sparkles’ on the image.
It is worthy to note that some features, such as corn fields, might appear rough when seen both in the visible and the microwave portion of the spectrum. Other surfaces, such as roadways, may be diffuse reflectors to the visible region but specular reflectors of microwave energy. In general, SLAR images manifest many more specular surfaces than photographs do.
Electrical Characteristics:
The electrical characteristics of terrain features work closely with their geometrical characteristics to determine the intensity of radar returns. One measure of an object’s electrical character is the complex dielectric constant. This parameter is an indication of the reflectivity and conductivity of various materials.
In the microwave region of the spectrum, most natural materials have a dielectric constant in the range of 3 to 8 when dry. On the other hand, water has a dielectric constant of approximately 80. Thus, the presence of moisture in either soil or vegetation can significantly increase radar reflectivity.
In fact, changes in radar signal strength from one material to another are often linked to changes in moisture content much more closely than they are to changes in the materials themselves. Because plants have large surface areas and often have a high moisture content they are particularly good reflectors of radar energy.
Plant canopies with their varying complex dielectric constants and their micro relief often dominate the texture of SLAR image tones. Metal objects also give high returns and metal bridges, silos, railroad tracks and poles appear as bright spots on SLAR images.
Relief Displacement:
As in line scanner imagery, relief displacement in SLAR images is one-dimensional and perpendicular to the flight line. However, unlike scanner imagery and photography, the reflection of relief displacement is reversed. This is because radar images display ranges or distances, from terrain features to the antenna.
When a vertical feature is encountered by a radar pulse, the top of the feature is reached before returns from the base of the future. This will cause a vertical feature to ‘lay over’ the closer features, making it appear to lean toward the nadir. This radar layover effect most severe at near range is compared to photographic relief displacement in Fig. 26.19.
Terrain slopes facing the antenna at near range are often displayed with a dramatic layover effect. This occurs whenever the terrain slope is steeper than a line perpendicular to the direction of the radar pulse, expressed by its depression angle.
This condition is met by the left sides of features A and B in Fig. 26.20. As such, the tops of these slopes will be imaged before their bases, causing layover. It can be seen in the image representations that the amount of layover displacement is greatest at short range, where the depression angle is greater.
When the slope facing the antenna is less steep than the line perpendicular to the depression direction, as in feature D in Fig. 26.20, no layover occurs. That is, the radar pulse reaches the base of the feature before the top. The slopes of the surfaces will not be presented in true size, however. As shown in feature D, the size of the sloped surface is compressed on the image.
This foreshortening effect gets more severe as the slope’s steepness approaches perpendicularity to the depression direction. In feature C, the front slope is precisely perpendicular to the depression direction and it can be seen that the image of the front slope has been foreshortened to zero length.
Foreshortening and layover are obviously interrelated to the previously described phenomenon of radar shadow. Slopes facing away from the radar side of feature A faces away from the aircraft, but it is less steep than the depression angle and will, therefore, be illuminated by the radar pulse.
This will be weak, causing a fairly dark image area. In feature B, its right side is parallel to the depression angle and will, therefore, not be illuminated. As a result, the antenna will receive no return signals for a period of time and the image area will be black.
When a slope faces away from the aircraft and is steeper than the depression angle, as in features C and D, the area of non-illumination will extend beyond the sloped area, masking downrange features in a radar shadow. As shown in Fig. 26.20, the shadow length increases with range because of the decrease in depression angle.
Thus, a feature that casts an extensive shadow at far range (D) can be completely illuminated at close range (A).
Parallax:
When an object is imaged twice from two different flight lines, differential relief displacements cause image parallax on SLAR imagery. This allows images to be viewed stereoscopically. Stereo SLAR imagery can be obtained by flying on parallel flight lines over the same area and viewing terrain features from opposite sides (Fig. 26.21).
However, because the radar side lighting effect will be reversed on the two images in the stereo-pair, stereoscopic viewing is somewhat difficult using this technique. Accordingly, stereo radar imagery is often flown using the same flight line but different altitudes. The resulting effect is called altitude parallax. In this case, the direction of illumination and the side lighting effects will be similar on both images (Fig. 26.21).