Lecture 4
LAWS OF RADIATION 
The propagation of EME follows certain physical laws. All objects with temperatures above absolute zero have temperature and emit energy. The amount of energy and the wavelength at which it is emitted depend on the temperature of the object. As the temperature of the object increases, the total amount of energy emitted also increases, and the wavelength of maximum emission becomes shorter. 
Stefan-Boltzman Law 
The law states that the total radiation emitted from a black’ body is proportional to the fourth power of the absolute temperature. It defined the relationship between total emitted radiation and temperature. 
M = σT4 
Where, M is energy of the body; σ is Stefan-Boltzman’s constant. 5.67 x 10-8 w m-4; T is the absolute temperature of the body.
This law states that hot bodies emit more energy per unit area than the cool bodies
Wein’s Law 
The wein’s law states that, the dominant wavelength or Wavelength at which a blackbody radiation reaches a maximum (M5,) is related to its temperature. 
λ max = a/T 
Where, ‘a’ is a constant with value of 2898 µ in °K; T is the absolute temperature of the blackbody in °K.
Thus, for a blackbody, the wavelength at which the maximum spectral radiation existence occurs varies inversely with its absolute temperature. The wavelength of maximum remittance shifts to shorter wavelength. 


‘Kirchoff’s Law
Kirchoff’s law states that the ratio of emitted radiation to absorbed radiation is same for all blackbodies at the same temperature. This law forms the basis for definition of emisssivity (E), as the ratio between the emittance of a given object (M) and that a black body (Mb) at the same temperature 
Emittance of a body                          M 
Emissivity            =       ------------------------------- =   -----------
Emittance of a black body                Mb 
The emissivity of a true blackbody is one and that of a perfect reflector (white body) is zero.
Blackbodies and white bodies are concepts in the laboratory under ideal conditions. In nature all objects have emisssivity that falls between zero and one are Gray bodies. For these bodies emissivity is a measure of their effectiveness as radiation of EM.
Planck’s Law 
The spectral existence i.e. the total energy radiated in all directions by unit area in unit time in a spectral band for a blackbody is given by Planck’s law. 
Q=hv 
Where, h is Planck’s constant (6.6256 x 10-34 J/sec); V is frequency 
The spectral existance of a blackbody is not the same at all wavelengths. The spectral existance is low for very short and very long wavelength. The law indicates that a blackbody at higher temperature emits more radiation than a blackbody at low temperature at all wavelengths.
In remote sensing we are interested in the following wavelength ranges.
1. Visible                                           0.4 0.7 µm
a) Blue                                              0.4 ~ 0.5 µm.
b) Green                                            0.5 0.6 µm 
c) Red                                                0.6 0.7 µm 
2. Infrared                                         0.7 30.0 µm
(a) Near infrared (NIR)                     0.7 - 1.3 µm
b) Middle infrared‘(MIR)                  1.3 3.0 µm
c) Far infrared (FIR)                          3.0 30 µm 
(thermal infrared) 
3. Microwaves                                 1 mm 1m 
Most common sensing systems operate in one or several of the visible, IR or microwave portions of the spectrum. 
In the infrared region, only the thermal infrared is directly related to the sensation of heat and near and middle infrared energy is not. 
REMOTE SENSING MEASUREMENTS 
In remote-sensing we measure the intensities of reflected and emitted radiations from target surfaces or objects. Due to characteristics interactions of different wave lengths of radiation with different materials we get characteristic variations in the measurement. Three important types of variations which form the basis of information about the object are: 
1. Spectral variation: These are changes in the intensity of reflected/emitted radiations with wave length. 
2. Spatial variation: These are changes in the intensity of reflected or emitted radiation with location due to variation in material composition or surface topography of the target. 
3. Temporal variation: These are changes in the intensity of reflected or emitted radiations with time due to dynamic characteristics of target surface eg. Vegetation cover. 
In order to derive information from the objects we have to measure these variations and relate them to the processes of known objects or phenomena. 


INTERACTION OF EMR WITH ATMOSPHERE
Electromagnetic radiation (EMR) while travelling from the source to surface of the earth and then from there to the sensors on-board satellite comes in contact with the atmospheric constituents and interacts with them. The atmospheric constituents like dust particles, smoke particles and gases affect the incoming radiation. Hence, in the atmosphere, the interactions are caused mainly by scattering (Fig.3.1), absorption and refraction. 
Scattering 
Scattering is the redirection of EME in different directions. It occurs in presence of large dust particles and gas molecules in the atmosphere. The 'effect of scattering is to redirect the incoming radiation back to space as well as towards earth’s surface (Fig.3.2). There are three types of scattering depending on the size of particles in relation to wavelength. 
1. Rayleigh scattering
2. Mie scattering 
3. Non-selective scattering 
l. Rayleigh Scattering 
It occurs when particles are very small compared to the wavelength of radiation. These particles could be particles such as dust particles, nitrogen and oxygen molecules. Rayleigh scattering causes shorter wavelengths energy to be scattered more than longer wavelengths. It is the dominant scattering in upper atmosphere. The blue colour of the sky and red and orange colours at sunrise and sunset are due to Rayleigh scattering. 
2. Mie Scattering
Mie scattering occurs when the atmospheric particles are about the same size of the wavelength of the radiation. These particles include dust, pollen, and smoke and water droplet. Mie scattering occurs mostly in the lower atmosphere (0-5 km) where larger particles are more abundant. It influences a broad range of wavelength in and near visible region. 
3. Non-selective Scattering 
Non-selective scattering occurs when the particle sizes are larger thin the wavelength of radiation. The particles may be dust and water droplets. This scattering does not depend on the wavelength of the radiation. This type of scattering causes fog and clouds to appear whitish appearance of sky. 
Absorption 
This phenomenon occurs when the atmospheric constituents absorb energy passing through the atmosphere. The gases like ozone (03), carbon dioxide (C02) and water vapour (H20) absorb radiation in the atmosphere. Ozone absorbs UV radiation, C02 absorbs radiation in the FIR portion of the spectrum and water vapour absorbs the incoming IR and microwave radiation. 
Transmission or Atmospheric Window 
Some radiations which are neither absorbed nor scattered are transmitted through the atmosphere. The transparency of atmosphere to such radiations is known as atmospheric window. Atmospheric windows are the regions in the EMS for which the atmosphere is transparent. i.e. these wavelengths are easily transmitted through the atmosphere. These are useful regions for remote sensing purposes. The major atmospheric windows available for remote sensing are given in Table: 3.1. 
Refraction 
Refraction is the bending of light rays at the surface of interaction. When the light enter into a different medium, it changes its direction or bends at the atmosphere as light passes through atmospheric layer of varying clarity, humidity and temperature. These variations influence the density of atmospheric layers, hence bending of light occurs when it changes the medium of higher density to lower density. 
INTERACTlON OF EMR WITH EARTH SURFACE 
The wave length of EMR that is useful for valuable in environmental remote sensing are
1 Reflected radiation in Visible, NIR, MIR and micro wave bands. 
2. Emitted radiation in MIR and thermal IR wave bands. 
These are three main components of remotely sensed scenes such as vegetation, soil and water. The processes involved in the interaction of EMR with earth’s surface are reflection, scattering and transmission. 
Radiation that is not absorbed or scattered in the atmosphere reaches and interacts with the earth’s surface. According to the law of conservation of energy, energy cannot be created nor be destroyed; it can be converted or transformed to another form. Thus, the portion of incident radiation is reflected by the surface, transmitted in to the surface or absorbed by the surface (Fig.3.3). Different features on the earth’s surface have different values of spectral reflectance, absorbance arid transmittance on the basis of which they can be identified. The sum of each proportion of the components is unity but the magnitude of each component depends on the nature of the surface and hence different. 

                                      Iλ = R λ +T λ +A λ 
Where I λ is incident radiation; R λ is reflected radiation; T λ is transmitted radiation; A λ is absorbed radiation 
If the magnitude of the spectral radiance i.e. reflected, absorbed or transmitted is vary different for different surfaces on the earth surface than we can identified those features on the basis of their spectral properties. 
Of all the interactions in the reflective regions “surface reflections” are the most useful in remote sensing applications. 
It can be seen from the table that major principal windows lie in visible, infrared and micro wave regions. 
Reflection             
Reflection occurs when radiation is redirected from a nontransparent surface. Reflection depends on the roughness or smoothness of the surface, in relation to wavelength of radiation. According to Rayleigh criterion, if the surface height variations are less then λ8, the surface height is considered to be smooth otherwise it is rough. Accordingly there are two types of reflections; 
(i) Specular and (ii) Diffuse reflection 
·        Specular reflection: If the surface is smooth relative to the Wavelength, Specular reflection occurs which follow the law of reflection. They occur with surfaces such as mirror, metal and a calm water body. This type of reflections is undesirable in remote sensing. 
·        Diffuse reflection: Diffuse reflection occurs when the surface iS' rough in relation to wavelength. In these reflections energy is reflected almost uniformly in all directions. Diffuse reflections are useful in remote sensing. In nature mixed reflections occurs most frequently. 
Absorption and Transmission 
Absorption occurs when the object absorbs the radiation. Transmission occurs when the radiation passes through an object or target. 
RESOLUTIONS 
Resolution of a system refers to its ability to record and display fine details. The images are described in terms of its scale as well as in terms of its resolution. In remote sensing we need three different types of information such as spatial, spectral and radiometric (intensity) information. Accordingly the sensor system varies in principles of detection and construction. The types of sensor systems used to acquire different information. 
Types of Resolution
In remote sensing there are four types of resolution 
1. Spatial resolution
2. Spectral resolution
3. Radiometric resolution and
4. Temporal resolution
Spatial Resolution 
Spatial resolution refers to the size of the smallest possible feature that can be detected. It is depended on the IFOV of the sensor. In many of the remote sensors, a small elemental area is observed at a time and such a field of view of the sensor is called the Instantaneous Field of View (IFOV). However, it should be noted that though the spatial resolution has a bearing on the IFOV, it does not entirely depend only on IFOV. There are various other factors such as satellite altitude, the relative motion between IFOV and the ground during the ‘dwell time’ (the time for which sensors looks over the elemental area), sampling frequency of the measurement, characteristics of all the subsystems of the sensing system, which contribute significantly to the overall spatial resolution of the system. Spatial resolution decides the smallest size of the observable picture element or pixel (under a given state of the art of detector technology). spatial resolution of remote sensing sensors is given in terms of the pixel size dimension. A pixel can be a square or rectangular shape.
Spectral Resolution 
The radiation reaching the remote sensor from the earth’s Surface cover the entire electromagnetic spectrum. The spectral resolve the energy received in a given spectral bandwidth to characterize different constituents of earth’s surface. Thus the spectral resolution is defined by the spectral bandwidth of the filter and the sensitiveness of the detector. Thus, for example, onboard the Land sat satellite; the multispectral scanner system had the capability to resolve the earth’s surface features at 80 m spatial resolution using four spectral bands viz, 0.5-0.6 µm, 0.6-0.7 µm, 0.7-0.8 µm, and 0.8-1.1 µm. The last band has a bandwidth of 0.3 µm as opposed to 0.1 µm of the rest of the bands. As the incoming solar radiation in this year infrared spectral region is small as compared to other bands, one had to increase the bandwidth three times in order to maintain the spatial resolution at 80 m as well as the given the signal to noise ratio requirement. On the other hand if the spectral bandwidth of the fourth band is maintained at 0.1 pm, the spatial resolution would be much larger than 80 m in order to have the same signal to noise ratio.
The Thematic mapper (TM) of the Land sat satellite has seven spectral bands viz., 0.45-0.52 µm, 0.52-0.60 µm, 0.630.69 µm, 0.76-0.90 µm, 1.55-1.75µm, 10.4-12.5 µm and 2.082.35 µm. With the upgradation of technology that was used in the multi-spectral scanner design in the earlier satellites, the spatial resolution could be increased to 30 m even with the reduction in spectral bandwidth in the visible' and reflected infrared region of electromagnetic spectrum, i.e. the TM bands 1 to 5 and band 7. In case of TM thermal band (10.4 to 12.5 µm), the energy had to be integrated over 16 times larger area (i.e. 120 m) as well as over a bandwidth of 2.1 µm to provide an acceptable signal to noise ratio. This is because energy emitted by earth is small, the average surface temperature of earth being only 300 °K in comparison to the sun with its surface temperature of 6000 °K even when the differences in the distances are accounted for. 
The Linear Imaging Self Scanning Sensor (LISS) onboard Indian Remote sensing Satellites (IRS-lA, 1B) has four spectral bands, viz., 0.45-0.52 µm, 0.52-0.59 µm, 0.62-0.68 µm and 0.77-0.86 µm. 
In LISS system an array of 2048 element charge coupled devices (CCDs) is provided for each spectral band so that a separate detector collects signal from each pixel instead of through scanning mirror as in Land sat, to see different pixels along a scan line. Due to satellite velocity, the time, available to scan a line of say about 185 km (swath width of Land sat) is fixed, and this is shared by Viewing say 11 pixels in the scan line. In L188, 3 whole line is swept like the pushing broom on railway platform during cleaning operation where in each detector corresponding to a pixel gets the whole of the time available for scan line as dwell time (time during Which signal is integrated) and this improves the quality of signal from the pixel and also minimizes the geometric distortions caused due to non-uniform motion of mirror as in Land sat. More number of narrow spectral bands gives rise to greater ability to discriminate various features of the earth’s surface. Table 5.1 gives the sensor details and utility characteristics of various sensing systems onboard Land sat, IRS and the French SPOT satellites. Table 5.2 gives description of IRS-lC and Spot-4 satellites.
Radiometric Resolution 
The ability to distinguish line variations in the radiance values of the different objects is characterized by the radiometric resolution.
In remote sensing, the reflected radiation from different objects generates an electrical signal (say, voltage) as output from the detector. This analogue voltage is digitized resulting into a digital number corresponding to the elemental area of the ground scene or pixel. The number of levels into which the output signal can be divided is dictated by the availability of data bandwidth and the signal to noise ratio. This is similar to the number of grey shades that can be seen in a black and white I photograph. -For example, the multi-spectral scanner onboard the Land sat satellites has a radiometric resolution of 1/64 in all the four spectral bands it uses. It means that 64 different values of radiance can be detected on the imagery obtained through the Land sat multi-spectral scanner. On the other hand, the thematic mapper flown on the Land sat 4 and 5 satellites had a radiometric resolution of 1/256 for all the seven bands in which it works. For comparison, the LISS I and II (Linear imaging self Scanner) on board the Indian Remote Sensing Satellite, IRS-l had a radiometric resolution of 1/128. 
For radars operating in the microwave range, the radiometric resolution is given in terms of decibels representing the minimum signal level that can be detected with an acceptable signal to noise ratio. Decibel is one tenth of a bit, which is the logarithm of the ratio of the signal strength to a reference value. As an example, typical synthetic aperture radar has a radiometric resolution of about 1 to 2 decibels. 
Temporal Resolution 
Temporal resolution is specific to space borne sensors particularly to sun-synchronous satellites. These are polar orbiting satellites having 9-16 hours rotational period and cross the equator at the same local time (solar time) in each orbit. Such an orbit offers similar sun illumination conditions for all observations taken over different geographical locations along given latitude in sun-lit areas. By suitable selection of the spacecraft altitude and the inclination angle of the orbit, the spacecraft can be made to cover the same area on the earth at regular intervals For example, the Land sat 1, 2 and 3 had an orbiting altitude of 918 km, inclination of 99.114° and the repetition cycle of 18 days. For Land sat 4 and 5 with an altitude of 705 km and an inclination of 982°, the repetition cycle is 16 days. The Indian Remote Sensing satellite (IRS-lA and 1B) at an altitude of 904 km, with inclination of 99.02° and repeat cycle of 22 days. With proper placement of two satellites in orbit the repetition cycle could be reduced to half, say 11 days in case of IRS observation system. With such a repetitive coverage, a given area on earth can be observed at regular intervals and dynamic features such as vegetation and water resources can be very effectively studied and analyzed. This ability to have revisit over any given area by remote sensor at regular interval is defined as temporal resolution. 


SCALE
Images can be described in terms of scale which is determined by the effective focal length of the lens of the remote sensing device, altitude of the platform and the magnification factor employed in reproducing the image. 
Generally there are three type of scales such as small scale, intermediate scale and large scale. The quantitative range of the scale are given below: 
Small scale  > 1:500,000                            1 cm: >5 km
Intermediate         1: 50,000 to 1: 500,000                   1 cm= 0.5 to 5 km
Large scale  <1: 50,000 ' '                          1 cm: < 0.5 km 
The large scale images provide more detailed information than the small scale images. 
The following scales are used at different levels: 
1 : 1,00,000                             Intermediate scale                    National level
1 : 2,50,000                             Intermediate scale                    State level 
1 : 50,000                      Intermediate scale                    District level
<1 : 8,000                      Large scale                     Village level 
Photo Scale
Photo scale of the 'aerial or satellite imageries .is computed as the ratio of the distance and the photo or map ((1) to actual distance on the ground (D) between any two known locations. 
S = d/D
For the photographs taken in the vertical (Nadir) view, the photo scale is a function of the focal length of the camera (f), the flying height of the platform (H) and the magnification factor (M) i.e.
Image scale = Mf/H 
In this type of scanner, the scan direction is along the track (direction of flight) and hence the name along track scanner (Fig.5.7). It is also called push broom scanner because the detectors are analogous to the bristles of a push broom sweeping a path on the floor. 
Development of charge-coupled device (CCD) has contributed to the successful design of the along track scanner. In this the sensor elements consist of an array of silicon photodiodes arranged in a line. There are as many silicon photodiodes as there are ground resolution cells (corresponding to IFOV) accommodated within the restricted FOV of the sensor optics. Each silicon photodiode, in turn, is coupled to a tiny charge storage cell in an array of integrated circuit MOS (metal oxide semiconductor) device forming a charge coupled (1ng (CCD) (Fig.5.8). When light from a ground resolution cell strikes a photodiomm it generates a small current proportional to the intensity of light falling on it and the current charges the storage cell placed behind the diode. The charged cells formepart of an electronic shift register which can be activated to read out the charge stored in the cells in a sequential fashion. The output signals are correlated with the shift pulses, and digitized to reconstitute the image. 

Comments

Popular posts from this blog