stop, look and listen: auditory perception analogies for radar remote sensing

15

Click here to load reader

Upload: i-h

Post on 15-Apr-2017

215 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Stop, look and listen: Auditory perception analogies for radar remote sensing

This article was downloaded by: [McMaster University]On: 20 December 2014, At: 08:47Publisher: Taylor & FrancisInforma Ltd Registered in England and Wales Registered Number: 1072954Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH,UK

International Journal ofRemote SensingPublication details, including instructions for authorsand subscription information:http://www.tandfonline.com/loi/tres20

Stop, look and listen: Auditoryperception analogies for radarremote sensingI. H. Woodhouse aa Department of Water Resources , WageningenAgricultural University , Nieuwe Kanaal 11,Wageningen, 6709 PA, The NetherlandsPublished online: 25 Nov 2010.

To cite this article: I. H. Woodhouse (2000) Stop, look and listen: Auditory perceptionanalogies for radar remote sensing, International Journal of Remote Sensing, 21:15,2901-2913, DOI: 10.1080/01431160050121302

To link to this article: http://dx.doi.org/10.1080/01431160050121302

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all theinformation (the “Content”) contained in the publications on our platform.However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness,or suitability for any purpose of the Content. Any opinions and viewsexpressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of theContent should not be relied upon and should be independently verified withprimary sources of information. Taylor and Francis shall not be liable for anylosses, actions, claims, proceedings, demands, costs, expenses, damages,and other liabilities whatsoever or howsoever caused arising directly orindirectly in connection with, in relation to or arising out of the use of theContent.

Page 2: Stop, look and listen: Auditory perception analogies for radar remote sensing

This article may be used for research, teaching, and private study purposes.Any substantial or systematic reproduction, redistribution, reselling, loan,sub-licensing, systematic supply, or distribution in any form to anyone isexpressly forbidden. Terms & Conditions of access and use can be found athttp://www.tandfonline.com/page/terms-and-conditions

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014

Page 3: Stop, look and listen: Auditory perception analogies for radar remote sensing

int. j. remote sensing, 2000, vol. 21, no. 15, 2901–2913

Tutorial review

Stop, look and listen: auditory perception analogies for radarremote sensing

I. H. WOODHOUSE†

Department of Water Resources, Wageningen Agricultural University,Nieuwe Kanaal 11, 6709 PA Wageningen, The Netherlands;e-mail: [email protected]

(Received 16 October 1997; in � nal form 6 January 1999)

Abstract. As active microwave instruments become more advanced and theapplication of radar images more proli� c there is an ever present need for a wideraudience to appreciate the workings and subtleties of radar remote sensing sys-tems. Many problems associated with the use and interpretation of such data area consequence of users failing to properly conceptualise the radar system and themanner in which it gathers information about the target. This paper attempts toprovide a conceptual framework for radar systems based on an auditory percep-tion analogy that should allow for a clearer appreciation of the subtle diŒerencesbetween passive optical and active microwave systems. Through a comparisonbetween how eyes and ears function, and by looking at their very diŒerent meansof gathering information, a few of the di� cult concepts that are important ingaining a deeper understanding of radar image production are discussed.

1. IntroductionRadar systems, such as altimeters, scatterometers and imaging radar systems, are

now widely recognised as highly successful tools for Earth Observation from aircraftand satellite. The analysis and interpretation of the measured data, however, isusually di� cult, especially for advanced imaging systems such as synthetic apertureradar (SAR). The e� cient and widespread application of such data therefore requiresresearchers to have a good grounding in the theory of radar remote sensing.Unfortunately, those new to radar often encounter initial conceptual di� culties thatcan lead to lasting misconceptions about the nature of the measurement process.Such misconceptions may ultimately result in later problems when attempting toextract useful information from the data.

It is this author’s opinion that many of these di� culties are due to the dominanceof visual metaphors within the wider remote sensing literature. The purpose of thispaper therefore is to explore the alternative conceptual framework for radar systemsbased on auditory perception analogies. Such a framework can allow students and

†Currently at: Department of Geography, The University of Edinburgh, EdinburghEH8 9XP, Scotland, UK.

Internationa l Journal of Remote SensingISSN 0143-1161 print/ISSN 1366-5901 online © 2000 Taylor & Francis Ltd

http://www.tandf.co.uk/journals

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014

Page 4: Stop, look and listen: Auditory perception analogies for radar remote sensing

I. H. Woodhouse2902

newcomers to form a clearer appreciation of some of the diŒerences between passiveoptical and active microwave systems. Through comparisons between the diŒerentfunctions of eyes and ears, and their distinct means of gathering information, it ispossible to clarify a few of the concepts that are important in gaining a � rmunderstanding of radar imaging.

Those readers already familiar with radar theory will be aware that makingcomparisons between radar and acoustic systems is not new (see Gri� n 1958 for agood example) and that much of the technical overlap between SAR and SyntheticAperture Sonar (SAS) is still evident in the literature (e.g. Gough and Hawkins1997). However, such comparisons are rarely, if ever, to be found in introductoryremote sensing literature. This paper therefore aims to reintroduce some of theseanalogies within a remote sensing context, as well as introducing some new onesthat are of particular relevance to remote sensing applications.

1.1. A word of warningThis paper explores audio perception analogies to aid in the comprehension of

radar imaging. It should be noted that the intention is not to make comparisonsbetween electromagnetic waves and sound waves and it is important to bear inmind that the audio analogy is limited. It cannot, for instance, be used to explainpolarisation eŒects, since sound waves are pressure waves.

1.2. Background: visual versus audio perceptionBoth visual and auditory perception in animals can be considered as ‘remote

sensing’ in that they provide information about objects at a distance. In humans,sight dominates the cognitive model of our local environment and so it is notsurprising that we predominately choose a visual medium for representing a widerange of data. Unfortunately our familiarity with such methods—graphs, maps,photographs , television, and so forth—naturally leads us to conceptualise any 2-Dinformation within the framework of visualisation. Within remote sensing visualmetaphors are common—brightness, (false) colour, shadow, etc.—even whenreferring to non-optical systems such as radar or sonar (e.g. multiple ‘looks’).

However, conceptual diŒerences are apparent in some of the comparative phras-eology between optical and microwave remote sensing. For instance, those workingwith optical systems usually refer to wavelengths, while microwave researchers com-monly use frequencies. Other common radar terms such as chirp, echo, ampli� cation,decibel, also imply audio metaphors. Indeed, the use of the ‘deci-Bel’ (dB) (namedafter Alexander Graham Bell, the Scottish-American inventor of the telephone)originates from early research into auditory perception.

Many introductory courses and textbooks on remote sensing reinforce the visualcontext by introducing the eye (or a lens camera) as a simple example of a remotesensing instrument (e.g. Sabins 1987) even though such systems bear little resemblanceto the workings of a radar system. An equivalent simple example rarely, if ever,precedes the description of radar systems.

There are two principle diŒerences between visual and auditory sensing. The � rstis that vision inherently measures intensity (energy) as a function of direction, orlook angle, whereas hearing maps intensity as a function of time or frequency.

The second, and perhaps more important, is that sound is a coherent wavephenomena. Eyes and cameras do not utilise coherent processes, while certain featuresof auditory perception require inference of phase or phase diŒerences (even if the ear

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014

Page 5: Stop, look and listen: Auditory perception analogies for radar remote sensing

T utorial review. Stop, look and listen 2903

does not explicitly measure the absolute phase). Echolocating bats, for instance, canuse the amplitude and phase of echoes as cues for distinguishing insects from otherobjects.1 These two fundamental diŒerences are also the key distinctions betweenoptical and radar remote sensing.

This paper is divided into two parts: the � rst describes the eye and ear andexamines their functional elements as analogous to certain aspects of optical andmicrowave radar systems. The second part introduces the echolocating bat as anexample of a natural active remote sensing system, and takes a look at howan auditory framework can help to explain some of the properties that are uniqueto imaging radar systems.

3. Sensing systemsFor Earth observation, remote sensing instruments may be described as being

an ‘optical’ system (l > 1 mm: UV, visible, IR and NIR), or a microwave system(l > 1 mm). Such a distinction is quite arbitrary, of course, and many techniques thatare used in one region may also feature in the other, particularly at the cross-overbetween far-infrared and high-frequency (sub-millimetre) microwave systems.However, the purpose of this paper is to compare two common methodologies usedin modern Earth observation: optical/IR imagers, and microwave radar systems. Theimportant distinction between the nature of these systems is that the former uses aseries of lenses and/or re� ectors to collect the radiation and focus it onto a detector,while the latter, because of the longer wavelengths involved, requires the use of anantenna to collect the radiation and direct the signal to a receiver system.

This distinction also characterises the diŒerent ways in which humans, and otheranimals, detect light and sound.

3.1. Optical systems and the eyeThe schematic diagram in � gure 1 represents the major elements of a typical

optical sensor (the details of which may vary between diŒerent instruments) .The collector de� nes the maximum power that can be made available to the

sensor and can be a lens or a re� ecting surface (e.g. plane or curved mirror). Thefocusing optics, which usually consist of numerous optical elements ( lenses and/orre� ectors), then focus the collected optical energy onto the detecting element orelements. In the case of the eye, it is the cornea which collects the light. The cornea,rather than the lens, also provides most of the focusing power of the eye, by the factthat it has a curved surface at the boundary between the air and watery humourthat � lls the eye. The lens is used for making the small corrections necessary to bringobjects into sharp focus.

A scanning element is used to allow wide coverage when there are few detectingelements, but is not necessary when the detecting medium is an array or 2-dimensionalsensor, as is the case in photographic cameras and the eye.

Images at diŒerent spectral bands are acquired by splitting the incident waveinto diŒerent spectral components using beam-splitters, � lters, or dispersive optics,

1Eptesicus (big brown bat), for example, can use the phase of echoes to help perceive theshape of a target. From Simmons et al. (1996): ‘Until recently it has seemed physiologicallyimpossible for bats to represent the phase of echoes, but in fact it is computationally feasibleeven with the known limitations inherent in neural responses recorded from the bat’s audit-ory system’.

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014

Page 6: Stop, look and listen: Auditory perception analogies for radar remote sensing

I. H. Woodhouse2904

Figure 1. A block diagram illustrating the features common to optical remote sensing systemsand the eye. (In this case it is a human eye, but all mammalian eyes exhibit the samegeneral features.)

such as a prism or grating. The detecting element is where the collected energy is� nally transformed into either a chemical imprint (� lms) or transformed into amodulated electrical current (array of detectors or return beam vidicon).

In the eye, the light is focused by the cornea-lens system onto the retina, thedetecting medium. Rather than splitting up the incident light into spectral compon-ents, spectral information is obtained by having the retina composed of four diŒerentlight sensitive cells: the rod cells, that are panchromatic in their sensitivity andprovide information only on light and dark, and three diŒerent cone cells, eachsensitive to a diŒerent range of wavelengths.

An in-depth description of the human eye can be found in Feynman et al. (1989 ).

3.2. Microwave systems and the earMost microwave systems usually consists of three basic elements: (1) an antenna

(and, when necessary, an associated scanning mechanism),2 which collects incomingradiation primarily from the antenna beam direction; (2 ) a receiver, which detectsand ampli� es the collected radiation within a speci� ed range of frequencies; and(3) a data handling system, (which performs digitising and formatting functions on

2The term ‘scanning’ is here used as a generic description for mechanical pointing.

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014

Page 7: Stop, look and listen: Auditory perception analogies for radar remote sensing

T utorial review. Stop, look and listen 2905

the received data as well as calibration and other housekeeping data; Elachi 1987).In addition, active (radar) systems incorporate a pulse generator and transmitter.Figure 2 compares a schematic of a simple microwave radar system with a descriptionof the human ear.

In a general way, the individual components of the ear perform analogous

Figure 2. A comparative block diagram illustrating the features common to both radarsystems and the mechanisms of the ear. (In this case it is a human ear, but allmammalian ears exhibit the same general features.) The transmission component ofthe radar system is also shown for completeness, but does not relate to functions ofthe ear.

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014

Page 8: Stop, look and listen: Auditory perception analogies for radar remote sensing

I. H. Woodhouse2906

functions to each of the steps within the passive part of a microwave system (i.e. apassive radiometer, or the receiving part of an active system).

Perhaps the most obvious similarity is that the outer ear (the pinna or auricle)is essentially an acoustic antenna, collecting the sound energy and providing somedegree of directivity.3 The wavelength of sound is six orders of magnitude greaterthan that of light, so that for hearing an acoustical focusing mechanism analogousto the optics of the eye is not feasible, and a diŒerent mechanism must be employed.Similarly, in remote sensing systems, the wavelengths commonly used for radarimaging range from 10 mm (Ka-band) to 1m (P-band), much larger than thosewavelengths employed for visible and IR imaging (0.3–10 mm). In both cases sizelimitations require the use of an antenna system. The most important diŒerence withoptical systems is that an antenna is not as directional: both microwave systems andears will collect energy from a wide range of directions, even though it may be mostsensitive to signals within a small range of angles.

After the incident energy is collected by the antenna system, it is transferred viaa waveguide to a low noise ampli� er. In the ear the auditory canal acts both as awaveguide to transmit the sound energy to the tympanic membrane (the ear drum)and as a resonant ampli� er, with a optimum e� ciency at frequencies correspondingto the peak sensitivity of the whole ear (2–5.5 kHz).

At this stage it is usual for microwave instruments to employ a heterodyne receiversystem whereby the received signal (radio-frequency, or RF, signal) is not directlydetected, but is converted to a diŒerent, and usually lower, frequency (the intermediate,or IF, signal) where it is further ampli� ed before it is detected. The reason forapplying this approach is that signals at microwave frequencies are often di� cult todeal with directly, whereas a down-converted IF frequency can be handled with avariety of techniques (since ampli� ers and spectrometers are easier to build for lowerfrequencies). The down-conversion is carried out by the mixer, in which the RFsignal is combined with a constant-frequency signal generated by the local oscillator(LO). The output can then be detected and recorded as both amplitude and phase,before undergoing spectral analysis to extract the information required to constructan image.

An analogous function to the mixer is performed by the ossicles of the middleear which, in addition to providing mechanical ampli� cation, also convert the lowintensity airborne sound waves into higher intensity waves carried in the cochlea� uid. The resulting pressure waves in the cochlea duct exert energy along thevibrating basilar membrane that is narrow and taunt at one end and wider and morepliant at the other. The hydraulic pressure waves thus induce a wave-like ripple inthe basilar membrane which travels from the taunt towards the loose end with thehigh frequencies resonating most where the membrane is tight, and low frequencieswhere it is slack. The position of the greatest resonance determines which nerve� bres send signals to the brain. The end result is eŒectively a highly e� cient bankof � lters which allows for the separation, with a good SNR, of the various frequencycomponents of a signal.

3The pinnae are signi� cantly more complex than most radio antenna: the various folds,cavities and ridges in our outer ears provide important re� ections at high frequencies, andresult in a frequency response that is directionally dependent. Such a frequency dependenceon the direction of the sound source (and additional frequency dependant diŒraction by thehead and torso) provide the brain with cues as to the location of the sound relative to thehead (Middlebrooks and Green 1991) .

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014

Page 9: Stop, look and listen: Auditory perception analogies for radar remote sensing

T utorial review. Stop, look and listen 2907

It is this impressive ability of the ear to perform spectral analysis that allows usto perform a variety of impressive feats: the conductor identifying the one instrumentwithin a whole orchestra that is out of key, or being able to distinguish a voice overbackground music. However, while the cochlea does behave more or less like anacoustic spectral analyser, it also processes information in the time domain byintegrating information over a period of time. In this way, the ear processes data inmuch the same manner as a microwave radar system.

Readers are directed to von Bekesy (1957) for a general description of theworkings of the human ear, or Allen and Neely (1992) for an in-depth descriptionof the mechanics of the Cochlea.

3.3. Noise and sensitivityThe noise level, and subsequent sensitivity, of a microwave system also has

analogous processes in the ear. In eyes and optical remote sensing systems, it isprimarily the detecting medium that introduces noise or random errors into the � nalmeasurements (although systematic errors may arise due to imperfections in otherparts of the instrument) . In both microwave systems and ears, however, noise mayarise from all parts of the system.

The antenna or pinna, for instance, may collect ‘background’ noise originatingfrom sources far removed from the direction of interest, while the inner ear ormicrowave receiver will also detect signals originating from a number of othersources. The basilar membrane will pick up vibrations in the skull from such sourcesas teeth clicking, or stomach gurgling, for example. (Fortunately, the low frequencycut-oŒof the cochlea sensitivity is just high enough not to hear the continuoussounds generated by the various other organs of the body.) In a similar way the netIF power reaching the detector of a microwave system will include internally gener-ated noise power.

As in any measurement system, the signal-to-noise ratio is the important measureof the instrument’s performance. In this respect, for an active system, the transmittedpower will also have an in� uence on the � nal performance of the system.

4. Radar image constructionThe human dependence upon sight (when we are lucky enough not to have lost

it) to construct an image of our local environment means that we are often quiteunaware of the role hearing plays in forming our perception of the world aroundus. We often fail to consciously notice the acoustics of a room, for instance, eventhough we obtain from it various clues as to the shape, size and material of a space.

A better example of the use of hearing for scene analysis can be found in othermammals, most notably the use of echolocation by bats—a natural active remotesensing system. Bats use their extremely well-developed audition to listen to theechoes of their own vocalisations that are re� ected by surrounding objects or bytheir prey. In fact bats incorporate a number of features in their echolocationtechniques that are similar to those found in radar systems (Gri� n 1958). As wellas bats, some insectivores, as well as whales, dolphins and seals, and a few birds,echolocate as well.

The transmission of a signal to illuminate the target scene is, of course, whatmakes radar and audio echolocation an active form of remote sensing. A � ashcamera is often used as an equivalent example of an active optical system, yet this

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014

Page 10: Stop, look and listen: Auditory perception analogies for radar remote sensing

I. H. Woodhouse2908

comparison distracts from the key features that make optical and radar imagesdiŒerent. Some of these are discussed in the following sections.

4.1. Radar geometryThe ear and radar are essentially one-dimensional remote sensing systems, meas-

uring received energy as a function of time. The term ‘one-dimension’ is used hereto mean the equivalent of a one-dimensional array of data points. By employing astrategy of active sensing, and utilising the principle of echolocation, objects maythen be mapped in one direction as a function of their range from the observingsystem.

For mapping surfaces such as the Earth, airborne and satellite radar imagersdirect their transmitted signal sideways so that the time delay between a transmittedpulse and a return echo from a target is used to estimate its slant range, which canthen be related to an approximate equivalent ground distance by applying somebasic assumptions about the surface. The subsequent construction of a 2-D radarimage results from utilising the motion of the instrument to scan the ground alongthe � ight path.

The important point to stress is that a radar imager, like a bat, principally obtainsinformation as a function of distance from the instrument, rather than relative lookdirection.

4.2. Image resolutionThe diŒerent viewing geometries applied by optical and radar also have a

signi� cant in� uence on the resolving power of the two types of instrument.The resolving power of an optical system is limited by a combination of the

diŒraction limit of the apertures of the collecting system, and the resolution of themeasuring medium and scanning system.

In imaging radars the range (across-track) and azimuth (along-track) groundresolutions are de� ned by quite distinct qualities of the system.

4.3. Range resolutionThe ability of bats to distinguish two objects in range is dependent upon them

being able to detect distinctly the returned echoes from each object. Equivalently,the limit of the range resolution of a radar system is its ability to distinguish (intime) the return pulses from two objects. Once these objects have range distancesclose enough together that their return pulses overlap they are no longer resolvable.

The shorter the pulse, therefore, the better the spatial resolution in the rangedirection. However, it is rather impractical to generate short rectangular pulses withthe high peak power required to provide adequate SNR, so it is usually the casethat a linear frequency modulated (FM) or chirped pulse is used. (The same processapplied to an audio signal would sound like a ‘chirp’). By sweeping the signal overa small range (bandwidth) of frequencies the transmitted signal is essentially encoded,so that even though a whole collection of overlapping return signals may be received,they can be distinguished in time to an accuracy much shorter than the length ofthe pulse. This technique makes use of the spectral � ltering capabilities of the radarsystem and is analogous to similar frequency modulation techniques which are acommon feature among echolocating bats. It is not hard to imagine being able todistinguish overlapping pulses if we consider the ability of the human ear to isolate

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014

Page 11: Stop, look and listen: Auditory perception analogies for radar remote sensing

T utorial review. Stop, look and listen 2909

sounds with speci� c temporal frequency patterns, such as being able to identifydiŒerent musical instruments that are played simultaneously.

This technique allows the use of microwave pulses in the order of 40 ms but witha range resolution comparable with a 60 ns rectangular pulse. Some echolocatingbats, by using brief broadband (FM) signals can measure object distance to astaggering degree of accuracy, discriminating between echoes that return with adiŒerence in their arrival time of less than 100 ns, which corresponds to a diŒerencein range distance of the two objects of less than 17 mm (Metzner 1991).

4.4. Azimuth resolutionFor a real aperture system, such as a side-looking airborne radar, the resolving

power in azimuth is governed by the diŒraction limit of the antenna, in much thesame way as an optical system is limited by its collecting aperture. A long antennacan produce a relatively narrow beamwidth in azimuth, providing a narrow strip ofillumination. However, to produce good ground resolution at the large distancesassociated with spaceborne instruments an unrealistically large antenna would berequired. To overcome this problem, it is possible to synthesise a much larger antennausing a processing technique that can diŒerentiate the echoes from diŒerent regionswithin the radar beam. This is the principle used in Synthetic Aperture Radarsystems, and is similar to the pulse compression technique described in the previoussection since it utilises the frequency analysing capability of the radar system.

Since the SAR platform is continuously moving the echoes returning from objectsin the front half of the beam are Doppler shifted to higher frequencies and likewiseto lower frequencies for objects in the aft part of the beam. The common audioanalogy is the change in pitch that is heard when a police car or ambulance with asiren drives past a listener at high speed.

For any given target the return signals will change in frequency as they passthrough the radar beam. The resulting frequency ‘history’ of a signal from a targetis in the same form as the linear FM chirp used in range compression (although ona diŒerent scale), (Elachi 1987).

5. Unique features of radar imagesAs a consequence of the singular nature in which radar systems generate an

image a number of unusual image features result which are readily apparent whenobserving a radar image. Consideration of such features is fundamental to designinga practical SAR system, as well as for the proper interpretation of SAR data, andare somewhat clearer to conceptualise within the framework of an auditory analogy.

5.1. L ay-over and foreshorteningThe most noticeable feature of radar imagery arises when the ground surface

diverges signi� cantly from the assumption of a smooth Earth. The side lookinggeometry of a radar system then means that the top of an object, say a mountain,may be closer to the instrument than the bottom of the object nearest to nadir.Considering the audio analogy, this would mean that one would hear the echo fromthe mountain peak before the echo from the foot of the mountain. In the � nal image(which maps the return echoes as a function of time) the top of a mountain maythen be mapped nearer to nadir that the base of the mountain, and in the � nalimage the mountain appears as if it is leaning over. This eŒect is know as layoverand is an extreme case of the foreshortening that results from the side-looking

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014

Page 12: Stop, look and listen: Auditory perception analogies for radar remote sensing

I. H. Woodhouse2910

geometry of imaging radar—even in cases when there is not complete layover, thenear side of the mountain in the � nal image will look considerably shorter than thefar side.

5.2. Radar shadowUnlike shadows in optical images, which are imaged regions that are weakly

illuminated, radar shadows are areas of a radar image that are in principle completelyblack and sharply de� ned since they correspond to areas where there is a completelack of received information. They correspond to the region that lies behind objectsin the imaged scene and from which there is no return echo. This hidden region ismapped as a discrete area since the � nal image is a function of time; i.e. there existsa time delay between the echo arriving from the top of the obstructing object, amountain for instance, and the far edge of the shadow region. Rather than consideringit as radar ‘shadow’, it is perhaps more useful to think of it as radar ‘silence’—aregion of no measured signal.

5.3. SpeckleThe characteristic speckle eŒect found in radar images is a result of interference

among the coherent echoes of the individual scatterers within a resolution cell. ThiseŒect can be illustrated visually using a laser, but may also be illustrated using acombination of loud speakers generating a correlated audio signal (e.g. from a signalgenerator) . As the speakers are moved, or the listener moves their head, the patternof constructive and destructive interference is apparent as variation in the signalvolume. (This is a standard high-school or undergraduate physics demonstrationdescribed in most introductory physics texts.)

5.4. AmbiguitiesAmbiguities are features in an image which do not belong at their imaged

position, resulting when an echo from one location on the ground arrives at thesame time as an echo from a quite diŒerent location. This is generally the result ofspurious echoes from outside the main antenna beamwidth being detected andprocessed as a signal originating from the main swath. The antenna pattern of aradar, like the ear, is sensitive to signals from all directions, even though it may havea particular direction of increased sensitivity. In general such signals are minimisedby designing antenna patterns with low side-lobes, and by timing the pulse transmis-sion so that the large specular re� ections from near nadir arrive just as a new pulseis being transmitted. Signals from beyond the far edge of the imaged swath may alsocontribute, and if the signal arrives within an unexpected echo-reception window,the signal may be wrongly mapped within the image swath.

Imagine, for instance, a succession of identical cries being shouted into the GrandCanyon, with intervals of a few seconds between each. Clearly a collection of loudechoes from the nearest walls will be heard arriving back soon after each shout. Butechoes from further away may arrive back after the subsequent cries have beenmade. Such echoes are ambiguous, since it is not clear which echo corresponds towhich shout.

Such ambiguities may place some restrictions on the design of a SAR system,particularly on the choice of pulse repetition frequency (PRF) and the shape of theantenna pattern.

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014

Page 13: Stop, look and listen: Auditory perception analogies for radar remote sensing

T utorial review. Stop, look and listen 2911

6. Other comparisons between radars and ears6.1. Application of dual systems

Single imaging systems generally provide information on two dimensions of athree-dimensional object—the surface of the Earth, in the case of Earth observation.The result is that certain properties of the target remain unknown: the distance inoptical images and the direction in radar images (illustrated by the lay-over eŒect).

This lack of information can be resolved by the use of two systems takingmeasurements from slightly diŒerent locations. The human visual system uses thistechnique by using two eyes in stereo to detect the small changes in look angle(parallax) that can be used to infer the distance to an object. This is also the principleused in stereo photogrammetry .

The use of two ears, on the other hand, is required to determine angles. One earof an echolocating bat, for instance, will allow discrimination of distance to a targetwith high accuracy,4 but to resolve azimuthal direction, bats and humans need touse binaural sensing.

One of the vital indicators of sound source direction comes from the InterauralPhase DiŒerence (IPD), proposed by Lord Rayleigh, one of the pioneers in spatialhearing research (Middlebrooks and Green 1991). The IPD is the delay between awaveform arriving at the ear nearest the sound source and arriving at the ear furthestaway. This delay results in a shift of the waveform of a fraction of a cycle which iseasily detected by the inner ear. Such a phase shift between sound arriving at eachear can provide the information needed to � nd the corresponding azimuth angle ofthe sound source. This process can provide an azimuthal resolution of 5–10 ß forhumans, and as little as 1.5 ß for echolocating bats (Masters et al. 1985).

Radar interferometry performs a similar function to the IPD by measuring thephase diŒerence between the signals of two images made from slightly diŒerentlocations, and removes some of the directional ambiguity in the across-track plane.Foreshortening is then no longer a problem, and height information can be inferredfrom the relative phase between the two signals.

6.2. Wave-object interactionOur intuitive understanding of optics, learnt through many years of using our

eyes, is a misleading context for comprehending the nature of how microwavesinteract with features on the Earth. We are far more familiar with observing, forinstance, brightness and colour than we are of diŒraction and interference. Ourexperiences of sound, however, provides a more useful analogy for the interactionof microwaves with objects. We are familiar with hearing sounds diŒractedaround corners, or multiple re� ections in a bathroom, or the frequency dependenttransmission of our neighbour’s hi-� through a brick wall (the bass always soundslouder).

The interaction of microwave radiation with an object essentially depends on:

1. The dielectric properties of the object.2. The roughness of the object.3. The size and shape of the object.4. The relative direction of the microwave source.

4A single ear can also provide some degree of information on the elevation angle of asound source as a result of the elevation dependant frequency response of the pinna thatprovides cues to the elevation of the location of a sound.

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014

Page 14: Stop, look and listen: Auditory perception analogies for radar remote sensing

I. H. Woodhouse2912

If we replace ‘acoustic’ for dielectric, and ‘sound source’ for microwave source,then the same list applies for sound waves. The size and shape are important forboth microwaves and sound waves because their wavelengths are on the scale of thetarget objects, (centimetres to metres) so that frequency-dependant diŒraction eŒectsare important.

The description of microwave interaction with an extended target is also similarto that used in acoustic modelling. These similarities are illustrated in � gure 3 (a) and(b). The acoustic response within a room, for instance, typically comprises a shortgroup of sparse echoes, called the early response (comprising of � rst and second-order re� ection from nearby surfaces), immediately followed by a very diŒuse, denseset of echoes that decay with time, called the late response (Anderson and Casey1997). These characteristics corresponds to the description of typical microwavescattering components that occur at, say, a forest canopy. These are: single scatteringfrom branches or the ground, double scattering between elements in the forest canopy

(a)

(b)

Figure 3. The characteristics of room acoustics are often described in the same manner asthe interaction of microwaves with surface features.

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014

Page 15: Stop, look and listen: Auditory perception analogies for radar remote sensing

T utorial review. Stop, look and listen 2913

or between the canopy and the ground, and higher order scattering which representsthe multiple scattering between the canopy elements.

7. Summary and discussionRadar remote sensing instruments continue to be employed both in airborne and

spaceborne platforms, and the use of radar images steadily increases for all mannerof applications, due mainly to its singular ability to penetrate cloud cover and itsprovision of unique information about surface properties. Yet even for the expert,understanding data from microwave radar instruments may be taxing and sometimesdi� cult. Anything, therefore, that can aid in the general appreciation and conceptualunderstanding of such data is surely welcome. This paper attempts to provide suchan aid by summarising auditory-based analogies for radar that highlight the maindiŒerences between passive optical and active microwave systems.

This analogy is primarily for use in teaching about radar imaging, but it mayalso help some established radar researchers think about their subject in a novelway, or consider new approaches to the analysis of their data. Indeed, perhaps it istime, given the growing use and popularity of multi-media both locally and via theinternet, to consider the use of audio cues in the analysis of radar data. The similaritiesbetween hearing and radar imaging may mean that audio interpretation of radarimages will provide new insights into the information available in such data.

AcknowledgmentsThis paper has developed over a long period of time and its evolution has been

greatly enhanced by the contribution of Chris Varekamp who provided invaluablecomments and encouragement during the writing of this manuscript. Additionaluseful suggestions on the text were provided by Dirk Hoekman, Peter van Oevelen,Gordon Peckham and the anonymous reviewer of this manuscript, although anyremaining errors and omissions are entirely the fault of the author. Line drawingsof the ear, the eye and the bat originate from Corel Draw Clipart.

ReferencesAllen, J. B., and Neely, S. T., 1992, Micromechanical models of the cochlea. Physics T oday,

July, 40–47.Anderson, D. B., and Casey, M. A., 1997, The sound dimension. IEEE Spectrum, 34, 46–50.Elachi, C., 1987, Introduction to the Physics and T echniques of Remote Sensing (Chichester:

John Wiley).Feynman, R. P., Leighton, R. B., and Sands, M. L., 1989, T he Feynman L ectures on Physics

(Reading, Mass.: Addison-Wesley).Gough, P. T., and Hawkins, D. W., 1997, Uni� ed framework for modern synthetic aperture

imaging algorithms. International Journal of Imaging System T echnology, 8, 343–358.Griffin, D. R., 1958, More about Bat ‘Radar’. Scienti� c American, 199, 40–44.Masters, W. M., Moffat,, A. J. M., and Simmons, J. A., 1985, Sonar tracking of horizontally

moving targets by the big brown bat Eptesicus fuscus. Science, 228, 1331–1333.Metzner, W., 1991, Echolocation behaviour in bats. Science Progress, 75, 453–465.Middlebrooks, J. C., and Green, D. M., Sound localization by human listeners. Annual

Reviews in Psychology, 42, 135–159.Sabins, F. F., 1987, Remote Sensing: Principles and interpretations (London: W. H. Freeman).Simmons, J. A., Dear, S. T., Ferragamo, M. J., Haresign, T., and Fritz, J., 1996,

Representation of perceptual dimensions of insect prey during terminal pursuit byecholocating bats. Biological Bulletin, 191, 109–121.

Von BeÁ keÁ sy, G., 1957, The ear. Scienti� c American, 197, 66–78.

Dow

nloa

ded

by [

McM

aste

r U

nive

rsity

] at

08:

47 2

0 D

ecem

ber

2014