what is photogrammetry

Upload: natasha-jacobs

Post on 04-Jun-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/13/2019 What is Photogrammetry

    1/17

    What is photogrammetry?

    Photogrammetry(Greek:phot(light) +gramma(something drawn) + metrein(measure)) is

    the science of making measurements from photographs.

    Basic example:the distance between two points that lie on a plane parallel to thephotographic image plane can be determined by measuring their distance on the image and

    then multiplying the measured distance by scale parameter.

    Typical outputs:a map, drawing or a 3d model of some real-world obect or scene.

    Related fields:!emote "ensing, G#".

    Main task of photogrammetry

    #f one wants to measure the si$e of an obect, let%s say the length, width and height of a house,

    then normally one will carry this out directly at the obect. &owe'er, the house may not eist

    anymore e.g. it was destroyed, but some historic photos eist. *hen, if one can determine

    the scale of the photos, it must be possible to get the desired data.

    f course one can use photos to get information about obects. *his kind of information is

    different. "o, for eample, one may recei'e qualitative data(the house seems to be old, the

    walls are colored light green) from photo interpretation, or quantitative datalike mentioned

    before (the house has a base si$e of by meters) from photo measurement, or information

    in addition to one/s background knowledge (the house has elements of classic style), and so

    on.

    0hotogrammetry pro'ides methods to get information of the second type: 1uantitati'e data.

    2s the term already indicates, photogrammetry can be defined as the science of measuring in

    photos4, and is traditionally a part of geodesy, belonging to the field of remote sensing (!").

    #f one would like to determine distances, areas or anything else, the basic task is to get obect

    (terrain) coordinates of any point in the photo from which one can then calculate geometric

    data or create maps.

    b'iously, from a single photo (two-dimensional plane) one can only get two-dimensional

    coordinates. *herefore, if one needs three-dimensional coordinates, a way how to get the third

    dimension is to be found. *his is a good moment to remember the properties of human 'ision.

    &umans are able to see obects in a spatial manner, and with this they are able to estimate thedistance between an obect and themsel'es. 5ut how does it work6 #n fact, human brain at any

    moment gets two slightly different images resulting from the different positions of the left and

    the right eye and due to the fact of the eye%s central perspecti'e.

    7actly this principle, the so-called stereoscopic viewing, is used to get three-dimensional

    information in photogrammetry: #f there are two (or more) photos of the same obect but taken

    from different positions, one may easily calculate the three-dimensional coordinates of any

    point which is represented in both photos (by setting up the e1uations of the rays originating

    in image proections of the obect point on the photos mentioned and passing through the

    obect point itself and after that calculating their intersection). *herefore, the main task of

    photogrammetrycould be defined in the following way:For any object point represented inat least two photos one has to calculate the three-dimensional object coordinates. #f this task

  • 8/13/2019 What is Photogrammetry

    2/17

    is fulfilled, it is possible to digiti$e points, lines and areas for map production or calculate

    distances, areas, 'olumes, slopes and much more.

    When do we need photogrammetry?

    *he first idea that comes up to one/s mind in association with measuring distances, areas and'olumes is most likely to be about a ruler or a foot rule. &owe'er, there are situations when it

    doesn/t work, like one of the following: the object itself doesnt existany more (but the photos

    from the obect are preser'ed) or the object cannot be reached(for eample, areas far away or

    in countries without ade1uate infrastructure which still can be photographed).

    8urthermore, photogrammetry is a perfect option for measuring "easily transformed" objects

    like li1uids, sand or clouds, as it a'oids physical contact.

    #n addition, photogrammetry enables one to measurefast moving objects. 8or instance these

    may be running or flying animals or wa'es. #n industry, high-speed cameras with

    simultaneous acti'ation are used to get data about deformation processes (like crash tests with

    cars).

    9omparing laser scanning techni1ues which are widely spread today both in terrain models

    generating and in close range case to get large amount of 3 point data (dense point cloud) to

    photogrammetry, one could note the following. *he ad'antage of laser scanning is that the

    obect can be low textured a situation where photogrammetric matching techni1ues often

    fail. n the other hand, laser scanning cannot be used for fast moving objects. ;oreo'er, laser

    scanning is time consumingand still very expensive, comparing with photogrammetric

    methods. *herefore, these methods may be considered as a complimentary to each other.

    Types of photogrammetry

    0hotogrammetry can be classified in a number of ways but one standard method is to split the

    field basing on camera location during photography. n this basis we ha'eAerial

    Photogrammetryand lose-!ange Photogrammetry.

    #n 2erial 0hotogrammetry the camera is mounted on an aircraft and is usually pointed

    'ertically towards the ground. ;ultiple o'erlapping photos of the ground are taken as the

    aircraft flies along a flight path.

    #n 9lose-range 0hotogrammetry the camera is close to the subect and is typically hand heldor on a tripod.

  • 8/13/2019 What is Photogrammetry

    3/17

    . *he in'ention of photography byL. DaguerreandN. Niepcein 1!"laid grounds for

    photogrammetry to originate. *he first phase of de'elopment (till the end of the =#=th

    century) was a period for pioneers to study absolutely new field and formulate first methods

    and principles. *he greatest achie'ements were done in terrestrial and balloon

    photogrammetry.

    >. *he second turning point was the in'ention of stereophotogrammetry (that based on

    stereoscopic 'iewing, see ;ain task of photogrammetrysection) by #. Pu$frich(1"%1).

    uring first world war airplanes and cameras became operational and ust se'eral years later

    the main principles of aerial sur'ey were formulated. #n fact, analog rectification and

    stereoplotting instruments, based on mechanical theory, were already known that days, yet the

    amount of computation was prohibiti'e for numerical solutions. ?ot surprising that &on

    'ru(ercalled photogrammetry of the period /the art of a'oiding computations/.

    3. *he third phase started with the ad'ent of the computer. 1")%ssaw the birth of analytical

    photogrammetry, with matri algebra forming the basis. 8or the first time a serious attempt

    was made to employ adustment theory to photogrammetric measurements, yet the firstoperational computer programs became a'ailable only se'eral years later. *rownde'eloped

    the firstblock adustmentprogram based on bundles in the late sities. 2s a result, the

    accuracy performance of aerial triangulationimpro'ed by a factor of ten. 2part from aerial

    triangulation, the analytical plotter is another maor in'ention of the third generation.

    . *he forth generation, digital photogrammetry, emerged due to the in'ention of digital

    photo and the a'ailability of storage de'ices which permit rapid access to digital imagery.

    &ardware supported by special 90

  • 8/13/2019 What is Photogrammetry

    4/17

    igital cameras ha'e been used for special photogrammtric applications since the early

    se'enties. &owe'er, 'idicon-tube cameras a'ailable at that time were not 'ery accurate

    because the imaging tubes were not stable. *his disad'antage was eliminated with the

    appearance of solid-state camerasin the early eighties. *he charge-coupled de'icepro'ides

    high stability and is therefore the preferred sensing de'ice in today%s digital cameras.

    Metric and digita$ consumer cameras

    Metric cameras#n principle, specific photogrammetric cameras (also simply called metric

    cameras) work the same way as the amateur camera. *he differences result from the high

    1uality demands which the first ones must fulfill. 8irst of all, it refers to high precision optics

    and mechanics.

    ;etric cameras are usually grouped into aerial cameras and terrestrial cameras. 2erial

    cameras are also called cartographic cameras. 0anoramic cameras are eamples of non-metric

    aerial cameras.

    *he lens system of aerial cameras is constructed as a unit with the camera body. o lens

    change or #$oom%is possible to pro'ide high stability and a good lens correction. *he focal

    length is fied, and the cameras ha'e a central shutter. 8urthermore, aerial cameras use a

    large film format. Ahile the si$e of > by 3B mm is typical for amateur cameras aerial

    cameras normally use a si$e of >3 by >3 mm. 2s a result, the 'alues of wide angle4,

    normal4 and telephoto4 focal lengths differ from those widely known for eample, wide

    angle aerial camera has a focal length of about C3 mm, the normal one a focal length of

    about 3C mm.

    "imilar to this, for close-range applications special cameras were de'eloped with a medium orlarge film format and fied lenses.

    Digita$ consumer cameras?owadays digital consumer cameras ha'e reached a high

    technical standard and good geometric resolution. ue to this fact these cameras can be

    successfully used for numerous photogrammetric tasks.

    *he differences of the construction principles between metric and consumer cameras can be

    seen, in general, in 1uality and stability of the camera body and the lens. 8urther, consumer

    cameras usually ha'e a $oom ('ario4) lens with larger distortions which are not constant but

    'ary, for instance, with the focal length, so it is difficult to correct them with the help of a

    calibration.

    &a'ing decided to purchase a digital camera to use it for photogrammetry it would be useful

    to take the following remarks into account:

    . &eneral: #t should be possible to set the parameters (focal length, focus, eposure time

    and f-number) manually, at least as an option.

    >. !esolution(?umber of piels): ecisi'e is the real (physical), not an interpolated

    resolution. Generally, the higher the number of piels, the better but not at any price.

    "mall chips with a large number of piels of course ha'e a 'ery small piel si$e and

    are not 'ery light sensiti'e, furthermore the signal-noise ratio is worse. *his one will

    encounter especially with higher #" 'alues (> and more) and in dark parts of theimage.

    http://en.wikipedia.org/wiki/Solid_state_(electronics)http://en.wikipedia.org/wiki/Charge-coupled_devicehttp://en.wikipedia.org/wiki/Solid_state_(electronics)http://en.wikipedia.org/wiki/Charge-coupled_device
  • 8/13/2019 What is Photogrammetry

    5/17

    3. Focal length range($oom): ecisi'e is the optical, not the digital (interpolated) range.

    . 'istance setting(focus): #t should be possible to deacti'ate the auto focus. #f the

    camera has a macro option you can use it also for small obects.

    C. (xposure time) f-number: *he maimum f-number (lens opening) should not be less

    than :>.D, the eposure time should ha'e a range of at least ... E seconds.

    B. *mage formats: *he digital images are stored in a customary format like F07G or*#88. #mportant: *he image compression rate must be selectable or, e'en better, the

    compression can be switched off to minimi$e the loss of 1uality.

    . +thers: "ometimes a tripod thread, a remote release and an adapter for an eternal

    flash are useful.

    #amera ca$i(ration

    uring the process of camera calibration, the interior orientation of the camera is determined.

    *he interior orientation data describe the metric characteristics of the camera needed for

    photogrammetric processes.

    *here are se'eral ways to calibrate the camera. 2fter assembling the camera, the manufacturer

    performs the calibration under laboratory conditions. 9ameras should be calibrated once in a

    while because stress, caused by temperature and pressure differences of an airborne camera,

    may change some of the interior orientation elements. Haboratory calibrations are also

    performed by speciali$ed go'ernment agencies.

    #n in-flight calibration, a test field with targets of known positions is photographed. *he photo

    coordinates of the targets are then precisely measured and compared with the control points.

    *he interior orientation is found by a least-s1uare adustment.

    *he main purpose of interior orientation is to define the position of the perspecti'e center and

    the radial distortion cur'e. ;odern aerial cameras are 'irtually distortion free. *hus, a good

    approimation for the interior orientation is to assume that the perspecti'e center is at a

    certain distance c(calculated during camera calibration) from the fiducial center.

    #$assification of aeria$ photographs

    2erial photography is the basic data source for making maps by photogrametric means. ;any

    factors determine the 1uality of aerial photography, first of all they are design and 1uality of

    lens system and weather conditions and sun angle during photo flight.

    2erial photographs are usually classified according to the orientation of the camera ais, the

    focal length of the camera, and spectral sensiti'ity.

    -rientation of camera ais

    True vertical photograph2 photograph with the camera ais perfectly 'ertical (identical to

    plumb line through eposure center). "uch photographs hardly eist in reality.

    Near vertical photograph2 photograph with the camera ais nearly 'ertical. *he de'iation

    from the 'ertical is called ti$t. Gyroscopically controlled mounts pro'ide stability of the

    camera so that the tilt is usually less than two to three degrees.

  • 8/13/2019 What is Photogrammetry

    6/17

    Oblique photograph2 photograph with the camera ais tilted between the 'ertical and

    hori$ontal. 2 high obliquephotograph is tilted so much that the hori$on is 'isible on the

    photograph. 2 low obliquedoes not show the hori$on. *he total area photographed with

    obli1ues is much larger than that of 'ertical photographs.

    ,ngu$ar co/erage

    *he angular co'erage is a function of focal length and format si$e. "tandard focal lengths and

    associated angular co'erages are summari$ed in *able .

    superwide wideang$e intermediate norma$ang$e narrowang$e

    focal length ImmJ DC C > 3C B

    angular co'erage IoJ K D> B B >

    Ta($e 10"ummary of photography with different angular co'erage (for KL M KL format si$e).

    Spectra$ Sensiti/ity

    panchromatic black and white@

    color(originally color photography was mainly used for interpretation purposes, howe'er,

    recently, color is increasingly being used for mapping applications as well)@

    infrared black and white(since infrared is less affected by ha$e it is used in applications

    where weather conditions may not be as fa'orable as for mapping missions)@

    false color(this is particular useful for interpretation, mainly for analy$ing 'egetation (e.g.

    crop disease) and water pollution. 7.g. Green, !ed, ?#! single sensor camera (see

    multispectral cameras)).

    iducia$ marks

    8iducial marks are fied points in the image plane, that ser'e as reference positions 'isible in

    the image. *hey are useful for image coordinate system setting in case of analog photography.

    Generally they are se'eral fied points on the sides of an image, that define fiducial center as

    the intersection of lines oining opposite fiducial marks. 8iducial center is used as the origin

    for image coordinate system.

    +mage geometry mode$ing

    Note0*he information below is rele'ant to frame photography (photographs eposed in one

    instant) with central proection assumption.

    -(2ect3 camera and image spaces

    *o fulfill the task of geometric reconstruction it is necessary to represent points in the obect

    coordinate system, i.e. a 3 local coordinate system related to the targeted obect or ageographical coordinate system.

    http://en.wikipedia.org/wiki/Multispectral_imagehttp://en.wikipedia.org/wiki/Multispectral_image
  • 8/13/2019 What is Photogrammetry

    7/17

    2t the same time, input data (points on the photos) are referenced in the image coordinate

    system, i.e. > sensor related coordinate system with the origin at the position of piel (,)

    (for digital frame cameras) E in the fiducial center (for analog images).

    8inally, the third space is determined by the camera itself. 9amera coordinate system has its

    origin at the proection center (the center of the lens).

    *hus, certain relations ha'e to be defined between these three spaces to allow

    photogrammetric procedures. 9amera modeling, with intrinsic and etrinsic parameters being

    introduced, sol'es the problem.

    #amera mode$ing

    2s the position of the camera in space 'aries much more 1uickly than than the geometry and

    physics of the camera, it is logical to distinguish between two sets of parameters in modeling:

    ) etrinsic parameters describe the position of the camera in space. *hey are the si

    parameters of the eterior orientation: the three coordinates of the proection center and the

    three rotation angles around the three camera aes. *he parameters of the eterior orientation

    may be directly measured (with G0" and #;< systems), howe'er, they are usually also

    estimated during photogarmmetric procedures.

    >) intrinsic parameters are all parameters necessary to model the geometry and physics of the

    camera. *hey allow to detect the direction of the proection ray to an obect point gi'en an

    image point and eterior orientation data. *he intrinsic parameters describe the interior

    orientation of the camera, that is determined by camera calibration.

    8or a pin-hole camera proecti'e mapping from 3 real world coordinates (, y, $) (obect

    space) to > piel coordinates (u, ')(image space) is simulated by the following linear model:

    (u, ', )*N 2 I! *J (, y, $, )*,

    where homogeneous coordinatesnotation is used.

    f s u

    2 N fy '

    - is the intrinsic matri containing C intrinsic parameters: f, fy- focal length in terms of

    piels@ u, '- principle point coordinates@ s - skew coefficient between and y ais.

    ther intrinsic camera parameters, such as lens distortion, are also important, but can be

    co'ered by linear camera model.

    4and Tare etrinsic camera parameters: rotation matri and translation 'ector respecti'ely,

    which denote the transformation from 3 obect coordinates to 3 camera coordinates.

    +rientation Angles

    http://en.wikipedia.org/wiki/Homogeneous_coordinateshttp://en.wikipedia.org/wiki/Homogeneous_coordinates
  • 8/13/2019 What is Photogrammetry

    8/17

    *o denote camera orientation two different sets of angles are used: O, , P and yaw, pitch, roll triplets. 5oth sets define transformation between real world and camera coordinates. *he

    difference comes from how the georeferenced system is defined: if the reference system is

  • 8/13/2019 What is Photogrammetry

    9/17

    ig. 50!elief displacement.

    *he effect of relief does not only cause a change in the scale but can also be considered as acomponent of image displacement (see 8ig. >). "uppose point * is on top of a building and

    point 5 at the bottom. n a map, both points ha'e identical =, Q coordinates@ howe'er, on the

    photograph they are imaged at different positions, namely in */ and 5/. *he distance d

    between the two photo points is called relief displacementbecause it is caused by the

    ele'ation difference dh between * and 5.

    *he magnitude of relief displacement for a true 'ertical photograph can be determined by the

    following e1uation

    dr N r5dhE& N r*dhE(& R dh)

    where dh is the ele'ation difference of two points on a 'ertical. *hen the ele'ation h of a

    'ertical obect

    h N dr &Er.

    *he direction of relief displacement is radial with respect to the nadir point, independent of

    camera tilt.

    ,ow does flight height and camera focal length influence on displacement

    Het the goal be to take a photo of a house, filling the complete image area. *here are se'eral

    possibilities to do that: take the photo from a short distance with a wide-angle lens (like

    camera position in the figure), or from a far distance with a small-angle lens (telephoto, like

    camera position >), or from any position in between or outside. !esults will differ in the

    following ways:

    *he smaller the distance camera obect and the wider the lens angle, the greater are

    the displacements due to the central perspecti'e, or, 'ice 'ersa:

    *he greater the distance camera obect and the smaller the lens angle, the smaller are

    the displacements.

    http://www.agisoft.ru/wiki/File:Relief_displacement.png
  • 8/13/2019 What is Photogrammetry

    10/17

    #n an etreme (theoretical) case, if the camera could be as far as possible away from the obect

    and if the angle would be as small as possible (super telephoto4), the proection rays would

    be nearly parallel, and the displacements near to $ero. *his is similar to the situation of

    images taken by a satellite orbiting some hundreds of kilometres abo'e ground, where we

    ha'e nearly parallel proection rays, yet influences come from the earth cur'ature.

    "o, at first glance, it seems that if one would like to transform a single aerial image to a gi'en

    map proection, it would be the best to take the image from as high as possible with a small

    angle camera to ha'e the lowest displacements. Qet, the radial-symmetric displacements are a

    prere1uisite to 'iew and measure image pairs stereoscopically, that is why in

    photogrammetric practice most of the aerial as well as terrestrial photos are taken with a wide-

    angle camera, showing relati'ely high relief-depending displacements.

    !elative camera positions

    ig. !09amera positions parallel (left) and con'ergent (right).

    *o get three-dimensional coordinates of obect points one needs at least two images of the

    obect, taken from different positions. *he point 0 (, y, $) will be calculated as an

    intersection of the two rays I0/0J and I0L0J. ne can easily imagine that the accuracy of the

    result depends among others on the angle between both rays. *he smaller this angle, the lesswill be the accuracy. #t is reasonable to take into account that e'ery measurement of the image

    points 0/ and 0L will ha'e more or less small errors, and e'en 'ery small errors here will lead

    to a large error especially in $ when the angle is 'ery small. *his is one more reason why

    wide-angle cameras are preferred in photogrammetry (see 8ig. 3).

    Het 2 be the distance between the cameras and the obect and 5 be the distance between both

    cameras (or camera positions when only a single camera is used), then the angle between both

    proection rays (continuous lines) depend on the ratio 2E5, in the aerial case called the

    height6(ase ratio. b'iously it is possible to impro'e the accuracy of the calculated

    coordinates 0(, y, $) by increasing the distance 5 (also called base). #f then the o'erlap area

    is too small you may use con'ergent camera positions s1uinting4 in contrast to human'ision (parallel). *he disad'antage of this case is that you will get additional perspecti'e

    distortions in the images. ?ote: *he parallel (aerial) case is good for human stereo 'iewing

    and automatic surface reconstruction, the con'ergent case often leads to a higher precision

    especially in $ direction.

    Main photogrammetric procedures

    -rientation of a stereo pair

    *he application of single photographs in photogrammetry is limited because they cannot beused for obect space reconstruction, since the depth information is lost when taking an image.

    http://www.agisoft.ru/wiki/File:Camera_position.png
  • 8/13/2019 What is Photogrammetry

    11/17

    7'en though the eterior orientation elements may be known it will not be possible to

    determine ground points unless the scale factor of e'ery bundle ray is known.

    *his problem is sol'ed by eploiting stereopsis, that is by using a second photograph of the

    same scene, taken from a different position. #f the scene is static, the same camera may be

    used to obtain the two images, one after the other. therwise, it is necessary to take the twoimages simultaneously, and thus it is necessary to synchroni$e the two different cameras. *wo

    photographs with different camera positions that show the same area, at least in part, is called

    a stereo pair.

    *he images in general ha'e different interior orientations and different eterior orientations.

    2nd e'en if corresponding points (images of the same obect point) are measured on both

    images, their coordinates will be known in different systems, thus pre'enting determination of

    3 coordinates of the obect point. 9onse1uently, a mathematical model of the stereo pair and

    a uniform coordinate system for the image pair (model coordinate system), is needed.

    *o define a stereo pair model, supposing that the camera(s) is(are) calibrated and interiororientation parameters are known, one needs to determine:

    relati'e orientation of the two cameras@

    absolute orientation of the image model.

    4e$ati/e orientation

    !elati'e orientation of the two cameras is fied by the following parameters:

    the rotation of the second camera relati'e to the first (these are three parameters - three

    relati'e orientation angles)@

    the direction of the base line connecting the two proection centers (these are

    additional two parameters@ no constraint eists against for shift of the second camera

    in the direction toward or away from the first camera).

    *herefore, the relati'e orientation of two calibrated cameras is characteri$ed byfive

    independent parameters. *hey can be determined if . corresponding image points are given.

    2n obect can be reconstructed from images of calibrated cameras only up to a spatial

    similarity transformation. *he result is aphotogrammetric model.

    ,(so$ute orientation

    *he orientation of the photogrammetric model in space is called absolute orientation. *his is

    actually a task of -parameter transformation application. *he transformation can only be

    sol'ed if priory information about some of the parameters is introduced. *his is most likely to

    be done with control points.

    ontrol pointsit is an obect point with known real world coordinates. 2 point with all three

    coordinates known is calledfull control point. #f only = and Q is known then we ha'e a

    planimetric control point. b'iously, with an elevation control pointwe know only the S

    coordinate.

  • 8/13/2019 What is Photogrammetry

    12/17

    ,ow many control points are needed#n order to calculate parameters at least se'en

    e1uations must be a'ailable. 8or eample, > full control points and one ele'ation control point

    would render a solution. #f more e1uations (that is, more control points) are a'ailable then the

    problem of determining the parameters can be sol'ed as a least-s1uare adustment. *he idea is

    to minimi$e the discrepancies between the transformed and the a'ailable control points.

    ,eria$ triangu$ation

    !erial triangulation "!T# "aerotriangulation#is a comple photogrammetric production line.

    *he main tasks to be carried out are the identification of tie pointsandground control points,

    the transfer of these points in homologous image segments and the measurement of its image

    coordinates. Hastly, the image-to-obect space transform is performed by bundle bloc/

    adjustment.

    *ransition to digital imagery led to appearance of the term digital aerial triangulation. *he

    task implies selection, transfer and measurement of image tie points by digital image

    matching. igital aerial triangulation is generally associated with automated aerial

    triangulationthanks to potential of digital approach to be automated.

    Bundle ad$ustment(bundle block ad$ustment) is the problem of refining a 'isual

    reconstruction to producejointly optimal3 structure and 'iewing parameter (camera pose

    andEor calibration) estimates. +ptimalmeans that the parameter estimates are found by

    minimi$ing some cost function that 1uantifies the model fitting error, and ointly that the

    solution is simultaneously optimal with respect to both structure and camera 'ariations. *he

    name refers to the Tbundles% of light rays lea'ing each 3 feature and con'erging on each

    camera centre, which are Tadusted% optimally with respect to both feature and camera

    positions. 71ui'alently U unlike independent model methods, which merge partial

    reconstructions without updating their internal structure U all structures and cameraparameters are adusted together Tin one bundle%.

    5undle adustment is really ust a large sparse geometric parameter estimation problem, the

    parameters being the combined 3 feature coordinates, camera poses and calibrations.

    ,d/antages of (und$e ($ock ad2ustment against other ad2ustment methods0

    $ei(i$ity: 5undle adustment gracefully handles a 'ery wide 'ariety of different 3

    feature and camera types (points, lines, cur'es, surfaces, eotic cameras), scene types

    (including dynamic and articulated models, scene constraints), information sources

    (> features, intensities, 3 information, priors) and error models (including robustones). #t has no problems with missing data.

    ,ccuracy: 5undle adustment gi'es precise and easily interpreted results because it

    uses accurate statistical error models and supports a sound, well-de'eloped 1uality

    control methodology.

    7fficiency: ;ature bundle algorithms are comparati'ely efficient e'en on 'ery large

    problems. *hey use economical and rapidly con'ergent numerical methods and make

    near-optimal use of problem sparseness.

    Systematic error corrections

    #orrection for $ens distortion

  • 8/13/2019 What is Photogrammetry

    13/17

    ig. 805arrel-shaped (left) and pincushion-shaped (right) distortions.

    Hens irregularities and aberrations result in some image displacement.

    2 typical effect with wide angle lenses are the (arre$6shaped distortions, that means,

    straight lines near the image borders are shown bended to the borders. *his effect usually will

    be less or $ero in medium focal lengths and may turn into the opposite form (pincushion6

    shaped) with telephoto lenses (see 8ig. ).

    5eside these so-called radia$6symmetric distortions, which ha'e their maimum at theimage borders, there are more systematic effects(affine, shrinking) and also non6systematic

    disp$acements. *he distortions depend among others on the focal length and the focus. *o

    minimi$e the resulting geometric errors efforts ha'e been undertaken to find suitable

    mathematical models (one of the most widely used is *rown mode$).

    #n most cases the radial-symmetric part has the largest effect of all, conse1uently, it is the

    main obect for correction. istortion 'alues are determined during the process of camera

    calibration.*hey are usually listed in tabular form, either as a function of the radius or the

    angle at the perspecti'e center. 8or aerial cameras the distortion 'alues are 'ery small. &ence,

    it is sufficient to linearly interpolate the distortion. "uppose one wants to determine the

    distortion for image point p, yp. *he radius is rpN (p>+ yp

    >)V. 8rom the table we obtain the

    distortion drifor riW rpand drfor rX rp. *he distortion for rpis interpolated

    drpN (drR dri) rpE (rR ri)

    *he corrections in - and y-direction are

    drN (pErp) drp

    dryN (ypErp) drp

    8inally, the photo coordinates must be corrected as follows:

    pN pR drN p( R drpErp)

    ypN ypR dryN yp( R drpErp)

    *he radial distortion can also be represented by an odd-power polynomial of the form

    dr N pr + pr3+ p>r

    C+ Y Y Y

    *he coefficients piare found by fitting the polynomial cur'e to the distortion 'alues. *his

    e1uation is a linear obser'ation e1uation. 8or e'ery distortion 'alue, an obser'ation e1uation

    is obtained. #n order to a'oid numerical problems (ill-conditioned normal e1uation system),the degree of the polynomial should not eceed nine.

    http://www.agisoft.ru/wiki/Photogrammetry#Camera_calibrationhttp://www.agisoft.ru/wiki/Photogrammetry#Camera_calibrationhttp://www.agisoft.ru/wiki/Photogrammetry#Camera_calibrationhttp://www.agisoft.ru/wiki/File:Lens_distortion.pnghttp://www.agisoft.ru/wiki/Photogrammetry#Camera_calibrationhttp://www.agisoft.ru/wiki/Photogrammetry#Camera_calibration
  • 8/13/2019 What is Photogrammetry

    14/17

    #orrection for refraction

    ig. ): 9orrection for refraction.

    8ig. C shows how an obli1ue light ray is refracted by the atmosphere. 2ccording to "nell%s

    law, a light ray is refracted at the interface of two different media. *he density differences in

    the atmosphere are in fact different media. *he refraction causes the image to be displayed

    outwardly, 1uite similar to a positi'e radial distortion. *he radial displacement caused by

    refraction can be computed by

    dr N Z (r + r3Ec>)

    Z N [> & E (&>R B & + >C) R > h>E (h>R B h + >C) &\ RB

    where cis calibrated focal length. *hese e1uations are based on a model atmosphere definedby the

  • 8/13/2019 What is Photogrammetry

    15/17

    transformed, say from a "tate 0lane coordinate system to a 9artesian system. *he = and Q

    coordinates of a "tate 0lane system are 9artesian, but not the ele'ations. 8ig. B shows the

    relationship between ele'ations abo'e a datum (h) and ele'ations in the 3 9artesian system.

    #f we approimate the datum by a sphere, radius ! N B3>.> km, then the radial displacement

    can be computed by

    dr N r3(& R h) E (> c>!)

    "trictly speaking, the correction of photo coordinates due to earth cur'ature is not a

    refinement of the mathematical model. #t is much better to eliminate the influence of earth

    cur'ature by transforming the obect space into a 3 9artesian system before establishing

    relationships with the ground system. *his is always possible, ecept when compiling a map.

    2 map, generated on an analytical plotter, for eample, is most likely plotted in a "tate 0lane

    coordinate system. *hat is, the ele'ations refer to the datum and not to the =Q plane of the

    9artesian coordinate system. #t would be 1uite awkward to produce the map in the 9artesian

    system and then transform it to the target system. *herefore, during map compilation, the

    photo coordinates are correctedL so that conugate bundle rays intersect in obect space atpositions related to reference sphere.

    Length and ang$e units

    ?ormally, for coordinates and distances in photogrammetry metric units are used according to

    the international standard. 5ut in se'eral cases, also some non metric units can be found like:

    Foot( / ): "ometimes used to gi'e the terrain height abo'e mean sea le'el, for eample in

    ?orth 2merican or 5ritish topographic maps, or the flying height abo'e ground.

    *nch( L ): 8or instance used to define the resolution of printers and scanners (dots per inch).

    / N >L N 3.D cm@ L N >.C cm@ m N 3.>D/@ cm N .3KL

    2ngles are normally gi'en in degrees. #n mathematics also radians are common. #n geodesy

    and photogrammetry, they use grads. #n the army, the so-called mils are used.

    2 full circle is:

    3B degrees N grads N >] (pi) N B mils

    '$ossary

    66*66

    5ase

    istance between the proection centers of neighboring photos.

    5lock

    2ll images of all strips.

    66D66

  • 8/13/2019 What is Photogrammetry

    16/17

    atum

    2 set of parameters and control points used to accurately define the three dimensional

    shape of the 7arth. *he corresponding datum is the basis for a planar coordinate

    system.

    6666

    8light altitude

    8light height abo'e datum.

    8light height

    8light height abo'e mean ground ele'ation.

    8iducial marks

    2ny marker built into an aerial camera that registers its image on an aerial photograph

    as a fied reference mark in the form of an image. *here are usually four fiducial

    marks on a photograph which are used to define the principal point of the photograph.

    66+66

    #mage

    *he photo in digital representation the scanned film or the photo directly taken by a

    digital camera.

    #mage coordinates E piel coordinates

    #n digital image processing the epression image coordinates refers to piel positions

    (row E column), while in classical photogrammetry it indicates the coordinates

    transformed to the fiducial mark nominal 'alues. 8or differentiation, the epression

    piel coordinates sometimes is used in the contet of digital image processing.

    #mage refinement

    *he process to correct photos for systematic errors, such as radial distortion, refraction

    and earth cur'ature.

    66M66

    ;odel (stereo model, image pair)

    *wo neighboring images within a strip.

    ;odel area

    *he area being co'ered by stereo images (image pair).

    66-66

    'erlaps

    2n image flight normally is carried out in the way that the area of interest is

    photographed strip by strip, turning around the aircraft after e'ery strip, so that the

    strips are taken in a meander like se1uence. *he two images of each model ha'e a

    longitudinal o'erlap of approimately B to D^ (also called end $ap), neighboring

    strips ha'e a lateral o'erlap of normally about 3^ (also called side $ap). *his is not

    only necessary for stereoscopic 'iewing but also for the connecting of all images of a

    block within an aerial triangulation.

    66P66

  • 8/13/2019 What is Photogrammetry

    17/17

    0hoto

    *he original photo on sensor.

    0lumb line

    _ertical.

    66466

    !elief

    topographic 'ariations of the surface.

    !esolution

    *he minimum distance between two adacent features, or the minimum si$e of a

    feature, which can be detected by photogrammetric data ac1uisition systems.

    66S66

    "kew

    2 transformation of coordinates in which one coordinate is displaced in one directionin proportion to its distance from a coordinate plane or ais.

    "tereoplotter

    2n instrument that lets an operator see two photos at once in a stereo 'iew.

    "trip

    2ll o'erlapping images taken one after another within one flight line.

    66T66

    *ilt

    e'iation of the camera ais from the 'ertical.

    4eferences

    ;ultilingual ictionary of !emote "ensing and 0hotogrammetry, 2"0!", KD3, p. 33

    ;anual of 0hotogrammetry, 2"0!", Cth 7d., >, p. C

    ;offit, 8.&. and 7. ;ikhail, KD. 0hotogrammetry, 3d 7d., &arper ` !ow publishes, ?Q.