automated lidar data quality control
DESCRIPTION
Conference presentation for the 2013 International LiDAR Mapping Forum (ILMF), which was held at the Hyatt Regency Denver at Colorado Convention Center in Denver, Colorado from February 11 - 13, 2013.TRANSCRIPT
Engineering | Architecture | Design-Build | Surveying | GeoSpatial Solutions
Automated LiDAR Data Quality Control
February 12, 2013
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 2
Presenter
Matt Bethel, GISP
Director of Technology for Merrick & Company
Development Manager for Merrick’s Advanced Remote Sensing (MARS) software
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 3
Merrick & Company Office Locations
500 employees at 13 national
and 4 international offices
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 4
Merrick’s International Project Experience
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 5
Presentation Objective
This presentation will review an automated approach to airborne LiDAR quality analysis and quality control (QA/QC)
that is based on the USGS’ National Geospatial Program LiDAR Base Specification Version 1.0. It will showcase a fully automated process for analyzing LiDAR data in its entirety to
verify and report compliance to a project’s acceptance criteria.
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 6
http://pubs.usgs.gov/tm/11b4/TM11-B4.pdf
Intended to create consistency across all of USGS’ National Geospatial Program (NGP) funded LiDAR data collections, in particular those undertaken in support of the National Elevation Dataset (NED)
Unlike most other “LiDAR specs”, which focus on the derived bare earth digital elevation model (DEM) product, this specification places unprecedented emphasis on the handling of the source LiDAR point cloud data
USGS NGP Lidar Base Specification Version 1.0
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 7
Who should have LiDAR QA/QC concerns?
Data providers: to ensure that data meets project specifications prior to delivery
Client/End users (commercial entities, local/state/federal organizations): to ensure that they are receiving the products that they purchased and require for their specific needs
Any purchaser of LiDAR data that requires a reliable process to determine if final payment should be authorized
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 8
The Problem – Client Side
RFPs and project scope of works state accuracy requirements but…
rarely say anything about how they will test these requirements
usually talk about absolute accuracy but not always relative
sometimes contradict themselves (“+/-15cm RMSEz at the 95% C.I.”)
are often copied from other documents and the client is left not really knowing what they are asking for or understand what they are getting
most everyone is asking for something slightly different
USGS Lidar Base Specification Version 1.0
“We want that”
“We want pieces of that”
“We want to refer to that but ask for this”
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 9
The Problem – Vendor Side
When contracted to QA LiDAR projects, we have seen a rise in poor quality data as a trade off to push the bidding price down
Data providers vary the procedure, frequency, and extent of their LiDAR calibration
Many vendors use automated boresight tools which could have potentially negative outcomes:
Lower skill level required
Effective enough to be dangerous
Most do not consider all aspects of an error budget
Does not always find and flag flight planning or acquisition issues, sensor malfunctions, or human mistakes
Often times, little to no QA/QC procedures
Some ‘cheat’ to get around proper calibration and other QC tasks
Clipping off or reclassifying edge lap to avoid dealing with LiDAR boresight
Shifting tiles to a custom geoid derived from the vertical error to ground control
Some vendors can hide error through other creative techniques especially if they discover problems after the plane has left the jobsite
These practices can be caught and/or avoided
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 10
The Problem – Quality
QA/QC methodologies ranged from…
None
Checking a representative sample (what happens everywhere else?)
Checking some things but not others (i.e. absolute accuracy but not relative calibration)
Throwing many people and a lot of time at projects to manually check as much data as possible (or that budget will allow)
Contract it out, typically it’s done right but at added costs and delays
Clients rarely know how to properly review LiDAR data nor do they have the tools to do so
We needed more automated tools to get quality answers quickly and accurately about our LiDAR data
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 11
Our Goals
To check all airborne LiDAR data in an automated fashion
Make it work across sensor platforms
Make it accurate
Make it usable
Make it customizable
Make it fast
Provide quantitative and qualitative results, whenever possible
When this is impossible, create derivative products during the automated process that will help the user QC the data as quickly and thoroughly as possible
Create tools that catch problems before they are too late
Create links to supplemental data that can assist with the QC process
Create reports that the end user can understand
Deliver these reports to the client or empower them to perform automated QA/QC analysis on their own data
Provide this tool to end users that have these challenges
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 12
MARS Tool Development
We developed many stand alone tools in MARS to analyze and report many aspects of LiDAR QA/QC
Control reporting tools (absolute accuracy)
Flight line vertical separation rasters (relative accuracy)
Point density reporting
Spatial distribution verification
Hillshade to check LiDAR filter
LAS statistics
Intensity/range analysis
Void detection
Others
These tools run on the entire dataset and often produce a report or a single, manageable, output raster, compressed to a JPEG2000 format for fast display and small file size
Excluding control point reporting, the products of these tools report on all of the data, not a representative sample
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 13
Modularization and Automation
We built a module in MARS that combines our stand alone tools into an automated process that test for the 29 USGS LiDAR specification V1.0 items
This creates two PDF reports (detailed and summary) plus subsequent derivative products
It is batched and performance has been optimized to run on large data sets
Multi-threaded
Effective RAM utilization
Temporary local disc caching for slower network processing needs
It is customizable so that some or all of the tests can be processed, depending on the need or available input data
Output report and derivative data are both thematically rendered and statistically reported
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 14
Results
A comprehensive and automated approach to checking the quality of all LiDAR point file deliverables in their ENTIRETY – no representative sample testing
A tool that saves an enormous amount of manual QC labor hours and dollars
A workflow addition that eliminates costly rework and project delays
A process for data providers to deliver better products (first time delivery acceptance) and invoice the customer sooner
A tool for end users to understand what level of data quality they are receiving and be able to provided proof of required rework. This also educates the client about their data investment.
A mechanism for clients to decrease the delivery acceptance period and start using the data sooner
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 15
Performance Benchmarks
0
5
10
15
20
25
30
35
40
45
0 20 40 60 80 100 120
Ru
nti
me (
ho
urs
)
LiDAR Data Size (GB)
MARS QC Module Benchmark Results Run times depends on:
Data
LiDAR flightline distribution
Flightline overlap
Project boundary complexity
Number of project boundaries
Number of delivery tiles
LiDAR density
Land cover
Processing computer hardware
Number of CPUs
Amount of available RAM
Disc / network speed
Settings
All tests run versus selected tests
Optional derivative data produced
Very rough processing speed (data ratio to
processing time) is ~3 GB per hour on a high
end processing computer (8-16 CPUs and
12-48 GB of RAM)
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 16
Report Demo
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 17
Future Developments
Workflow staged processing
Coverage check
Boresight
Filter
Delivery
Distributed processing
More user definable LiDAR QA/QC tests
Additional LiDAR specifications
Horizontal accuracy measurement and reporting capabilities
Copyright © 2010 Merrick & Company All rights reserved.
PREXXXX 18
Thank you
Matt Bethel
Director of Technology
Merrick & Company - Booth #45
303-353-3662
http://www.merrick.com/Geospatial
http://www.merrick.com/Geospatial/Services/MARS-Software