By Sanchit Agarwal, director of mapping operations; Brad Arshat, director of strategic accounts; and Shawn Benham, senior project manager, Sanborn (www.sanborn.com), Colorado Springs, Colo.
From the glorious past of plane table surveying and analytical plotters to the mesmerizing future of augmented reality and holographic models, there has been a seismic shift in the way maps are created, distributed and consumed. Gigapixel sensors; multicore, multi-thread processors; and incredibly detailed, automated algorithms are revolutionizing aerial mapping. With such powerful choices available for geospatial professionals comes the huge responsibility of leveraging technology wisely to produce more precise, crisp and user-oriented mapping products and applications.
As technology has improved, automated processes, such as parcel extraction and noise reduction, have increased workflow throughput dramatically. Although detailed, specific instructions can be programmed into machines to carry out such routine tasks, a completely unsupervised algorithm implementation can lead to quality control issues. Thus, more than ever, employing best practices throughout all stages of aerial mapping projects is
essential. Here we describe several critical aspects of project planning, data acquisition, data production and quality control.
Few project design activities are as important as planning imagery acquisition. As procurements grow in size and complexity—often involving large numbers of stakeholders who bring money and increased expectations to the table—it’s important to ensure the acquisition process takes all potential needs into account beforehand. This is critical, because schedules and budgets usually preclude re-flying a project if something is overlooked. Some accuracy and resolution factors are nearly universally understood—as is the need for a high sun angle and clear, calm air—but other important criteria may be less evident.
For example, seasonal requirements are an especially important issue. Although certain applications require leaf-on conditions in which vegetation is in its most vigorous state, many (if not most) mapping programs call for maximum exposure of terrain and man-made features, i.e., leaf-off/minimal vegetation, no snow, and streams and rivers within their natural banks. When weather is factored in, this combination of conditions may exist for only a few weeks annually in certain parts of the world.
Temporal requirements may get down to the day of the week or even the hour of the day. Imagery for urban mapping programs that require photogrammetric extraction, such as maintenance holes, hand holes and parking stripes, won’t be as useful if it’s acquired when numerous vehicles obscure required features. Increasingly compelling coastal issues often call for tide-coordinated imagery acquisition. Dense urban environments with tall buildings may offer only a short window of opportunity each day when sunlight banishes shadows that hide the streetscape. Flights often must be coordinated with civilian or military air traffic control facilities. Permanent or temporary restrictions may make access to certain airspaces difficult, if not outright impossible, during what otherwise would be optimal acquisition times.
Such situations may require users to select a sensor with a longer or shorter focal length lens, reschedule or reconfigure flights, or even ask clients to compromise standards and specifications for some deliverables. These choices are better explained in a proposal or project initiation meeting than apologized for after airborne operations.
Successful aerial data acquisition is the foundation of any mapping program. Missteps in collecting aerial data can increase a program’s cost due to re-collection efforts, production delays and issues that may cause the data produced to fall outside required specifications. Consider the following best practices to help ensure data are captured accurately and completely:
• During a busy flying season, a flight crew may have multiple projects within the same region, each with a unique set of requirements. Therefore, it’s critical to define and document detailed project specifications, such as allowable sun angle, target resolution, allowable crab/tilt, photo overlap, etc. Flight plan reviews by the project principal and end client help to ensure area of interest (AOI) has full coverage prior to flight plan distribution.
• To aid in tracking and reporting, unique names or IDs should be assigned to each line/exposure station within the project. As projects grow in size and complexity, ID nomenclatures can help define specific imagery resolutions, flying heights and flight blocks.
• With constant vigilance by local, state and federal agencies against the threat of terrorism, advance notice and flight coordination is becoming more critical every day. It’s no longer sufficient to review published charts or online materials to identify any restrictions within the AOI to ensure access to the desired airspace. Provide specific flight plans and flight expectations to air traffic control well in advance of flights to ensure access is available when conditions become favorable.
• As the size and complexity of aerial programs continue to increase, multiple aircraft often are required within a localized area. When collecting with multiple aircraft, a planning meeting every morning allows crews to plan for the day ahead by reviewing assignments and weather expectations. At the end of each day, complete and circulate detailed flight logs that identify any issues noted during flight that impact data quality, such as turbulence, cloud shadows or sensor anomalies, as well as suggestions for improvement.
• Many imagery programs have parameters (leaf-free, snow-free, cloud-free, etc.) that create narrow flying windows. Given these constraints, the real-time review and identification of issues within the collected imagery is critical, so any required re-collection can be performed while the conditions permit.
In contrast with the era of film photography, when the processing of photos used to take days or even weeks, the latest advances in digital camera technology and mobile computing have reduced the amount of time required for such tasks. Today the initial review of the collected data happens immediately after landing or even during flights. More detailed quality-control processes are performed within 24-48 hours following data collection—usually in office environments—so any identified re-flight routes can be communicated in a timely manner to the aerial team.
Airborne Global Positioning System (GPS) and Inertial Measurement Unit (IMU) technologies enable direct georeferencing processes and provide a boon to mapping contractors and clients alike. Eliminating much of the ground control often required with traditional methods has reduced costs and accelerated schedules. Although establishing a minimal network of accurate and well-distributed ground control points is a preferred practice for high-accuracy mapping programs, even this methodology has benefited from GPS/IMU techniques through greater flexibility in cases where necessary ground control can be set.
The GPS/IMU approach forms the foundation of accuracy for imagery-derived mapping products in programs where it’s applied. As a result, it’s critical to verify the integrity of GPS/IMU results through aero-triangulation and ground checkpoints—still considered to be the gold standard for accuracy testing. In cases where the use of contractor-blind checkpoints isn’t feasible, an aero-triangulation solution must be run on the GPS/IMU control at minimum.
Such a solution can generate a root-mean-square error (RMSE) for ground control against the GPS/IMU control coordinates while withholding some or all of the project’s established ground-control points. This also provides a validity test, prior to running the final aero-triangulation adjustments that incorporate all of the ground-control points, and supports the extraction of the best possible coordinates for subsequent photogrammetric operations. Post-flight ground control also is an option (see “Consider Post-Flight Ground Control,” below).
Digital Elevation Models
Digital Elevation Model (DEM) is a
generic term for elevation models. As shown in Figure 1, there can be different types of elevation data: Digital Surface Model (DSM) and Digital Terrain Model (DTM). A DSM depicts the elevation of the reflective surface with all the visible cultural and man-made features, whereas a DTM provides the elevation of the bare-earth surface.
The temporal characteristics, geometric accuracy and surface quality of the DEM surface play a significant role in an aerial mapping program’s success, as shown in Figure 2. Many project proposals suggest using an existing DEM source. More often than not, the existing DEM comes from an old source that doesn’t depict an area’s current state.
In addition, the level of detail of an existing DEM can be insufficient to generate products at the desired resolution. This can cause artifacts—smearing, warping, shifting, etc.—in the final product and can significantly degrade overall product quality.
Correcting such errors in the downstream production processes using methods like Photoshop can be labor intensive, inefficient and expensive. It’s worth spending the time, resources and energy to thoroughly check, supplement, enhance and produce the appropriate DEM before the rectification process.
In cases where an existing DEM isn’t available, a DEM can be compiled manually using 3-D stereo images. Although the mathematical formula for automatically extracting elevation with auto-correlation techniques has been known for years, the computational horsepower needed to mass produce the surface didn’t exist until recently. Hardware advancements have facilitated the generation of massive elevation models with minimal human intervention. Although these automated surfaces need to be checked, enhanced (using breaklines) and manually formatted before consumption, the time and effort required to produce a DEM has been reduced significantly.
Today’s auto-correlation techniques also eliminate two fundamental problems. As a DEM is produced from the latest imagery, there’s no temporal shift between the imagery and the DEM. In addition, the DEM is derived from the same aerial triangulation solution, so there’s no horizontal or vertical shift between the images and the DEM’s surface. This methodology is gaining momentum and soon will become the default process in an orthoimagery aerial mapping program’s workflow.
Orthoimagery is used as the primary base-map dataset in many enterprise and local geographic information systems (GISs). Recent technological advances in digital cameras, GPS/IMU systems and distributed processing environments have lowered production costs, expanded the amount of imagery collected annually and increased product resolution. These advancements have made it more important than ever to follow best practices to maintain quality and accuracy in this foundational dataset.
Preferred imagery color and contrast are subjective criteria and can vary widely from user to user. As shown in Figure 4, precisely defining a target set of radiometry parameters early in the imagery processing stage is required to achieve a consistent overall appearance prior to rectification. Radiometric color balancing is required across images within a single flight and across multiple mission flights.
Special attention should be given to very dark (shadowed) and very bright (tin rooftop) locations to ensure that no information is lost. Following rectification, further balancing and adjustment may be performed to ensure tonal balance across the entire project area meets the predefined target sample as closely as possible.
With recent advances in computing power and GIS software, seaming and mosaicking orthoimagery has become an increasingly automated process. Regardless of the sophistication of such algorithms, human interaction is needed to create the aesthetically pleasant look of a high-quality orthoimage. As shown in Figure 5, seam lines should be placed to avoid buildings and strategically moved around above ground features as much as practical. The image center, or “sweet spot,” should be maximized to reduce radial distortion and building lean.
Note an image’s undesirable portions—flares, hot spots, minor cloud shadows—and adjust seam lines to use alternative adjacent images. Images should be reviewed at all seam lines to check for positional offsets among ground features. If offsets are noted, review process inputs such as aerial triangulation and DEM. Mosaic locations should be feathered where necessary to ensure seamless and homogenous image appearance.
Editing and finishing activities typically require most of an orthoimagery process’s man-hours and resources. Macro (projectwide) and micro (at appropriate zoom levels for the imagery scale) reviews should be performed to ensure the product is free from defects. Given the inherent distortion of the aforementioned ground features, identify bridges and correct displacements within the project area, as shown in Figure 6.
Review all areas for obvious signs of DTM-related issues—e.g., smears and wavy linear features. Cut or “tile” products to a grid with consistent image sizes and a pre-defined standard naming convention. It’s important for the tile size to be evenly divisible by the pixel size of the orthoimagery to avoid any partial offsets or overlaps between tiles. Finally, perform a horizontal accuracy assessment by comparing ground control coordinates (paneled or photo-identifiable) to the location in the orthoimagery. This will ensure the product meets the required accuracy.
Technology improvements, such as softcopy photogrammetric workstations, use stereo superimposition to allow technicians to see acquired planimetric features traced in vector form directly over the top of imagery. Although this leads to more accurate and complete data collection, it’s a time-consuming, labor-intensive and expensive process. Some promising work is being done with light detection and ranging (LiDAR) data and other technologies. However, the “holy grail” of accurate, fully automated feature extraction has yet to be achieved. Offshoring to locations where low-wage labor is available remains the primary way to reduce costs and accelerate production schedules.
Moreover, planimetric mapping remains a less-than-perfect process from the standpoint of the success rate at which features within a particular class can be captured. Feature capture can be affected by the source imagery’s scale or resolution limits, shadows, vegetation cover, roof overhangs, temporary structures such as mobile canopies, vehicles driving or parked-over features, paved-over or dirt-covered features, or features that are close in color to surrounding terrain features.
It’s becoming increasingly common in major procurements for mapping contractors to be asked what their estimated capture success rate is for each feature class. Where this hasn’t been the case, a candid discussion of this issue before or during the procurement can help to avoid a disappointed or angry client down the road. The same is true for situations that can’t be mitigated by using sensors with a higher dynamic range or flying during a season, day or time that maximizes the chances of a successful capture.
An extra challenge presents itself on projects that entail updating a planimetric database, rather than creating a new one. A mapping contractor is called upon to review hundreds, thousands and even tens of thousands of square miles of imagery; identify the changes; and make the necessary edits to the planimetric database. This is a difficult, time-consuming process.
Accurately estimating the cost and schedule are equally complex tasks. Sometimes a helpful client can ease the process by providing parcel polygons from a GIS for properties where construction permits were pulled since the database was updated or by providing general guidance about the percentage of change and where it occurred. Nevertheless, in the end, the responsibility and liability for an accurate and complete database remains with the mapping contractor.
Automated change detection is a valuable new tool that identifies spectral, textural and linear feature changes between imagery of an area acquired on two different dates and highlights areas where change has occurred, as shown in Figure 7. Another form of change detection can be performed by rapidly generating a new, low-resolution DEM using auto-correlation techniques. This new DEM can be compared with the client’s old DEM using surface subtraction techniques, and 3-D change areas can be flagged for updating. Although neither of these techniques is perfect, and such analysis can generate “false positives,” they apply some degree of automation and efficiency to what has been a time-consuming and highly labor-intensive process.
With increasing reliance on automated processes in aerial mapping projects, there’s a growing need to implement proper checks and balances at key stages of the production workflow. Some of the fundamental tenets of the quality assurance/quality control workflow are as follows:
• Consider the concept of the internal customer, i.e., the upstream department should treat the downstream department as an internal customer. Enlist a formal sign-off process at key stages as the data/information moves down through the production workflow.
• Perform a final review of a representative well-distributed sample of data by an independent internal/external resource before data are shipped to the customer.
• Ensure there are clear and objective pass/fail criteria at each stage of production to validate product quality.
• Submit a small subset of data to the customer for approval before proceeding with full-blown production in cases where quality factors are more subjective, such as a product’s radiometric characteristics. This adds some level of subjectivity and creates a baseline for client expectations of the final product.
• Ensure there’s traceability at each step of the production workflow. If a problem is detected downstream, the workflow should allow a team to trace the problem back to the root cause. This promotes accountability and helps to identify training/process improvement needs in a production workflow.
• Treat customers as a part of the quality control process. Get proper sign-off from a customer at critical stages or projects to reduce the risk of major fallout. Also, understand the ways in which the customer is going to use the data to ensure the data’s ultimate usability.
• Use automated scripts to carry out routine tasks, such as checking the file format, metadata, projection/datum, units, file nomenclature, etc., for all the final products.
On a closing note, experience has shown that outsourcing labor-intensive parts of the production workflow to off-shore subcontractors often creates a significant quality-control challenge. A mapping firm hired by the customer doesn’t have any insight or control of a subcontractor’s resources and processes, inviting difficulties in controlling the quality of the data received. This can result in a recursive, nonproductive feedback loop with the subcontractor in an attempt to sort through all the quality issues. To avoid such situations, perform proper quality checks to a subset of the dataset before ingesting and packaging it into the customer’s final product.
Although it’s not feasible in all environments, consider using post-flight, photo-identifiable ground control points, especially in more urban mapping programs, in place of traditional, monumented and targeted survey control processes. Collecting post-flight, photo-identified ground control points can provide the following benefits:
• Cost savings through the easy recovery and use of points on future projects;
• More robust, risk-averse planning because no targets can be destroyed or obscured before/during imagery acquisition flights;
• Time savings because aircrews don’t need to wait on survey teams to set targets; and
• Better aesthetics and resource savings because crews don’t have to paint unsightly targets on pavements or have to remove painted or paneled targets post-flight.