By Matteo Luccio, founder and president of Pale Blue Dot (www.palebluedotllc.com).
For geospatial professionals, the most exciting aspect of the current explosion in unmanned aircraft systems (UASs) is using them for mapping, a task for which they complement manned aircraft and Earth imaging satellites. The trade-offs among these three platforms include cost, endurance, coverage, resolution, processing time and refresh rates.
Additionally, there are significant differences between fixed- and rotary-wing aircraft. In general, the latter is more adept at capturing vertical surfaces. The former is more efficient at covering large distances and areas, because any rotary-wing drone uses most of its battery power moving its wing, reducing time aloft.
However, fixed- and rotary-wing systems can complement each other. For example, a fixed-wing drone can map an entire open pit mine, then a rotary aircraft can better map sharply inclined surfaces. Other differences among UASs include their endurance, the quality of their autopilot system, the sensors they carry, and the area they need to take off and land. Some of these differences are discussed in the accompanying UAS company profiles (see “UAS Company Profiles Highlight Drone Differences”). Drone capabilities also can be distinguished by workflow and customization options.
The workflows for different UAS data acquisition missions are essentially the same, except for the choice of sensor. For example, most agricultural applications require near-infrared (NIR) sensors to produce normalized difference vegetation index (NDVI) maps or multispectral cameras to produce reflectance maps of heat stress on plants; mining applications typically require natural-color, red, green and blue (RGB) cameras; and solar farms require thermal sensors.
“Most of the differences in mapping missions depend on whether you’re creating a 2D orthoimage or an auto-correlated 3D point cloud,” says Brian Murphy, vice president of Business Development with Altavian (www.altavian.com), which develops UAS technology specifically for photogrammetry. “This is simply adjusted in the flight planning by increasing side lap and forward lap.”
To support end data products such as NDVI, agricultural customers typically want color-infrared imagery, which requires installing an NIR filter on the camera.
“Although agricultural and natural resource applications don’t typically require surveyed [Global Positioning System] ground control, it wouldn’t hurt to utilize some ground control to help ensure that when you are flying recurring seasonal missions over the same project area your data will align most closely with other datasets,” relates Murphy. “However, you can typically achieve this by doing second-order orthos, essentially collecting similar points from previously flown datasets.”
Surveying missions require laying out ground control points or using surveyed photo-identifiable points in the imagery.
“Even when using our metric mapping payload, we still recommend using some amount of ground control into your solution, and withholding points to use as checkpoints, after you’ve finished post-processing the imagery,” advises Murphy. “The amount of ground control required depends on the payload you’re flying.”
The eBee UAS, developed by senseFly (www.sensefly.com), carries an RGB camera integrated with, and controlled by, the autopilot. The ground control software works with the autopilot and permits the operator to program a flight.
“The main thing the operator has to do is to overlay a polygon over the area of interest—for example, a mining site—and specify the resolution required for the results—for example, 5 centimeters—and then the UAS flies automatically,” explains Andrea Halter, the company’s co-founder and head of marketing. “The operator will see on the screen all of the flight parameters, such as when and where the flight started. The software will automatically set all the waypoints.”
The software contains a data management system that stores each flight’s path and parameters. This enables the operator to geotag the images taken during each flight and pass this information to processing software that senseFly provides its customers. To land, the eBee measures the wind’s speed and direction, turns to land head into the wind, then uses reverse engine thrust and a ground proximity sensor to minimize its landing area.
“The takeoff and landing are critical,” stresses Halter, “because there can be many obstacles around, such as trees, houses and power lines.”
For agricultural mapping, eBee carries an NIR camera. After a flight, the images are processed into a reflectance map, which allows users to calculate an NDVI and create a prescription map. Users can then turn the map into shapefiles for geographic information system (GIS) software, load them into tractor and sprayer guidance systems, and optimize the amount of fertilizer and pesticide sprayed on crops.
“The data are processed straight through, from UAS to tractor,” says Halter. “You can use the same UAS to output workflow for mining sites, creating business-specific outputs such as 3D point clouds and contour maps, taking stockpile measurements or measuring extraction volumes.”
According to Dana Maher, geospatial software lead for Skyward (www.skyward.io), the company’s clients author operations through a Web application in which they outline the geographic extent of their operation and identify the relevant points of interest. Skyward develops software and provides professional services that assist UAS operators with scheduling, aircraft maintenance programs and other back-office functions as well as aerial imaging. Visualizing in 3D the airspace and different vehicles moving through it poses interesting challenges.
“On the front end, we use Leaflet JS, a mapping library,” explains Maher. “On the back end, we use a database called DB to store a lot of our business-type data about organizations, airframes, personnel, etc. Our actual spatial services are being served by PostGIS on top of PostgreSQL. Then we have a fairly standard Java mill to express our [application programming interface].”
Topcon Positioning Systems (www.topcon.com) offers flight-planning software for its Sirius UAS, which contains the company’s own survey-grade global navigation satellite system (GNSS) real-time kinematic (RTK) receiver.
“The advantage of that solution is that we can provide survey accuracy without the need for ground control,” says Sander Jongeleen, the company’s product manager for mobile mapping. “Of course, you need to either put your base antenna somewhere in your project area or use network RTK.”
Operators use Topcon’s flight planning software to plan a mission, including specifying the required ground sampling distance and image overlap, then wirelessly upload the flight plan to the UAS. The plane is launched without a catapult and flies autonomously; however, the operator can manually intervene at any point. When the UAS lands, the operator downloads the imagery, which consists of JPGs from a camera and their photo locations, then imports it into photogrammetry software for post-processing.
In addition to selecting the most appropriate sensor and setting up the flight plan, resolution and other parameters for each mission, a UAS operator also needs to understand a site’s safety, including obstacles. According to Todd Steiner, Trimble’s product marketing director for the company’s Geospatial Imaging Business Area, users of Trimble’s UX5 UAS can do that using Trimble Access Aerial Imaging software on the controller.
“You follow a safety preflight checklist, upload the flight plan to the aircraft and then launch it,” explains Steiner. “The aircraft then flies and captures the data as you told it to do. It’s an autonomous flight, so the pilot, assuming everything is going fine, isn’t really interacting with the aircraft at that point.”
Once the aircraft lands, the pilot downloads and processes the data using either Trimble Business Center (TBC) or UASMaster, a product that allows more professional photogrammetrists to work with data from a small UAS.
“The application will determine what processing needs to be done on the collected data,” says Steiner. “For instance, if clients are surveying a construction project, they may also be combining total station and GNSS data from the ground or 3D scanning or other types of terrestrial imaging data. They might be doing all of that in TBC software and creating an adjusted dataset. They can then provide the line work and the attributions. That may then go on to a GIS application, into a GIS database, or maybe they’re doing some modeling in a [computer-aided design] package.”
The application determines how you set up the site. “For instance,” says Steiner, “if you want high accuracy because you are looking at the water runoff for an agriculture application, you may want to set up ground control points. If you are doing an asset management or a construction site status flight, you just want to see the assets, so the images alone may be enough to satisfy the customer. We have customers who do wildlife management and are out counting cows. They don’t need to know whether the cow is 1 meter wide or 1.5 meters wide, they just need to know it’s a cow.”
HoneyComb (www.honeycombcorp.com), which specializes in aerial imaging solutions for precision agriculture and forestry, transfers data from a UAS to the cloud, according to Ryan Jenson, the company’s chief executive and senior engineer. In the cloud, the data are mosaicked, orthorectified and delivered via a Web browser. After the UAS lands, the operator pulls out a memory card to recover the imagery, and the telemetry is transmitted wirelessly.
“There’s a real-time feed and an on-board feed,” says Jenson. “Obviously, the on-board feed has a higher fidelity, so we actually pull both of those—the downlink as the UAS is operating and the pure data files from the autopilot.”
Most people in the UAS industry agree that currently the single largest bottleneck to its development is legislation and regulation. According to Steiner, that’s the case worldwide, not just in the United States.
The Federal Aviation Administration (FAA) recently issued a Notice of Proposed Rule Making that would restrict UASs to flying below 500 feet and within line of sight. If implemented, the rule would severely limit the efficiency of aerial imaging of large areas.
“Under 500 feet, you’re typically collecting anywhere from 5 millimeter [ground sample distance] (GSD) to 3 centimeter GSD, depending on what camera payload and focal length you fly,” says Murphy. “However, not all jobs are going to call for 3 centimeter GSD; it might be much more efficient to fly a job at an [above ground level] that would support 5 centimeters or 6 centimeters. Thus, being able to fly higher and ‘beyond line of sight’ would help our customers acquire data more efficiently for projects larger than 20,000 acres or more.”
According to Murphy, in the aerial imagery business, depending on the customer, a company has a limited amount of time each day to acquire data—typically +/- two hours from when the sun is near its 90-degree azimuth to minimize shadows. Restricted airspace near airports and in densely populated areas further limits possible UAS uses.
“As a result, the U.S. UAS industry is really restricted to operations for natural resources, agriculture and mining,” claims Murphy.
Another bottleneck is the time it takes to download, process and transfer data.
“Today, many people use software to post-process their data that takes an enormous amount of time,” says Jongeleen. “So the data collection might require 45 minutes for a flight and your post-processing might require 20 or 24 hours. You’ll see the first offerings of cloud-based computing solutions appear, where you can hide the complexity and speed up the processing work.”
Furthermore, not every business has a fiber-optic Internet connection, so many companies still have to transfer data via FTP or by shipping or hand-delivering USB drives.
Sensor size and weight, though dropping steadily, still limit small UASs—vehicles weighing less than 55 lbs.
According to Murphy, the size and weight of the sensor dictates the power requirement, and, of course, the power dictates the size of the wing.
“In the aviation world, we call it SWAP—for size, weight and power,” says Murphy. “It’s all interrelated. The key is the development of smaller and lighter imaging devices and sensors. The other thing is battery power.”
Currently, light detection and ranging (LiDAR) sensors are too heavy and expensive to fly on a UAS, according to Murphy.
“For the price of a LiDAR sensor for a UAS, you might as well buy a second-hand large-format LiDAR sensor and fly it on a Cessna,” he claims.
In agriculture, especially in the United States, a common challenge is the size of the fields. “With a relatively lightweight camera and a fixed-wing UAS, we can currently cover such large areas,” says Jongeleen. “However, flying hyperspectral cameras is a little bit difficult on large fields.”
Despite the challenges, everybody in the UAS industry agrees it’s poised for phenomenal growth in the near future. According to Matthew Wade, senseFly’s marketing and communications manager, such growth will be driven primarily by two factors: the evolution of pragmatic legislation in different countries and a gradual increase in awareness of the business benefits of commercial drone technology. Consolidation is likely as well.
“Today, you see many, many players in the market,” says Jongeleen. “That will change, partly due to regulations. In the end, there will be a much smaller number of professional players.”
According to Jenson, 2016 will be a strong growth year.
“The technology has several years to go—mostly on the algorithms and the software and the sensor-side integration,” he says. “I see regulation being a big thing in the next year. I see big technology developments in the next two to three years. Then I see growth and diffusion in the types of different commercial applications in the next five to 10 years.”
“Our current mapping business addresses mapping for humans,” says Maher. “In the future, we’re going to be supporting mapping for robots at least as much as we are for humans—making airspace understandable and manageable, helping them with deconfliction, helping autonomous vehicles understand the airspace in which they are traveling with little to no human intervention. One of the really interesting aspects of mapping for drones is that eventually the drones themselves will be, in a sense, consumers of the mapping technology.”
“It will be interesting to see where public acceptance will go and the trade-off between the additional cost to buy or lease a system vs. the long-term cost and intangible cost of poor reliability, lost aircraft, lost data, crashes and things like that,” says Greg Davis, director of product management and business development at Cloud Cap Technology (www.cloudcaptech.com).
Claims Steve Edgar, founder and owner of Advanced Aviation Solutions (www.adavso.com), “[UAS technology] is going to explode. Essentially, drones are going to replace so many things that are a waste of time for humans to do. Some drones can stay up for 32 hours. If we had a natural disaster, such as an earthquake or a tsunami that wiped out everything, including communications, I could put a UAS overhead with a communications platform on board and restore communications. We could fly around at night looking for lost hunters, campers, etc., in the wilderness. [UAS technology] is the future of aviation.”