One of the latest buzzwords in the business world is Big Data. There are many different ways to define Big Data, but it typically refers to very large, complicated datasets that are too complex to be handled or processed by traditional data applications.
Remotely sensed data, such as aerial imagery, occupy a unique position in that they’re both a component and source of Big Data. Proper georegistration generates spatially accurate derivative data pieces and metrics; in its raw form, imagery provides Big Data users the context needed to feel secure in their decision making. Although many providers capture and deliver aerial imagery, it’s the data storage, handling, speed of access, quality and security that must be evaluated and considered when implementing imagery use into a workflow.
Governments have faced challenges presented by Big Data as far back as the 1800s, with data from the 1880 census taking more than seven years to tabulate. In 1888, the Census Bureau held a competition to find a more-efficient method to process and calculate data from a sampling area in St. Louis. The winner received the contract for the 1890 census.
A former Census Bureau employee, Herman Hollerith, won the contract, and modified versions of his electronic tabulator were used by the bureau until computers replaced the machines in the 1950s.
By the 1960s, the term “information explosion” was used to describe the rapid increase in data such as social-security numbers, birth records, property records, publications and photographs. Massive amounts of space were needed to house the information, and the risk of loss also was a concern. With most of the data existing on paper, information could be lost in an instant via fire or flood.
Aerial photography has been used and valued by the government and military dating back more than 100 years, with early capture being obtained by attaching cameras to carrier pigeons. As technology led to advancements in camera systems and the use of rolled film, it became easier and less costly to implement aerial imagery for managing and planning on a broader scale.
Despite such advances, the challenges of managing Big Data (e.g., storage, distribution, quality and security) remained. Indeed, these challenges persist to this day, as the technologies involved in acquiring and managing data advance in fits and starts independent of each other.
Big Data imagery challenges seemed like they would become more manageable with the introduction of computers and their mainstream adoption for more-productive workflows. Scanning printed photos onto hard drives appeared to be an ideal solution, but it soon became evident that computers were restrained by limited processing power and maxed-out storage.
Digital image capture and improvements in resolution created storage and access challenges for the growing computer industry. The higher cost of digital technologies in the early years limited their use by many local and state government entities and agencies as well as smaller commercial businesses.
As technology advanced in the late 1990s, the ability to incorporate aerial imagery and its related data into everyday use became a reality. In those early years, imagery was captured, processed and delivered via a hard drive that the county or municipality hosted on a server provided by the company.
Although this solution allowed access to imagery, it put the burden of maintenance, security and backup on the end user. This thick-client solution also required a knowledgeable information-technology professional onsite to install software on each user’s machine. An early adopter and aerial-image customer once said that they love receiving these great, high-resolution images, but hate that the data are dumped on them when they have limited resources to maintain and manage them.
Client-hosted, Web-based deployment allowed for greater democratization, but came with its own drawbacks. Specifically, the deployment is more complicated and often required additional training to understand system administration. In both scenarios, with data stored onsite, there’s another local point of failure that could compromise access during an emergency.
Counties, provinces, government agencies and commercial businesses considering the implementation of aerial imagery and analytics may think it’s important to have physical possession of imagery, but having data onsite can present storage and security issues.
Even if organizations have the infrastructure to store data, an additional challenge may be presented by the need to run analytics on the data. Infrastructure that supports storage and security may not be high-powered enough to handle rapid analytics. What good is having the data if they can’t be used for analysis?
In the government space, information gleaned from imagery analysis is essential for property inspections, change detection, visualization and mapping for economic development, assisting emergency responders, and planning for catastrophe response or other events. It’s imperative to be able to quickly access imagery and data, particularly in an emergency.
As Pictometry International Corp. gathered aerial imagery of the United States and Canada during the last 15 years, it recognized early on the need to address these issues and developed its CONNECT platform, offering cloud storage and Web-based access and analytics through any Internet browser as well as integration into widely used GIS programs and Web-mapping services.
The software-as-a-service tool allows users to rapidly analyze and interact with data, including the ability to upload GIS layers for additional understanding. In addition to the CONNECT environment, Pictometry offers imagery as a data service through Image Service and Gateway offerings, providing orthogonal and oblique views in a format ready for reports or projects.
Some counties, jurisdictions and commercial businesses hesitate to access data via the cloud because of the perception that purchased images should be tangible and held onsite, or they’re not really owned by the organization. Although Web-based deployment is intangible and can’t be owned per se, Pictometry offer a protective perpetual license for a data copy to be used via a thick-client solution. This allows customers to access imagery and data indefinitely.
A protective perpetual license virtually eliminates challenges in managing Big Data. The data are stored securely with a redundant, non-local backup solution, so if the hosting server would go offline, a backup immediately takes over for no loss of access. The license allows staff to focus on serving their communities and constituents and not trying to build the infrastructure required to manage Big Data resulting from image acquisition and analytics.
Important Questions to Ask Data Providers
The pace at which remote-sensing technology is progressing indicates that Big Data derived from it will continue to expand at an explosive rate. More government entities and commercial businesses will use and rely on such data as they becomes less expensive via new capture platforms such as Unmanned Aerial Systems (UASs).
UAS technology will allow for more-frequent imagery capture for property inspections, construction monitoring, utility inspections, damage assessment following storms or other disasters, and a host of other applications.
In the case of catastrophic events, because a UAS is unmanned, it could get into the airspace over a disaster-stricken area sooner than the Civil Air Patrol. Emergency responders could efficiently allocate resources, determine if there’s a need to recruit additional resources, and identify access points and areas in need of immediate response.
Information provided by a UAS can help utility companies visualize damage extent and prioritize crews to restore power. Disaster-relief organizations such as the Red Cross will have improved response times when they have a full understanding of the community’s damage and most-immediate needs.
Such uses are exciting and may seem easy to implement, but it’s important to remember that the amount of data generated from more-frequent use of up-to-date and, in some cases, real-time imagery will be immense. And the more data generated, the more digital storage is needed.
Most government entities and commercial businesses aren’t equipped with the infrastructure required to effectively support and deploy the massive amounts of data, not to mention the cyber-fortification required to secure it. More data means more vulnerability and exposure to potential hacking. Any organization relying on remotely sensed data needs to be certain that the imagery they’re relying on will be available, delivered rapidly and secure.
Robert Locke is president, Pictometry Government Solutions; e-mail: bob.locke at pictometry.com.