What Are the Most Significant Forces Shaping the Remote Sensing Industry?
How would you respond to this question? Perhaps even more important, what are the key industry challenges and their possible solutions? Earth Imaging Journal
’s staff solicited the professional views of our Editorial Advisory Board members on these questions for our annual State of the Industry Report.
John Copple, CEO, Sanborn
In an industry still driven by government investment, there are a multitude of forces shaping the remote sensing industry, including massive amounts of geospatial data from new sources and sensors, advances in visual analytics, predictive intelligence, new automated signature development and discovery techniques, cloud storage, Web processing, free Web-based public mapping data and new unmanned collection vehicles. The last 10 years has seen a significant amount of change. For example, technologies for atmospheric sensing have benefited greatly from continuing improvements in lasers, detectors and signal-processing technology. With the coming change in government budgets, the U.S. industry will see a slowdown in the rate of technology development for sensors and analytics, as the majority of development funding in these areas is government based.
Internationally, satellite investment is dominating the majority of spending in remote sensing, as all countries with significant defense budgets continue to invest to achieve their own satellite imaging capability. What will follow are investments in processing and analytics. While various country policies on mapping are evolving, the presence of dozens of satellites provides a new level of practical openness, leaving restrictions effective only within the borders of the countries implementing those restrictions.
Commercially there will be continued investment and advances in Web and mobile platforms consuming and processing geospatial and related data. Integration of static data, mobile real-time data, and analytics in mobile devices to provide real-time location-based information and advertising will move forward with significant advancement in capability. Driven by investment sources other than government, such as advertising and gaming, this area will potentially see the largest growth in the next few years.
John Delay, CSO/Architect MI, Harris Government Communications Solutions
As imagery, full-motion video and wide-area persistent surveillance content have become more available, traditional processing, exploitation and dissemination (PED) solutions no longer meet the needs of the intelligence community. Today’s PED solutions provide limited access to data, are difficult to scale and provide few opportunities to take advantage of new technologies such as analytics or workflow automation. Although these systems have served the community well for managing PED for dedicated sensors, a different approach is required to meet the future demands of the intelligence community.
John Hornsby, Senior Director, Geospatial Strategy, MDA
In recent years, Earth observation developments are occurring at an accelerated pace, both in terms of capability and the ability to use the data.
A positive force shaping the industry is the rapid development of technical and operational capabilities, which greatly enhance the value that Earth observation can bring to our daily lives. This includes the convergence of other complementary technologies and the bringing together of other technologies to provide complete solutions to customers. For example, automatic identification systems and broadband communications greatly increase the utility of synthetic aperture radar imagery for maritime surveillance. When this is combined with other technologies, such as unmanned aerial vehicles, a more complete and valuable solution exists for maritime situational awareness.
The current global financial climate is clearly an opposing force for the industry today. There has been considerable progress in developing lower costs solutions for missions, but highly capable satellites and associated ground segments still represent significant financial investments. Despite the greater visibility of remote sensing through imagery in social media, as well as greater acceptance of its value to provide the public with good information and associated commercial business, space is still often seen in political circles as a luxury item when times get tough.
Utilization, although a fundamental challenge in the industry, also represents an extremely positive force. After decades of development, now there are proven and accepted operational uses for Earth observation—from assisting maritime navigation through ice to monitoring movement in oil fields to preventing well damage from extraction methods. Clear economic benefits are being derived from this utilization, which ultimately increases day by day the industry’s long-term sustainability and growth.
Shawana Johnson, President, Global Marketing Insights
It can’t be denied that global governments are still the underlying key economic supporters of the remote sensing industry, and what impacts them impacts the industry as a whole. With that said, global governments are undergoing major changes that significantly impact the remote sensing industry.
Government remote sensing clients are insisting on:
• Single point of access/data discovery in integrated and federated clouds
• Elimination of data duplication and greater sharing among global government agencies
Organizations such as Esri, Microsoft, Google and Amazon are changing market expectations and demands globally. Simply put, users want data and their technology to be better, faster and less expensive, and these organizations are answering that demand.
Geospatial and location-based data are an expected component in products and services used in environmental and land use projects for climate change; renewable energy analysis; traditional energy analysis; extractive materials analysis; and for protecting natural resources such as water, forests and wildlife. This creates an even greater demand for enterprisewide information infrastructures that can maximize and optimize the utility of large datasets.
Continued acquisitions in the remote sensing and geospatial industry by global conglomerates focused on energy, marine, water and defense applications are mounting daily. The three largest conglomerates at this time are Pasco, Hexagon and Fugro. Another acquisition giant that isn’t specifically focused on remote sensing but affects the industry nonetheless is EADS Astrium. EADS is a global leader in space transportation, satellite systems and services.
Reduced global government and consumer budgets are driving increased government-to-government collaboration, which eventually will reduce multiple purchases by government agencies and support increased government to business collaborations with organizations that will provide on-demand services with less restrictive licensing than the traditional remote sensing and geospatial organizations.
Fred Limp, Director Emeritus, Center for Advanced Spatial Technologies, University of Arkansas
My particular expertise and experience is in the university setting, dealing with research and education. From that perspective, perhaps the most significant force shaping the remote sensing industry is that it is no longer an industry alone. It’s essentially impossible today to effectively deal with remotely sensed data products and not also simultaneously be fully engaged with the broad computational infrastructure, the cloud and all of the emerging elements that define modern computing. In other words, it’s becoming increasingly difficult to isolate a specific remotely sensed “industry” from a part of a larger information technology industry.
The types of research problems faced today range from figuring out how to move traditional image processing algorithms to effectively operate across thousands of cores on a supercomputer to developing algorithms that will extract information objects out of raw data. The integration of remote sensing technologies with information technology infrastructures, combined with the growing ability to accommodate higher-resolution scientific data, is escalating demand for geospatial information at all scales, from individual buildings to global phenomena, as well as driving the development of new collection technologies and techniques.
Modern instruments produce data at astonishing rates, but their raw data must be efficiently processed to extract useful feature information required for maximizing geospatial applications and building decision-support systems—to name only two of the many known and even more unknown consumers of spatial information. Traditional “image processing” or “remote sensing” skills are essential in all of these areas, but they’re only a part. The only effective approach is to identify the complex skills matrix needed in any particular research and development effort and partner with researchers across multiple disciplines. No longer can any single researcher master enough of the technology and methods.
I should note that we’re seeing researchers in other information technology areas, especially machine/computer vision, reinventing much that already has been learned within the traditional remote sensing industry—especially photogrammetry. The focus in these new research directions tends toward speed and efficiency, but accuracy and reliability—with its reliance on stochastic models, for example—will continue to be necessary as data streams pour in from an ever-increasing variety of sources.
Alex Philp, Founder and CEO, TerraEchos
The remote sensing industry continues to experience significant expansion, redefinition and growth. Clearly, so much depends on conventional vs. nonconventional definitions of remote sensing. However, if one allows for “remote sensing” to include all sensor systems and types, then the growth is explosive. Why we limit ourselves to simply airborne and space-based platforms for collecting or sensing data remotely doesn’t make logical sense and seems to be an artifact of normative assumptions.
Definitions aside, and assuming continued growth in the amount of data collected, the forces of storage, bandwidth, analytics and access are at work. One way to frame the issues is to describe these combined forces in terms of “big data in motion” analytics. Increasing demand for near-real-time use of remotely sensed data products is pushing the industry from an “at rest” workflow to an “in-motion” workflow. Thus, we’re seeing increasing emphasis placed upon successfully managing data volume, variety and velocity nonlinear equations. These forces are going to impact the industry for the next three to five years, embedding analytics directly into the remote sensor or shortening the distance between the phenomenology and the consumer.
Brian R. Raber, CMS, GLS, GISP, Vice President, Geospatial Solutions, Merrick & Company
During the last 50 years, creating remotely sensed planimetric and topographic maps has evolved using a variety of film to digital, analog to analytical, hard copy to electronic media, and finally, CAD toGIStechnologies. However, yet another positive force is shaping the remote sensing industry that will have a greater impact than the aforementioned evolutionary changes. As our profession transforms itself from maps to spatial solutions, exciting methodologies now combine traditional 2-D and 3-D map data, with temporal characteristics, subject matter information and human geographic intelligence.
To provide a spatial reference to the countless examples of data fusion, there are many subsurface, terrestrial, air- and space-borne technologies that offer user-defined, “fit-for-use” content and accuracy, enabling the creation of virtual modeling of almost every man-made and natural phenomenon. An example of this type of data synthesis can be found in new geospatial analytical workflows, which combine ground-penetrating radar for subsurface object detection, airborne LiDAR for terrain and point cloud exploitation, multispectral satellite imagery for comprehending vegetation and soil conditions, and historical human geography anomalies and culture information that predict potential migration patterns used for more effective border security. As the world around us continues to become more complex, future advancements in data collection platforms, structured workflow, and software and modeling algorithms that enable the synthesis of disparate data sources and types will launch professional remote sensing services into the forefront of the social, physical, engineering and geospatial sciences.
Bill Wilt, Vice President, North American Sales, GeoEye
On the positive side, we see greater demand for more whole Earth coverage, geolocation accuracy and pixel resolution. These attributes are accompanied by demand for greater services or ease of access to the industry’s products.
Not surprisingly, the positive forces don’t stand alone. Budget pressures around the world also will continue to shape our industry. Of course, there are dangers that there will be downward pressure on the industry because of the current budget environment. But I’d rather look at the upside. Because we provide governments around the world with high-resolution, map-ready data at cost-effective pricing, we could anticipate a greater need for our services in this cost-constrained environment.
Said another way, the existing industry offers products and services in a far more cost-effective way than the alternative build-your-own path that we have seen in the past. Thus, we’ll see an increasing focus on getting more information from the increasing amounts of imagery and other data our users are collecting. Customers are asking for help to “do more with more.”
The net result of these forces is making it incumbent on us, the existing industry, to create greater value for our customers. From government agencies to the military to a range of commercial entities, we’ll see an increasing need to develop customized solutions that will do more than respond to change and actually anticipate it. In the case of our government customers, such solutions will provide users with the tools they need to better protect lives, manage risk and optimize resource allocations. Further, we need to provide better value propositions to our commercial customers so our products and services enhance their business opportunities and improve their business margins.
To meet these needs, the industry will have to show greater and easier access to data, better distillation of data into easy-to-understand information, and an ability to understand the information and provide geospatial predictive analytics. GeoEye has anticipated these shaping forces and fundamentally shaped the company to respond. We’re enhancing our collection-through-production timelines and developing more automation in our higher-level product production lines. We acquired and are enhancing GeoEye Analytics to provide predictive analysis. So we have developed the ability to combine imagery with other information—including social media, history, cultural norms and demographics—that can be cross-referenced to a place to analyze past events at that place to try to predict future events. The “shaping forces” you asked about are working to mature our industry and facilitate greater value propositions to our customers.
What are the key industry challenges and their possible solutions?
The largest challenge in the United States will be uncertain investment levels by government entities in geospatial technology and the uncertain rate of implementation and adoption of geospatial technology. As proven by public Web mapping sites, geospatial technology offers significant advances in organizing and presenting data. With the appropriate analytical tools, the power to unlock extremely valuable information sources is unlimited.
However, there’s no cohesive strategic or investment plan across federal agencies and levels of government to achieve the best return on investment. Each agency and/or level of government continues to operate autonomously despite significant efforts by groups such as the Federal Geographic Data Committee and the National State Geographic Information Council to encourage federal focus and investment in national data sources to be leveraged across multiple levels of government. In conjunction with industry, chief information officers, along with geographic information officers, should develop a cohesive strategy and investment plan for geospatial data and service investments.
Another large challenge remains federal regulation. U.S. companies continue to remain at a significant disadvantage when competing with foreign counterparts. A comprehensive overhaul of remote sensing legislation and policy needs to be undertaken to enable U.S. firms to compete internationally.
To improve the PED process, the intelligence community can leverage commercial technologies for managing content within their enterprises and take advantage of data sharing across network and security domains. One such technology is Enterprise Motion Imagery Content Management (EMICM), a mature content management technology within commercial markets. EMICM systems combine capture, management, search, production, publishing and networking of full-motion video and wide-area motion imagery by leveraging mature digital media asset management (DMAM) workflow systems.
DMAM systems are different than document management systems in that document management systems allow users to find video files while DMAM systems allow users to drill down to the video frame of interest within the files. Such technologies provide the metadata framework required to correlate sources of intelligence information within an enterprise.
Implementing such technologies within an enterprise provides several key capabilities:
By federating metadata as content is ingested into the enterprise, users can search, discover, collaborate, build products and correlate intelligence information. Enabling search and discovery requires that the enterprise adopt and adhere to open standards, such as those of the Open Geospatial Consortium. By providing open APIs, current and future systems can be integrated into the enterprise and enable new application plug-ins, user interfaces and workflow processes, many of which can be performed automatically across the enterprise.
By tracking user consumption and content use across the enterprise, users can devise new workflows that apply enhanced filtering, routing and search methods, as well as eliminate duplications and automate the archiving process.
With the increased volumes of motion imagery content, the ability to automate processes becomes critical because automation offsets the need to add more staff to review and analyze the content. The result is that fewer people are needed to solve complex problems, and intelligence can be derived more quickly. Ultimately, applying automation across the enterprise is the key to future advances in predicting threats within the intelligence community.
EMICM systems provide functions to allow more sophisticated user access controls for sensitive data. Security and access to specific information can be set at the user level and at the staff function level, as well as be record- and or frame-specific. In fact, even information contained on a specific video or imagery file can be processed using watermarking or finger printing technologies to ensure the integrity of the intelligence information hasn’t been compromised.
Every action taken within the enterprise can be tracked and reported for auditing purposes for a wide variety of needs.
Enterprise content management is an enabler to eliminate duplicate content storage, resulting in better data integrity and reduced overall operational cost.
In short, there are several factors driving the intelligence community to adopt an enterprise approach to managing imagery and motion imagery, such as the need to increase efficiency, improve information security and lower the overall cost of information management for the enterprise.
The core challenges facing the industry today are largely the same ones that have existed since the 1970s in the days of ERTS-1. The major exceptions are technical capability and solutions. With the proliferation of missions and the development of complementary technologies to meet customer needs, these are less relevant issues today. The ability to provide persistent surveillance from only satellite platforms is still a challenge, although better combined use of existing missions can go a long way toward solving this problem.
Today’s challenges typically are associated with Earth observation users or customers.
Value to Customers
This is the most fundamental challenge the industry has always had and still has today. Historically, the value was demonstrated by pointing out, for example, ‘You can see a ship in this image.’ This has progressed to ‘This ship is suspicious because it can’t be identified.’ The most value, however, is ‘Out of all the ships, this one is a threat because …’. Great progress has been made to the point where Earth observation is used for routine operational applications. This is in part through improvements in capabilities, but also through greater integration into existing systems and operations. This leads to the next challenge:
Crossing the bridge from demonstrating a capability to bringing that capability into a fully integrated component of an organization’s operation is another key challenge. There are many examples where the science has demonstrated a capability, but that’s as far as it has gone. Barriers include technical complexity to end users, resistance to new methods (the old ways work) and finding the resources to try something new. However, progress is being made. In the case of Synthetic Aperture Radar imaging, there are more examples where the data are providing critical information for maritime surveillance operations for military and civilian agencies.
Science vs. Industry
Other key challenges that date back to the early Landsat days are mission control and the policies associated with data access. These are challenges that, unfortunately, are fueled by emotion, misinformation and the resulting erroneous perceptions of reality. Clearly, Earth observation still relies heavily on government funding, but so do many other industries. The scientific community successfully started and built Earth observation capability to where it is today, including nurturing the growth of a viable industry. In turn, the industry has driven forward to take Earth observation beyond science and into the operational world, generating commerce by addressing the first two challenges. There’s a concerted effort from various agencies and organizations to pull back Earth observation so it only serves science and public good. The solution lies where both needs are addressed—not at the expense of one or the other. Thus, Earth observation will flourish and become more sustainable financially and politically.
Challenges for the remote sensing industry are global and consist of the following:
– Global priorities shifting toward Earth science and away from defense
– Downward trending defense budgets globally, causing a shift toward commercial Earth observation data usage
– Data that are easy to use and accessed quickly at low costs
– Cloud access
– Federated data systems
– Hundreds of commercial aerial and satellite sensors (manned and unmanned) planned for the next two decades, leading to greater data availability
– Water resource management and agricultural security
– Disaster management
– Increased licensing of data and intellectual property as data are distributed at the push of a button
The remote sensing industry must embrace these challenges and adapt. This entails
- Changing business and revenue models to allow users to pay only for the data they want, when they want the data
- Examining the potential of cloud computing to lower upfront costs and minimize capital expenditures
- Determining what products and services can be offered in simpler solution formats and even as “service” offerings
- Evaluating mobile app-based capabilities for remote sensing products and services
There’s great opportunity in the remote sensing industry globally. The simplest business transactions now include some type of geo- or location-based intelligence. That means the remote sensing industry must make the data easy to access and affordable. This can be done, because at last there’s synchronization among the remote sensing industry’s technology capabilities and user needs and expectations.
My answer to this question follows from my earlier comment. The most significant challenge for a university is developing a sustainable workforce of practitioners and educated consumers. Combined majors/minors focused on both remote sensing and computational techniques are one way to achieve this. But the objective must be kept in mind. Are we preparing computer scientists with a basic knowledge of remote sensing or remote sensors with strong computational skills? In fact, both are needed, but the structure and educational requirements of each are different.
As mentioned previously, one of the key industry challenges is decreasing the distance/time between data collection or acquisition and the use of these data by an increasingly diverse user-customer base. With the explosion of mobile computing devices and demand for imagery products, the challenge is to optimize “data in motion” workflows, distribute automated analytical functions across the enterprise and redesign how we add value to the raw binary data across optimized workflows. Currently, and finally, we’re seeing the adoption of standard service-oriented architectures for large-scale U.S. government and commercial imagery data workflows—imagery as Web services as part of a cloud model.
Simultaneously, we see a concurrent trend leap-frogging traditional Web services and cloud models toward data analytics on-board the sensor platform. For example, currently the general business processes are: 1) collect the data, 2) download the data one way or another, 3) store the data, 4) process the data, 5) repackage or format the data into a pre-defined menagerie of file formats and 6) transmit the data products for analysis by analysts who use a redundant variety of commercial or government technologies to conduct analysis and produce products. Transmitting data via cloud Web services to analytical desktops isn’t going to solve the problem in terms of revolutionizing workflow and consumption of derived data products.
What if we were to push certain analytical functions onto the platforms themselves using a combination of next-generation algorithms combined with hybrid computing architecture? How can we shorten the distance or time between collection and analysis, and move from “database-centric” thinking to “in-motion” analytics embedded into the sensor engineering layer itself? What if the next-generational, programmable and affordable chip architectures supported entire analytical packages customized directly for the particular workflow? Overcoming the distance-time problem and moving toward revolutionary architectures for big data in motion analytics will dominate innovative solutions for the next three to five years. Real-time analytical processing of streaming sensor data from multisource feeds, along with valid fusion products, represents an ideal industry goal during this period.
There are an abundance of technical advances being made daily in acquisition platforms, spatial and positional accuracy, analytical software, faster hardware and a host of complementary peripherals. However, with our agencies, organizations and businesses still feeling the effects of the recent global economic downturn, the remote sensing practice must work harder to demonstrate its professional nature and cost-effective viability of services.
With many remotely sensed data types and analysis already being viewed as a commodity, industry lobbying and branding efforts are a key challenge facing our industry. To support the procurement of our expert services, we must work harder than ever to justify that remote sensing should be considered under the category of “professional services” procurement processes, such as the Brooks Act, rather than falling under low price-based procurement options.
I’m a firm believer in the old adage that “you get what you pay for.” As a consumer and citizen, it amazes me that geospatial data and the analytical results from our profession are used in numerous phases of the decision-making process—ultimately providing viable solutions to real-world problems—but are still acquired using a low-bid procurement process. There are many industry organizations in the middle of this battle that need our support. Therefore, now is the time to get involved with these efforts and not succumb to the low-bid procurement traps that lower the value of our professional services and expensive technologies.
Global economic turmoil is the most significant issue facing our industry. Many of our partners, customers and users are reaching the same conclusion: Commercial satellite imagery is cost effective, even in these difficult times, and perhaps especially so.
Few governments need to use their nationally owned satellites 100 percent of the time. By sharing the time among different customers, commercial satellite companies reduce the burden on any one nation. By selling our excess capacity above that which was ordered by the U.S. government, we can effectively offset the costs to our government, and thus to the taxpayer. In addition, private companies like GeoEye are more cost conscious, and certainly more nimble, than most government agencies. We can construct and operate satellites faster and for less money.
We’re fortunate that awareness of the value our industry delivers has been growing. Our customers know the advantages of doing business with us:
- Our industry delivers imagery and value-added products that provide cost-effective geospatial information and insight to decision-makers.
- Our imagery provides precise situational awareness and leads to more efficient decision-making, which, in turn, mitigates risk and saves time and money.
- Demand for imagery is increasing, not decreasing, and so is the demand for speed, accuracy and customer-friendly interfaces, all of which we provide.
- The amount of data is also increasing, not decreasing, and we can help manage that data and make the data accessible when it counts.