There was a time when all data wanted to be free. That was followed by all data wanted to be open. Then we moved to all data made accessible via a cloud-based and software integrated marketplace. Each of these data trends still exist and we now seem to be moving toward a new paradigm where the data that we need finds us.
The fact remains that a lack of data (or bad data) are the number one disappointment in the effectiveness and efficiency of GIS insights. The old maxims of “we can’t manage what we can’t measure” and “garbage in, garbage out” apply to this issue. Often the resolution or timeliness of data aren’t good enough to address the problem or analysis is performed on sub-par data only to yield dubious insight.
We’ve certainly come a long way in data availability, with the growing number of sensors and data repositories. How we’ve progressed and where we’re headed is an interesting question as there are a lot of moving parts and a great deal of momentum for a mind-boggling explosion of inputs. The convergence of the Internet of Things, drones and smallsat providers will certainly rewrite the game, helping us to move from real-time to predictive analytics.
Not long ago, federal, regional and local governments embraced a strategy of open data, making their geospatial and other data more available. The premise was that developers would code applications to make use of the data and to provide value back to citizens with information and services. While this certainly has occurred, the movement seems to have lost some momentum.
Part of the problem is a lack of measurable means to assess data quality. The maxim that local data is best is often the case, but that does depend upon how actively it’s being used and how often it’s being updated. Other issues with open data include a lack of completeness, inconsistencies, currency, timeliness (if dealing with change), and validity. There are locations where data are of exceptional quality, but that’s the exception rather than the rule. These issues all compound and data validation rigor is the reason that commercial data providers are continuing to thrive.
The data marketplace provides a new means for improved access by aggregating a wide array of offerings for data providers for discovery within our GIS platforms. Esri has succeeded in setting up a clearinghouse of data providers that can be easily accessed and incorporated via the cloud. This continues to evolve, with Airbus Defence and Space going a step further with their marketplace offering to even allow for the tasking of their Earth observation imaging satellites through the marketplace interface. Such data offerings continue to grow, and with each additional data type the collective repository grows in value.
One of the more interesting developments in the data marketplace progression is the fact that sensor builders are harnessing their own hardware to collect and provide data. This is notable with the Hexagon Imagery Program where the company is using their high-resolution Leica digital imaging sensors to capture and catalog orthorectified aerial imagery and make that available through the cloud via partners (Esri, Valtus and via their own Geospatial Power Portfolio). There’s also the SpyMeSat app from Orbit Logic that not only provides details about satellite companies flying overhead, but also offers the ability to purchase imagery of that spot in the future. This illustrates the shift of value away from hardware (where you have to fly, process and catalog) to on-demand data services.
There are a lot of interesting new business models in the data aggregation and delivery market. nearmap is a company that offers imagery subscriptions in metro areas with targeted offerings such as trucking companies with tools tailored to the insight that they’re after. This accessibility and customization of data is leading to a more sophisticated fit-for-purpose geospatial analysis.
Data scientists at up-and-coming imagery providers like Planet Labs and Google SkyBox are concocting a wide variety of insights derived from imagery. The aggregation and algorithm building are on a path to take things to the next level by parsing imagery pixel-by-pixel and pattern-by-pattern to dramatically improve the insight from our eyes in the sky.
Certainly we now have a much larger set of imagery and geospatial data options to choose from for any specific geography, and our access to what’s available continues to increase. It’s not a far leap from instant access from a choice of platforms to data that finds us. This shift to answering questions before they’ve been asked is right around the corner as the value of timeliness of information is only increasing.