Sensors and Systems
Breaking News
Orbit GT upgrades 3D Mapping Cloud to support Meshes, DEMs
Rating12345“I’m pleased to announce yet another great update for...
Seequent acquires Geosoft: Merger of Leapfrog and Geosoft brands creates subsurface geoscience and modelling powerhouse
Rating12345CHRISTCHURCH, NZ and TORONTO, Canada – Seequent, a world...
U.S. Army Geospatial Center (AGC) Awards Strategic ACI $4.9M Contract
Rating12345WASHINGTON – Strategic Alliance Consulting, Inc. (Strategic ACI) announced today...

October 14th, 2014
Is it time for another spatial analysis revolution?

  • Rating12345

Earlier this month, noted geospatial author and spatial analysis pioneer Joe Berry gave a series of talks marking his retirement. His 40-year perspective on the evolution of GIS technology takes in a remarkable progression from punch cards to digitizers and on to personal and portable computers. Much of the changes to GIS were driven by the availability of computing power, with the software always rooted in decision making.

As Joe pointed out, we were distracted by what could be done on the Web in the 2000s, with new and different interfaces, slippy maps, and access anywhere. The enterprise GIS server came alive during the decade as well, but much of the emphasis has been on integration rather than geospatial data insight. Now that computing power is remarkably cheap, connectivity is ubiquitous, and data is exponentially expanding, is it time to turn back to inquiry and the unique insights that are gained from spatial analysis?

Into the Infinite

The geospatial software evolution has been largely rooted in processing speed and computer memory limitations. Taking a trip back in time to see what we spent on computers with less computing power than today’s phones is instructive only if you also consider what things we were able to accomplish with these tools. We spent millions on these room-sized devices for their decision making power. Perhaps we’re squandering today’s tools by not asking such hard questions or going through the same painful cost justification exercises because of the ubiquity of computing.

We live in a time of infinite computing, where we can simply rent multiple computing cores on distant servers for fractions of the cost of actual ownership, and get answers in far less time than we may have even dreamed. Coupled with this capacity is the amount of data that we’re now amassing thanks to our personal devices, our use of mapping applications to navigate our world, and our propensity to share what we see through social media messaging. The tools and data combine to give us the means for nearly limitless discover, provided we attach are curiosity to the rigors of spatial analytic inquiry.

Lasting Legacy

The Wikipedia entry for spatial analysis does a great service in stating that the techniques are still in their early development, that complex issues arise in spatial analysis, and that the complexity isn’t clearly defined nor completely resolved. While we’ve spent decades looking at spatial correlation, interpolation, regression and interaction there is much we don’t yet grasp. Simulation and modeling have come a long way, but there’s a long way yet to go.

Joe Berry has been a king of communicating ideas on ‘map-ematical’ processing to make spatial reasoning, operations and analysis accessible to all practitioners. The mathematical world of spatial analysis can be daunting, and that’s why Joe’s work is so popular and important, because it ties theories to examples and walks the user through the reasoning. Joe asserts that maps are numbers first and foremost, and pictures later. With that perspective, we have the opportunity to gain a much better understanding of our world, from the molecular to the cosmos.

The Big Wave

Joe closed his recent talk with a surfboard image of waiting for the perfect wave, and wistfully saying that it’s coming. Today’s big data reality is that coming wave, and if the wave is data, then the board is our computing capacity, and to surf you’ll need to know how to make sense of where the wave can take you.  

Just today, the European Commission made the announcement of a 2.5 billion Euro investment alongside computing industry partners for a public-private partnership around Big Data. The press release outlined the need for new ideas, tools and infrastructure to process and gain insight from this data. Interestingly, here’s the string of sources that they outline for this big data: climate information, satellite imagery, digital pictures and videos, transaction records or GPS signals. Three are absolutely geospatial, and it can be argued that photos are of places, and transactions are conducted at a  place. Location is decidedly the common denominator, as in most queries.

Today’s server farms and their infinite computing capacity are surely spoiling us, causing companies with access to our personal records to even experiment with emotional manipulation on a grand scale. This capacity lies in wait for those that dare to take on the significant spatial analysis problems of our day. We have the means and we’re amassing the data needed to make real breakthroughs that can help us all, such as taking real stock of global change, limited resources and population problems.

Leave a Reply

Your email address will not be published. Required fields are marked *