Sensors and Systems
Breaking News
ModalAIⓇ Launches Next Generation Starling 2 and Starling 2 Max NDAA-Compliant Development Drones
Rating12345SAN DIEGO – ModalAI, Inc. today announced Starling 2...
Draganfly, Doodle Labs, and UXV Technologies Collaborate to Enhance UAV Communication Solutions
Rating12345Innovative Collaboration Between Draganfly, Doodle Labs, and UXV Technologies...
Lynred wins Sentinel-2 NG mission pre-development contract to design advanced multispectral infrared detector
Rating12345Design of new high-resolution imaging device to meet European...
  • Rating12345

Perspectives Header

About three years ago spatial data quality was the topic of the day. Everyone seemed to be talking about it. It was important, and seemed to be gathering momentum for a thrust into the direction of improved spatial data quality factors, how to assess them and go on. Suddenly it seems as if that period of spatial enlightenment seems to have passed. Did it get lost in the ‘The Cloud’ to the whim of cloud owners who we automatically trust, purchase from and ask no questions from? Where did all the talk about spatial data quality go?

Perspectives Header

About three years ago spatial data quality was the topic of the day. Everyone seemed to be talking about it. It was important, and seemed to be gathering momentum for a thrust into the direction of improved spatial data quality factors, how to assess them and go on. Suddenly it seems as if that period of spatial enlightenment seems to have passed. Did it get lost in the ‘The Cloud’ to the whim of cloud owners who we automatically trust, purchase from and ask no questions from? Where did all the talk about spatial data quality go?

North_AfricaSpatial data quality is a critical factor in the development of successful spatial data applications.  It’s presence can be found through all phases of these work flows from data capture, data management, data processing and to representation.

With higher quality applications users, organisations and businesses experience higher degrees of reliability, assurance and improved adaptability leading toward the reuse of spatial information, and the creation of new applications.

A few years ago spatial data quality was a big issue, and everyone seemed to be talking about it. At the time it made sense because a new wave of technologies with higher resolution capabilities was on the horizon with many of them becoming available slowly. Moreover, new applications were being developed based upon these information.

At that time, various organisations and individuals were talking about the need to track data quality from one end of the spatial spectrum to the other. From data capture through to representation, and often discussion seemed to abound around the notion that indicators needed to be included in the information (metadata) that would provide users with clear, understandable and easy-to-use guidelines for selecting the appropriate data for their solutions.

It appears that those days passed quickly. The spatial data quality discussions have abated for the most part, with the exception of surveyors who have continued to express the needs for obtaining quality information for land survey related tasks.  In fairness, the Trimble company seems to have been one of main companies who continued to pursue a high quality spatial data objective, and the benefits of it.

Did all the discussion about spatial data quality get lost in the ramp-up toward ‘The Cloud’? Have people turned their focus toward the idea of accessing data, rather than wondering what is in the data? There is little doubt that the cloud is enabling more people to use geographic information system (GIS) data, satellite and airborne imagery, computer-aided design data (CAD) as well as geodata from a host of sensors including lidar.

As many of those involved in creating and handling geospatial data backed into the cloud garage, the need for taking the auto out for a spatial data quality spin, seems to have decreased. I don’t think this has been intentional, instead, it has been the result of a cloud architecture that decidedly puts the geospatial user on the access and use side – no questions asked.

Would I be out of line today to suggest that anyone could provide any kind of data in a transaction to someone, through cloud services, provided it matched the kind of solution the application developer wants to provide?

Don’t shoot the messenger. But I feel that it is important to get back on the spatial data quality track. It is the surest route to providing a guarantee that professional services and quality are being offered, to provide buyers with geodata they can trust, and to more fully understand how and where new data directions need to go – ensuring a greater numbers of applications to solve real problems.

It is important to identify and state how instruments perform and can be used. It is important to explain how data is stored, where it can be used most appropriately and which data needs more work. It is important to explain and indicate how data is to be analyzed and represented, and to provide indication of shortcomings and potential problems. It is also important to unequivocally provide indications as to the benefits of following these approaches and how they lead to successful business.

The cloud is powerful, valuable and has many advantages. But we need not lose focus on the data going into the system, and all the value a spatial data quality work flow can bring.

The goal still remains for each of us to educate, help others understand spatial data processes, technologies and to understand why and how their investments in geospatial networks make sense, provide jobs and lead toward a higher standard of quality living.

We need to get back on to data quality.

Leave a Reply

Your email address will not be published. Required fields are marked *