Sensors and Systems
Breaking News
Maxar Secures NOAA Approval to Provide Non-Earth Imaging Services to Government and Commercial Customers
Rating12345WESTMINSTER, Colo.- Maxar Technologies (NYSE:MAXR) (TSX:MAXR), provider of comprehensive...
AEye Announces Groundbreaking Immersive Lidar Experience for Attendees at CES 2023
Rating12345DUBLIN, Calif.- AEye, Inc. (NASDAQ: LIDR), a global leader in...
WIMI Hologram Academy: Multi-Dimensional Holographic Vision Opens A New Chapter In Cyberspace Mapping
Rating12345HONG KONG – WIMI Hologram Academy, working in partnership...
  • Rating12345

Schell_David_thumbThe Open Geospatial Consortium has been hard at work on many fronts to realize the full promise of geospatial technology. David Schell, the organization’s founder and chairman of the board, spoke with V1 editor Matt Ball about the organizations inroads with the AECOO community and the promise of greater software interoperability.


The Open Geospatial Consortium has been hard at work on many fronts to realize the full promise of geospatial technology. David Schell, the organization’s founder and chairman of the board, spoke with V1 editor Matt Ball about the organizations inroads with the AECOO community and the promise of greater software interoperability.

V1: There seems to be growing momentum for the adoption of GIS within the AEC industry as a central operational platform for project management. Where do you think we stand in terms of realizing the full power of GIS in this space?

Schell: This question gets at a fundamental issue of interoperability, and it’s really not about GIS. We’re really talking about the larger domain of the extended enterprise in which there is a gradual melting of the information boundaries around disciplines driven by a more rapid melting of the boundaries around digital applications and technologies. In the AEC and building owner and operator world, there is a need for sophisticated decision support frameworks that employ new workflow processes and complex modeling approaches for dealing with complex issues. Many branches of applications, including GIS and AEC CAD, have come a long way in terms of meeting discrete sets of user needs, and now it’s quite possible, with the limits of proprietary clusters of products, to integrate functions such as geospatial analysis and engineering design. But the objective is not just design. It’s workflow and decision support. Geospatial information needs to be an integral part of this, but I don’t see GIS evolving as the central operational platform.

If there’s a central operational platform, it’s Web services. The larger Information, Communication and Technology (ICT) world is rapidly evolving a Web services framework that provides the infrastructure that’s necessary for connecting the digital activities of many separate players. The players can communicate in an ad hoc way. They can have different interests and rules, different data models, and different business models supported by different information systems. The AEC and building owner and operator and service provider world doesn’t need a central operational platform; it needs freer flow of information and services. That awaits a set of technical capabilities for managing semantics and implementing rules that involve security, privacy, intellectual property, liability and so forth.

When you talk about enterprise use of geospatial information today, you’re talking about cloud-based computing that involves geospatial, and that means Web services-based interoperable geoprocessing. We shouldn’t have to say anymore that it’s not GIS. In this new context, the use of spatial information is taken for granted. The whole conception is the integration not of multiple sources of geospatial information, but multiple heterogeneous domains of information. It also includes what OGC’s Sensor Web Enablement standards enable: real-time inputs for process modeling and monitoring. On the back end, so to speak, all of this draws from various application disciplines engaged in by information technology specialists, but it’s all increasingly invisible to most users. And that’s the way it’s supposed to be. Technology advances so that we can think about things other than nuts and bolts.

V1:  You mention sensor webs. The idea of sensor webs appears to be growing, particularly at the larger scales involved in the Global Earth Observation System of Systems. What role does the sensor web play in sustainability? Is there growing interest in this concept in the urban environment?

Schell: Sensor webs are growing in importance, and the importance of interoperability among sensor webs is becoming increasingly apparent to the people in the various communities that are building sensor webs. Sustainability is becoming a more and more important driver. You can’t manage what you can’t observe and measure, and sustainability is about environmental management. GEOSS is the most visible effort, but the ocean observing community, the hydrology community, and the meteorology and climate communities are also getting involved. These domains have active OGC Technical Committee working groups and those groups are engaged in successful and ongoing interoperability experiments. In Taiwan there’s a sensor web based system that monitors Earth tremors to anticipate sudden rockslides that are a recurring problem in steep river valleys. OGC Sensor Web Enablement (SWE) standards are also used in a Tsunami warning system and in a number of European environmental sensor programs. Defense and intelligence and homeland security programs have provided much of the funding for the development of SWE standards, but those standards are directly applicable in environmental applications.

In the urban environment, people are using sensor webs to study flows of airborne pollutants and toxins. There’s certainly potential for monitoring storm water runoff and anything involving pipe networks or pipes that release wastes. The National Institute of Standards and Technology (NIST) has listed OGC as one of six critical standards organizations that need to be involved in the US Smart Grid Standards Roadmap, and we believe SWE standards will play a role as this unfolds. Sun and wind sensors as well as electric current sensors of various kinds will be necessary in systems that balance local variable renewable energy sources with storage units and energy sources that can be powered up and powered down on demand. We also envision smart grid technology eventually extending from electric power grids to gas, water and sewage pipe grids. Every woodstove vent in a city could have a sensor for particulates and dioxin. Sustainability is all about what happens locally and what people do at a local level.

V1: In the past we’ve talked about the importance of GI Science to address global change, and you and some of your board members have written about geospatial interoperability science. Is GI Science being adopted to the degree it needs to be, and with the urgency that is needed? If not, what can be done to accelerate that effort?

Schell: Well, once again, we need to expand our terms. “GI Science” as it’s usually used is still too narrow a term. “Interoperability science” encompasses what we are talking about. One of the first interoperability science issues we need to address is the issue of sharing of information between various data centers, and making research data more discoverable and accessible. Data centers can’t be stovepipes anymore, they have to be “loosely coupled,” so any data center can be accessed by any data provider or user, with appropriate permissions, of course.

Ralph Cicerone, president of the National Academy of Sciences, wrote about the importance of this in general terms in the February 8 issue of the journal Science. We’ve been working toward this for years in terms of technical interoperability issues, and we have working groups in hydrology, Earth system science, etc. who are developing application schemas that meet their intra-community and inter-community data sharing needs. We see individual scientists and small groups in various projects moving in this direction, but what’s really needed is a cultural dialog, which is where a public forum like V1 can make a significant difference. There is tremendous importance in defining the vocabulary and making sure it is understood by policy makers, because their decisions will ultimately enable or disable data sharing. NOAA is moving forward on this issue.

OGC standards are an absolute requirement, but they are not the only requirement. The critical requirement for technical interoperability has been met by our members, but the critical requirement for institutional change goes largely unanswered. Climate is an interdisciplinary discipline and it is inherently longitudinal, that is, historical. Data from other studies, current and past, need to be put to maximum use if we are to fully understand and cope with our climate problem. Unfortunately, most scientists and science funders don’t understand that their view of data is dramatically and tragically a relic of nineteenth century scientific specialization and twentieth century GIS usage. It’s not about files anymore, it’s about Web services.

Researchers need to properly document and publish their data and methodologies using available Web technologies, standards and best practices. Improved discoverability and availability of data has many ramifications, and not just for cross-disciplinary and longitudinal studies. Cicerone was referring to verifiability, rigor and transparency, which are critical not only for truth but also for the public’s confidence in science. Other issues include re-use, data exploration, data fusion, use in models and more. Overall, we’re talking about improved societal and institutional return on investment of research dollars, and improved ability of research funding institutions to do due diligence and policy development. At a higher level, we’re talking about more efficient scientific debate and accelerated pace of scientific discovery. It’s about freeing researchers’ time for more creative work and more communication with other scientists.

V1: One area of growing application for geospatial technologies to address climate change is in carbon accounting. Are the current tools adequate to address this issue? How do you think GIS will benefit from such applications that require detailed spatial analysis?

Schell: There’s certainly a lot of money that’s going to change hands, and the whole business is full of abstraction and complexity, a lot like mortgage financing. In the mortgage world, we’ve seen what happens when greed takes over in an environment where there’s too little accountability and too much complexity. I fear that’s what we will see with carbon trading unless the system is built on radical transparency and careful regulation. The transparency will often depend on geospatial data, such as monitoring of forests that are ostensibly capturing carbon, and the monitoring of improved power plants that are ostensibly lowering society’s overall production of CO2.  Again, the tools are available, but the institutional will and policy preparation may not be up to the task.

V1: The monitoring of global change will require more detailed models and the ability to monitor change over time. How much of a technical hurdle is this, and when do you think we’ll achieve it?

Schell: It’s really just a matter of geospatial information being thoroughly a part of the larger information environment, and it’s a matter of service-based permanent data hosting and good data curation. As I just mentioned, monitoring change over time and feeding diverse data streams into models require interoperability because you need historical data, and historical data often comes from diverse sources. The OGC interface and encoding standards provide quite good handling of temporal parameters and they provide a foundation for automating the legitimate fusion of data from diverse sources. The many software applications in which the standards are implemented are becoming increasingly capable and intuitive. The real issue is, are the funders of science ready to force change on the extraordinarily important but inherently conservative institutions of science? And will they fund interoperability research that builds on what the OGC membership has accomplished? These are key cyberinfrastructure questions. Some of the hurdles are behind us, but many are in plain view ahead of us. The OGC working groups are hard at work on things like geosemantics, data quality and uncertainty, geospatial rights management, “table joining,” and many other challenges, and application domains are using the OGC to facilitate both technical and semantic interoperability. But, as you know, I think research is needed that gets out in front of the commercially-driven standards setting process, providing a stronger theoretical basis for the advancement of interoperability.

Leave a Reply

Your email address will not be published. Required fields are marked *