Sensors and Systems
Breaking News
On Earth Day and Every Day The Gloria Barron Prize for Young Heroes Encourages Youth to Save the Planet
Rating12345The Gloria Barron Prize for Young Heroes is a...
Leica Geosystems launches its first Machine Smart Antenna — the Leica iCON gps 120
Rating12345Heerbrugg, Switzerland) Leica Geosystems, part of Hexagon, today announced the...
Descartes Labs Government Launches Retina at GEOINT 2024, Fusing OSINT and GEOINT into a Single Pane of Glass
Rating12345SANTA FE, New Mexico – Descartes Labs Government (DLG) is...
  • Rating12345

Steven_Ramage_1SpatialOpen Geospatial Consortium (OGC) has been working diligently over the years to build capacity within the geospatial community through the adoption of standards. There have been many successes during that time, some of which have led to even more interesting changes within the community, causing further speculation, interest and questions. Vector1 Media editor Jeff Thurston recently interviewed Steven Ramage, executive director, marketing and communications of the Open Geospatial Consortium (OGC) and raised some these questions.

V1 Magazine: I’ve recently heard people debating on the issue of ‘authoritative’ sources of geodata and geospatial services. This seems to surround initiatives connected with neogeography and social media type applications with the notion they cannot be professional. How does the Open Geospatial Consortium (OGC) approach the whole notion of ‘authoritative’ geospatial data and services?

SR: I recently gave a paper at the GSDI 12 World Conference in Singapore. The topic was User Generated Content and SDI Standards. From an OGC perspective I don’t think we should be concerning ourselves with authoritative or gold standard data or services, but we do want to ensure that data quality information can be conveyed. The OGC Technical Committee’s Data Quality Domain Working Group, for example, has recently been looking into mechanisms for expressing uncertainty, using UnCertML.

Many organisations involved in defence or national security are very strict about the sources of their data, for obvious reasons. Commercial organisations and public sector bodies can be more flexible, in many cases, but attribution and liability are big issues. Many inputs of User Generated Content, such as feeds from crowdsourcing and sensors, are just different types of data that need to be integrated, and this is one place where OGC standards play a role. From the OGC’s perspective the biggest consideration is the increasingly large volumes of geospatial data, and growing number of disparate sensors, sources and feeds providing this data. All this data has to be quality controlled, made interoperable for access and shared/reused to be of any value.

V1 Magazine:: With respect to the first point, it seems to me that quality ties rather closely to the issue of standards. That is, I think most people think of higher quality when they think of standards. Is this an accurate observation? Can you explain the relationship using a simple geospatial example?

SR: You might say this issue requires raising the quality of our thinking about standards. It’s important to distinguish between three kinds of geospatial standards: data standards that relate to data’s fitness for use; metadata standards that enable data to be described and compared (and that includes descriptions and comparisons of quality and fitness for use); and technical standards for communication between geospatial processing systems. The OGC is concerned mainly with the latter, but OGC cares about data and metadata standards, too, because OGC standards must enable systems to communicate information about data. Some OGC standards involve conveying and using metadata and some involve capturing and preserving information about the various processes that created the data. 

Prior to joining the OGC I spent the last 8 or 9 years of my career at 1Spatial working on this topic. It’s massive. The thing about quality is how you define and measure it. It doesn’t mean the same thing to all people. You obviously have data quality where things like business rules, geometric and topological data structuring, semantic and syntactic data translation and transformation are key to understanding meaning and context and being able to share data in a meaningful manner. This is an area that I see emerging in the OGC work around cross-domain modeling, where understanding the language of different domains or industry sectors is extremely important.

Consider GeoSciML. Geological information is certainly important, and the quality of geological information varies. The Commission for the Management and Application of Geoscience Information (CGI), a commission of the International Union of Geological Sciences, as their website explains, “seeks to enable the global exchange of knowledge about geoscience information and systems. It is increasingly becoming important to query and exchange geological information between geological data providers for legal, social, environmental and geoscientific reasons.”

So CGI developed a conceptual model of geoscientific information drawing on existing data models; they implemented an agreed subset of this model in an agreed schema language; and they have implemented an XML/GML encoding of the model. GML is the OGC Geography Markup Language Encoding Standard. In other words, a community of interest organizes to establish common naming schemas, including ways of describing quality. The community of interest may work with the OGC to be sure their XML encoding is consistent with GML, the industry standard XML encoding for geospatial information.

The geology community, the meteorology community, the hydrology community, the aviation community, and the ocean observing community have been working together in the OGC to establish such foundations for their work. These are ongoing activities because there’s a lot of legacy to weave together and because work remains on some technical standards, such as those that capture and preserve information about data lineage, that is, what processes have shaped the data. This relates to workflow, and ultimately it ties into technical standards that enable data providers to put restrictions on the use of their data.

This is not a simple example because it’s not a simple problem. Data quality is iterative and it’s fundamental to get it right at the outset from the data model level where possible. 

V1 Magazine:: The idea that one can have standards and still maintain performance crosses the mind of many people. The argument goes something along the lines that to gain similar standard among products, then functional performance must be sacrificed. Yet, we often hear people say standards will result in more, not less applications. Could you explain how these arguments align?

SR: Well, that’s two questions: what does performance mean, and what stimulates innovation? 
Consider the performance of a vector update operation: The definition of vector update can be defined publicly, and two vendors can have different proprietary algorithms that produce identical vector update results, but one vendor’s software runs faster. OGC standards can enable a third vendor’s client application to send a vector update task to either vendor’s server (to be compared and integrated). The vendors can keep their algorithms secret, but they implement standards so their “black boxes” can communicate with a larger world of different applications. 

Suppose there were only one vendor of vector update software, and that vendor charged an exorbitant fee for application developers to use the application programming interface (API) for that software. That would inhibit application development. But with open standard APIs – in this case the OGC Web Feature Service Interface Standard – many developers can sell applications whose vector update capability can be provided by any of a variety of vector update servers. To see how important this is, we just need to look at the innovation that html and http have enabled as IT standards!

A standard in itself is not slow or fast or anything in between. Like any processing algorithm, performance is dependent on how the standard is implemented and what infrastructure (bandwidth, servers, etc) is available to run the application. It all depends!

V1 Magazine: I’ve heard people talk about ‘INSPIRE compliant’ and ‘INSPIRE standards’ in both general media and even product releases. Why is that since INSPIRE only exists as a Directive of the European Union? What is the current state of INSPIRE vis-à-vis standards and what is OGC doing about this to clarify communication?

SR: As far as I know, no products or applications are actually ‘INSPIRE compliant’ yet, since the INSPIRE technical architecture and services are still being developed. Further, there are no official compliance tests to actually test if a product or application is INSPIRE compliant. So a better way of communicating these solutions may actually be “prepared or suitable for INSPIRE”. 

A number of the issues that INSPIRE is addressing are topics that the OGC has been covering for many years, such as interoperability and harmonisation.  Additionally communities around the world, for example in Oceania, North America and South America, have also been working on Spatial Data Infrastructures or large information infrastructures to enable data sharing for at least 10 years or more. All these communities were using OGC standards before INSPIRE was adopted as European legislation. 

What INSPIRE does well is to provide a coordinated framework for the use of standards, and this includes de jure standards like those from ISO. In my opinion the greatest value of INSPIRE is bringing together communities of practice so they are able to share good practice and ultimately help society.  This includes data modeling, issues around data management, notably around harmonisation, schema transformation and edge matching, as well as network services and monitoring and reporting. Every conference I attend around the world has almost the same number of presentations on INSPIRE as they do around OGC standards. 

So in terms of communicating the links between standards and INSPIRE there are a number of activity threads. The JRC work with us to provide information relating to INSPIRE developments, and my colleague, Athina Trakas, has been involved with the IOC – Initial Operating Capability Task Force (IOC TF). The goal of this group is to help and support the implementation of INSPIRE in the Member States.  I could ramble on, but the easiest thing is to view her presentation from the INSPIRE Conference earlier this year

I would like to create an overview of all OGC (and ISO) standards relevant for INSPIRE, but it’s a major effort and will require input from the OGC and INSPIRE communities. This would essentially be a short overview document that pulls from a multitude of INSPIRE documents, something like a quick guide. The initial list includes the OGC Web Service standards (such as the OGC Catalog Services, Web Map Service, and Web Feature Service interface standards) for aspects like Discovery, View and Download. Then you start to get into areas like OGC Symbology Encoding and Web Processing Services.

The international standards for metadata for datasets and services are ISO 19115 and ISO 19119, respectively. The application schema for both is ISO 19139, and these schemas can be found at two different locations: the ISO repository for official standards and the OGC Schema Repository. The table below is from the Draft Guidelines – INSPIRE metadata implementing rules based on ISO 19115 and ISO 19119 (this Implementing Rule has since been updated). CityGML (see the work of SIG3D) is another OGC standard that the INSPIRE community is working on.

OGC’s GML, described above, is useful and necessary for INSPIRE. GML is an open, XML grammar defined to express geographical features. GML is based on a number of abstract models codified in a number of ISO standards. GML serves as a modeling language for geographic systems as well as an open interchange format for geographic transactions on the Internet. As such, if there is a content model, you can encode the content using GML and view the result as a general geospatial data interchange format for that content. In fact, geographic data in GML can be sent to any device with an XML interface. So, for instance, you could use GML to send geographic data from one GIS to another. GML can also be displayed on XML-enabled devices like the new-generation PDAs and cell phones. The benefit for the provider is that one format suits all uses. Some work is planned around complexity analysis of the GML specifications to make it easier for people to get started with GML.

Another example: service chaining. Service chaining refers to automated invocation of a series of services that may all be hosted on different servers. Perhaps a request on a portal extracts some geospatial data from a database, sends it to a server to convert it from one reference system to another, sends it to another server to convert from geographic coordinates to UTM, sends it to another site to add administration boundaries and demographic data, and finally passes it along to another site for display or storage. Each server is providing a discrete service. This concept is supported well by GML because (a) GML is an encoding standard, so sites don’t need to support lots of proprietary data formats, and (b) GML is extensible and XML-based, which makes it easy to manipulate, change, and add to its contents. The range of domains covered by GML, which is essentially what INSPIRE is all about with its 34 themes – is also important, because GML provides a foundation for cross-domain data modelling.

INSPIRE needs to deal with a wide variety of geospatial standards. OGC, ISO/TC 211, and CEN/TC 287 are exploring the adoption of a common “Change Request / Requirements Registry” and examining ways in which XML schema for adopted standards can be managed more effectively and efficiently across these organizations. Gathering all requirements and change requests in an open forum visible to the public will also make the standards process more transparent and more responsive to community needs. 

Since 1998, when the OGC became a Class A Liaison organization with ISO/TC 211 (geomatics), six OGC standards have been submitted to TC 211 and gone through the formal ISO process to become joint OGC and ISO standards. These include the OGC Simple Features (SF), Web Map Service (WMS), and Web Feature Service (WFS) interface standards and the OGC Geography Markup Language (GML), Filter (FE), and Observations and Measurements (O&M) encoding standards.

Having a close correspondence between OGC standards and ISO international standards supports market development and policy development in Europe, Asia, and other world regions. The relationship also adds a vetting process that “puts more eyes on” the standards, ensuring that they are as good as they can be, addressing all relevant requirements. ISO has a “Publicly Available Specification” (PAS) process that is faster and less demanding on resources, i.e. peoples’ time, and the OGC may make more use of this process in the future. Similarly, the OGC Members recently approved an OGC “Fast Track” process to allow rapid adoption of de-facto industry standards and community of interest standard encodings that have been broadly implemented. 

The OGC has had a formal liaison with CEN since September 2005. CEN and the OGC are currently engaged in discussions regarding cooperative activities that will help both organizations contribute more effectively to the development of a European Spatial Data Infrastructure through the EU INSPIRE initiative.
Other standards development organizations (SDOs) – such as ISO, OASIS, W3C, open mobile alliance, buildingSMART International Alliance for Interoperability, IETF, 911 NENA, and OSGeo (see full names below) — have requirements for encoding and/or using location. Over the last 5 years, location has become an increasingly important part of their standards work. Thus the OGC plays a key role in ensuring the common and consistent use of location/geography across the spectrum of information and communication technology (ICT) standards, many of which are important in reaching the goals of INSPIRE. 

Additionally there is the OGC Web Services (OWS) Shibboleth Interoperability Experiment (IE). This is for advanced best practice for implementing standards on federated security in transactions involving geospatial data and services. This was part of the ESDIN project – European Spatial Data Infrastructure best practice Network and has relevance for INSPIRE.

Probably the most important point regarding OGC and INSPIRE is that the OGC exists because of the members. The Consortium is there to support their interoperability needs and try to promote political, cultural, legal, organisational, scientific and technological interoperability worldwide.  So the members are helping each other and the wider community by communicating around OGC and INSPIRE issues and opportunities. A good example can be found here.

V1 Magazine:: With respect to the point above, there seems to be a growing sense that spatial data infrastructures (SDI) are also standardised. Yet, most of these projects and initiatives often come together for specific and unique reasons with differing resource needs and goals supported through local policies. Can you discuss the standards concept in terms of SDI and what the current thinking is about this type of work?

SR: You do realise that there are entire conferences dedicated to this topic, such as the recently-ended GSDI 12, which covered SDIs and standards. In my mind the work on standards and SDIs are synonymous. Both areas are about providing better ways to access and share information to produce some sort of societal benefit. The OGC works with a variety of domains tackling issues like national security, weather forecasting, emergency and disaster management and many SDIs are taking similar issues into consideration as they are being developed.

In less grandiose words they are tackling the interoperability aspects associated with geospatial information; this obviously covers the technical issues, but also incorporates spatial law and policy. 

Whether it’s closed source, proprietary software that is adding value to an organisation through its features and functionality or whether its free and open source software and services, OGC standards work across the spectrum of tools available in the market place today. It’s these same tools and services that are now underpinning local, national and regional, or even global SDIs.

As more and more communities engage with the OGC, the consortium, its members and processes, we have an opportunity to educate and inform. The education part is to avoid duplication of effort and to help communities take advantage of all that the existing standards offer. With the rapid rollout of smart phones, social networking, and cloud computing, it’s important for developers and users to know about some of the newer, lightweight standards that exist in this area and how they fit with the well-proven and adopted standards. For example, the candidate Open GeoSMS and Geosynchronisation standards are designed to work with the other OGC Web Services standards for more complicated encodings and services.

The rapid advance of location applications of all kinds just raises the level of concern about data quality, rights management, privacy, currency and access. With over 400 members we have a dynamic network offering a wealth of experience, knowledge, know-how and ideas to draw from and this is probably as important as the standards development process itself.

V1 Magazine:: Some people see standardisation as also including non-monetary benefits also – not everything is based on return on investment (ROI). These alternate forms of benefit are not often discussed, yet, they are important. Can you explain why using a few examples?

V1 Magazine:: My opinion is that it is very hard to measure a return on investment when you are alleviating poverty, relocating international displaced persons or saving lives. We don’t really know how much OGC standards contribute to all these areas because of the depth and breadth of the standards, and also the large number of organisations implementing the standards. However, we do have some great examples from Haiti and other places, see here: http://www.ogcnetwork.net/networks/haiti.

As part of the marketing and communications effort the team has been working hard to set up a Business Value Committee. The original pitch to anyone who would listen was around tangible and intangible benefits, what you call non-monetary benefits. This issue is much wider than just standards in the geospatial sector and it appears that in just the last 2-3 years a number of studies and reports have been written to tackle this perception that it’s hard to measure the benefits, particularly for open standards. One of the goals of the Business Value Committee (BVC) is to address precisely this area.

The BVC is chaired by Marge Cole from MobiLaps (currently working at NASA), with an EMEA Vice Chair from Erdas, Emmanuel Mondon, and Kylie Armstrong, who works for both Landgate and the CRC-SI as the Australasia Vice Chair. 


V1 Magazine: I think some people in the geospatial community view standards and see them as involving a select group of people who are technologically aware and capable of participating within the initiatives. Describe how anyone could become involved, why that is important and what OGC is doing to make that happen – beyond simply saying people are welcome? Are there events and ways that these people can participate and become more full engaged?

SR: I couldn’t agree more and would really like to change this perception. Sure there’s a certain amount of technology understanding and technical rigor in the process, but this doesn’t mean that everyone who participates has to be technically inclined. In fact quite the contrary. The Business Value Committee I just mentioned has no restrictions on membership and the documents have all been made publicly available for review. My desire, and hopefully this interview will help, is to engage many more sales, marketing, strategy and policy makers in the OGC. There is a reason that standards development work takes place and that’s what we mustn’t lose sight of – the benefits and value of open standards. 

Further, much of the work in the OGC is about understanding requirements for enhanced interoperability that may be domain specified. As such, we are seeing more non-programmers participate in a number of the OGC Domain Working Groups mentioned above. These professionals, scientists, and research understand the requirements for sharing data and services in their domain. They know semantics, domain models, and so forth. This is extremely valuable information that is required to ensure that OGC standards are relevant for use on a global basis in specific domains or communities.

Another practical step we are about to take relates to membership levels where organisations are less interested in standards development per se, but they are very interested in how and where standards can help them drive areas like organisational efficiency and effectiveness. Later this month at Map Africa 2010 we will announce a revised membership structure called GovFuture. I cannot say more at the moment, but please look out for the announcement – it is targeting local and subnational government worldwide and will hopefully help engage more public sector bodies.

Perhaps we haven’t publicised it enough, but there are many ways in which non-members can provide input to the OGC standards process. The OGC website has a page titled “OGC Invites Input with links to our Public Forum, our online Change Request/New Requirements Form, Domain Working Groups (which feature outside speakers), public Engineering Reports, Discussion Papers and Requests for Comment. We have many Alliance Partners, and we have new OGC Global Advisory Council, many of whom do not represent OGC member organizations.

In terms of people participating, they can look at some of the areas I mentioned and I would also really encourage all geospatial professionals to look at the OGC Interoperability Program. This is probably one of the greatest areas of value within the OGC, if you invest in it wholeheartedly. It you use it as a technology learning opportunity as well as an opportunity to work with vendors and end users alike, then the rewards reaped can be substantial. 


V1 Magazine:: Most products today can consume and work with data from other products and if they cannot then often there are resources to translate / transform geodata. Is the interoperability message still as important as it was previously given that ‘open’ software seems so prevalent?

SR: “Open” can mean many things. In the old days, a vendor’s interfaces and encodings were called “open” if the vendor licensed their use to other companies, but we’ve moved beyond that very limited definition.  Now the vendor might “open” their interfaces and encodings in such as way that anyone can implement or use them. But the vendor can still change them at will, without consulting all those whose business depends on interoperability that depends on continued use of the vendor’s old interfaces and formats. Batch conversion is also much slower than interactive Web services. Interoperability in our networked age is more important than ever, and it is NOT usually a result of multiple vendors updating their translation software every time their competitors and partners update old interfaces and encodings. Rather, interoperability today, throughout the Information and Communication Technology world, depends on open standards developed by inclusive-membership open standards organizations that bring vendor communities together with user communities. That’s the OGC in a nutshell.

We seem to be moving now to the next level of interoperability, which is all about data context, meaning, understanding and classification. This means things like semantics, syntactics, ontologies, shared vocabularies, etc. So part of the answer to your question is that interoperability is even more important now since technology standards have provided the platform and the capability that we need to use and carefully interpret data. And not just data, but also online services. 


V1 Magazine: Web mapping is only a small part of the entire geospatial pipeline, however, as cloud services gain traction then the implementation of web mapping services seem poised to expand much further. How will HTML 5 impact OGC WMS?

SR: You’re right, as cloud services gain traction there will be increasing demand for web mapping services. Standards are important now, when most cloud applications are closed within enterprise information systems, but they will become even more important as the market improves for the public cloud services like those from Amazon, Microsoft, Oracle, IBM services etc. that have received so much publicity. 

HTML5 will make it easier to write good browser-based WMS clients that don’t use Flash. Also, there will likely be a potentially positive impact on WFS and GML clients in the browser due to better/faster 2D drawing support and a native API for doing the drawing. Greater browser support for “maps” using OGC standards such as WFS, GML and the OGC Sensor Observation Service Interface Standard will likely mean that a larger population of web developers will be interested in OGC standards. Developers can, of course, implement our standards in the browser using Javascript, but many of these developers will no doubt want simpler, more “web-friendly” interfaces that are URL-based access and that use JSON (JavaScript Object Notation), or other lightweight data-interchange formats. 

V1 Magazine:: Using Smart Grid example, could you explain where interoperability, standards and specifications independently fit into the example? What part of the Smart Grid example is OGC involved in?

SR: Every Smart Grid component — transformer, meter, air conditioner, power plant, electric car, solar panel, etc. — has a location, and location matters. Every grid event or phenomenon — brown-out, demand variability, power surge, regulation, transmission loss, etc. — occurs within some time interval and at some location in space along the grid’s physical network. 

Utilities have used GIS and SCADA for decades, but the smart grid is something different, because it’s a huge network of distributed nodes that produce, collect and use data. Most of the data has a location component. A GIS in the back room is just one of millions of nodes, and it’s of little use without the open interface and encoding standards that enable communication of queries, responses and alerts between countless systems and devices. The standards also enable hundreds or thousands of vendors to participate in the international smart grid market. OGC participated in early US National Institute of Standards (NIST) smart grid standards workshops to make the key players understand the importance of geospatial standards. As a result, OGC standards are now written into the US smart grid standards framework. However, much work remains. 

One key area that needs to be addressed is indoor location standards, and these need to be developed in such as way that they work well with the OGC’s outdoor location standards. The OGC CityGML standard provides an important beginning, but it is primarily an open standard for 3D modeling of the built environment. The indoor/outdoor location issue and Building Information Models are important for the smart grid, emergency response, location-based advertising and every company or agency that buys, sells, owns, occupies, insures, inspects, appraises, manages, designs, builds and protects real estate or property. 

V1 Magazine: Some companies are moving toward ‘open’ approaches for plant related infrastructure and design. Traditionally these have been highly proprietary applications. These changes impact the way people work, the organisational culture in some cases and raise other issues related to safety, investment, ownership and lifecycle approaches. How does OGC work among all these directions and challenges?

SR: Your key point here is spot on, that open approaches based on open standards impact the way people work and live. Just as important, or more important, is the fact that standards impact business models. This often makes progress difficult, even though the net result for society and the economy is positive. As I just mentioned, there is a huge need for open standards in the industries and professions involved in the built environment. However, meeting this need has proven to be extremely difficult because it would bring change in many arenas of economic activity. 

The OGC has been working for several years with the buildingSmart alliance, the main organization involved in Building Information Models (BIM).  The OGC also has other Alliance Partners that care about BIM, such as the Open Standards Consortium for Real Estate (OSCRE). Both of these organizations have recently become more international, and this internationalization, along with greater cooperation among standards development organisations, may provide the critical mass necessary to make real progress.

Even though open standards help markets expand, the leading vendors, quite understandably, usually wait for demand from the user community before getting involved with open standards. This is not happening very fast in the US, with respect to BIM. But it could happen with push from governments, insurers, real estate investors, lenders and others in other countries. The US would likely follow suit. So we are hopeful that collaboration will increase and progress will follow.

V1 Magazine: OGC recently announced the 2011 OWS-8 Testbed. Can you outline what you expect to be included into this for next year? What challenges do you see for the 2011 edition?

SR: OWS (OGC Web Services) testbeds are part of OGC’s Interoperability Program, a global, hands-on and collaborative prototyping program. The goal is to rapidly develop, test and deliver proven candidate standards specifications into OGC’s Specification Program, where they are formalized for public release. In OGC’s Interoperability Initiatives, international teams of technology providers work together to solve specific geoprocessing interoperability problems posed by the Initiative’s sponsoring organizations. OGC Interoperability Initiatives include test beds, pilot projects, interoperability experiments and interoperability support services – all designed to encourage rapid development, testing, validation and adoption of OGC standards.

Before responding on OWS-8, I would like to talk about OWS-7. Since the OGC Web Services (OWS) activities have been happening for many years now, it’s important to provide some context and explain how OWS-8 thinking was developed. OWS-7 had three activity threads: Sensor Fusion Enablement (SFE), Motion Video Fusion, and Aviation: 

  • The SFE Thread focused on integrating the OGC Sensor Web Enablement (SWE) interfaces and encodings with workflow and web processing services to perform sensor fusion. This activity also increased the interoperability between SWE and the US CCSI (Common Chemical Biological Radiological and Nuclear) Sensor Interface. 
  • Motion Video Fusion is about geo-location of motion video for display and processing. This effort involved change detection in motion video using the OGC Web Processing Service Interface Standard. It also involved dynamic sensor tracking and notification based on a geographic Area of Interest (AOI). 
  • The Aviation thread further developed and demonstrated the use of the Aeronautical Information Exchange Model (AIXM) and the Weather Information Exchange Model (WXXM) in an OGC Web Services environment. These are standards that the US Federal Aviation Administration (FAA) and Eurocontrol developed as a global standard for the representation and exchange of aeronautical information. AIXM uses the OGC Geography Markup Language (GML) tailored for the representation of aeronautical objects. 

After analyzing the OWS-8 sponsors’ requirements, the OGC Interoperability Team recommended to the sponsors that the content of the OWS-8 initiative be organized around a set of activity threads similar to those in OWS-7:

  • Sensor Fusion  (observations, gridded data sources and services, motion video change detection, WCS 2.0 compliance tests, CCSI toxic sensor interfaces)
  • Feature and Decision Fusion  (information models, database synchronization, information catalogues and sharing, integrated clients enabling inter-agency schema harmonization, feature portrayal, decision support)
  • Aviation  (WFS/FE support for AIXM, WXXM, portrayal, event architecture, dispatch and general aviation) 

Your timing is good with this question because the OGC’s Request for Quotation / Call for Participation (RFQ/CFP) is just about to come out (mid-November 2010). There will be a Bidders Conference (Q&A) on 6 December 2010; proposals will be due 7 January 2011; and the OWS-8 Kickoff, a 3-day workshop in the Washington, DC area, will be held 9-11 March 2011. OWS-8 final reports are due in September 2011 and there will be live demonstrations of OWS-8 results at the September 2011 OGC Technical Committee meeting in Boulder, Colorado USA.

As described above, the OWS-8 testbed will include the same activity threads as in OWS-7, but with different content. Details will be announced in the RFQ/CFP). The challenge, as always, is spreading the message regarding sponsorship and participation in these highly valuable exercises!

V1 Magazine: I’d like to touch on the issue of flooding, as that has been a major issue this past year in a lot of the media information coming across our desk. How is OGC involved in water related issues and how does that contribute toward improving the situation for dealing with flood related events? On the flip side to this lies the issue of water quality – suitable for drinking. Are you involved in efforts to conserve and ensure quality water supply?

SR: The OGC has always had a number of members focused on disaster management, and some testbeds and pilots have focused on flooding. On the flip side, to advance standards that support efforts to ensure water availability and water quality, a Hydrology Domain Working Group formed recently. 

The OGC has played a leadership role in the Global Earth Observation System of Systems (GEOSS) Architecture Implementation Pilot, developing an interoperability infrastructure for that multinational, multi-year program. OGC has worked with partners on a number of demonstrations, some of which involved flood management and water resources.

For example, part of Japan’s contribution to GEOSS is the GEO Grid project at Japan’s National Institute of Advanced Industrial Science and Technology. GEO Grid applies grid technology to integrate satellite imagery, geological data, and terrestrial sensor data to monitor and respond to disasters such as earthquakes, landslides, and flooding. A Taiwan landslide monitoring effort has heavily used OGC standards in systems that alert populations to impending flows of debris down valleys after heavy rainfalls.  A German Tsunami warning system in the Indian Ocean uses OGC standards.

As highlighted earlier the Haiti earthquake relief benefited from applications that were quickly made available thanks to OGC standards. And there’s a UN-Habitat project that will make use of the “Human Sensor Web” idea for building a monitoring system to improve the water supply in Zanzibar. There are many such examples.
The OGC Technical Committee has a new Emergency and Disaster Management Domain Working Group that will focus floods as well as other disasters. 

V1 Magazine: What are two things that you think the geospatial community does not quite understand about OGC or some of the work you are doing? How can this be changed?


SR:
To a certain extent I think you have covered the first point. You talked about the perception that it may be seen as some sort of exclusive club or just the technology cognoscenti. It’s not. The biggest eye opener for me when I joined from 1Spatial, a long-standing OGC member, was just how much happens within the consortium. I never realised the level of activity worldwide, even as an OGC member. From the outside it’s not so obvious what is actually happening, but once you get involved and invest some time and energy in the process you see the broad scope of things, and you can get a lot back! There are a large number of people who have been doing exactly that, successfully for many years, too many people to mention. 

The second point is related to education and awareness at multiple levels. Whether or not you agree with how the OGC operates or even with the technical architecture decisions, there’s no denying that the OGC and the work done by many individuals involved in OGC over the years has helped to shape the geospatial industry and will continue to do so. The goal has always been to provide interoperability that enables a range of benefits to a global community. We are providing open and freely available standards to anyone and everyone working with geospatial technologies in the IT sector. I believe that’s a positive message that many more people are willing to support and endorse.

V1 Magazine: When I talk to the engineering community they often speak about a shortage of skilled labor. The same is true in the utility community. To meet the challenge they point to a need to design more software and products that are smarter (more intelligent). This suggests a greater need for geospatial software and services to communicate. Where does OGC fit into this and evolving initiatives involving robotics, augmented reality and high-end modeling?

SR: OGC has provided the open standards that are breaking down the barriers between geospatial technologies and other information and communication technologies.

Standards enable the kind of convergence you mention, things like location-smart robotics and augmented reality. OGC standards “hide” a lot of complexity, a lot of details that had to be addressed before geospatial interoperability could become a reality. Today’s new software developers and entrepreneurs have all this to build on. Growth in our market has also helped create the impression, an accurate impression, that there is opportunity for developers in this space. There are also applications that appeal to a wide variety of interests, from games to sustainability to location-based marketing. 

We have a university working group and we attend academic conferences. Sometimes academic institutions are very conservative, and much of what passes for geospatial education and geography is still stuck in the age of old media, that is, print media and desktop GIS. But that is starting to change, and younger people are starting to drive this change, with their interest in things like open data, user-generated content, smart device applications and location-based social networks. It’s a very exciting time.

 

Leave a Reply

Your email address will not be published. Required fields are marked *