Sensors and Systems
Breaking News
BARSC Welcomes ICEYE as New Member
Rating12345The British Association of Remote Sensing Companies (BARSC), the industry...
Citadel Defense Delivers on Urgent Contract for Counter Drone Systems
Rating12345SAN DIEGO – Citadel Defense, the manufacturer of innovative, high-performance...
HERE and Cerence bring more personalized, contextualized in-car experiences to drivers worldwide
Rating12345Amsterdam and Burlington, Mass. – Cerence Inc., the global...
  • Rating12345

DerekGliddon_TN.gifIt’s a daunting task to synthesize and analyze global knowledge regarding the health of the environment. The United Nations Environment Program’s World Conservation Monitoring Center (UNEP-WCMC) in Cambridge, United Kingdom, has a dedicated staff of more than 60 people that work to bring together data on global biodiversity and conservation to guide policy. A primary data product of UNEP-WCMC is the World Database of Protected Areas.V1 editor Matt Ball spoke with Derek Gliddon, head of the Informatics Program at UNEP-WCMC about the goals and challenges of the organization.

 

DerekGliddon.jpg

© Hamish Jordan

It’s a daunting task to synthesize and analyze global knowledge regarding the health of the environment. The United Nations Environment Program’s World Conservation Monitoring Center (UNEP-WCMC ) in Cambridge, United Kingdom, has a dedicated staff of more than 60 people that work to bring together data on global biodiversity and conservation to guide policy. A primary data product of UNEP-WCMC is the World Database of Protected Areas . V1 editor Matt Ball spoke with Derek Gliddon, head of the Informatics Program at UNEP-WCMC about the goals and challenges of the organization.

V1: One of your core missions is to provide biodiversity data across the globe. How does the World Database of Protected Areas  help fulfil that mission?

Gliddon:  “Protected Areas” is a broad-brush term that encompasses many different types of management regime and management objective. Protected Areas may be managed for their conservation value, recreational, landscape, cultural or ecosystem service (e.g. rainfall capture) value.  Many protected areas include a habitat conservation function.

Clearly protected areas provide an important function, but to evaluate their efficacy we must evaluate them in the context of their geographic distribution, spatial connectivity and the national, regional and global status of the species and ecosystems they contain and the threats they are exposed to—at national, regional and global scales.

From the perspective of biodiversity conservation, protected areas have multiple functions. They may protect the habitat and ecosystems necessary to sustain viable populations of species found only in that location, they often provide staging posts for migratory species, and they may be managed within the framework of a Protected Area network which ensures a healthy genetic variability.

The World Database on Protected Areas or WDPA- is a global collation of GIS datasets which locate and define locations managed for some type of conservation purpose. The WDPA is referred to as a “foundation dataset” as it provides the basis of a vast array of different types of conservation assessments. The WDPA provides the global status of human effort to manage our planet for conservation purposes. The WDPA is used not only for assessment of wildlife species per se but increasingly the “ecosystem services” they provide. For example our Centre has recently completed a global assessment of the contribution of protected forests and thus the value of Protected Areas in storing carbon. Other studies look at the efficacy of protected area networks in providing corridors for species to relocate as an adaptation mechanism against changing global climates.

The WDPA is an essential tool in conserving global Biodiversity, however, to tell the full story we must also understand threats, the status of species and the location of areas of biodiversity importance, which are not necessarily protected.  To address this challenge we are involved in the Integrated Biodiversity Assessment Tool (IBAT). The IBAT tool brings together, for the first time in an on-line environment, the WDPA and other foundation data sets, such as Key Biodiversity Areas (KBA) from BirdLife International and Conservation International. IBAT overlays these global datasets in a single web map. Together these datasets provide the most authoritative and comprehensive view of the state of global biodiversity conservation efforts. 

V1: How do you go about collecting and synthesizing data at a global scale?

Gliddon: We’re looking at global trends and the status of ecosystems globally. We don’t have people going out into the field. We’re operating at a higher level of abstraction. We standardize then aggregate data from a global network of Governmental, Regional and NGO partners. The development of this global network and facilitating the supply of protected area data is the result of over two decades of effort and investment.

The WDPA was developed in conjunction with the International Union for Conservation of Nature (IUCN) – specifically their World Commission on Protected Areas (WCPA).  The WDPA is operated as a joint effort between the PA Programme and the Informatics Programme.  Informatics build the system and perform the backend GIS processes, whilst PA undertake the outreach activities with data providers and generate various reports to inform global policy decisions. The WDPA has a dedicated Content Manager who is engaged full time in developing and supporting the network of data providers.

The Internet and on-line data sharing technologies are profoundly important to us; they allow us to bring the data in from around the world, add value to it and subsequently republish again on-line in the form of maps or syntheses. The whole area of web mashups is something that’s extremely important to us and to our community and that’s primarily why we’ve developed the Informatics Program – which is now the biggest team within the Centre.

V1: How important is geospatial technology in the informatics efforts?

Gliddon: Profoundly. The spatial framework is the basis for global data integration. We’ve been involved with GI technologies from the outset. I remember first visiting the Centre almost twenty years ago and even then the centre was already an established leader in GIS-based conservation data integration.

In recent years the whole GI world has been shaken up by Google Earth, ArcGIS Explorer, Virtual Earth, and other exploration systems. They’ve had a tremendous effect in mainstreaming GI data. The virtual globe is such a beautiful interface, the user experience is really intuitive, and the increasing ease by which data and analytical mashups can be undertaken is really extraordinary. I believe the real power of web services and mash ups has still yet to be fully appreciated.

V1: You’re increasingly becoming a platform enabler in a sense. Not necessarily delivering all the data, but acting as a hub for a lot of these mashups activities?

Gliddon: That’s right. It’s an interesting area because it’s potentially quite competitive. It’s becoming ever easier for people to facilitate mashups but you need authoritative organizations such as the UN Environment Program, to provide a credible focal point to facilitate these mashups and to facilitate delivery of the output information products to global policy decision makers.

V1: Who submits data to you?

Gliddon: The UN Convention on Biodiversity (CBD) has set a target to establish ten percent of territory as protected. Signatory countries are obliged under the convention to periodically report their progress toward achieving that goal. Consequently almost all countries periodically submit lists of their protected areas to us.  In parallel however we also have information provided by local, regional and international conservation NGOs. In many countries the NGO community is able to provide more detailed information.

V1: How do you bring all the disparate data together?

Gliddon: We have recently reengineered the WPDA Management System. The new tool was formally launched in early October. The WDPA Management System has a web interface for contributors to submit their data and for on-line viewing, searching and data download. Once data is submitted it is standardized, quality checked and published.

We use ESRI technologies. ESRI are a tremendous and long term supporter of our work, they have donated their GIS software to the Centre for many years.

Collating WDPA data has been a challenge for many years, and until recently it was a manual task because the data came in numerous formats – both analog, digital. Digital submissions came in different schema, used different coding standards. It was a huge task.

In a utopian world everybody would be collecting biodiversity data according to globally accepted standards and it would be easy to collate data together. But, in the real world most of the data that’s out there was not collected for global integration. It was collected for in-house or local government purposes using different methodologies and different storage schema.

To standardize and quality assure WDPA data submissions we rely on Safe Software’s FME. FME is at the core of the WDPA. Safe Software donated their software to us, and it’s allowed us to dramatically increase our throughputs and the quality of the WDPA data. Conceptually what FME does is quite straightforward; essentially you define a translation between an incoming format/ schema and an output format schema. Typically individual organizations submit their data in the same format each time, so we save the transformation parameters for each translation then reapply it in subsequent passes.

There’s a bit of an overhead the first time through, but subsequently it’s relatively a straightforward thing to do and can be largely automated. This means that now people can upload their data directly using the WDPA website and that goes into a repository, the upload is registered and is then passed through the FME translator.

V1: How customized is your translation process?

Gliddon: We’ve developed sophisticated quality control mechanisms using FME and ESRI’s GeoProcessing tools. One of the challenges we have to deal with is determining if a PA submission is already known to the database. We also have to deal with alternate boundary definitions for a single protected area. Often an NGO’s definition of a PA boundary differs from that submitted by the National Government.

We have developed twelve automated quality checks and some of these are complex. After automated checking we then have to review what the validation rules have identified and deal with any issues. Some errors can be corrected automatically but some are subjective and require human expertise. We refer any problems back to the data provider so that they can improve their data sets.

V1: That’s impressive. Would you foresee people using an online tool for interaction and manipulation down the road that might resolve some of the data integration issues or would we ever get to that point?

Gliddon: We are already exploring these avenues. We are involved in some pilot work with the government of Brazil – who want to use web services to link their own PA management systems with the WDPA. Some of our corporate partners are keen to use the WDPA as a web service so it can be integrated directly into their workflows.

V1: There was some talk on your website regarding moving toward real-time data, and real-time remotely sensed imagery. Are there efforts to make the data much more current?

Gliddon: In terms of imagery and other data, we’re working with the Global Earth Observation System of Systems (GEOSS) effort. There is a global movement to develop an Internet architecture to share remotely sensed imagery with a whole raft of environmental institutes and ground sensing devices.  This architecture addresses nine subject areas, one of which is biodiversity – this is referred to as the Group on Earth Observation – Biodiversity Observation Network (GEO-BON) – that’s something we’re involved in..

We are also linking with the Global Biodiversity Information Facility (GBIF). GBIF has developed an impressive distributed biodiversity data architecture – they aim to provide on-line access to over a billion specimen records.

Another dimension that we’re looking at with imagery is assessing the management effectiveness of protected areas – i.e. to look at an area and see whether or not any management regime that is in place is effectively protecting the area. Around the world there are many areas that are designated as protected but, in fact, they receive little or certainly inadequate active protection. We need to monitor land cover/land use over time to assess the efficacy of management actions. Real time imagery provides access to important PA management information – such as the location of forest fires.

V1: There are some limitations with GIS in terms of tracking change over time. Is that something that you’re pushing, to be able to add a time element to your data set?

Gliddon: We are using ESRI’s Enterprise ARCSDE and have built versioning and history into our architecture. We can thus track changes in protected areas over time. We track changes at the individual level and capture metadata for every submission and every change made to the data.

V1: I’m fascinated by the level of integration and partnership that you have, and your role as the authoritative data source.

Gliddon: The key word is partnership, that’s something that we are very keen on. The international environmental NGO community is not awash with money, and there’s a strong association between data sets and organisational branding. There is a challenge with data mashups of losing brand identity.

Our own protected areas data is a good example. It’s an expensive process to maintain these data sets and it’s becoming increasingly easy for third parties to mashups data and to get credit for these value-added end products. But, the backend processes have to take place to bring these global data sets together and to keep them actively maintained. It consumes a lot of resource and obviously organizations are keen to make sure that their contribution is acknowledged.

In our push to ensure ever more environmental data is put into public domain where it can be put to better or more widespread use, we have recently taken on the Secretariatship of the Conservation Commons. The Conservation Commons is an organization based on three principals and organizations endorse these principals when they join. These principles are: that conservation data held by members of the Conservation Commons will be freely and openly shared; that any data used is properly accredited, and that there’s fair and equitable use of the data.

Data sharing is a perpetual challenge for organizations such as ourselves. We have limited funding yet our objective is to facilitate free and open data sharing. There is a cost associated with doing the work, but there’s little revenue arriving. We have to form partnerships with industries that value these data sets and who wish to contribute to their maintenance.

V1: All the rigorous background data handling that you’re doing seems to justify investments in the service that you provide. Are there other missions outside of being the most trusted source for this data?

Gliddon: We are involved in many initiatives, for example we operate, under contract, databases on the Trade in Endangered Species. A major element of the Centre’s work is providing scientifically robust consultancy services to support conservation policy. On the Informatics side we are focusing evermore on the development of standards to facilitate the exchange of biodiversity data. Biodiversity is a complex topic and data standards are not generally well developed.

GBIF have led some excellent work in developing standards in biological taxonomy. Species information it’s very complex because the multiple biological classification schemes used are inconsistent and individual species are constantly being reclassified. Species can be reclassified in one schema and not in another, and making species databases interoperable is a major challenge. GBIF and the Taxonomic Data Working Group (TDWG) have done exceptional work in that area.

V1: You’ve opened my eyes to some of the many problems of a global data set. And, a lot of these revolve around the data aggregation work that needs to take place.

Gliddon: In many sectors, but in the biodiversity sector especially, data is not collected against international data standards because they don’t exist. A lot of this data comes out of the “publish-or-perish” academic community, and there are some pretty major challenges to data sharing still to be overcome there.

In the United States, the Federal Geographic Data Committee did a good job in mandating the capture of metadata for federally funded data collection activities. Something similar needs to be done in the biodiversity community. This is really what the GEOBON concept is about, to get governments to mandate the cataloguing of biodiversity metadata. Environmental pressures require these data to be in the public domain not locked away in some researcher’s “C-drive”, or a Corporate EIA library, which is still too often what happens.

Leave a Reply

Your email address will not be published. Required fields are marked *