Sensors and Systems
Breaking News
Trimble and GroundProbe Collaborate to Offer Complete Monitoring Portfolio for Geotechnical and Geospatial Mining Professionals
Rating12345Integrated approach means less hassle and more support for...
Space42 and ICEYE Announce Joint Venture to Bring Satellite Manufacturing to the UAE
Rating12345ABU DHABI, UAE —  Space42 (ADX: SPACE42), a UAE-based...
Hexagon appoints new Group Executive Vice President and new President of Hexagon’s Geosystems division
Rating12345 Thomas Harring, currently President of Hexagon’s Geosystems division,...

August 27th, 2008
An INSPIRE Agenda for Lisbon

image
  • Rating12345

thumb_sandersonMotivation speaker Daniel Burrus spoke at the Intergraph Conference a few years ago.  He said: “Time is the currency of the 90s”. If that was the case then it is certainly the case now. So how do you make time? The answer is simple. It comes from an old quality mantra: ‘do it right the first time’. With INSPIRE we have an opportunity to do it right. But do what? I will argue that we have to define a supply chain for the public sector and the private sector to identify where open source and chargeable value added activities can exist side by side to create a geospatial knowledge community in Europe. For the industry this places geospatial at the centre of an informatics discipline based on the use of geometrical mathematics. I’ll explain using some examples primarily from the UK.

 

The End Users
To start with I will take up the challenge laid down of identifying Lars Brodersen’s end users for INSPIRE column. I can do no better than start by quoting from a Strategy document from the UK Network Rail organisation, as I take up the challenge. This statement emerges from under another quality mantra, ‘opportunities for improvement’:

“We will improve and enforce data standards. This would simplify data analysis, improve data quality and reduce the cost of future IT systems. This is especially important in respect of location data, which is the means of achieving commonality between data sets. Currently, data cannot be readily transferred between systems and data is thus only available to the users of each  individual system. An improved data dictionary is needed, together with a business process, which enforces its use. Technology improvements will also play a key role, providing data validation on entry and automating the updating of our systems using, for example, hand held devices.”

{sidebar id=209 align=left}Let’s traverse further end user concerns to ensure that we have identified a number of business processes that are suffering from lack of time (because its not being done right first time) to achieve interoperability and re-use.

I am a solicitor with the legal right to advise and sign-off on a property exchange. As part of the process I have to obtain the deeds and a plan of the property. This is public sector information (PSI). Even in many countries in the EU this isn’t available in digital form. I might even have to drive to a central records office to verify the property of interest on a plan (which might even be on linen not paper). I have to enquire as to what planning applications have been made which may affect the property in value or otherwise; similarly I need to find out what legal covenants exist for the property, for example which utility has the responsibility for supply of services. Some of these data are digital. In fact the UK Government would like to think they are all digital under the electronic and transforming government agendas (these are like Soviet 5 year plans, so have to be declared a success). They aren’t, but more of that later.

I am the GIS technician responsible for the managing GI data in a municipality. Amongst other roles I have is that I interact with the process above. In order to be a Level III municipality (and gain additional central government funding) I need to pass information to other organisations and the citizen electronically. I can do this Soviet style by putting up a portal.

However although trained as a geographer, not an information scientist I understand intuitively that there are problems with the data flow. I understand this because the data supplied by the National Mapping & Charting Agencies (NMCA) has a unique reference number (in Great Britain this is known as a TOID™. I maintain a gazetteer of addresses as part of the municipality’s management of its property asset base according to the BS7666 address standard. This gives a series of unique references to properties, known as BLPUs (basic land & property units).

I know they have to be matched, so that entities using the information know they are accessing the right information. I have some understanding of time series. I know that the NMCA data comes as a set of changes and that these have to be matched to the BLPU changes. We send these to central co-ordinating body for the municipalities and that creates another reference within a national land and prioperty gazetteer (NLPG). We carried out this exercise 3 years ago and for a period of 12 months were able to automate the requests for information for property searches. This was a real Level III application and saved £100,000 per annum.

Some of the changes are geometric (as a result of real world change, as a result of error correction including positional accuracy changes due to GPS). Some of the changes are attributional  (change of use, description, correction etc.). However I haven’t the information systems framework to manage the changes automatically to the reference sets [1]. Over the 3 years the data have negatively mutated, such that nearly a third of all property searches requests have to be checked manually to ensure that it is the right property. The savings per annum are down to £66,000 and will decline even further.

I am the Director of a State Mapping Agency in Germany. I have responsibility for producing several map series from 1:10, 000 upwards. Like all NMCAs I suffer from the time it takes to produce these products. My customers suffer because when I produce a new product at one scale it is different from the other series out there in the market. This is confusing for decision-makers. I want to produce all series from a common time stamped detailed scale. I have a new state of the art data model (AAA) and an opportunity to collect data of a quality sufficient to generalise from the detailed scale. I need to get this right from the start to satisfy my end customers.

These are some of Lars’s end users and the problems they wrestle with. They can be turned into use cases, which are missing from the INSPIRE activities. So lets look at how INSPIRE in general and data quality approaches in particular will assist the end users. But before I do this lets consider the wider geopolitical dimension.

It seems like a hundred years ago now, but back in 1977 I worked as a Scientist in Yorkshire Water. The EC was implementing a set of water directives. It did this because, it is easier to harmonise and build consensus in the environmental domain than in the legal and constitutional area. The basis for this assertion is the difficulty in agreeing with the Lisbon treaty. It is no accident that the INSPIRE directive is being implemented within the environmental agenda therefore.

However there is another dimension to the geopolitical context. Europe again in Lisbon created an agenda for a Knowledge Economy, known not unnaturally as the Lisbon Agenda . Now this is really important. Ok it’s not really happening as fast as we would like but it is, or should be a force for harmonisation. There is no precedent for a hegemony (for that is what the EU is) being based on a knowledge economy, but then Europe is really beyond the smokestack age and if its citizens are to continue to enjoy a certain lifestyle we have to have an edge. Aside from this why would we want to continue doing things 27x over? I hope by now you can see where this is going. It is the justification for INSPIRE. This does not satisfy the end users yet, so lets return to them and their issues.

Delivering the End User Benefits with Informatics
There is a set of problems for the end user as I have described. The current approaches on INSPIRE are indeed technological and the approach is dominated by a set of vested interests, including the NMCA community. Be that as it may in Europe we have an advanced set of geospatial models that can play a key role in building the knowledge economy. We should not seek to dumb them down or lose them. So INSPIRE does provide an opportunity to put in place an information systems framework that will work if the end user gets something for free (it solves some problems they can’t readily solve now) and this does not become another burdensome requirement. What does this framework look like?

The problems described above are about process improvements, lifecycle events and workflow. Many organizations are now focusing on and investing in data management solutions that leverage existing investments to ensure high quality data for their businesses. This started in mainstream IT when Enterprise Resource Planning applications came into vogue in the middle 90s. In building these and adding on Customer Relationship Management (CRM) modules data had to be migrated and evidence started to build on the costs of poor data quality in billions of dollars.

Further that the migration activity fixes data quality at that time, but within a few months the data degrades, unless the business processes are addressed.. In the geospatial world data quality around geometry can be fixed during the migration activity by using mathematics or topology. Here our industry has a big advantage over the business intelligence space! We can use mathematics to automate the data quality position by taking advantage of that often quoted statistic that 80% of all data has a location element. Since this can be subjected to mathematical treatment, it gives the basis for ensuring persistent data quality. So having fixed the data on the basis of geometry, persistent topology and associated rules act as a gatekeeper to maintain the quality automatically.

Current approaches to transformational government are based on portals, which give access to information, but do nothing for identifying the quality or allowing for maintenance. To achieve this, transactions need to be driven by what is termed master data (sometimes also referred to as master reference data), and includes information about business objects — things like products, customers, suppliers, regions, business rules etc. – including associated structures and reporting hierarchies.

In the 90s it was believed that the difficulty of and inability to effectively integrate this data was the direct result of a lack of consistent metadata (definitions of field lengths, formats, types etc.). However, despite heavy investment in tools and significant efforts by many companies to improve their metadata management processes and the quality of their data, their efforts have failed to provide a complete solution to date. This is because having good metadata is only a small part of the solution. It is now recognized that the secret to better performance management and improved business insight lies in improving the quality of the master data on which those applications and processes rely. INSPIRE must not fall into the same trap.

Master data is the language of doing business – the business objects, definitions, classifications, and terminology that describe business information as well as the context for recording transaction data. In essence we need to achieve optimum value, speed, certainty, safety, security, and low total cost can only be achieved via agreed objectives, standard processes, communications, messages, data, identities, automatic identifiers, master data and dynamic data “.  A key element in this equation is the automatic assignment of meaningless reference numbers (automatic ids). We already have these as they were identified in the problem statement above, TOIDs, BLPUs, NLPG. These are owned by bodies who compete and therefore a UK solution is missing for the automatic update of changes across the supply chain. To avoid this situation arising in the EU, JRC using INSPIRE needs to define an overall framework, a supply chain embodying master data management principles within which business benefits get delivered and the end user can see where they can benefit.

{sidebar id=204}

The geospatial community has been aggregating data for a long period of time without making advances, because there is a lack of orchestration of events when data changes. These business processes can now be implemented because of advances in orchestration and workflow technologies, which have been standardised in the Web Services Business Process Execution Language (BPEL). BPEL is an XML grammar for ‘programming in the large’ and provides central orchestration of distributed components. Overall business process and workflow is implemented through a two tier system as indicated in the above diagram:

• Large scale process implementation. This level uses BPEL to control high-level transitions in abstract processes.
• Small scale business logic implementation. This level is implemented by rules processing engines.
BPEL decouples service implementation from processes using WSDL/SOAP.

Rules-based processing is by no means new in the geospatial world. Indeed it is not new in the EC. RAISE (Rigorous Approach to Industrial Software Engineering) was developed as part of the European ESPRIT II LaCoS project in the 1990s. It consists of a set of tools based around a specification language (RSL) for software development. It has been adopted by UNU-IIST in Macau. So our rules based community already stretches outside the EU. Bjørner [2] describes its application to the rail network domain. An example showing both the common language description and the mathematical statement of the rule is shown below:

{sidebar id=208}

Radius Studio from 1Spatial is an example of a modern rules-based processing environment, implemented both as middleware and as a service that is used for business logic workflows and for domain discovery and conformance checking.

{sidebar id=205 align=right}
Radius Studio can be used to measure the quality of data in the sense of measuring the degree of conformance of the data to a rules-base. A document or data store can be processed and a report generated with results of the conformance test at both a summary and individual feature level. The summary reports the proportion of compliant objects.

Additionally a list of non-compliant objects is generated (as HTML or as an XML-file, with object identifiers). The following output depicts the power of using geometry to correct alphanumeric data and allow fit for current purpose data to be generated.

{sidebar id=206 align=right}

This consists of per-object metadata detailing which object infringed which rule. The rule identifier and text is included in each Object element with any unique key attributes and the bounding box (envelope) of the checked feature geometry attribute. Detailed feature level metadata contained within the Object elements can be used in manual reconciliation processes to locate and correct data. Of more interest is the deployment of action maps. An action map is an ordered list of (rule, action) pairs. When an action map is invoked, objects are checked against the rules. If the object does not conform to that rule, then the associated action is invoked. Action maps are often used to associate fixes to problems with non-conforming objects. The new role is to engineer these rules and fixes as part of an INSPIRE based knowledge community. It is time for JRC to take advantage of the lead we have in the EU in this area as a result of f the ESPRIT funding and create an ISO standard for a semantic rules language. This can form the basis of an Informatics discipline driving the knowledge economy.

Practical Steps to Define the Community
There is a need to define a framework for a community of users. To develop a community of users committed to knowledge management in Europe, components in the Supply Chain need to be open source. Why? Because the public and private sector need to participate and identify where in the supply chain they can provide open source components and where they provide value added components which can support a business model to keep their own activities going. By its very nature this model will be different from today’s proprietary models. A candidate example is shown in Figure 4.

INSPIRE is at an early stage in developing harnesses to allow data specifications testing . If this results in a bazaar rather than a cathedral of open source activity then quality rules can be written once and exchanged amongst member states. Lets hope it does, because the benefits are inestimable.

{sidebar id=207}

Conclusion
In conclusion I present the benefits that can be obtained from geospatial data quality through persistent topology for one of the end user examples I cited at the start of the article. 1Spatial started this process in 2001. 1Spatial has used a combination of topology to maintain the geometric data and business rules to maintain attribute information in a cadastral application. As a result of building the rules and implementing topology for the map base the following types benefits will be evident , when such gains begin to outstrip the overheads involved in the project.:

1. Solicitors no longer have to visit the Registry offices to confirm the property they are conveyancing is correctly identified.
2. Where registry offices are dispersed geographically work can now be switched depending on the where the workload exists increasing the flexibility of the response from the organisation.
3. The organisation becomes more dynamic with focus on improving the business processes and rules rather than on mundane administrative tasks.

Michael Sanderson is CEO at 1Spatial. For more information: www.1spatial.com

 

References

[1] –  See: Burrus.D., TechnoTrends (1993) ISBN 0-88730-700-0

[2] – A CloverLeaf of Software Engineering. D. Bjørner (2005). Proceedings of the Third IEEE International Conference on Software Engineering and Formal Methods (SEFM’05)
ISBN 0-7695-2435-4/05

Leave a Reply

Your email address will not be published. Required fields are marked *