The requirements on Geographic Information Systems (GIS) have originally been focused on data capturing and mapping functionalities. Current analyses state very clearly, that 80% of all technical business processes are related with geospatial questions. That’s why an intelligent Pipeline Integrity Management System (PIMS) is more than reasonable for pipeline operators, and we introduce a GIS and geodata-based PIMS which makes it possible to establish a pipeline integrity management system.
Introduction
The technical integrity can be shown by collecting all essential influences on the system for each pipe segment. All this information will be evaluated using a standardised concept. An analysis of reliability is performed applying the criterion of failure probabilities in the technical condition analysis (TCA). That’s how pipeline operators are able to detect weaknesses of a line or loop and also to take action in order to ensure its proper condition. As a result, already existing GIS systems with existing objects and objects from a newly developed database can be used for the assessment.
The data model is available “out of the box” and can be configured and enhanced if necessary. The collected data can be transferred to the PIMS via standard Interfaces, e.g. CSV, XML or any other kind of format. In the PIMS the data can be analysed and presented. Failure probabilities or renovation actions can be managed with these analyses and with parameter studies.
Finally, as a state of the art PIMS, it is not bound to one special Geographic Information System like ESRI’s Arc GIS, GE Smallworld GIS, Bentley Microstation or Intergraph. Using IT standards such as Oracle, Java or XML, it is platform neutral. In all the software is a very elaborate solution which focuses on pipeline integrity using already given data.
Gas companies are obliged to keep their pipelines and facilities in correct and sound condition according to safety regulations. This guarantees the necessary standard of stability and supply security at any time. In general, it can be expected, that the pipeline condition achieves the highest quality level after construction and performance of the pressure test.
Integrity of a pipeline
This quality level is, basically, anchored in the body of regulations and the quality requirements for the construction of the pipeline. Internal and external influences during the operation period can reduce the quality level of the pipeline. To maintain the holistic integrity of a pipeline, all elements of a safe operation are to be considered in addition to the technical aspects. The integrity of a pipeline is defined as follows: Warranty of a comprehensive structural integrity and operational reliability according to specifications while yielding optimal cost effectiveness.
The pipeline has to comply with the requirements of technical integrity and the operation, including the maintenance, has to be performed in an economically, quality-safe, environmentally way and has to comply with the obligations. Process integrity guarantees that the operational environment meets the corresponding requirements of quality e.g. stipulation of tasks and competencies, training, risk consciousness, information management, documentation and communication in the company as well as with other companies and science, evaluation of success, descriptions and connections with operation processes etc.
If significant features show that the integrity according to specifications is questioned, this has to be pursued. Significant features can be found in the technical field (e.g. metal loss) or in procedures (e.g. communication deficiencies).
The reaction to this information can be as follows:
1– Need for action — Experience of the personnel or the body of regulations necessitate immediate action (e.g. impact of third parties with significant damage to the pipeline or unclear communication in certain special cases).
2– No need for action – An impairment of the pipeline is, indeed, identifiable, but the experience of the operating personnel can safely assess it (e.g. slight displacement of a pipeline under a surface load).
If the situation is at first unclear and cannot be evaluated according to points 1 or 2, further steps are necessary to identify and assess the condition:
3– Application of assessment procedures – If, on one hand, the distinctive features found cannot be identified as not critical by the experience of the operating personnel, and, on the other hand, the taking of measures would imply economic disadvantages, it is recommended to apply recognised assessment procedures, which show the technical or operational safety or question it.
In all three cases, the integrity of the pipeline system is given again in the end. This is also the case when the results of the assessment demand a middle-term need for action – checking and securing the integrity.
Pipeline Integrity Management Systems & Technical Condition Analysis: A General Overview
The technical integrity of a pipeline meaning the correct and technically proper state in terms of security requires maintenance and repair because the current state may change dependent on time through third party damage, corrosion etc.
The driving goal is that all security relevant processes are examined holistic and organised in a PIMS.
The technical integrity can be shown by collecting all essential influences on the system for each pipe segment. All this information will be evaluated using a standardised concept. An analysis of reliability is performed applying the criterion of failure probabilities in the TCA. That’s how pipeline operators are able to detect weaknesses of a line or loop and also to take actions in order to ensure its proper condition.
A PIMS enables pipeline operators to design and integrate specific business processes. Integrity Management allows different departments and people to work on the safety of a pipeline. Costs and efforts can be shared throughout the cycle. The basis for the cycle is data gathering. Already existing GIS and expert data will be used and delivered to the evaluation software. New data, e.g. survey or specialist data can be added to the system at any time. Several data and factors of influence can be used for the assessment.
A consequent modular architecture was designed to make the Eclipse based software open and adjustable. The flexible data model allows easy working with increments and objects. The main goal of the whole framework is to reach improved flexibility using proven standards. In the result, all data model’s can be configured to fit the application for the assessment. Totally new objects, functions and modules can also be added.
Data gathering for the assessment
Data gathering is a must before performing a technical condition analysis or calculating the failure probability. This can be realized using a directive for data gathering to ensure that the data is correct. For optimization of orientation and display it is also possible to integrate land register data or historically. The software is an elaborate solution which focuses on pipeline integrity using already existing and new data as well.
{sidebar id=188}
The process starts by collecting relevant data about the pipeline to run an initial assessment. This includes, for example, pipeline attributes, operating data or inspection data. The leading system in this case is the one where the pipeline operators keep the data. This might be a GIS, but could also be a different database. The design focuses on a platform neutral product because naturally the data maintenance differs from one company to the other.
The advantage of using a GIS is that it is cost effective and additional data from other systems can also be connected to the pipeline such as cathodic protection data or pressure changes. Plus the geographic background like cadastral data is also available. In the context of the implementation of CAD and GIS in the mid 1990’s, most companies are owners of detailed as-built documentation of their network. Operating data like pressure or pressure change can be handled via SCADA integration, external influences like land use can be captured easily to each existing GIS, if not available yet.
As one very important aspect in such a PIMS framework, the integration of inline inspection (ILI) and cathodic protection (CP) data is essential. Important modules in state of the art Pipeline Integrity Management Systems are spatial enabled add-ons for the handling of such data. The big advantage of this approach is the data storage in one central database. This offers the opportunity to access the information and data company wide via GIS, the Intranet or the Internet. Furthermore this enables several departments to work together collaboratively. Moving these daily business data into the geographic context, the information value is much higher in a GIS then in a row-column-based application like Excel. Another benefit is using the instant chart view which shows the CP information along a selected pipeline.
{sidebar id=189}
When all required or estimated data needed for an assessment is gathered, the PIMS will gather the information. During the next step, dynamic segments will be generated. These may differ in length. The software will calculate the length of the segments using specific pipeline properties. This is realised with an XML-output, which basically connects the database or GIS to the PIMS.
Standard workflow of a PIMS
The standard workflow of the PIMS is modular structured into these steps:
{sidebar id=190}
Import of the geodata from GIS to PIMS
The PIMS is managing all relevant pipeline resp. network data in a command structure. The base is one (original) pipeline and versions of this pipeline with multiple alternatives as copies from this original. Every copy version can be used for copying, editing or deleting this alternative.
This is implemented especially in the context of calculating assessments with different parameters for the same pipeline. The default data import is based on the XML standard. But specific imports from other platforms or systems can be implemented quickly due to the open and modular architecture of the offered PIMS. An import wizard helps the operator during the data import to the PIMS, if needed.
Data controlling
After choosing the correct version of the pipeline to work with, all pipeline-related data will be displayed in a segmented view. This provides a unique way to present the data, making it possible to view all pipeline data in one single screen. It is also possible to edit single or all data in this part of the application, e.g. for only one specific date or for a range of the pipeline.
{sidebar id=191}
Selecting and running an Assessment
There are several ways to run an assessment on the calculated segments. One option involves a ranking. The principle behind the ranking approach is to grade the pipeline according to its condition. Attributes which influence the grading can be determined beforehand.
Another option for the assessment is through the use of formulas, which are defined by the department or the operating company. The software was designed for company’s to indicate their own requirements
The third possibility for the assessment is to use probabilistic procedures. Additional statistics are used for this assessment, which in the end will lead to the conclusion that the integrity of a pipeline is true or false.
All tools for the analysis and presentation are based on the result file, which is created automatically by the PIMS system after each assessment. The result files are not linked to the database respective of the original data, so each result file will reflect the condition at a specific time under specific conditions. This is very important in the context of an internal or external revision demand. This also means, that parameters can be changed or the pipeline can be deleted. The result of the calculation stored in the PIMS wills still be available at any time until it is deleted.
Presentation of the assessment results
The simplest way to provide the results of an analysis is to use a table view. The PIMS provides each kilometer of the pipeline on the x-axis (columns), the specific pipeline data is presented on the y-axis (rows) in this view. The system provides the operator with any information, e.g. error messages and the reason for missing results. The next figure shows a typical situation after an assessment, where two segments were not calculated.
The reason here is a missing outside diameter. The PIMS also provides the operator with a compact overview of all identified errors. This is very important to ensure data quality. The last row of figure 5 presents the total result of the assessment, in this example based on a scoring-model.
{sidebar id=192}
The characteristic of the Band View: Each kilometre of the pipeline is annotated to the x-axis, the linked values are annotated to the y-axis. The band view is linked with a table-based view, so the operator of the Pipeline Integrity Management System can jump between both views in both directions by one mouse click.
{sidebar id=193}
Geographic view
So far there has been no connection between the results or diagrams and the context of geography. This has been made possible with the introduced approach of a PIMS. It is possible to show the diagrams of the results plus the geography like cadastral data. If this is existent it can be included to support the view. The segments might be coloured thematically, so that the user is able to identify quickly the relevance of the shown data.
The PIMS operator can use thematic mapping functions to present critical segments on the pipeline, e.g. all segments with an depth of cover less than 1,2 meter. For this thematic mapping all input- and the result data can be used as well. The Geographical View and Band View are linked, so the spatial context in each view is always analogue. Additionally the geographic context makes it easier to compare results. The diagrams make it easy to identify very high or very low values inside the assessment.
{sidebar id=194}
All result files can be converted into a format which is used by MS Excel. For each pipe segment the PIMS system will create an own row, each input- or result-data will present a column. The exported data can be used for other business processes, for example, the creation of expert reports.
Tracking the measures
The software enables companies to plan and control measures for renewals and constructions. This knowledge can be used to interpret the assessment results. The results are displayed together with the measures. Very high failure probabilities can be put into perspective in this case. But the data must be kept up-to-date as good as possible.
It is important to know which measures were planned but not in action. It is also important to know measures that were already implemented, but not refreshed in the documentation. This knowledge helps to interpret the assessment results.
There are several possibilities to keep the data up to date and to share the work between departments. A mobile system, for example, can be used to reduce the effort because the data input is handled in the field. The PIMS is able to save all results in a file format, which is used for all implemented assessment modules in the same way.
In this context the result-file will contain the results of the assessment as well as all data that has been used for the assessment. With this functionality a utility company will fit every requirement which is linked to internal or external revision demands. The result files are separated from the core database and can be used as an argument to bring forward evidence.
Conclusion
Each pipeline has to comply with the requirements of technical integrity and the operation, including the maintenance, has to be performed in an economically, quality-safe, environmentally way and has to comply with obligations. Process integrity guarantees that the operational environment meets corresponding requirements of quality e.g. stipulation of tasks and competencies, training, risk consciousness, information management, documentation and communication in the company as well as with other companies and science, evaluation of success, descriptions and connections with operation processes.
The technical integrity of a pipeline, which means the proper and technically secure condition, requires knowledge about the pipeline and maintenance. The conditions change continuously due to corrosion or third-party-impact. To organise processes that support the safety it is necessary to implement a Pipeline Integrity Management System. The technical integrity can be shown by collecting all essential influences on the system for each pipe segment which is available at each pipeline operator.
All this information will be evaluated using a standardised concept. An analysis of reliability is performed applying the criterion of failure probabilities in the technical condition analysis. Using a GIS and geodata powered PIMS, the integrity of the pipeline system is given. That’s how pipeline operators are able to detect weaknesses on a line or loop and also to take action in order to ensure the safety.
————————————————————-
Authors
Sebastian Pache, GEOMAGIC GmbH, Germany ([email protected])
Sebastian Pache majored in geography and geoinformatics at the Westfälische Wilhelms-Universität in Münster, Germany. He gained his diploma with the thesis paper “Geoinformationen und Immobilienmarketing” at a real estate company in Berlin. After a short trip into the world of automated and geo-data-based real estate evaluation Sebastian came back to geographic information systems as Business Development Manager at PDV-Systeme GmbH in Erfurt. There he was busy with possible applications of GIS and web-based information systems in the context of eGovernment solutions. Since November 2006 Sebastian is working for GEOMAGIC in the marketing and sales department. As an international IT service provider for pipeline operators and distribution companies GEOMAGIC is an active member of the “Pipeline Open Data Standard Organisation” (PODS) and Oracle and GE Energy Partner as well.
Dr. Stephan Knoblauch, GEOMAGIC GmbH, Germany ([email protected])
Stephan Knoblauch majored in chemistry at the University of Leipzig, Germany, and Amsterdam, Netherlands. He gained his PhD with the paper “Investigations on structural, electronic, and photochemical properties of tetrahedrally distorted copper(II) complexes with thiopyrazolone ligands”. From 1995 – 1999 Stephan was teaching in several positions, e.g. as Post doctoral research associate at the Texas University, USA. In 1999 he becomes an IT Consultant for Brokat Technologies and he specialised in internet banking and payment applications. After a three year trip as Consultant and product manager at Encorus Technologies, a First Data Company, Stephan entered GEOMAGIC GmbH as Team Leader Product Development.