Sensors and Systems
Breaking News
NASA-Led Study Provides New Global Accounting of Earth’s Rivers 
Rating12345A study led by NASA researchers provides new estimates...
geothinQ Rebrands to Latapult, Reflecting Company’s Mission to Empower GIS Data Use for Decision-Making
Rating12345Savannah, Georgia – geothinQ, a premier geographic information system...
Sanborn’s Broadband Navigator™ is Available for Streamlined Purchase on NASPO
Rating12345 The Sanborn Map Company’s (Sanborn) Broadband Navigator™ is...
  • Rating12345

thumb-saas-dogru

The interaction between science and technology is inevitable. Scientific studies produce information and cause advances in technology while on the other hand technological progress provides us better circumstances on scientific research. Today data deluge is a growing concern in Earth sciences and providing a solution for the analysis of these upcoming data is an extensive task in Computer science. There are different types of data to understand earthquake processes.

 

 

Each group works on one aspect of the problem and creates their own tools.  However, an integrated system can be built by combining these datasets and tools in a common platform and sharing them over the Internet. This effort combines the information technology with the science of studying earthquakes. Service-oriented technology can support strongly Earth sciences in this context. A PhD thesis study was performed in Turkey to create an information technology infrastructure for enabling multidisciplinary Earth science research by using existing tools and integrating them.

This application has been developed to enable efficient computations of strain and velocity for research in Geodesy. Architecturally, the system uses Web service technology. GEON (Geosciences Network) is one of the early exemplars of such cyberinfrastructure projects like BIRN, GriPhyN, TeraGrid and has played a pioneering role in development of cyberinfrastructures. The recognition of the importance of cyberinfrastructure has increased in recent years and e-science has become a truly global phenomenon.

Turkey is an earthquake country. Just like Turkey which is a place where two continents (Europe and Asia) meet, North Anatolian Fault Zone (NAFZ) of Turkey is located at a point where two of the Earth’s tectonic plates meet, the Eurasian and Anatolian plates. Ninety-six percent of the land containing 66 percent of the active faults is affected by earthquake hazards and 98 percent of the population lives in these regions. North Anatolian Fault Zone (NAFZ) of Turkey is a natural laboratory for Earth scientists where various tectonic landforms exist.

Particularly on the western part of it, lots of geodetic projects have been performed for monitoring crustal movements over three decades. There are an increasing number of data inferred from these tectonic studies and lots of efforts such as data collection from field observations and sensors, database creation, software development, data integration, and data management. Moreover, each of them has its own various problems.

The need is to provide access to all of existing resources and support interoperability among them by using information technologies. Building mechanisms which are capable to share these data and tools is the key for the next generation of Earth science research. Information Technologies can facilitate many complex tasks that geoscientists face today and solve problems in less time. This study can be applied to many different geoscience branches but here we focus on earthquake geodesy.

{sidebar id=392}

While data and compute-intensive nature of Earth sciences makes reaching results difficult, there are still duplicative efforts on data collection, conversion, reformatting and tool development which cause waste of labor and time. In order to avoid repeated efforts and to analyze growing data rapidly, use of information technology is necessary. Information technologies allow scientists to create appropriate solutions to meet the requirements of interdisciplinary Earth science projects which have multiple goals.

Web services provide interoperability between different software applications running on different platforms. Web services clearly increase the speed of the scientific discovery process in Earth sciences. Web services are the implementation of Service Oriented Architecture (SOA) in the Web environment. SOA is basically a collection of services which communicate with each other.

A Web service can be defined that a programmable application which is accessible using standard Internet protocols. Web services can be any piece of code that is available over the Internet and they can be written in any language. Reuse of existing tools, lower cost of maintenance and reduced impact of change are the most important benefits of Web services. In this study, a system was built in a service-oriented architecture for reusability and interoperability of each component.

{sidebar id=393 align=right}

This system brings the complex strain analysis procedure (developed by Haines and Holt, 1993) to a level that can be used by anyone efficiently and effectively by using Web services approach.

Strain is studied and researched on the determination of deformation and identification of high seismic hazard areas. It is very important for geoscientists. But the calculation process of strain is complex and it takes time to see the results obtained.

There are mainly thirty Fortran programs that should be run successively. First of all, some modifications were done to these programs for study purposes. These modifications were primarily to increase portability and IO performance. There were a number of places where the codes require user interaction.

These codes were removed, and some new programs were written to create input files. In the algorithm, the extent of the rigid blocks for the study area is arranged according to the seismicity. Fortran programs assigns some numbers in a specific order to the deforming points. Furthermore, the geometry file, which is the main input file for these programs, consists 3,163 lines in case of 0.5×0.5 degree size grid area for the study area so that a program had to be written to create this geometry data file automatically.

In order to inverse GPS data with seismic constraint to obtain strain rates and velocity field, two files, one from seismic inversion and the other from GPS inversion, needed to be combined. In order to be able to do that, a Java program was written. Fortran programs give output files as ASCII files, as well. For instance, in order to draw velocity error ellipses, the diagonal values in the variance-covariance matrices obtained by Fortran programs must be used (following some mathematical calculation).

The first and second values in the diagonal of the matrices are the standard errors of east and north velocities, and the third one is correlation coefficient. And this has to be done for each of inversions. So another program was created to do this. And the next step was to wrap all of these tools as Web services which are accessible on the Internet. Thus, researchers who want to use this algorithm do not have to develop new tools and recode available ones.

{sidebar id=394}

In the study, there are two types of geospatial data: geodetic velocities provided by GPS and earthquakes’ focal mechanisms. GPS velocity data are related to Marmara region of the country which involves strike slip faulting. The area has a high seismic hazard and risk because of the region’s tectonics. 65 GPS points were included with the information of longitude and latitude values, eastward velocities, northward velocities, uncertainties of eastward velocities, uncertainties of northward velocities, and correlation between eastward and northward components.

This data produced by GPS campaigns performed between the years of 2003 and 2005 in Marmara region by Geodesy Department of Kandilli Observatory and Earthquake Research Institute (KOERI) of Bogazici University (BU). These campaigns were performed by a collaborative project among BU, MIT (Massachusetts Institute of Technology), TUBITAK-MRC (Turkish Scientific and Technological Research Council – Marmara Research Center), GCM (General Command of Mapping), and ITU (Istanbul Technical University).

GPS data from various sources, which contributes valuable information, can be used. Geodesy Department of KOERI has been conducting studies of monitoring crustal movements since 1990. Particularly, data from western and eastern NAFZ has been collected by different techniques in various projects. The department has an important geodetic data since 1994. And this system is opened to more data.

{sidebar id=395}

Focal mechanisms file includes all records since 1976 for entire Turkey. A focal mechanism solution is the result of an analysis of waveforms generated by an earthquake. An earthquake’s focal mechanism provides important information, including the origin time, epicenter location, focal depth, seismic moment, and the magnitude and spatial orientation of the moment tensor.

Earthquake focal mechanisms data is provided by Global Seismology Centroid Moment Tensor catalog. This catalog contains solutions for events with magnitude about and greater than 5.5. There are 275 earthquakes (M0<1*1020) for the study area and the August 17, 1999 earthquake was excluded since it decreases the strain rates. Thus, earthquakes input file includes 274 earthquakes in a time interval of 30 years.

{sidebar id=396}A typical user request on the system for a given region (i.e. lat/long extent, grid size) returns strain and velocity maps. Each request is given a unique id number according to the time the request arrived at the web server. Then input files are created and stored in the workspace.

The application runs the programs and output files are created and converted to the GMT input file format, then GMT is executed and maps are created. Apache Axis was used as web server and Apache Tomcat was used as servlet engine in Linux platform.

This figure displays the communication among system components. Client makes a request to the Web server and sends the parameters. Input files are created and SOAP request is generated. Web services receive request, parse it, and then send the response to the client.

We as geoscientists have data and tools but data and tool sharing is not satisfying. We still create redundant data and tools. This study tries to find how to bring data, calculations and analysis to desktops of researchers and how to provide optimal use of the data and tools that already exist in order to make time to do more science.

There is an overflow in all of the branches of science today, especially in Earth sciences. This situation causes problems on data storage and data processing, as well as accessing to these large size datasets and analysing them.

Earth-related data are being collected every day using present-day technologies. By means of the satellite and computer technologies, it is now possible to study the Earth as a global system. And scientific instruments such as satellites generate terabytes and petabytes of data every day.

Therefore, there is a rapidly widening gap between data collection capabilities and the ability to analyse the data. Once the solutions have been created for collecting, storing and accessing data, now it is a challenge to effectively share data and applications and processing resources across many locations.

Web services can support integration and interoperability of different applications and enable online science. In this study, the concepts of web services were applied to the development of a strain computation system, which is a prototype created for a web-based geodetic information system.

This study implements an infrastructure to define high seismic hazard areas by providing a strain computing service and a mapping service. As for future work, we would like to develop a database service to facilitate sharing of datasets.

 

——————————————————————————

Asli Dogru is a Research Assistant at Bogazici University, Kandilli Observatory and Earthquake Research Institute, Geodesy Department Cengelkoy, Istanbul, Turkey

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *