Sensors and Systems
Breaking News
Reimagining Michael Baker International to Accelerate Growth and Innovation
Rating12345BELLEVUE, Wash. – EagleView Technologies, Inc., a portfolio company...
Dynam.AI Unveils Vizlab, a Next-Generation AI Platform with Customizable Real-World Machine Learning Capabilities
Rating12345Data scientists are encouraged to join the early access...
HawkEye360 to Deliver Large Data Files Over LEO Satellites using BitRipple Technology
Rating12345BitRipple Inc., a provider of Metaverse solutions that enable...
  • Rating12345

Jeff Thurston — "Building upon the primary functions of geographic information systems
(GIS) is the key to unlocking the door toward greater spatial process
modeling and geo-processing. We have barely entered this phase of
development and there is immense opportunity ahead. Immediate needs
that come to mind for furthering this goal include : building on the
integrative nature of GIS, building capacity in the area of
understanding processes in a spatial context, evolving GIS output
toward new forms of communication and expanding automation into
tool-process interactions."

Matt Ball — "It’s
becoming ever more clear that man has placed a great burden on the
planet and atmosphere, and that these manmade pressures are only
increasing. If we’re going to find expedient solutions to global issues,
GIS will need to evolve to accept larger datasets, incorporate multiple
and in-depth Earth system process models and address dynamic processes
across large space and long time."

Untitled Document

PerspectivesWebHeader.jpg
heads_banner.jpg

Building upon the primary functions of geographic information systems (GIS) is the key to unlocking the door toward greater spatial process modeling and geo-processing. We have barely entered this phase of development and there is immense opportunity ahead. Immediate needs that come to mind for furthering this goal include : building on the integrative nature of GIS, building capacity in the area of understanding processes in a spatial context, evolving GIS output toward new forms of communication and expanding automation into tool-process interactions.

But we also need to become more hungry about using technology to help us to understand spatial processes and their direct connection to the things we need – like producing good food, water and clean air. We also need to recognize those individuals and teams who make this stuff happen in new ways, real ways – that make a difference and to celebrate in successes contributing toward improving quality of life and living, more often. Many of these people are not stars, popularly known and are not in the public eye.


Integrative Nature of GIS

Since GIS are well known for their abilities to integrate spatial information from across a wide variety of sources in a federated way, an easy next step surrounds the possibilities to integrate CAD information together. This is often called GIS/CAD integration and today we think about this in terms of being able to work with GIS and or CAD within one workspace. But, I think remote sensing and positioning technologies also have a big role in process modeling.

The integrative nature of GIS/CAD implies process modeling, at its root and that design tools are being coupled with tools which enable spatial analysis. The terms work flow, processes, process modeling, spatial integration, geospatial analysis, geo-processing and spatial modeling can all, more or less, be considered as terms which encapsulate common core of functions, namely,

* Spatial data management
* Spatial data analysis
* Spatial modeling
* Operation and design
* Visualization


Build Capacity for Understanding Processes in a Spatial Context

The above points are more or less focused on the technical aspects. To get to the heart of process modeling it is important to understand not only the nature of what is being modeled and its underlying processes, but also scale. Is the process a local one, such as a room in a house or is it a larger one, perhaps all the rooms in a building? Even larger scale projects might involve all the buildings on the street. The larger things become, the more complex they become and levels of integration rise. Generally, as we move to larger scales then more factors and features enter into the work flow, adding more processes. The total number of processes can sometimes exist as a group and be modeled individually or collectively. The model for a bedroom in a house (using CAD) can quickly become a bigger project with lots of rooms and streets etc, all requiring more management in space and through time (GIS). We sometimes assume the whole GIS/CAD relationship will exist in one form that can be boxed up, controlled rigidly and negotiated on finite terms. I doubt that is the answer, reality is not so clean.

Instead, process modeling will require adaptive tools, flexible mind-set and a willingness to design and analyze based on unique situations – think of it as a continuum of sustainable design – adaptive, integrative and circulating.

It is this recognition, albeit in a simple example, that stimulates discussions surrounding eco-sustainable design for communities or neighborhoods. Although, at another level, say the refrigerator level, one could be designing a new refrigerator that is more energy efficient, yet also contributing to the neighborhood. The point is; sustainable communities are landscapes, infrastructure and people. These are the raw ingredients for which toolsets need to applied.

We do not have enough people who think in terms of scale in a spatial sense, in a process sense. Today, the media is filled with messages like “the Earth is warming, save it” and “green your neighborhood, consume less” and “build better houses” and so on. But these don’t cause people to understand what is driving the underlying processes contributing toward the high energy use and how these are all represented in a walk down their street – visibly.

A need exists to explain how the tools of CAD and GIS contribute and can be used for understanding processes. We need to articulate in more digestible chunks, how we are going to go about saving the world, improving the situation and where each of us can create impact.

Example: You grow the hay, it goes into a cow, a farmer comes along and milks the cow, the milk goes into a machine, then is put in bottles, a truck delivers it to the store, someone puts a price on it and you start your car, which runs on gas, on a road, then go into the store, buy the milk etc. etc. Do you see what I am getting at?

In that short description a number of processes were enacted, all contributing to a work flow. It is no different for spatial information and designing cities with CAD and GIS. But we need to build more capacity to understand the spaces we live in and the processes surrounding and supporting them and then to describe them in pieces we can understand and get passionate about to do something with.


Support GIS Output with Communication Tools

The desktop monitor is nice, but three monitors together is nicer. Touch sensitive hand devices, tables, overhead viewers, IMAX, 3D sound, holography, virtual reality and a host of other technologies and tools are a big part of the geospatial future.

This recognizes that we are creating work flows that cross boundary’s such as the pan-European data infrastructures and the geospatial initiatives of North America. It also recognizes that many Asian countries are well advanced in developing technologies.

But the integral piece to understand, I think, is that the analysis and results of all these wonderful tools and technologies in GIS, CAD and GPS and so on, can do real things, can have real impacts, can make a difference. We are beyond maps in one way, but not others. But what matters more is that we realize our results are used and understood best by other people on terms and in ways that are meaningful to them. It’s nice to pat each other on the back, but the real point is to have the guy or gal in front of you understand. Applying spatial results, whether in tabular form, map form, visualization or schematic; all demand that the most effective communication tools are used to speed up understanding.

Today we are likely to see software and applications designed as much for devices and display or user devices as for performance and features – ideally both. How do we rationalize having all this great data and information through results, if no one is seeing them, building on them? What you create data for, does not necessarily mean it cannot be used in different ways, re-purposed. Communication tools enable the likelihood of that happening.


Expanding Automation into Tool-Process Interactions

If you look closely at some of the geospatial software being developed today, you will see that linkages are being made between functions. Manufacturer’s are incorporating shortcuts or automation into the processing of functions. Feature extraction from aerial imagery is an obvious example, but other examples include automatic adjustment for geo-referencing when adding data. That is, ETL functions may be hidden, but operating and on guard throughout processes. In other cases, we see automation across devices, for example, a one-click function that downloads GPS data, makes a map then exports the data into GIS and virtual globe formats. Data quality software are another good example of automated functions.

The net effect of automation is an increase in productivity. But also increased opportunities. With huge amounts of data at higher resolutions now available, automation allows for quick processing of all the data, thereby ensuring value and return on investment is maintained.

The future is likely to bring more automation, allowing more people to process data because the heavy lifting is hidden, behind the scenes. Although ever present and a necessity, this form of processing and the fact that anyone can begin to use high level tools means the gap between GIS and process modeling will rapidly narrow. The GIS people and process design and modeling people will begin to eat lunch together and share thoughts more often. Why? Because their relationships will not be solely oriented to technology silos, but instead shift to conceptual frameworks for accomplishing tasks. This is a big deal and a direct benefit of higher automation geospatial toolsets. It also means that changes in organizational structures are on the horizon as these people mingle, again, leading toward more opportunities and interesting possibilities.

Overall, the greatest impacts in linking GIS to process models will be through empowering people to do more, take risks and feel more passionate about owning problems and solving them. Whatever we do to harness that, will set the course. GIS and process modeling tools are on the verge of a real dynamic future which will significantly contribute to a better, more sustainable tomorrow.

 

It is imperative in this time of global challenges that GIS evolve to analyze phenomena over time and space. This week’s question is paraphrased from a paper written by Carl Steinitz of Harvard University’s Graduate School of Design. The paper asks a number of compelling questions and provides a framework for GIS modeling. The fact that it was written in 1993 makes it no less relevant today as this problem of linkages still exists.

No other tool can play as critical a role as GIS in understanding the complex interactions between Earth systems and human impacts. It’s becoming ever more clear that man has placed a great burden on the planet and atmosphere, and that these manmade pressures are only increasing. If we’re going to find expedient solutions to these issues, GIS will need to evolve to accept larger datasets, incorporate multiple and in-depth Earth system process models and address dynamic processes across large space and long time.


A Matter of Scale

The issue of scale is a very important part of this discussion as the tools and approaches should vary widely based on the scale of the problem that we’re trying to address.

Global challenges represent issues of massive scale that are very difficult to comprehend and rather hard to model. When we’re talking about problems such as coastline response to global warming, pollution in a watershed and atmospheric affects of carbon emissions, a large-scale model is needed. Also in problems of this size, it becomes necessary to involve a much broader data set from multiple disciplines, and to incorporate multiple analysis frameworks that trade on discipline expertise.

When the issue is to model smaller geographies, such as analyzing the effect of different planning outcomes on a neighborhood or a small city, the modeling burden isn’t so great and the problem isn’t quite as complex. Still there’s an issue of drawing together multiple datasets and involving a multidisciplinary approach.

When getting down to smaller geographies, such as buildings or intersections, a smaller model and individual inputs can be adequate. Here the use of a desktop tool can be quite adequate, and the need for collaboration and communication might not be so great.


Tracking Movement on Machines

The idea of modeling and analyzing dynamic processes over space and time has been around since the early days of GIS. There have been incremental steps along this route to address these issues, but limitations on computing power have typically been the choke point when dealing with issues on a large scale.

There simply isn’t a workstation around, and very few server installations, that can handle the enormous data handling loads of problems that involve both time and space over large geographies. Adding three-dimensional geographic space further increases the computing load, but is increasingly necessary as models improve.

Interestingly, gaming platforms have evolved nicely to handle 3D visualization and have driven down costs while providing increasingly powerful graphics processors. The move toward 64-bit computing and clustered computing has begun to make supercomputing capabilities affordable. There’s also the advent of grid computing with internetworked machines harnessed together to spread the processing loads.

By combining these advances, there’s great promise for taking the choke point of processing out of the equation, and making the processing problem a less-expensive obstacle.


Offline Analysis

Dynamic process modeling on a medium scale requires in-depth and complex geoprocessing and analysis. The deep thinking and complex programming involved in solving these issues makes it a problem that is outside of the realm of most users, yet the outputs of this analysis is needed for policy changes to improve our balance with nature.

The break between those that can build dynamic processing engines for analysis and those that need these outputs requires a new approach. We’ve seen interesting business models emerge for offline analysis in the Building Information Modeling space, where architects submit a model online to a Web-based software as a service (SaaS) outfit and in return receive detailed reports.

Green Building Studio, which was just purchased by Autodesk, is one such outfit that takes an architectural plan and returned in-depth energy analysis after running the model through a complex analysis engine, aligning the design with the LEED standard for energy efficiency. This SaaS approach takes the burden of the computing load and the expertise needed for the model away from the end user. This approach also ensures that the expertise of the analysis engine will constantly improve as more models are passed through the system and a better grasp of parameters emerges.

This approach is a compelling and reasonable alternative for geospatial modeling of dynamic processes at a neighborhood or small city scale. It concentrates the expertise in one organization for consistent improvement of the model, while taking away the unrealistic expectation that non-experts will be able to use a desktop tool for trusted analysis in areas that are outside of their professional expertise.

This approach can work well for large-scale geospatial problem solving in areas of land-use planning, hydrological modeling, transportation planning, environmental impact analysis and the like. It’s at this large scale where problems of process power and data analysis become very complex.


Better Simulation

What about tools to offer insight to dynamic issues on a local scale? When we’re talking about a very local issue of the interaction of one valve on a water network, one intersection in a transportation network or one transformer on a grid, these problems needs to be modeled in time and space, but don’t present an overwhelming processing burden. The issue of these local problems is more a need for simulation than it is of in-depth analysis.

Here’s where a better simulation model that incorporates 3D reality above and below ground could come into play. I viewed such a prototype simulation window at this week’s Autodesk event. The simulation was used to model the water load of a network both before and after a building went online. When the valve was shut off, the simulation displayed the affect on the network – showing how water loads along that node scaled back and eventually dissipated.

This local simulation across both time and space provided an incredibly compelling visual that imparted knowledge of how water reacts in this network and provided clear visual confirmation that the planned action would return the desired result.

Simulation at the local scale is a means to quickly put actions into context. Advances along these lines will give GIS practitioners a much better understanding of the time and space realities of their networks and assets. Simulation will also make apparent any deficiencies in the model, and will help push the need for better data quality right down to the individual user who can see the effect of poor data when looking at a model or simulation view.

The issues of time and space and process models is certainly not a trivial problem on any of these scales. The local simulation issue seems to be the closest to resolution for a software tool perspective as it’s the easiest one to coordinate and implement from at technical sense. Issues of medium scale are predominantly a matter of combining disparate domain expertise and modeling that I believe could be most easily addressed from a service perspective. When you get to the large scale of climate and watersheds, it becomes a matter of great scientific import that will be driven by policy decisions and investments from government. There’s certainly the expertise to solve these big-picture problems, what’s lacking is a mandate and funding to push us toward this goal.

 

Leave a Reply

Your email address will not be published. Required fields are marked *