Sensors and Systems
Breaking News
OpenGov Makes Gains with Larger Governments and School Districts, Closes Cartegraph Deal, and Strengthens Leadership Team with Notable New Hires
Rating12345SAN JOSE, Calif. – OpenGov, the leader in modern...
The New DJI Mavic 3 Enterprise Series Sets Ultimate Standard For Portable Commercial Drones
Rating12345With A 56× Zoom Camera And RTK Module For...
AUVSI Collaborates with DIU on Cybersecurity Certification Pilot for Commercial Drones
Rating12345Arlington, Va.- The Association for Uncrewed Vehicle Systems International...
  • Rating12345

Doherty_Mark.gifIntergraph has been developing industry-specific solutions on top of its core  geospatial platform for some time now, and the benefits of that focus are translating into strong company performance. V1 Magazine editor Matt Ball sat down with Mark Doherty, Executive Director, Technology Architecture and Strategy, for Intergraph’s Security, Government & Infrastructure (SG&I) division, at the Intergraph User’s Conference in Las Vegas to discuss the solutions approach and the plans for the underlying platform.

Doherty_Mark.gifIntergraph has been developing industry-specific solutions on top of its core  geospatial platform for some time now, and the benefits of that focus are translating into strong company performance. V1 Magazine editor Matt Ball sat down with Mark Doherty, Executive Director, Technology Architecture and Strategy, for Intergraph’s Security, Government & Infrastructure (SG&I) division, at the Intergraph User’s Conference in Las Vegas to discuss the solutions approach and the plans for the underlying platform.

V1: Intergraph has switched from strictly platform development to a solutions orientation. What does that mean for your technology approach and the geospatial culture within Intergraph?

Doherty: Let’s start with the cultural aspect first. We’ve always had industry managers that have focused on understanding an industry and its issues and requirements. In the past, we’ve made certain attempts at industry-focused products, however, I think with the maturation of the geospatial industry we’ve reached the point where there’s real benefit to go a lot deeper into solution areas.

We’ve organized around key verticals and around ways of doing things that are specific for individual industries. We start out with industry managers that focus on a specific industry and collect that industry’s requirements. Then we get into organizing or segregating our technology development so that we have people focused on the platform part of the business and people who are focused on the solution part of the business.

There’s no perfect way to do that, but certainly it’s necessary to have that segregation because the platform moves at one pace and has a set of issues and the solution stack on top of that needs to move at a different pace. Solutions react to different pressures and different market issues.

You can’t just flip a switch and make this reorganization happen. It’s certainly an evolutionary thing, where we’ve had to grow into the solution space. We’ve done some things right, we’ve done some things wrong. And, quite frankly, we’re still learning about the best way to organize and structure both organizationally and technically to support being both a platform provider and solution provider.

V1: I imagine that it can be hard to respond to technology-specific business drivers and industry-specific business drivers simultaneously.  Certain sectors can be hot and cold depending on the economy and geopolitical pressures.

Doherty: It helps to be a bit decoupled, so that we can be a little more nimble in a certain solution area and respond to some things that are happening right away. At the same time, maybe our platform is rumbling along at a slightly different cadence or pace.

It’s important to have flexibility on the platform, and we’re also focusing on six or seven different vertical areas. We don’t attempt, nor will we attempt, to go as far up the stack or as deep in a solution set in every one of those industries. Some industries are more repeatable than others across the globe.

For example, there are some differences in electrical utilities, but globally you can find repeatable things that are done and consequently develop global applications to manage different aspects of electrical utilities. In other areas such as a government’s land management for example, the global solution bar isn’t quite as high because government regulations, the legal framework, the policies and the cultural aspects of how land is managed can be very different as you go from one part of the world to another.

So, how far up that stack or how deep you can go in that solution does vary from industry to industry. That’s another reason to have those two very distinct parts of the organization.

V1: I’m impressed with some of the intelligent tools that you’ve developed that have business cases and rules built into them, such as the SmartPlant product. Are you able to replicate that kind of approach for other verticals that are really begging for machine-based intelligence and much more intelligent systems?

Yes, and a good example in the SG&I division would be in our public safety space in the area of interoperability. If you look at the computer-aided dispatch and call taking side of things, you can envision an event or incident that happens and is then captured on the network or in the system.

Public safety agencies are increasingly being faced with deciding who and how and what bit of that information they send out to first responders and other public safety agencies, and that’s very much a rules driven, business process driven situation. An event goes into BizTalk or an application system and asks, “I’m in this type of an event, it’s this time of day, I have this severity code, what do I do with myself?”

An event with a specific set of conditions needs to be responded to in a very specific way. For example, a call comes in that says a suspicious person has been spotted with a gun. That event gets captured, goes into a business-rules driven system that parses “person with gun” and knows to ask, “Are there schools within a 5-mile radius of this event?” If yes, then the system knows to automatically notify the schools that they should go into some kind of lock down level.

We can’t expect humans to be able to do all of that analysis and checking, particularly when they’re already being asked to do more and more. To do this all manually or with a manual process is not going to be an option.

V1: Going back to a checklist must be really tough when responding to an incident.

Doherty: They just get overwhelmed. There are too many things to handle and at the same time they’re answering more calls. It becomes overwhelming, so finding ways to apply clear business rules that you can automate is one of the things that we try to find the opportunity to do.

V1: Another type of intelligence is model-based intelligence, with the building information modeling approach gaining ground. Are you working to create 3D intelligent models in your solution sets?

Doherty: Our Process, Power and Marine (PPM) division would have a comprehensive answer to that. PPM is in the business of building 3D models to let people operate plants, make decisions on construction, and run them through fifty-year life cycles.

On the SG&I side, we’re seeing more requirements to have that kind of building information modeling that integrates into the kinds of systems that we deliver. For example, a security system for transit or airports has a good view of perimeters where a map or plan view does a pretty good job of letting people see where events are and what’s going on. But the minute that event transitions inside a structure, then what you really want to be able to do is flip over to some kind of a Building Information Modeling (BIM) view.  Seeing exactly where the alarm is going off helps you understand what other things exist in relation to it. Model-based intelligence is definitely coming into the spaces that we occupy.

V1: It’s sometimes hard to reconcile the benefit of 3D with the business case.

Doherty: Well, the fact is that any map or flat display on a screen is an approximation of what exists in the real world, right? So we’re starting to get to a point where the tools are ubiquitous enough and easy enough to use that we can look at modeling and work with the world in a more natural way, more in the way that we should see it.

There are still lots of challenges because we still do have a two-dimensional model that’s on a two-dimensional screen even though it appears to be three-dimensional.

Visualization like in Google Earth and Microsoft’s Virtual Earth obviously appeals to people. I think they have really driven that forward in the last few years and have really opened up people’s minds to how valuable 3D visualization can be. 3D visualization has existed for quite a while, but in niche systems that have been very expensive.

The cost to build the models is coming down. The standards surrounding 3D representation like CityGML are starting to emerge, and that lets you exchange models between different systems. That’s going to stimulate the uptake and have a lot more people using 3D tools and systems like ours.

V1: All the sensor integration that you’re doing is primarily security-oriented, but you’re ingesting and working with a lot of sensors within that 3-D space. Does the sensor web hold good prospects for your solutions in the future?

Doherty: Yes, I would say so. We’ve gone into that vertical area of public safety and security, and we’ve integrated sensors into some solutions, whether it’s border security or transportation security at an airport or a transit facility. Moving into the security space has caused us to learn a lot about sensors, figuring out how to do it, and sometimes painfully so.

One of the challenges there is that standards aren’t widely adopted yet. So if you go into an existing building that has door alarms that have different types of sensors or video feeds, they’re all proprietary systems with their own APIs and their own ways of doing things. In some cases, they’re very archaic or old APIs. So to just bring those feeds into a more modern system and integrate them is not always a simple thing to do.

One of the things that would certainly make that a much simpler process is if we started to see some adoption of sensor-web standards by those providers. We’re not quite there yet. That’s still more of a vision than a reality. But that’s the way that I think it ultimately has to go.

In a typical standards fashion, we need vendors supporting it and purchasers demanding it. That’s ultimately how it moves forward — similar to what we’ve seen with the adoption of standards developed by the Open Geospatial Consortium (OGC) such as Web Map Service (WMS) or Web Feature Service (WFS) standards, where it took quite a while for those standards to really penetrate, get traction and have a whole new market. We’re at the early days of the sensor space. 

V1: Sensors are going to feed a whole dynamic side to the business, with many real-time inputs. The need for real-time systems seems to be a focus of yours in many markets.

Doherty: Absolutely. We started out in the public safety space, where we gained a lot of real-time experience dealing with call taking and dispatching and the 9-1-1 world. Then the progression was to take some of that technology over to the utility space with workforce management and outage management, and now the most recent progression has been to take that incident capability back into the security domain.

That integration of real-time information with geospatial data has been a differentiator for us. Combining geospatial technology with incident management, work management and records management is really something that sets us apart from pure geospatial vendors.

A simple map came into a public safety application some time ago and stayed at a steady state. But now we’re really seeing a “taking off” of the map as a means to integrate information. People, partly driven by the Google and Microsoft viewers, have become aware of what you can do with mapping. They’re asking much more sophisticated questions of us. We’re really seeing a resurgence in the demand for new and more advanced geospatial capabilities inside those real-time environments. And I think that it’s just going to keep increasing for the foreseeable future.

V1: One of the challenges to implementing all these new systems is the large number of different software distribution channels today. We’re not just operating on the desktop and server anymore.

Doherty: I think what you have to do is to position yourself to have as much flexibility in terms of how you architect and deploy a solution. The best way to do that today is through service-oriented architecture.

That’s a big part of the evolution that we’re in — taking a rich or a thick client application today and decoupling some of those business rules from the GUI logic so that they are now services that can be deployed in different applications or as different components of an architecture. Some of our technology stack was further along in that respect than others.

GeoMedia was built as a componentized model that is fairly easy to pull apart. That is evidenced by the fact that we’ve had the same components running in the desktop in GeoMedia that we’ve had in GeoMedia WebMap on the server side for many years. For instance, we only have one buffer zone object that’s deployed on the desktop and the server. So we did a very good job of that a lot of years ago.

We’ve written a lot of business logic in our public safety tools that are in a client-side piece. Part of what we’re looking at is places where we can separate some of that business logic and put it into a web service or a service component that we can re-deploy and use in a lot of places.

Once you do enough of that, you’ve got enough critical mass in that middle tier to give you a lot more flexibility to decide what kind of a client you want to exploit that with. Or you may want to exploit that functionality even without a user interface on it at all, just ask a question of it through a simple interface. You may simply want to use business rules inside that business process automation server to answer a specific question, to create a result that gets sent off to a requesting agency and it never ends up being manifested in a graphical interface that exists in your enterprise. You want to have the business logic such that you can use it any way that makes sense.

V1: I see the need for intelligence within our systems as a strong driver for industry expansion, but I don’t know if that’s now or ten years from now. How quickly can an evolution toward machine intelligence happen?

The whole area of business intelligence may be the best area to address that. There are a couple of things converging there — there’s traditional business intelligence of slicing and dicing and reporting and dashboards on massive amounts of data. And a logical extension to that is predictive analysis that looks at past patterns and helps assess what might happen in the future.

One of the things we see a big hole in is that there hasn’t been a lot of work to integrate geospatial components into that kind of business intelligence (BI) work. We see a possible opportunity there to do a better job of integrating geospatial technology with BI type tools and analytical tools that also extend into the predictive part. 

Leave a Reply

Your email address will not be published.