Sensors and Systems
Breaking News
Pix4D is at InterGeo to showcase terrestrial RTK scanning solutions, new licenses and promotions
Rating12345Lausanne, Switzerland Pix4D, the market leader in photogrammetry solutions,...
KVS Technologies signs US$60 million strategic agreement with US-based Spright to deliver linear infrastructure drone inspection services in North America
Rating12345DENVER, – KVS Technologies, leading provider of end-to-end drone...
RouteSmart and HERE improve delivery optimization tools for postal and parcel industries
Rating12345 Since 2003, RouteSmart has combined HERE quality street...
  • Rating12345

Penor JB ThumbFor many years, major governmental agencies, like the Federal Emergency Management Agency (FEMA) and the National Geospatial-Intelligence Agency (NGA), have been embracing new and innovative ways for accessing and sharing crowdsourced data through social networks. While this is a natural progression for organizations to access data for decision making, it begs the question: is all of this data truly valid? 

 

Penor JB FullFor many years, major governmental agencies, like the Federal Emergency Management Agency (FEMA) and the National Geospatial-Intelligence Agency (NGA), have been embracing new and innovative ways for accessing and sharing crowdsourced data through social networks. While this is a natural progression for organizations to access data for decision making, it begs the question: is all of this data truly valid? 

Since social networking is driven by individual human experiences, there’s bound to be some inherent flaws in the data. Organizations need to view this type of data as being qualitative and need to augment this information with more quantitative data for effective decision making. 

Rise of Crowdsourced Geospatial Data

The rise of crowdsourced data is no surprise to anyone in business and government. Facebook, Twitter and LinkedIn have become core social networks for consumers, executives and government decision makers to share and access information. The growing importance of social networks to support business initiatives has been documented many times over in the news media. 

While most organizations use social media as a communications tool, certain government agencies are implementing new data mining techniques that leverage this open source information.

At GEOINT 2010, NGA Director Letitia A. Long, announced the concept of GEOINT 2.0, which focuses on putting the power of geospatial solutions directly in the hands of the warfighter or the end user. A core component of this effort is providing access to raw data, including open source data. As part of her vision, new mobile apps would tap into social networks to glean and share this information – ultimately allowing users to “empower” themselves. 

In addition, FEMA has transformed itself into being a nimble agency when it comes to domestic disaster response. In the seven years since Hurricane Katrina, FEMA now utilizes data from social networks, as well as geospatial and mobile technologies to quickly and effectively respond to natural disasters. 

FEMA’s response to Hurricane Irene last year, which devastated parts of Vermont, has become a case study for how government can tap into social networks for quick and effective decision making. In addition to garnering data from citizens on the ground during the hurricane, FEMA implemented new ways for disseminating alerts and warnings to people via mobile devices and social media, like Twitter and Facebook.

Challenges of Crowdsourced Data

While leveraging crowdsourced data seems like the next step in gathering and managing geospatial information, much of this data can be faulty at times – since many online sources may be sharing misinformation or the information may be simply outdated. Even organizations such as Google realize this and give users an option in Google Earth to report misplaced or inappropriate images uploaded to it from crowdsourced devices.

Much like we all learned in kindergarten, we run the risk of cultivating the game “telephone” – where children share information verbally around a circle – and once the information had been shared amongst many, it transitions into a message that becomes faulty. This gossip-circle phenomenon is certainly not new to the human condition and should be taken into strong consideration when making decisions. With this realization, understanding the quality, currency and overall accuracy of the data, and the initial source, is critical.

Simply put, crowdsourced data is valuable but cannot meet all the needs of larger and more complex organizations and time-critical situations. 

Hybrid Approach

However, it is an unfair assumption that all crowdsourced data is faulty. There certainly are benefits for having access to on-the-ground information quickly and easily. For major businesses looking to tap into this type of intelligence, they should approach crowdsourced data as an augmentation to more traditional data collections. By taking a hybrid approach to gathering, processing and disseminating geospatial data, larger enterprises can use crowdsourced data in concert with enterprise-level systems. 

For example, a major utility company can tap into data from social networks and quickly learn about power outages or areas where fallen trees have taken down power lines. 

Conversely, utility companies could develop mobile apps that allow customers to quickly report outages, which would be aggregated in real time for a faster response. This is certainly a welcome change from having customers clogging up the utilities’ customer service phone lines with convoluted descriptions of a given location. 

While directly accessing data through apps can be done quickly, it should be taken with a slight grain of salt, or at minimum, be used to augment enterprise-level geospatial data processing efforts and systems. 

One thing to take into consideration is the accuracy of mobile GPS. Many consumers use apps like FourSquare, which allow them to check-in to locations to receive points and/or rewards from proprietors. But because of smartphone GPS and its level of accuracy, these individuals don’t really need to visit the location. They just have to be in the vicinity to get the credit. It’s an all-too-common and easy way to cheat the system. The producers of the Shopkick app have conducted their own study on the use of their app and the results show that the location-based data they collected from users can often be off by 1,000 feet or more from the establishment location. 

In addition, smartphones and their GPS/LBS capabilities are similar to field grade (e.g., hiking) GPS handheld devices when it comes to accuracy – often 10 to 15 meters in range of its actual location. However, unlike the dedicated units, smartphone GPS does not usually also have GLONASS tracking and the ability to deal with multipath problems in urban environments. 

In addition, there are typically no dedicated antennae on smartphones since GPS is not its primary function and battery usage is very high when using the GPS. Another challenge is the software behind the app: its mobile and typically not enterprise-grade. Does the mobile software manipulate the data in anyway or just pass the information on? All of these compounded factors can leave users with degraded data accuracy. 

However, crowdsourced data is great for municipalities using mobile apps like “Give-a-Hint” to alert departments of potholes, graffiti or parks with overflowing trash. These apps allow the public to provide local governments with the general vicinity of incidents – allowing for a more efficient response. In addition, these mobile solutions are ideal for finding earthquake survivors in collapsed buildings, enhancing rescue efforts tremendously.

Enterprise-level solutions are clearly more robust and meet the complex needs of institutions that need more accurate location information. For example, utilities work with many kinds of geospatial data including LiDAR and imagery, which requires the efficient management and updating of asset information. In addition, these companies are faced with environmental and safety mandates, which results in volumes of land use inspections, photos, field analysis and surveys that also need to be managed along with geospatial data. Crowdsourced data from customers on the ground could never truly meet this need. 

Many utility providers also have highly complex business models with a variety of field assets and large workforces, requiring a suite of industry specific solutions to address work design, network asset management, outage management, and integrated mobile work force management. This is where robust remote sensing, GIS and data management tools are needed.

Enterprise-level solutions are vital for creating GIS and remote sensing applications for utility asset management, which extends beyond any data available through social networks. These applications enable utilities to detect changes by comparing geospatial imagery and LiDAR data to monitor transmission corridors, power lines and poles, as well as analyzing an area for vegetation encroachment. 

Many countries also have organizations responsible for collecting, surveying, processing, analyzing and publishing geospatial information for specified areas. These organizations are often driven by mandates or the critical need to make their geospatial data available to citizens. When this occurs, accuracy is paramount and dissemination of this data through social networks is ideal – not the other way around.

Need for Lifecycle Solutions

From population expansion to unprecedented levels of manmade and natural disasters, our world is changing at blistering pace. We are now confronted with global uncertainties that need to be addressed swiftly and precisely.

To address these issues, manage change and ensure a safe future for our planet, we simply need the right geospatial solutions, which have become the foundation for effective decision-making. 

Larger organizations need solutions from providers who offer capabilities through the whole life-cycle of geospatial information management. Extending well beyond any data available in a crowdsourced environment, this includes content capture to on-demand geo-processing to delivering this actionable data to users in the field. 

Crowdsourced solutions are also unable to combine all key assets including GPS, airborne and laser-scanning sensors providing the data, then ultimately feeding it through photogrammetric, GIS and remote-sensing mapping systems. Larger organizations require this type of end-to-end solution for decision-making – especially in a world that is highly tumultuous. 

Chasing the Crowdsourced ‘Siren Song’

Increasingly, we are seeing many smaller- and mid-sized geospatial software and solutions providers branching into crowdsourcing. Many providers are integrating these solutions into their core offerings through acquisition, or building them out organically. 

When combined with legacy offerings, crowdsourced data solutions make sense. Though any organization that base all of its decisions on this type of data runs the risk of relying on faulty data with bad decisions to follow.

We live in an ever-increasing dynamic world where data is at everyone’s fingertips. In many ways, this makes us smarter and more nimble. With crowdsourced data, we are able to react to situations more swiftly, which is needed now more than ever. However, just because information now is becoming more instantaneous, our decision-making should not. There are those times when acting too swiftly can complicate matters more. 

So, is crowdsourced data valid for our geospatial needs? Yes and no…it is still a young technology continuing to evolve. For now though, we should just tread cautiously or always put an asterisk next to crowdsourced data because no organization – whether small or large – can afford to make bad decisions in this day and age.

Leave a Reply

Your email address will not be published. Required fields are marked *