Sensors and Systems
Breaking News
DroneDeploy Partners with SLANTRANGE and Sentera to Provide Accurate Crop Health Analysis Using Near-Infrared and Multispectral Sensors
Rating12345SAN FRANCISCO, CA –  DroneDeploy, the leading drone software...
URISA Board Candidates Announced
Rating12345Des Plaines, IL – URISA’s Leadership Development Committee recently...
OGC Publishes Results of International Arctic Spatial Data Pilot
Rating12345 The Open Geospatial Consortium (OGC®) is pleased to...

August 25th, 2015
Tackling 60TB of Data Per Day from Space #EAS2015

  • Rating12345

Shay Har-Noy, senior director, Geospatial Big Data with DigitalGlobe spoke this morning at the ENVI Analytics Symposium in Boulder. He spoke to big data considerations of volume, variety, velocity and veracity of the data — and the fact that our Earth is changing constantly. DigitalGlobe covers the entire globe every six months, and captures 60 terabytes (TB) of imagery per day.

The challenge is to convert the heavy data of this large imagery archive into accessible Big Data that can be parsed and “crunched on.” We’ve done a good job of exploiting information on a local scale, but ramping this to a larger scale is still a frontier. Rather than thinking about pixels, it’s a matter of making sense of these pixels in their largest extent.

The pace of data collection, with 90 percent of data created in the history of the world being created in the past ten years, means opportunity for the geospatial community. The omnipresent sensors on the ground, such as 500M tweets every day, are leading to new insights.

Solving Big Data problems is an ongoing problem. The issue of processing TBs of high quality multiband imagery requires storage and computing and now this is being done on the Cloud for easier access. Selling per km2 data is expensive for large scale, so DigitalGlobe now offers tiered pricing to address analytical questions at larger scales.

One recent analytical challenge dealt with imagery in California on water usage. The opportunity is to look at the full record of five years of imagery to understand water usage alongside data from water boards to really understand usage across the state.

Geospatial problem solving involves:

  • Organizing data sets for standard access
  • Optimizing for velocity and change and not for perfection
  • Thinking different and thinking big

Leave a Reply

Your email address will not be published. Required fields are marked *