Every year, several hundred million hectares of forest, grasslands, and other types of vegetation burn throughout the world, and this amount is set to increase due to climate change. Wildfires pose a challenge for ecosystem management because they can be both harmful—threatening human life, property, economic activity and contributing to climate change—and beneficial—by regulating plant succession and fuel accumulation, affecting populations of insects and diseases, influencing nutrient cycles, and in many other ways we still struggle to understand.
Organizations responsible for environmental monitoring, especially when it comes to ‘smart development’ in environmentally sensitive areas, are increasingly using the right geospatial data, tools and processes to ensure minimal impact.
The sciences, technologies, and practices of remote sensing and of geographic information systems (GIS) arose separately, developed in parallel, intersected, and are now inextricably linked. Nearly all the features in most GIS are collected by means of satellite imagery or aerial photogrammetry, and GIS is the application where this imagery is most commonly visualized. “All the foundation elements of GIS come from remote sensing: cultural features, roads, buildings, water features, topography, terrain, soils, slopes, geology, and many more,” points out Lawrie Jordan, Director of Imagery at Esri.
Good management decisions require the availability of quality information. For forest resource managers, the combination of airborne Light Detection And Ranging (lidar) remote sensing data together with Esri’s ArcGIS and the Forest Service, U.S. Department of Agriculture’s own FUSION software have created a powerful 3D environment capable of modeling a forest’s canopy structure.
If Old McDonald had a farm today, he could manage it from his laptop computer and map it with an application on his handheld device. When he was out in the field, his tractor’s guidance system could know its position to within less than an inch, turning his planters and sprayers on and off accordingly. A boom height control system would make sure that his sprayer did not hit the ground and a yield monitor on his combine would measure the exact volume of his harvest, in real time. Soil moisture sensors networked via cellular modems, soil density sensors on his planters, and infrared crop health sensors on his tractor would gather a wealth of data that his agronomist would use to prepare a prescription map for the next season. In a few years, that data stream would also include aerial imagery collected by his unmanned aerial vehicle (UAV) and his tractor would also be running unmanned as a robot in the field. If a chick, duck, turkey, pig, cow, cat, mule, dog, turtle, or farm hand got in its way, the tractor’s radar collision avoidance system would recognize it and stop.