Common UAV Software May Not (Yet) Be Reliable for Building Safety or Damage Assessment

Common UAV Software May Not (Yet) Be Reliable for Building Safety or Damage Assessment

Posted by admin on Nov 10, 2015 at 5:51 pm America/Chicago

[caption id="attachment_1945" align="alignright" width="289"]Reconstruction exhibiting all four types of anomalies Sample reconstruction exhibiting all four types of anomalies[/caption]
(Note: there is lots of great work going on worldwide and we look forward to working with all companies and researchers to help improve this vital technology) Researchers at Texas A&M and University of Nebraska-Lincoln have found that popular software packages for creating photo mosaics of disasters from imagery taken by small unmanned aerial systems (UAS) may contain anomalies that prevent its use for reliably determining if a building is safe to enter or estimating the cost of damage. Small unmanned aerial systems can enable responders to collect imagery faster, cheaper, and at higher resolutions than from satellites or manned aircraft. While software packages are continuously improving, users need to be aware that current versions may not produce reliable results for all situations. The report is a step towards understanding the value of small unmanned aerial systems during the time- and resource-critical initial response phase. “In general, responders and agencies are using a wide variety of general purpose small UAS such as fixed-wings or quadrotors and then running the images through the software to get high resolution mosaics of the area. But the current state of the software suggests that they may not get always the reliability that they expect or need,” said Dr. Robin Murphy, director of the director of the Texas A&M Engineering Experiment Station Center for Robot-Assisted Search and Rescue and the research supervisor of the study.  “The alternative is to purchase small UAVs explicitly designed for photogrammetric data collection, which means agencies might have to buy a second general purpose UAV to handle the other missions. We’d like to encourage photogrammetric software development to continue to make advances in working with any set of geo-tagged images and being easier to tune and configure.” In a late breaking report (see SSRR2015 LBR sarmiento duncan murphy ) released at the 13th annual IEEE International Symposium on Safety Security and Rescue Robotics held at Purdue University, researchers presented results showing that two different photogrammetric packages produced an average of 36 anomalies, or errors, per flight.  The researchers identified four types of anomalies impacting damage assessment and structural inspection in general.  Until this study, it does not appear that glitches or anomalies had been systematically characterized or discussed in terms of the impact on decision-making for disasters.  The team of researchers consisted of Traci Sarmiento, a PhD student at Texas A&M, Dr. Brittany Duncan an assistant professor at University of Nebraska- Lincoln who participated while a PhD student at Texas A&M, and Dr. Murphy. The team applied two packages, Agisoft Photoscan, a standard industrial system, and Microsoft ICE, a popular free software package to the same set of imagery. Both software packages combine hundreds of images into a single high resolution image. They have been used for precision agriculture, pipeline inspection, and amateur photography and are now beginning to be used for structural inspection and disaster damage assessment. The dimensions and distances between objects in the image can be accurately measured within 4cm.  However, the objects themselves may have glitches or anomalies. created through the reconstruction process, making it difficult to tell if the object is seriously damaged. The researchers collected images using an AirRobot 180, a quadrotor used by the US and Germany military, flying over seven disaster props representing different types of building collapses, a train derailment, and rubble at the Texas A&M Engineering Extension Service’s Disaster City®. The team flew five flights over 6 months. The resulting images for each of the five flights were processed with both packages, then inspected for anomalies using the four categories.

Our Sponsors