Archive for the ‘Research’ Category

Missions, Choice of sUAS Platforms, and Manpower for Flying Floods: Lessons learned from our deployment with Fort Bend County May 30-31, 2016

YouTube Preview ImageCRASAR was in the field from Monday and Tuesday (May 30-31, 2016) at the request of Fort Bend County Office of Emergency Management, with Roboticists Without Borders member CartoFusion providing off-site data support.  We flew a small UAS (DJI Phantom 3 Pro) throughout the county to help them validate their flood inundation models, conduct hydrological forensics, and educate the public on why evacuations were necessary. This is our third response to flooding this year (Louisiana March, Fort Bend County April, and this event) and we continue to learn about missions, selection of sUAS, and crew organization and CONOPS building on my previous recommendations for flooding.  I am working on a formal cognitive work analysis using the shared roles model but some preliminary lessons that may be of use to other teams are discussed below. We’d like to especially thank Jeff Braun, Lach Mullen, Adam Wright, and Juling Bao from Fort Bend County. Traci Sarmiento served as VSO both days, Xiaosu Xiao and Grant Wilde took turns being data manager, and I was the pilot.

 

Check out some of the YouTube clips at (some of which were on the CBS news):
RWB member Hydronaulix also offered Texas Task Force 1 the use of five EMILY swiftwater rescue robots that we used in Greece (see NPR story here) but they weren’t needed.

Missions

Six distinct missions have emerged from working with emergency managers:
  • property damage assessment: can sUAS help document the number of houses damaged and the amount of damage in terms of height of water into the houses while there is still flooding in order to qualify the area for disaster assistance? Our experience in Louisiana suggests that UAS of any size are not a good fit because the UAS cannot see an 18 inch high water line on houses in a subdivision crowded with trees or cover much within LOS. This mission appears to be a better match for a robot boat which can zoom down the flooded streets.
  • flood mapping and projection of impact: can sUAS help document the extent of the flood, the impact on residents, roads, levees, etc.? sUAS appear to have advantages over manned aircraft for forested regions where the platform can operate safely at lower altitudes and hover and stare to detect flowing water in between trees. An expert can use the sUAS  in order to identify possible causes of unexpected flooding and mitigation.
  • verification of flood inundation models
  • flood monitoring over time: This is related to flood inundation modeling and one of the reasons why Fort Bend County had us fly multiple days.
  • justification for publicly accountable decisions: The documentation of flooding is useful for future land use planning. Fort Bend County was particularly interested in capturing compelling video of the floods in the western part of the county which were severely flooded to show residents in the eastern part of the county which had not been yet been flooded so that they could see why evacuations were mandated.
  • public information: Fort Bend County immediately posted the video to YouTube and began pointing citizens to the video to answer questions about their neighborhood. One of the neighborhoods filmed had an assisted living facility and relatives calling into the OEM were directed to look at the video and see that the flooding wasn’t going to impact their family member.
Flood mapping, verification of flood models, and flood monitoring are not new missions and have been known for some time. Property damage has always been a projected use of sUAS for floods, though we did not expect to encounter the practical problems introduced by trees, trees, and more trees. People like trees in their yards.

Choosing the type of sUAS

The majority of missions for sUAS (versus larger UAS that can fly higher and longer areas) are expert-in-the-loop missions (more formally called remote presence), where the expert wants to be able to view the video and then direct the sUAS to a better view. It is not clear that orthomosaics and digital elevation maps (DEM) are a priority. The emergency managers generally have DEM already (though they may be outdated) and the areathat they want to look at is so large that a sUAS team is unlikely to reach all of the areas if restricted to line of sight operations— this is where larger UAS or Civil Air Patrol can be of great value. Another problem is that the file size of orthomosaics is unwieldy for OEMs to handle, share, and post among themselves. Not that many mangers have laptops that can handle a 55GB file and the upload times are slow.

We have converged on quadcopters being the default platform because of expert-in-the-loop flying and because of the physical constraints of landing zones, though we don’t rule out fixed wing. In the field there is limited access to the area because of the flooding. Access points such as raised roads or levees also had high tension powerlines which can induce interference- indeed, we had one “whoa!” takeoff next to powerlines. (We won’t discuss the creepy horde of swarming insects making it unsafe to stand in one potential site, or the fire ant bites I am sporting from a misstep at another site.)  Flying near rivers or from residential areas is hard because of trees and power lines. Empty lots without trees are rare, especially in older and urban parts of town, though suburban areas may have soccer fields suitable for launching and recovering fixed-wing sUAS.

Manpower: Crew Organization and CONOPS

This is my area of research, so it is always of interest to me! As noted in previous blogs, my TED talk, and papers, it’s the data that is the barrier to adoption. We’ve converged on a 4 person field team plus dedicated data management team back at the base to handle the data. Gee, that sounds like a lot, doesn’t it? Well, it is better than accidentally writing over data or taking extra hours to get data products to the OEM.

 

Let me explain about the field team. I see four major roles of the field team which leads to 4 people:
  • pilot, who is in charge of the sUAS
  • visual safety officer, who is not allowed to look at the pilot’s display or do anything but eyes on the sUAS and sky (and given the number of manned aircraft zipping by at low altitudes, that is an important and full time job)
  • agency expert, who actually knows what to look for and to opportunistically direct the flight
  • data manager, who immediately backs up the data (hard lesson learned at the 9/11 World Trade Center robot deployment) and makes sure all data is logged and stored for immediate hand-off to the OEM (want to give them that thumb drive as soon as sneaker net permits)

 

The pilot and VSO have to be dedicated roles, held by different people. In the future, one person could share the pilot and agency expert roles but for now it seems unreasonable to expect a flood inundation engineer to also be proficient with sUAS. One person can’t share the agency expert and VSO roles because then the person is sometimes looking at the display and sometimes at the sUAS, which is not permitted by regulations and bazillion of  safety studies.

 

There’s room for debate for having an agency expert- RWB member David Kovar points out that requiring an agency expert can hold up getting to the field. But in my experience, if the flights are exploratory, require expertise, and opportunistic, it is more effective in the long run to have them there. But not all missions for all disasters are remote presence. But for the missions that have emerged for the floods so far, this appears to be reasonable.
By the way, having an agency expert is valuable to the agency— citizens come up and ask questions that we can’t answer and they also like seeing that their officials are out there doing proactive things. It’s also important because it also helps with legitimacy. As a woman with a woman safety officer, we look more like a team of news people trying to sneak in and get footage than engineers operating under the official request of the county.

 

Back to roles, we’ve typically tried to keep it to a 3 person team by having the person serve as the VSO also serve as the data manager. Doubling up on roles seems like a great idea as the VSO takes notes, then when the platform lands, can take the data, make sure the files aren’t corrupted, and as the team drives to the next site, make backups, put in folders with filenames more helpful than say “DCIM”, and so on.

 

Except it never happens that way- which could be me. The VSO starts the process then before everything is done has to stop in the middle because of a) carsickness, b ) too bumpy to type, c) we’re at the next site and the VSO has to do a safety check and help set up, or d) all of the above. The pace just exceeds capacity. In Louisiana, we wound up sorting through a wad of video and imagery, requiring about 3 hours of extra effort at the end of a very long day. You can’t dump this on the agency person (“look, you’re now the data manager, follow this to do list! And remember, we’re here to help you!”) because they are busy making their own notes, talking to the OEM about what they are seeing, and talking to residents.

 

Bringing a fourth person along to do nothing but to sit in the truck with the air conditioning on and manage the data was a huge win on this last deployment. But again as David points out, a well honed team can do this. I suspect that that teams that do not fly together or fly these types of rapid fire missions (called ad hoc teams in industrial psychology)  will need the fourth person and real high performing teams will be able to combine roles.

 

But if you are handling data in the field, why do you need a data management team back at the base? Well, someone needs to create visualizations such as SituMap and summaries of where the data is from, edit high value snippets, upload data over a faster internet connections, etc. The field data manager doesn’t have access to the faster internet, can barely keep up with pace as it is so doesn’t have time to do snippets, and needs to head right back out to the field. Noooo, the OEM doesn’t do this. They are too busy to handle the data themselves. Their solution is to data management is to shout, “pause it there!” and then whip out their cellphone to video the video that is playing on the screen so that they can mail it and post it internally (an Ewok approach to snipeting and reducing resolution). They are trying to USE the information, not process it themselves. So while having Xiaosu or Grant come with us in the field was great, they couldn’t get it all done and we did not even try to work in the visualization software or photogrammetrics.
Again in theory, agencies have people dealing with ESRI and Big Files, but I have encountered this capacity once, maybe twice in my deployments. And in that case, the need to use these products caused a huge problem between the tactical responders and engineers and the people back in the emergency operations center.  So I still see there is a need for “immediate and intermediate” data processing.

More flooding– recommendations for small UAVs

Flooding continues through the southeast and we are getting some preliminary requests– here’s a quick rundown of  previous blogs:

suggestions from our work at the Texas floods where we flew with Lone Star UASC

a history of use of robots at floods

why the flood of data may be the biggest problem in floods

and some suggestions on flying for floods

plus best practices:

Let’s hope the flooding is not too bad- a bit of the luck of the Irish in time for St. Patrick’s Day.

Common UAV Software May Not (Yet) Be Reliable for Building Safety or Damage Assessment

Reconstruction exhibiting all four types of anomalies

Sample reconstruction exhibiting all four types of anomalies

(Note: there is lots of great work going on worldwide and we look forward to working with all companies and researchers to help improve this vital technology)

Researchers at Texas A&M and University of Nebraska-Lincoln have found that popular software packages for creating photo mosaics of disasters from imagery taken by small unmanned aerial systems (UAS) may contain anomalies that prevent its use for reliably determining if a building is safe to enter or estimating the cost of damage. Small unmanned aerial systems can enable responders to collect imagery faster, cheaper, and at higher resolutions than from satellites or manned aircraft. While software packages are continuously improving, users need to be aware that current versions may not produce reliable results for all situations. The report is a step towards understanding the value of small unmanned aerial systems during the time- and resource-critical initial response phase.

“In general, responders and agencies are using a wide variety of general purpose small UAS such as fixed-wings or

quadrotors and then running the images through the software to get high resolution mosaics of the area. But the current state of the software suggests that they may not get always the reliability that they expect or need,” said Dr. Robin Murphy, director of the director of the Texas A&M Engineering Experiment Station Center for Robot-Assisted Search and Rescue and the research supervisor of the study.  “The alternative is to purchase small UAVs explicitly designed for photogrammetric data collection, which means agencies might have to buy a second general purpose UAV to handle the other missions. We’d like to encourage photogrammetric software development to continue to make advances in working with any set of geo-tagged images and being easier to tune and configure.”

In a late breaking report (see SSRR2015 LBR sarmiento duncan murphy ) released at the 13th annual IEEE International Symposium on Safety Security and Rescue Robotics held at Purdue University, researchers presented results showing that two different photogrammetric packages produced an average of 36 anomalies, or errors, per flight.  The researchers identified four types of anomalies impacting damage assessment and structural inspection in general.  Until this study, it does not appear that glitches or anomalies had been systematically characterized or discussed in terms of the impact on decision-making for disasters.  The team of researchers consisted of Traci Sarmiento, a PhD student at Texas A&M, Dr. Brittany Duncan an assistant professor at University of Nebraska- Lincoln who participated while a PhD student at Texas A&M, and Dr. Murphy.

The team applied two packages, Agisoft Photoscan, a standard industrial system, and Microsoft ICE, a popular free software package to the same set of imagery. Both software packages combine hundreds of images into a single high resolution image. They have been used for precision agriculture, pipeline inspection, and amateur photography and are now beginning to be used for structural inspection and disaster damage assessment. The dimensions and distances between objects in the image can be accurately measured within 4cm.  However, the objects themselves may have glitches or anomalies. created through the reconstruction process, making it difficult to tell if the object is seriously damaged.

The researchers collected images using an AirRobot 180, a quadrotor used by the US and Germany military, flying over seven disaster props representing different types of building collapses, a train derailment, and rubble at the Texas A&M Engineering Extension Service’s Disaster City®. The team flew five flights over 6 months. The resulting images for each of the five flights were processed with both packages, then inspected for anomalies using the four categories.

Robots for earthquakes- history of use of ground, aerial, and marine systems plus best practices

Our hearts go out to the victims, families, and responders in Afghanistan and Pakistan. Here are links to

And from our home page, here are helpful 1 page guides and best practices for small unmanned aerial systems that have been incorporated into United Nations humanitarian standards and are continuing to evolve:

Texas Floods: How small UAVs have been used at 9 floods in 6 countries. Don’t forget about UMVs!

Note: this is a long blog with sections on best practices, where SUAS have been used (and for what missions), the flood of data that interferes with making the most of UAS data and how computer vision can help, and unmanned marine vehicles.

CRASAR is standing by to assist with the flooding in Texas with small unmanned aerial systems (UAS/UAV) and unmanned marine vehicles. Johnny Cash’s song “How High The Water Momma” comes to mind. We’ve been working with floods since 2005 and in July offered a class on flying for floods.

The rain is still too heavy to fly in most affected parts. Coitt Kessler, Austin Fire Department, is coordinating the use of small UAS with the State Operations Center and has been working tirelessly since Thursday. CRASAR is offering the Texas A&M team and the UAVRG team at no cost through the Roboticists Without Borders program. We also hope to try out an app of coordinating small UAS from the newest member of Roboticists Without Borders, Akum.

Hey- If you want to volunteer to fly, please do not fly with out explicitly coordinating with your local fire department and confirming that they in turn have followed standard procedures and coordinated with the state air operations (this is a standard ICS practice and should only take them a few minutes), otherwise there may be a repeat of the dangerous situation where  a) low flying helicopters and SUAS are working too close to each other and b) the data collected was either the wrong data or never made it to a decision maker.  Dangerous situations happened at the Boulder floods and several times in the Texas Memorial Day floods- it shuts down the helicopter operations.  And remember, it hards to become the fire rescue equivalent of a deputy without have met and worked with the fire rescue department- so it may not be realistic to expect to help with this disaster.

Best Practices

Here are links to our best practices for picking UAVs and payloads for disasters:

Where Small UAVs Have Been Used

Small UAVs or UAS have been used at least 9 disasters from flooding or had flooding associated with it: Hurricane Katrina 2005 (the first ever use of a small UAS for a disaster, which was by CRASAR), Typhoon Morakot, Taiwan 2009, Thailand Floods 2011, Typhoon Haiyan Philippines 2013, Boulder Colorado floods 2013, Oso Washington Mudslides 2014, Balkans flooding Serbia 2014, Cyclone Pamela Vanuatu 2014, and the Texas Memorial Day Floods 2015.  CRASAR participated in 3 of the 9 events.

SUAS missions at these floods have been:

  • situation awareness of the flood, affected transportation, and person in distress
  • hydrological assessment- where’s the flooding, state of levees, etc.? Texas has levees that impact people (think New Orleans and Katrina) but also livestock. Another use of small UAS is to determine why the floods are flooding where they are. In the Balkans flooding, the ICARUS team used their UAS and found a illegal dike that was preventing public works engineers from draining the area.
  • searching for missing persons presumably swept away- that was the major use of small UAS at the Texas Memorial Day floods
  • deliver a small line to persons in distress so that they can pull up a heavier line for help- this was also done at the Texas Memorial Day floods
  • debris estimation in order to speed recovery

SUAS proposed, but never flown to the best of my knowledge at an actual disaster (remember a patch to anyone who can help me keep the list of deployments up to date!), missions have been:

  • home owner and business insurance claims- many insurance carriers are actively exploring this and this was a big topic with at our 2015 Summer Institute on Flooding
  • carry wireless repeaters—this was actually done with manned aircraft from the Civil Air Patrol during the Memorial Day floods. The greater persistence and distance may keep that in the CAP list of responsibilities

The Flood of Data and the Promise of Computer Vision

The biggest challenge in using UAS is not flying (or regulations) but rather the flood of data. As I noted in my blog on our Summer Institute on flooding, one of our 20-minute UAS flights for the Texas Memorial Day floods produced roughly over 800 images totaling 1.7GB. There were over a dozen platforms flying daily for two weeks during the floods as well as Civil Air Patrol and satellite imagery. Most of the imagery was being used to search for missing persons, which means each image has to be inspected manually by at least (preferably more). Signs of missing persons are hard to see, as there may be only a few pixels of clothing (victims may be covered in mud or obscured by vegetation and debris) or urban debris (as in, if you see parts of a house, there may be the occupant of the house somewhere in the image). Given the multiple agencies and tools, it was hard to pinpoint what data has been collected when (i.e., spatial and temporal complexity) and then access the data by area or time. Essentially no one knew what they had.  Agencies and insurance companies had to manually sort through news feeds and public postings, both text and images, to find nuggets of relevant information.

Students from our NSF Research Experience for Undergraduates on Computing for Disasters and our partners at the University of Maryland and Berkeley led by Prof. Larry Davis created computer vision and machine learning apps during the Texas floods. The apps searched the imagery for signs of missing persons, including debris that might have been washed away with them and piles of debris large enough to contain a victim. The students also created visualization packages to show where the UAS and other assets had been and what data they had collected.

Don’t Forget About Unmanned Marine Vehicles

As I described in a previous blog on Hurricane Patricia, unmanned marine vehicles have been used for hurricane storm surges but not for flooding. They would be of great benefit for inspecting underwater portions of critical infrastructure such as bridges and pipelines. There’s even EMILY a robot super floatation device that can zoom out to where people are trapped.

 

 

Four Surprises about the Use of Unmanned Ground, Aerial, and Marine Vehicles for Hurricanes, Typhoons, and Cyclones

Hurricanes, typhoons, and cyclones form a category of meteorological events referred to as cyclonic activity.  They damage large areas and destroy the transportation infrastructure, interfering with the ability of agencies to find and assist people in distress, restore power, water, and communications, and prevent the delivery of supplies. As I describe in my TED talk, it can take years for a community to recover- the rule of thumb developed by disaster experts Haas, Kates, and Bowden in 1982 is that reducing the duration of each phase of disaster response reduces the duration of the next phase by a factor of 10. Thus, reducing the initial response phase by just 1 day reduces the overall time through the three reconstruction phases to complete recovery by up to 1,000 days. The sooner emergency response agencies can use unmanned systems, the faster they can respond and we can recover from a disaster.

There are three modes or types of small unmanned vehicles or robots: ground, aerial, and marine systems. Small vehicles have the advantage that they are easy to carry in an SUV or a backpack and deploy on demand when the field teams need them, which the military would call a tactical assets. Larger unmanned systems such as the National Guard flying a Predator to help get situation awareness of several counties or provinces requires much more coordination and planning (and expense); these are strategic assets.

Here are four surprises about small unmanned vehicles for cyclonic events (I’ll be adding links to videos through out the day):

1. Small unmanned ground, aerial, and marine systems have been reported at 7 hurricanes since the first use at Hurricane Charley in 2004.

These events are Hurricane Charley (USA, 2004), Hurricane Katrina (USA, 2005), Hurricane Wilma (USA, 2005), Hurricane Ike (USA, 2008), Typhoon Morakot (Taiwan, 2009), Typhoon Haiyan (Philippines, 2013), and Cylone Pam (Vanuatu, 2015).

2. Ground robots are generally not useful.

Ground robots have only be used at 2 of the 7 events: Charley and Katrina. Cyclonic activity tends to damage or destroy swaths of single story family dwellings, not multi-story commercial buildings. If houses are flattened, the debris is not more than 20 feet deep, so traditional techniques work. If houses or apartments are damaged but standing and there is a concern that people are hurt inside, canines can determine in seconds if a person is inside. A door or window would have to be breached to insert a robot (or a person), which means the apartment would then be open to robbers. We learned that while helping Florida Task Force 3 search the retirement communities in Florida affected by Hurricane Charley in 2004. Florida Task Force 3 did use a robot to enter two apartment buildings that were too dangerously damage to enter during Hurricane Katrina, but they didn’t have a canine team which is now generally considered the preferred method.

3. Marine vehicles may be the most useful kind of robot for both response and recovery.

AEOS-1 with accoustic imager inspecting underwater portion of bridge

AEOS-1 with accoustic imager inspecting underwater portion of bridge

Marine vehicles have been used for only 2 of the events, Hurricane Wilma and Hurricane Ike, but could have been effective for all 7. Hurricanes and Typhoons are a double whammy for marine infrastructure- the underwater portion of bridges, seawalls, pipelines, power grid, and ports. First the event creates storm surges along the coast, then flooding occurs inland and hits the coast again.  Bridges and ports can appear to be safe but the surge and flooding can have scoured the ground from under the pilings, leaving them resting on nothing. Debris can have broken off a piling underwater, creating a hanging pile. This means that transportation routes can be cut off during the response, hampering the movement of responders but also hampering bringing in enough food and supplies to feed a country, such as at the Haiti earthquake, which is normally done with ships.  The economy can’t recover until the infrastructure is back in place.

Checking for these conditions is typically done with manual divers but the conditions are dangerous- the current is still high, the water is cloudy and debris is floating everywhere, and divers often have to resort to feeling for damage. There are few divers and it can take months to schedule them, as we saw at the Tohoku tsunami. Marine vehicles, both underwater and on the surface, can be outfitted with acoustic imagers that act as a combination of ultrasound and a camera to check for these conditions. In Japan, we re-opened a port in 4 hours versus weeks by a dive team, and dive teams would not be able to start work for six months after the disaster. The six month delay would have caused the city to miss the salmon fishing season, which is the big economic driver for the region.

See UMV and UAV at Hurricane Wilma here.

4. Small unmanned aerial systems have been used the most frequently of the three types of robots.

SUAS have been used in all but two of the 7 events, Hurricane Charley and Hurricane Ike. Small UAS were still experimental in 2004 when Hurricane Charley occurred but the next day after our experiences as part of Florida Task Force 3, I called Mike Tamilow at FEMA and offer to make introductions to facilitate use for the next hurricane. Unfortunately it wasn’t until next year and several hurricanes later that SUAS were used for Katrina by us and other teams from the Department of Defense. Despite the success of these deployments, SUAS didn’t really take off (pun intended) until 2011 when the technology had matured and come down in price.

Small UAVs for flooding: history, recommendations, missions, and the future

Our thoughts go out to South Carolina and their extreme flooding. We’ve participated in 3 floods, numerous flood exercises, and two summer institutes on flooding.

This blog is divided into 4 sections with some information that we hope may be of use:

  • History of Use of Small UAVs at floods worldwide
  • Recommendations for hobbyists/volunteers who want to fly
  • Missions that have been flown in past floods and the payloads used
  • Other applications of small UAVs

See our previous blogs on small UAVs and flooding (with videos and photos):  general flooding and small UAVs, an update on flooding diasters and the challenges to response ,swift water rescue with UAVs and UMVs, how to fly at floods, data may be the biggest problems at floods, apps for handling data

History of Use of Small UAVs at Floods Worldwide

Small UAVs have been used at least 9 disasters from flooding or had flooding associated with it: Hurricane Katrina 2005* (first reported use of small UAVs), Typhoon Morakot, Taiwan 2009, Thailand Floods 2011, Typhoon Haiyan Philippines 2013, Boulder Colorado floods 2013, Oso Washington Mudslides 2014*, Balkans flooding Serbia 2014, Cyclone Pamela Vanuatu 2014, and the 2015 Texas floods*.  *means that CRASAR participated.

If you are a hobbyist or volunteer and want to fly, some recommendations:

Contact your local fire department and volunteer. Don’t be upset if they decline- it is extremely busy for them and hard to add anything new and relatively unknown to their effort. It is actually illegal to self-deploy UAVs- just like showing up to a police incident with a gun. Even if you have a carry permit, you can’t just show up- you needed to be trained and deputized in advance.

With your local fire department’s permission, contact the local or state air operations. Note that some fire departments or sheriff’s offices may not be aware that during many large scale operations, an agency is responsible for coordinating manned aircraft—especially helicopters working at low altitudes and Civil Air Patrol. Even if you have a 333 exemption, you still need to coordinate with air operations so that you don’t accidently interfere with manned helicopters.

Check http://tfr.faa.gov/tfr2/list.html to see if the area is under a Temporary Flight Restriction, which is the aerial version of a highway closure.  This is one of those things that you learn about when taking pilot’s licenses and a partial motivation for the FAA’s insistence on at least passing the written private pilot exam.

If you are flying check out the best practices on the crasar.org home page to see what types of payloads to use for what missions.

Check out the UAViators code of conduct as well for humanitarian use of drones.

Missions Small UAVs have been used for and payloads:

Surveillance/Reconnaissance/Situation awareness for both search and rescue and public works. This is about where’s the flooding? how bad? Are people in distress? What is the state of the transportation infrastructure- roads? Bridges? Typically this is done with video payloads. Rotorcraft offer the advantage of being able to hover and thus give a sense of how fast the water is flowing.

Examination of levees for signs of overflow over the top or for seepage indicating incipient collapse. This can be done with visual inspection using video payloads or with a camera payload for photogrammetrics. If you are going to try to create a 2D or 3D photogrammetric reconstruction, you will want GPS stamped high resolution imagery.

Missing persons, both living and presumed drowned and tangled in debris. This is done with high resolution still imagery that geotagged (if you don’t the have the GPS stamp, then it’s hard to direct a team to the right spot). Note CRASAR has software developed by the NSF REU Computing for Disasters program that uses computer vision to help identify victims in flood debris. It’s yet not released for general use but we can run it internally.

Delivery to trapped people. Keep in mind three concerns with the use of small rotorcraft and we recommend extreme care when flying near people. The first concern is that hanging things off of a small UAV changes the dynamics of the vehicle and how well it can be controlled, so it may behave and move unpredictably. Hoisting a fishing line tied to a heavier line tied to the object may be a good way to go. The second is that operators tend to lose depth perception and may get far too close to objects and people. The third is that work by Dr. Brittany Duncan shows that people aren’t naturally afraid of rotorcraft and will let them get dangerously close, so a person may be likely to be injured by a sudden move of a too-close UAV.

Other applications that have been discussed but not reported at an actual disaster:

Swift water rescue: UAVs providing oversight on floating debris that might jeopardize crews in boats working to rescue trapped people

Restoration and recovery assessment: such as identifying easement and standing water conditions that prevent power utility crews from restoring electricity

Carrying wireless repeater nodes: this has been done by Civil Air Patrol manned aircraft, so the advantage of small UAVs is unclear

Debris estimation: both the debris directly from the flood and the indirect debris a few days or weeks later from people having to rip out sheet rock and carpets. The advances in photogrammetrics make it possible to estimate the volume of debris— if you have the “before” survey of the area; we flew with PrecisionHawk at the Bennett Landfill superfund site in February in order to estimate the volume of toxic trash (which was on fire) that needed to be safely removed. The next step is to estimate the content, because vegetation and construction materials have to get handled and processed differently.

 

 

ICARUS: European Union Moves Robot-Assisted Search and Rescue Forward!

I had the pleasure of attending the ICARUS project’s final demonstration in Brussels Belgium as an advisor. ICARUS is the European Union funded project “Integrated Components for Assisted Rescue and Unmanned Search operations” which you can read about at here.  The demonstration was quite the success and the entire project has my greatest admiration!

Just a note to anyone wondering why the US is not doing more of this: the European Union funded the project at $17.5 M Euros, far more than any funding for robotics projects available through the National Science Foundation or the Department of Homeland Security. The great ICARUS team and the funding really helped them move the EU ahead of the US and Asia in robotics and in robotics for disasters. This is not the only project being funded at this level in the EU. NIFTY just finished up, TIRAMISU, CADDY, and SHERPA are all major projects focusing on fundamental research in robotics through applications to disasters. Each project has a strong partnership with an actual response agency or national US&R team, following the model that we use at CRASAR- and indeed that’s why I’m on the advisory board for most of these projects. This is a very different model than the DARPA Robotics Challenge in the US.

There were four aspects of the project that resonated with me:

  1. Engagement of the end-users, in this case, Belgium’s US&R team B-FAST, and emphasis on physical and operational fidelity. This is the major thrust of CRASAR. The engagement of end users led to them deploying their rotorcraft UAV for the Serbia-Bosnia floods, with an excellent set of lessons learned reported at IEEE Safety Security Rescue Robotics at http://tinyurl.com/ppr6c7b.
  2. Focus on heterogeneity of robots. The project demonstrated land, aerial, and marine robots complementing each other to provide responders with more capabilities to see and act at a distance. The July demo showed Aerial-Marine cooperation and this, the September demo, focused on Aerial-Ground cooperation. Heterogeneous robots are not a new topic, nor a new topic for disasters (see our work at the Japanese tsunami http://onlinelibrary.wiley.com/doi/10.1002/rob.21435/abstract) but ICARUS advanced the field by showing interoperability of control of the robots. Arguably, interoperability is not new and something the US Department of Defense is pursuing but it was nice to see, especially combined with heterogeneity of missions.
  3. Heterogeneity of missions. Perhaps the most compelling part of the demo was the how robots could be repurposed for different missions and how the interoperability framework supported this. A large robot for removing rubble could change its end effector and carry a smaller robot and lift it to the roof of a compromised building. The displays showed the payloads and types of functions each robot could do- this visualization was a nice advance.
  4. One size does not fit all. It was music to my ears to hear Geert DeCubber say that there is not a single robot that will work for all missions. I’ve been working on categorizing missions and the environmental constraints (e.g., how small does a robot need to be), with the initial taxonomy in Disaster Robotics https://mitpress.mit.edu/books/disaster-robotics)

The project focused on Interoperability between the assets, which was interesting technologically but I wonder if it will be of practical importance beyond what would be used by a single US&R team- assuming that a single US&R team would own a complete set of ground, aerial, and marine vehicles.

Our experience has been that a single agency or ESF is unlikely to own all the robotic assets. For example at the Fukushima Daiichi nuclear accident, several different types of ground robots and an aerial robot were simultaneously deployed. It didn’t make sense for a single operator to be able to control the devices— with a UGV outside the building clearing rubble, a UGV inside inserting a sensor, and a UAV outside conducting a radiological survey- these seem to be delegated functionality and better kept as separate modules. Furthermore, many of the devices were brought in for the disaster, that the best available was deployed rather than existing JAEA, so there is always the issue of how to incorporate the latest tool.

Even in a relatively small disaster, such as the Prospect Towers parking garage collapse, New Jersey Task Force 1 borrowed ground robots from a law enforcement agency. The point is that for the next decade, teams may be using ad hoc assemblies of robots, not owning a dedicated set of assets.

For CRASAR, the challenge is how the different end-users get the right information from the ad hoc assembly of robotics fast enough to make better decisions.

The project had a host of commendable technical innovations, such as showing a small solar power fixed-wing that operated for 81 hours endurance and provided a wireless network for the responders, a novel stereo sensor for the tiny Astec Firefly which they showed flying in through a window, and an exoskeleton controller for a robot arm which is being commercialized.

I particularly liked the ICARUS focus on establishing useful mission protocols. They experimented with launching a fixed wing immediately to do recon and wireless and provide overwatch of the camp and with using a quadrotor to fly ahead of convoy and try to ascertain the best route to the destination when roads might be blocked with rubble or trees.

Word from Responders: “Small UAVs are Available, Now Help Us Use The Data They Generate!” REU Students Provide Apps

Virtual reality reconstruction of Blanco River flood by Dr. Russ Taylor from CRASAR flight

TEES just concluded a three day Summer Institute on Floods (July 26-28, 2015), which was hosted by our “mother” center, the Center for Emergency Informatics. The Summer Institute focuses on the data-to-decision problems for a particular type of disaster. This year’s Summer Institute was the second in two-part series on floods and brought together representatives from 12 State agencies, 15 universities, and 5 companies for two days of “this is what we did during the Texas floods” and one day of “this is what we could do or do better” experimentation with small unmanned aerial vehicles, crowd sourcing, computer vision, map-based visualization packages, and mobile phone apps for common operating pictures and data collection.

A portion of the Summer Institute focused strictly on UAVs. The session was organized by the Lone Star UAS Center and the TEES Center for Robot-Assisted Search and Rescue (CRASAR), both of whom flew during the Texas floods.

The UAV field exercises  spanned four scenarios witnessed during the floods:

  • Search of cars and other vehicles swept away by a storm surge for trapped victims;
  • Search for missing persons who may have fled campsites or been swept down river;
  • Assessment of damage to power lines and transformers and presence of standing water which prevents restoration of power infrastructure; and
  • Assessment of damage to houses and estimates of household debris, which is critical to insurance companies estimating damage and agencies such as the American Red Cross in projecting social impacts.

The exercises were staged at the 1,900 acre TAMU Riverside Campus with one fixed wing and three types of rotorcraft  flown by CRASAR and David Kovak, a member of CRASAR’s Roboticists Without Borders program.

A major finding out of the institute was the realization that while State agencies are adopting UAVs, the agencies can’t process all the imagery coming in. For example, a single 20-minute UAV flight by CRASAR at the floods produced over 800 images totaling 1.7GB. There were over a dozen platforms flying daily for two weeks as well as Civil Air Patrol and satellite imagery. Each image has to be viewed manually for signs of survivors, clothing, or debris that indicate a house or car (and thus a person too) was swept to this location.

Given the huge demand for automating image processing, it was no surprise that a panel of four judges from the agencies awarded $900 in prizes to three students from the National Science Foundation Research Experiences for Undergraduates (REU) program. The prizes were for software that classified imagery and displayed where it was taken.

First place went to Julia Proft for an application that used computer vision and machine learning to find images from the floods which contained a spectral anomaly such as color. Below is UAV image flagged with an amaloy and anomalies indicated on a parallel image to make it easier to find where in the image the anomalies occurred without marking up the image and blocking the view.

UAV image flagged with an amaloy and anomalies indicated on a parallel image

urban debrisSecond place went to Matt Hegarty for an application that used computer vision and machine learning to find images from the floods which contained urban debris having straight lines or corners. In this image, the program found  instances of trash, pvc poles, and other indications of where houses (and possibly victims) had been swept down river.

 

Third place went to Abygail McMillian for a program that visually displayed where and when UAV data or any geotagged data was taken. The spatial apsect is important for responders to see where assets have sensed- at Nepal, UAV teams often surveyed the same areas but missed important regions. The temporal aspect (a timeline scale on the map) is important because with the changes in the river flood stage, hydrological and missing person images become “stale” and flights should be reflown to get new data.

Data display with timeline

The 10 students (nine from under-represented groups) were from Colorado, Georgia, Puerto Rico, South Carolina, Texas, Vermont, and Wyoming and spent  the summer conducting research through the Computing for Disasters NSF REU site grant under the direction of Dr. Robin Murphy. Multiple State agencies requested those student-created apps be hardened and released. In previous years, the panels have awarded prizes primarily to hardware- UAVs, battery systems, communications nodes, etc. This year, the focus was on software and programming to help analyze the data the hardware generates.

 

 

How to Fly at Floods: Summer Institute PLUS Roboticists Without Borders UAV training

One word about the floods and about UAVs: informatics. Read on to see what I mean ;-)

As I blogged earlier, on July 16-28, the Center for Emergency Informatics’ 2015 Summer Institute is on flooding and will bring together state agencies and municipalities who were part of the Texas floods with researchers and industry for a two-day workshop and 1 day exercise. The exercise will include UAVs flying the missing persons missions and the recovery and restoration missions.

Notice that it’s the Center for Emergency Informatics hosting the event because it’s about the right data getting to the right agencies or stakeholders at the right time and displayed in the right way that will enable them to make the right decisions. UAVs (and marine vehicles such as Emily and the small airboats being developed at Carnegie Mellon, Texas A&M, and University of Illinois) have a big role to play. But UAVs are useful only if the entire data-to-decision process works, aka informatics.

The Summer Institute July 26-28 will also host a training session for Roboticists Without Borders members specifically on UAVs and the best practices of how to fly at floods and upcoming hurricanes and collect the useful data– what do the decision makers need? Again, this is the informatics, the science of data, not the aeronautics. The training is independent of platform- because what the decision makers need is what they need ;-)   The current (and evolving) best practices are derived from three sources:

  1. CRASAR RWB deployments going back to 2005 Hurricane Katrina Pearl River cresting and including the Oso Mudslides and our deployment with Lone Star UAS Center to the Texas floods,
  2. the reports and analyses of what has worked at typhoons and other flooding events worldwide, and
  3. what researchers through out the world, especially the IEEE Safety, Security, and Rescue Robotics technical community, are doing.

For example, video is not as useful as high resolution imagery for searching for missing persons. Infrared isn’t helpful except in the early morning. Some missions and terrains require remote presence style of control, other can use preprogrammed flight paths. Complex terrains such as river bluffs may require flight paths that are vertical slices, not horizontal slices. Many more and I’m sure we will learn more from each other.

The training session will consist of evening classes on July 26 and 27, with field work on July 28 at the 1,900 acre Riverside Campus. We will fly fixed-wing and rotorcraft for response missions (reconn, missing persons, flood mitigation) and for recovery/restoration (damage assessment, debris estimation, household debris estimation, power utility assessment, transportation assessment). The scenarios will be designed by experts from Texas Task Force 1 and the representatives from the agencies that would use the information, including the fire rescue, law enforcement,  Texas insurance commission, SETRAC, etc.).

It’s not too late to join Roboticists Without Borders and attend! It’s free.

Hope to see you there!