On Friday morning Jan 15, 2016, team member Chief Fernando Boiteux (on vacation from his position as head of lifeguards for LA County Fire Department) deployed EMILY along the northern shore of Lesvos finding a unique ecological niche for her: in the 100 meter “gap” between the beach and where it is deep enough for lifeguard boats to go.
“The Gap” represents a type of no-mans land for lifeguards. It’s the area that the deeper water patrol boats (such as the Hellenic Coast Guard cutters use in the channel between Turkey and Greece and the smaller rigid hull inflatable boats used by NGOs) cannot enter due to draft restrictions but is too far out for lifeguards on shore to wade and has to be approached by a swimming lifeguard. If the boat capsizes, people fall or misjudge the depth and jump off, or the boat runs aground, the lifeguards in patrol boats are not in position to help. The lifeguards on land have to swim floatation devices out, taking valuable time and risking panicking people trying to climb on their heads.
Another challenge posed by “The Gap” is what happens when multiple boats arrive. Lifeguards on shore have to split their attention and may lose situation awareness of what is going on, especially in behind boats or sides that are blocked from view.
EMILY was able to fill the gap on Thursday by being able to work in the shallow water gap and to provide situation awareness with her cameras for the Hellenic Red Cross and PROACTIVA lifeguard teams on land who worked tirelessly as nearly a dozen boats arrived at first light along the rocky shore. Once on shore, other NGOs get the refugees to shelter.
This video show EMILY in The Gap and how she gives the lifeguards the ability to keep an eye on multiple boats. Note that 1 EMILY enabled 1 lifeguard to watch multiple boats and maintain general situation awareness.
CRASAR and Roboticists Without Borders members are on Lesvos on day 2 of a 10 day deployment to Greece to assist the local Coast Guard and lifeguard organizations in rescuing refugees from drowning. As you may know over 300 refugees have drowned, with 34 bodies found on Jan. 5, seven of which were children. We are deploying two types of robots: the EMILY marine vehicle that is used worldwide to assist lifeguards, a Fotokite, plus ruggedized Sonim phones from our Texas A&M sister center, the Internet2 Technology Evaluation Center. This is my 21st disaster and I’ve never seen such a diversity of NGOs working so well together and such a compassionate local population. It is an honor to think that we could provide them with useful tools to do their amazing and heartbreaking work. As you see in the video below, think of EMILY as a combination of a large life preserver with a battery powered miniature jet ski that a lifeguard can radio control. Based on prior use and talking with the PROEM-AID and PROACTIVA lifeguard teams here, we have identified 4 possible uses for her- 2 of which are standard operating procedures but 2 are new challenges posed by the unique situation here. The lessons learned here would be applicable to other marine catastrophes such as the cruise ships or ferries sinking.
Possible Use 1: Getting floatation to victims then pulling them to the rescue boat or shore
This is a standard use of EMILY. As illustrated in the above video, we demonstrated the EMILY robot to the PROEM-AID lifeguard team from Spain- two of whom are shown here as victims hanging on. 5 or more people can hang on. In rescues from a life boat (versus the shore), EMILY zooms out with a line because lifeguards can pull her back loaded with people faster than she jet-ski back. Plus it is much less scary for victims to have that wallowing sounds, wake, and vibration.
In this video, Chief Fernando Boiteux of the CRASAR team (in blue) demonstrated using a smartphone to view EMILY’s onboard camera, which can switch between visible light and infrared. A member of the PROACTIVA lifeguard team (red, black)- is shown driving EMILY. The lifeguard can direct EMILY to victims out of easy range of sight by using EMILY’s onboard camera.
One area that we hope to collect data on is the use of thermal imagery to help the lifeguard see the victims at night and in high waves. (And our students will be working on algorithms to exploit this new sensor to make EMILY smarter.)
Possible Use 2: Bringing a line to boat in trouble
It’s straightforward to send EMILY out to a boat in trouble, tell the people to unclip the line (yes, EMILY has two-way audio) and tie it to their boats. EMILY does this a lot in the Pacific Northwest where kayakers get pummeled by waves on the rocky shore and the rescue boat can’t get close enough. Once the line is on the kayaker, the rescue boat hauls it off the rocks while EMILY zooms back out of the way. This may be very useful at Lesvos because parts of the shore are treacherous.
Possible Use 3: “Follow Me”
Given that EMILY has two-way audio, goes 20 MPH, and a long radio-control range (plus the spiffy flag for visibility), PROACTIVE lifeguards envisioned that when a boat that was heading to a bad location, they could use EMILY to guide whoever was piloting the boat to the better beach.
Possible Use 4: Divide and Conquer
The refugee crossings present a new scenario- how to handle a large number of people in the water. Some may be in different levels of distress, elderly or children, or unconscious. One solution is to use EMILY to go to the people who are still able to grab on, while the lifeguards swim to aid the people who need special professional attention. Chief John Sims from Rural/Metro Fire Department, Pima, (our 4th team member) is anticipating situations where rescuers can concentrate on saving children and unconscious victims while sending EMILY to the conscious and responsive people. We are also going to experiment with the Fotokite, which is NOT considered a UAV by aviation agencies. It is a tethered aerial camera originally developed for safe and easy photo journalism- specifically because tethered aircraft like balloons and kites under certain altitudes are not regulated. I was immediately impressed when I saw it at DroneApps last year. It’s both a solid technology and it can be used where small unmanned aerial systems cannot be used since the flying camera is tethered. One challenge that the lifeguards have is seeing exactly what the situation is and who is in what kind of distress. That could be magnified in the chaos of a capsized boat. Even a 10 or 20 foot view could help rescuers see over the waves and better prioritize their lifesaving actions. I am delighted that Sergei Lupashin and his team scrambled over the holidays to get us one.
Other Notes
We expect to primarily deploy from lifeguard boats that go out to the refugees boats but perhaps from the beach as well. Note that EMILY doesn’t replace a responder, it is one more tool that they can use. It is a mature technology that helped responders save lives since 2012 (see https://lifesaving.com/node/2815). I first saw a prototype in 2010 and Tony Mulligan, creator and CEO of Hydronalix, and Chief Fernando Boiteux from the LA County Fire Department brought EMILY to our Summer Institute on Flooding in 2015- and they are deploying. Chief Boiteux is using his vacation days to come and serve as an expert operator. This is another case of proven mature robot technology that exists but is not getting the attention and adoption it deserves. I hate to make too much of a big deal about our deployment as we still haven’t done anything yet. And it’s not about us, but about helping the selfless work that the Hellenic Coast Guard and NGOs are doing and have been doing through sun and storm, hot and cold. However, this deployment is being funded out of pocket. Even with Roushan Zenooz and Hydronalix donating partial travel costs and Fotokite donating a platform, we are still short. So please consider donating at https://www.gofundme.com/Friends-of-CRASAR to cover the remaining costs (we couldn’t wait any longer). Once we can establish the utility of EMILY, we also hope to raise enough additional money to leave an EMILY (or multiple ones) behind.
I was a panelist on The Robot Revolution panel held by the Chicago Council on Global Affairs in conjunction with the Chicago Museum of Science and Industry (and the fantastic exhibit Robot Revolution!) You can see the panel at the YouTube clip here
What struck me was the focus on the ethics of robotics: were they safe? what about weaponization? they are going to take away jobs? Elon Musk and Stephen Hawking say they are dangerous, and so on.
I believe that it is unethical not to use existing (and rapidly improving) unmanned systems for emergency preparedness, response, and recovery. The fear of some futuristic version of artificial intelligence is like having discovered penicillin but saying “wait, what if it mutates and eventually leads to penicillin resistant bacteria a 100 years from now” and then starting an unending set of committees and standards processes. Unmanned systems are tools like fire trucks, ambulances, and safety gear.
Here are just three of the reasons why I believe it is unethical not to use unmanned systems for all aspects of emergency management. These came up when I was being interviewed by TIME Magazine as an Agent of Change for Hyundai (but don’t blame them)!
1. Robots speed up disaster response. The logarithmic heuristic developed by Haas, Kates, and Bowden in 1977 posits that reducing the duration of each phase of disaster response reduces the duration of the next phase by a factor of 10. Thus, reducing the initial response phase by just 1 day reduces the overall time through the three reconstruction phases to complete recovery by up to 1,000 days (or 3 years). Think of what that means in terms of lives saved, people not incurring additional health problems, and the resilience of the economy. While social scientists argue with the exact numbers and phases in the Haas, Kates, and Bowden work, it still generally works. If the emergency responders finish up life-saving, rescues, and mitigation, faster then recovery groups can enter the affected region and restore utilities, repairs roads, etc.
At the 2011 Tokoku tsunami, we used unmanned marine vehicles to reopen a fishing port that was central to the economy of the prefecture in 4 hours. It was going to take manual divers 2 weeks to manually cover the same area- but more than 6 months before divers could be schedule to do the inspection. That would have caused the fishing cooperatives to miss the salmon season- which was the center of their economy.
At the 2014 Oso Mudslides, we used UAVs to provide ESF3 Public Works with geological and hydrological data in 7 hours that would normally take 2-4 days to get from satellites and was higher resolution than satellites and better angles than what could have been gotten from manned helicopters– if the helicopters could have safely flown those missions. The data was critical in determining how to protect the responders from additional slide and flooding, protecting residents and property who had not (yet) been affected, and mitigating damage to salmon fishing.
2. Land, sea, and aerial robots have already been used since 2001. By my count, unmanned systems have been used in at least 47 disasters in 15 countries. It’s not like these are new technologies. Sure, they are imperfect (I’ve never met a robot that I couldn’t think of something that would make it better), but they are not going to get any better for emergency response until professionals have them and can use them- there has to be a market and then market pressure/competition can drive useful advances!
(Note: there is lots of great work going on worldwide and we look forward to working with all companies and researchers to help improve this vital technology)
Researchers at Texas A&M and University of Nebraska-Lincoln have found that popular software packages for creating photo mosaics of disasters from imagery taken by small unmanned aerial systems (UAS) may contain anomalies that prevent its use for reliably determining if a building is safe to enter or estimating the cost of damage. Small unmanned aerial systems can enable responders to collect imagery faster, cheaper, and at higher resolutions than from satellites or manned aircraft. While software packages are continuously improving, users need to be aware that current versions may not produce reliable results for all situations. The report is a step towards understanding the value of small unmanned aerial systems during the time- and resource-critical initial response phase.
“In general, responders and agencies are using a wide variety of general purpose small UAS such as fixed-wings or
quadrotors and then running the images through the software to get high resolution mosaics of the area. But the current state of the software suggests that they may not get always the reliability that they expect or need,” said Dr. Robin Murphy, director of the director of the Texas A&M Engineering Experiment Station Center for Robot-Assisted Search and Rescue and the research supervisor of the study. “The alternative is to purchase small UAVs explicitly designed for photogrammetric data collection, which means agencies might have to buy a second general purpose UAV to handle the other missions. We’d like to encourage photogrammetric software development to continue to make advances in working with any set of geo-tagged images and being easier to tune and configure.”
In a late breaking report (see SSRR2015 LBR sarmiento duncan murphy ) released at the 13th annual IEEE International Symposium on Safety Security and Rescue Robotics held at Purdue University, researchers presented results showing that two different photogrammetric packages produced an average of 36 anomalies, or errors, per flight. The researchers identified four types of anomalies impacting damage assessment and structural inspection in general. Until this study, it does not appear that glitches or anomalies had been systematically characterized or discussed in terms of the impact on decision-making for disasters. The team of researchers consisted of Traci Sarmiento, a PhD student at Texas A&M, Dr. Brittany Duncan an assistant professor at University of Nebraska- Lincoln who participated while a PhD student at Texas A&M, and Dr. Murphy.
The team applied two packages, Agisoft Photoscan, a standard industrial system, and Microsoft ICE, a popular free software package to the same set of imagery. Both software packages combine hundreds of images into a single high resolution image. They have been used for precision agriculture, pipeline inspection, and amateur photography and are now beginning to be used for structural inspection and disaster damage assessment. The dimensions and distances between objects in the image can be accurately measured within 4cm. However, the objects themselves may have glitches or anomalies. created through the reconstruction process, making it difficult to tell if the object is seriously damaged.
The researchers collected images using an AirRobot 180, a quadrotor used by the US and Germany military, flying over seven disaster props representing different types of building collapses, a train derailment, and rubble at the Texas A&M Engineering Extension Service’s Disaster City®. The team flew five flights over 6 months. The resulting images for each of the five flights were processed with both packages, then inspected for anomalies using the four categories.
(Note: there is lots of great work going on worldwide and we look forward to working with all companies and researchers to help improve this vital technology)
Researchers at Texas A&M and University of Nebraska-Lincoln have found that popular software packages for creating photo mosaics of disasters from imagery taken by small unmanned aerial systems (UAS) may contain anomalies that prevent its use for reliably determining if a building is safe to enter or estimating the cost of damage. Small unmanned aerial systems can enable responders to collect imagery faster, cheaper, and at higher resolutions than from satellites or manned aircraft. While software packages are continuously improving, users need to be aware that current versions may not produce reliable results for all situations. The report is a step towards understanding the value of small unmanned aerial systems during the time- and resource-critical initial response phase.
“In general, responders and agencies are using a wide variety of general purpose small UAS such as fixed-wings or
quadrotors and then running the images through the software to get high resolution mosaics of the area. But the current state of the software suggests that they may not get always the reliability that they expect or need,” said Dr. Robin Murphy, director of the director of the Texas A&M Engineering Experiment Station Center for Robot-Assisted Search and Rescue and the research supervisor of the study. “The alternative is to purchase small UAVs explicitly designed for photogrammetric data collection, which means agencies might have to buy a second general purpose UAV to handle the other missions. We’d like to encourage photogrammetric software development to continue to make advances in working with any set of geo-tagged images and being easier to tune and configure.”
In a late breaking report (see SSRR2015 LBR sarmiento duncan murphy ) released at the 13th annual IEEE International Symposium on Safety Security and Rescue Robotics held at Purdue University, researchers presented results showing that two different photogrammetric packages produced an average of 36 anomalies, or errors, per flight. The researchers identified four types of anomalies impacting damage assessment and structural inspection in general. Until this study, it does not appear that glitches or anomalies had been systematically characterized or discussed in terms of the impact on decision-making for disasters. The team of researchers consisted of Traci Sarmiento, a PhD student at Texas A&M, Dr. Brittany Duncan an assistant professor at University of Nebraska- Lincoln who participated while a PhD student at Texas A&M, and Dr. Murphy.
The team applied two packages, Agisoft Photoscan, a standard industrial system, and Microsoft ICE, a popular free software package to the same set of imagery. Both software packages combine hundreds of images into a single high resolution image. They have been used for precision agriculture, pipeline inspection, and amateur photography and are now beginning to be used for structural inspection and disaster damage assessment. The dimensions and distances between objects in the image can be accurately measured within 4cm. However, the objects themselves may have glitches or anomalies. created through the reconstruction process, making it difficult to tell if the object is seriously damaged.
The researchers collected images using an AirRobot 180, a quadrotor used by the US and Germany military, flying over seven disaster props representing different types of building collapses, a train derailment, and rubble at the Texas A&M Engineering Extension Service’s Disaster City®. The team flew five flights over 6 months. The resulting images for each of the five flights were processed with both packages, then inspected for anomalies using the four categories.
Note: this is a long blog with sections on best practices, where SUAS have been used (and for what missions), the flood of data that interferes with making the most of UAS data and how computer vision can help, and unmanned marine vehicles.
CRASAR is standing by to assist with the flooding in Texas with small unmanned aerial systems (UAS/UAV) and unmanned marine vehicles. Johnny Cash’s song “How High The Water Momma” comes to mind. We’ve been working with floods since 2005 and in July offered a class on flying for floods.
The rain is still too heavy to fly in most affected parts. Coitt Kessler, Austin Fire Department, is coordinating the use of small UAS with the State Operations Center and has been working tirelessly since Thursday. CRASAR is offering the Texas A&M team and the UAVRG team at no cost through the Roboticists Without Borders program. We also hope to try out an app of coordinating small UAS from the newest member of Roboticists Without Borders, Akum.
Hey- If you want to volunteer to fly, please do not fly with out explicitly coordinating with your local fire department and confirming that they in turn have followed standard procedures and coordinated with the state air operations (this is a standard ICS practice and should only take them a few minutes), otherwise there may be a repeat of the dangerous situation where a) low flying helicopters and SUAS are working too close to each other and b) the data collected was either the wrong data or never made it to a decision maker. Dangerous situations happened at the Boulder floods and several times in the Texas Memorial Day floods- it shuts down the helicopter operations. And remember, it hards to become the fire rescue equivalent of a deputy without have met and worked with the fire rescue department- so it may not be realistic to expect to help with this disaster.
Best Practices
Here are links to our best practices for picking UAVs and payloads for disasters:
Small UAVs or UAS have been used at least 9 disasters from flooding or had flooding associated with it: Hurricane Katrina 2005 (the first ever use of a small UAS for a disaster, which was by CRASAR), Typhoon Morakot, Taiwan 2009, Thailand Floods 2011, Typhoon Haiyan Philippines 2013, Boulder Colorado floods 2013, Oso Washington Mudslides 2014, Balkans flooding Serbia 2014, Cyclone Pamela Vanuatu 2014, and the Texas Memorial Day Floods 2015. CRASAR participated in 3 of the 9 events.
SUAS missions at these floods have been:
situation awareness of the flood, affected transportation, and person in distress
hydrological assessment– where’s the flooding, state of levees, etc.? Texas has levees that impact people (think New Orleans and Katrina) but also livestock. Another use of small UAS is to determine why the floods are flooding where they are. In the Balkans flooding, the ICARUS team used their UAS and found a illegal dike that was preventing public works engineers from draining the area.
searching for missing persons presumably swept away- that was the major use of small UAS at the Texas Memorial Day floods
deliver a small line to persons in distress so that they can pull up a heavier line for help- this was also done at the Texas Memorial Day floods
debris estimation in order to speed recovery
SUAS proposed, but never flown to the best of my knowledge at an actual disaster (remember a patch to anyone who can help me keep the list of deployments up to date!), missions have been:
home owner and business insurance claims- many insurance carriers are actively exploring this and this was a big topic with at our 2015 Summer Institute on Flooding
carry wireless repeaters—this was actually done with manned aircraft from the Civil Air Patrol during the Memorial Day floods. The greater persistence and distance may keep that in the CAP list of responsibilities
The Flood of Data and the Promise of Computer Vision
The biggest challenge in using UAS is not flying (or regulations) but rather the flood of data. As I noted in my blog on our Summer Institute on flooding, one of our 20-minute UAS flights for the Texas Memorial Day floods produced roughly over 800 images totaling 1.7GB. There were over a dozen platforms flying daily for two weeks during the floods as well as Civil Air Patrol and satellite imagery. Most of the imagery was being used to search for missing persons, which means each image has to be inspected manually by at least (preferably more). Signs of missing persons are hard to see, as there may be only a few pixels of clothing (victims may be covered in mud or obscured by vegetation and debris) or urban debris (as in, if you see parts of a house, there may be the occupant of the house somewhere in the image). Given the multiple agencies and tools, it was hard to pinpoint what data has been collected when (i.e., spatial and temporal complexity) and then access the data by area or time. Essentially no one knew what they had. Agencies and insurance companies had to manually sort through news feeds and public postings, both text and images, to find nuggets of relevant information.
Students from our NSF Research Experience for Undergraduates on Computing for Disasters and our partners at the University of Maryland and Berkeley led by Prof. Larry Davis created computer vision and machine learning apps during the Texas floods. The apps searched the imagery for signs of missing persons, including debris that might have been washed away with them and piles of debris large enough to contain a victim. The students also created visualization packages to show where the UAS and other assets had been and what data they had collected.
Don’t Forget About Unmanned Marine Vehicles
As I described in a previous blog on Hurricane Patricia, unmanned marine vehicles have been used for hurricane storm surges but not for flooding. They would be of great benefit for inspecting underwater portions of critical infrastructure such as bridges and pipelines. There’s even EMILY a robot super floatation device that can zoom out to where people are trapped.
Hurricanes, typhoons, and cyclones form a category of meteorological events referred to as cyclonic activity. They damage large areas and destroy the transportation infrastructure, interfering with the ability of agencies to find and assist people in distress, restore power, water, and communications, and prevent the delivery of supplies. As I describe in my TED talk, it can take years for a community to recover- the rule of thumb developed by disaster experts Haas, Kates, and Bowden in 1982 is that reducing the duration of each phase of disaster response reduces the duration of the next phase by a factor of 10. Thus, reducing the initial response phase by just 1 day reduces the overall time through the three reconstruction phases to complete recovery by up to 1,000 days. The sooner emergency response agencies can use unmanned systems, the faster they can respond and we can recover from a disaster.
There are three modes or types of small unmanned vehicles or robots: ground, aerial, and marine systems. Small vehicles have the advantage that they are easy to carry in an SUV or a backpack and deploy on demand when the field teams need them, which the military would call a tactical assets. Larger unmanned systems such as the National Guard flying a Predator to help get situation awareness of several counties or provinces requires much more coordination and planning (and expense); these are strategic assets.
Here are four surprises about small unmanned vehicles for cyclonic events (I’ll be adding links to videos through out the day):
1. Small unmanned ground, aerial, and marine systems have been reported at 7 hurricanes since the first use at Hurricane Charley in 2004.
These events are Hurricane Charley (USA, 2004), Hurricane Katrina (USA, 2005), Hurricane Wilma (USA, 2005), Hurricane Ike (USA, 2008), Typhoon Morakot (Taiwan, 2009), Typhoon Haiyan (Philippines, 2013), and Cylone Pam (Vanuatu, 2015).
2. Ground robots are generally not useful.
Ground robots have only be used at 2 of the 7 events: Charley and Katrina. Cyclonic activity tends to damage or destroy swaths of single story family dwellings, not multi-story commercial buildings. If houses are flattened, the debris is not more than 20 feet deep, so traditional techniques work. If houses or apartments are damaged but standing and there is a concern that people are hurt inside, canines can determine in seconds if a person is inside. A door or window would have to be breached to insert a robot (or a person), which means the apartment would then be open to robbers. We learned that while helping Florida Task Force 3 search the retirement communities in Florida affected by Hurricane Charley in 2004. Florida Task Force 3 did use a robot to enter two apartment buildings that were too dangerously damage to enter during Hurricane Katrina, but they didn’t have a canine team which is now generally considered the preferred method.
3. Marine vehicles may be the most useful kind of robot for both response and recovery.
Marine vehicles have been used for only 2 of the events, Hurricane Wilma and Hurricane Ike, but could have been effective for all 7. Hurricanes and Typhoons are a double whammy for marine infrastructure- the underwater portion of bridges, seawalls, pipelines, power grid, and ports. First the event creates storm surges along the coast, then flooding occurs inland and hits the coast again. Bridges and ports can appear to be safe but the surge and flooding can have scoured the ground from under the pilings, leaving them resting on nothing. Debris can have broken off a piling underwater, creating a hanging pile. This means that transportation routes can be cut off during the response, hampering the movement of responders but also hampering bringing in enough food and supplies to feed a country, such as at the Haiti earthquake, which is normally done with ships. The economy can’t recover until the infrastructure is back in place.
Checking for these conditions is typically done with manual divers but the conditions are dangerous- the current is still high, the water is cloudy and debris is floating everywhere, and divers often have to resort to feeling for damage. There are few divers and it can take months to schedule them, as we saw at the Tohoku tsunami. Marine vehicles, both underwater and on the surface, can be outfitted with acoustic imagers that act as a combination of ultrasound and a camera to check for these conditions. In Japan, we re-opened a port in 4 hours versus weeks by a dive team, and dive teams would not be able to start work for six months after the disaster. The six month delay would have caused the city to miss the salmon fishing season, which is the big economic driver for the region.
4. Small unmanned aerial systems have been used the most frequently of the three types of robots.
SUAS have been used in all but two of the 7 events, Hurricane Charley and Hurricane Ike. Small UAS were still experimental in 2004 when Hurricane Charley occurred but the next day after our experiences as part of Florida Task Force 3, I called Mike Tamilow at FEMA and offer to make introductions to facilitate use for the next hurricane. Unfortunately it wasn’t until next year and several hurricanes later that SUAS were used for Katrina by us and other teams from the Department of Defense. Despite the success of these deployments, SUAS didn’t really take off (pun intended) until 2011 when the technology had matured and come down in price.
Our thoughts go out to South Carolina and their extreme flooding. We’ve participated in 3 floods, numerous flood exercises, and two summer institutes on flooding.
This blog is divided into 4 sections with some information that we hope may be of use:
History of Use of Small UAVs at floods worldwide
Recommendations for hobbyists/volunteers who want to fly
Missions that have been flown in past floods and the payloads used
Small UAVs have been used at least 9 disasters from flooding or had flooding associated with it: Hurricane Katrina 2005* (first reported use of small UAVs), Typhoon Morakot, Taiwan 2009, Thailand Floods 2011, Typhoon Haiyan Philippines 2013, Boulder Colorado floods 2013, Oso Washington Mudslides 2014*, Balkans flooding Serbia 2014, Cyclone Pamela Vanuatu 2014, and the 2015 Texas floods*. *means that CRASAR participated.
If you are a hobbyist or volunteer and want to fly, some recommendations:
Contact your local fire department and volunteer. Don’t be upset if they decline- it is extremely busy for them and hard to add anything new and relatively unknown to their effort. It is actually illegal to self-deploy UAVs- just like showing up to a police incident with a gun. Even if you have a carry permit, you can’t just show up- you needed to be trained and deputized in advance.
With your local fire department’s permission, contact the local or state air operations. Note that some fire departments or sheriff’s offices may not be aware that during many large scale operations, an agency is responsible for coordinating manned aircraft—especially helicopters working at low altitudes and Civil Air Patrol. Even if you have a 333 exemption, you still need to coordinate with air operations so that you don’t accidently interfere with manned helicopters.
Check http://tfr.faa.gov/tfr2/list.html to see if the area is under a Temporary Flight Restriction, which is the aerial version of a highway closure. This is one of those things that you learn about when taking pilot’s licenses and a partial motivation for the FAA’s insistence on at least passing the written private pilot exam.
If you are flying check out the best practices on the crasar.org home page to see what types of payloads to use for what missions.
Check out the UAViators code of conduct as well for humanitarian use of drones.
Missions Small UAVs have been used for and payloads:
Surveillance/Reconnaissance/Situation awareness for both search and rescue and public works. This is about where’s the flooding? how bad? Are people in distress? What is the state of the transportation infrastructure- roads? Bridges? Typically this is done with video payloads. Rotorcraft offer the advantage of being able to hover and thus give a sense of how fast the water is flowing.
Examination of levees for signs of overflow over the top or for seepage indicating incipient collapse. This can be done with visual inspection using video payloads or with a camera payload for photogrammetrics. If you are going to try to create a 2D or 3D photogrammetric reconstruction, you will want GPS stamped high resolution imagery.
Missing persons, both living and presumed drowned and tangled in debris. This is done with high resolution still imagery that geotagged (if you don’t the have the GPS stamp, then it’s hard to direct a team to the right spot). Note CRASAR has software developed by the NSF REU Computing for Disasters program that uses computer vision to help identify victims in flood debris. It’s yet not released for general use but we can run it internally.
Delivery to trapped people. Keep in mind three concerns with the use of small rotorcraft and we recommend extreme care when flying near people. The first concern is that hanging things off of a small UAV changes the dynamics of the vehicle and how well it can be controlled, so it may behave and move unpredictably. Hoisting a fishing line tied to a heavier line tied to the object may be a good way to go. The second is that operators tend to lose depth perception and may get far too close to objects and people. The third is that work by Dr. Brittany Duncan shows that people aren’t naturally afraid of rotorcraft and will let them get dangerously close, so a person may be likely to be injured by a sudden move of a too-close UAV.
Other applications that have been discussed but not reported at an actual disaster:
Swift water rescue: UAVs providing oversight on floating debris that might jeopardize crews in boats working to rescue trapped people
Restoration and recovery assessment: such as identifying easement and standing water conditions that prevent power utility crews from restoring electricity
Carrying wireless repeater nodes: this has been done by Civil Air Patrol manned aircraft, so the advantage of small UAVs is unclear
Debris estimation: both the debris directly from the flood and the indirect debris a few days or weeks later from people having to rip out sheet rock and carpets. The advances in photogrammetrics make it possible to estimate the volume of debris— if you have the “before” survey of the area; we flew with PrecisionHawk at the Bennett Landfill superfund site in February in order to estimate the volume of toxic trash (which was on fire) that needed to be safely removed. The next step is to estimate the content, because vegetation and construction materials have to get handled and processed differently.
In 2011, President Obama remarked “even the smallest act of service, the simplest act of kindness, is a way to honor those we lost, a way to reclaim that spirit of unity that followed 9/11.”
The 9/11 World Trade Center was the first use of robots for the response to a disaster and paved the way for over 47 deployments of land, sea, and aerial robots in 15 countries. It was a small act of service led by John Blitch, the director and founder of CRASAR, with robots helping to comb the rubble in places people and dogs could not go and places still on fire from the jet fuel. It is in honor of those victims and the over 1 million people killed or 2.5 million displaced by disasters each year that all of us at CRASAR and Roboticists Without Borders continue to promote the use of robots for disaster prevention, response, and recovery.
TEES just concluded a three day Summer Institute on Floods (July 26-28, 2015), which was hosted by our “mother” center, the Center for Emergency Informatics. The Summer Institute focuses on the data-to-decision problems for a particular type of disaster. This year’s Summer Institute was the second in two-part series on floods and brought together representatives from 12 State agencies, 15 universities, and 5 companies for two days of “this is what we did during the Texas floods” and one day of “this is what we could do or do better” experimentation with small unmanned aerial vehicles, crowd sourcing, computer vision, map-based visualization packages, and mobile phone apps for common operating pictures and data collection.
A portion of the Summer Institute focused strictly on UAVs. The session was organized by the Lone Star UAS Center and the TEES Center for Robot-Assisted Search and Rescue (CRASAR), both of whom flew during the Texas floods.
The UAV field exercises spanned four scenarios witnessed during the floods:
Search of cars and other vehicles swept away by a storm surge for trapped victims;
Search for missing persons who may have fled campsites or been swept down river;
Assessment of damage to power lines and transformers and presence of standing water which prevents restoration of power infrastructure; and
Assessment of damage to houses and estimates of household debris, which is critical to insurance companies estimating damage and agencies such as the American Red Cross in projecting social impacts.
The exercises were staged at the 1,900 acre TAMU Riverside Campus with one fixed wing and three types of rotorcraft flown by CRASAR and David Kovak, a member of CRASAR’s Roboticists Without Borders program.
A major finding out of the institute was the realization that while State agencies are adopting UAVs, the agencies can’t process all the imagery coming in. For example, a single 20-minute UAV flight by CRASAR at the floods produced over 800 images totaling 1.7GB. There were over a dozen platforms flying daily for two weeks as well as Civil Air Patrol and satellite imagery. Each image has to be viewed manually for signs of survivors, clothing, or debris that indicate a house or car (and thus a person too) was swept to this location.
Given the huge demand for automating image processing, it was no surprise that a panel of four judges from the agencies awarded $900 in prizes to three students from the National Science Foundation Research Experiences for Undergraduates (REU) program. The prizes were for software that classified imagery and displayed where it was taken.
First place went to Julia Proft for an application that used computer vision and machine learning to find images from the floods which contained a spectral anomaly such as color. Below is UAV image flagged with an amaloy and anomalies indicated on a parallel image to make it easier to find where in the image the anomalies occurred without marking up the image and blocking the view.
Second place went to Matt Hegarty for an application that used computer vision and machine learning to find images from the floods which contained urban debris having straight lines or corners. In this image, the program found instances of trash, pvc poles, and other indications of where houses (and possibly victims) had been swept down river.
Third place went to Abygail McMillian for a program that visually displayed where and when UAV data or any geotagged data was taken. The spatial apsect is important for responders to see where assets have sensed- at Nepal, UAV teams often surveyed the same areas but missed important regions. The temporal aspect (a timeline scale on the map) is important because with the changes in the river flood stage, hydrological and missing person images become “stale” and flights should be reflown to get new data.
The 10 students (nine from under-represented groups) were from Colorado, Georgia, Puerto Rico, South Carolina, Texas, Vermont, and Wyoming and spent the summer conducting research through the Computing for Disasters NSF REU site grant under the direction of Dr. Robin Murphy. Multiple State agencies requested those student-created apps be hardened and released. In previous years, the panels have awarded prizes primarily to hardware- UAVs, battery systems, communications nodes, etc. This year, the focus was on software and programming to help analyze the data the hardware generates.