For Hurricane Matthew: Quick Guide For Agencies Flying small Unmanned Aerial Systems (SUAS) for Emergencies

The illustrated version in pdf is here.

This quick guide is aimed at helping emergency managers quickly determine how they can exploit small unmanned aerial systems (like quadcopters).  The guide covers our best understanding of who can fly?  where can they fly?, and  any additional considerations in planning. Our best practices series has other documents on what kind of data you can expect to get, flight duration, etc., but this guide is about how the new regulations impact emergency managers. It is based on our SUAS deployments since 2005 and lessons learned from deployments by our colleagues.

 

WHO CAN FLY?

 

If members of your agency own a small UAS or have friends with a small UAS, they cannot fly at the disaster- even if they aren’t asking for money. The FAA has repeatedly ruled that a) disasters are a business or government activity and  b) if the UAV flight is a donation to a business or government, it is the same thing as if the business or government agency flew directly.

 

Therefore, the only people/companies who can fly are those with a:

  • Part 107 license. The license is new and many people/companies don’t have these yet.
  • 333 exemption. Essentially a business license versus of the COA. Many hobbyist declared themselves a company to get a 333.
  • COA. Essentially a government or academic license.

 

Your agency does not have to have the 107, 333, or COA– just formally invite the group to fly on your behalf. If the group has one of the above, there are three important caveats.

 

1. Controlled airspace. They can fly at a disaster in uncontrolled airspace, but will need special permissions for controlled airspace. Keep in mind, many densely populated areas will be in controlled airspace.

 

2.  They have to obey all the flight restrictions for their license, including Temporary Flight Restrictions. Getting permission to fly under a Temporary Flight Restriction does not give them permission to change up the rules, it only means that they are now coordinated with the rest of the air traffic who will expect them to obey the same rules as in normal flights.

 

3. 24 hour notifications before flights may be required.  If the group is flying under a 333 or COA, they have to post an online notice of intention to fly in a specific area, called a NOTAM, 24 hours in advance. So if you think you are going to have a group fly, have them declare as soon as you know. There is no downside to filing a NOTAM and then not flying.

 

 

WHERE CAN THEY FLY?

 

For planning purposes there are 3 types of airspaces: uncontrolled, controlled, and TFRUncontrolled means they can fly anywhere that is not controlled according to their license. TFR was covered above. That leaves the controlled airspace.

 

You can quickly determine if an area you want a group to fly in is in controlled airspace by going to:

 

https://app.airmap.io/

 

and enter the nearest town, then click the appropriate boxes.  What is “Controlled airspace” and what you have to do to get permission to fly in it will depend on whether the group has a) a Part 107 license or b) a 333 exemption or COA.

 

a. Determining Part 107 controlled airspace.  If the group has a 107, click on the menu on the left that says Controlled Airspace and “all”. You will get something like this:

 

 

 

 

Anything in shade means that it is controlled airspace. This means that they can fly only IF they have an airspace authorization that they have applied for in advance online and gotten approval. Note: the FAA system is backlogged by weeks, so for Matthew, this may not make possible to get approval fast enough.

 

b. Determining 333 or COA airspace.

 

Clear airmap and instead click on “blanket COA”. You should get something like this:

 

 

 

Any area in orange means that the airspace is off limits without additional permissions- no matter what altitude you are flying at.  The controlled airspace is due to airports. A local group may already have permission to fly in those areas, but may not. If not, permission to fly in controlled airspace on short notice is handled through an Emergency COA, also called ECOA, process. The process takes about 1 hour to get through the FAA- assuming you have the GPS coordinates of where you want to fly, the COA number, etc.

 

The key is that the tower has to approve the flights (actually the approve the process of letting them know where you’re flying, when you take off, land, etc.) and the FAA has to agree to the temporary extension of the current license.

 

  • Note about 333 exemption. ECOAs are granted only to businesses or agencies, not individuals doing business as. Too many quasi-hobbyists were trying to fly at disasters without working with a response agency.

 

 

ARE THEY ANY OTHER CONSIDERATIONS?

 

There are three considerations:

 

  • Data. The data (images, video) really belongs to your agency and needs to be handled as such. It may have personal identifying information. Some groups may routinely post videos and images to the web or tweet, which might not be appropriate. Therefore, you may want to make clear what the data management policies are applicable to flights on your behalf.

 

  • Privacy, state laws, or other regulations plus the public perception.  There may be state or local rules that impact the use of SUAS. Regardless, if you have a group flying SUAS for disasters, the residents will need to be aware that they are legitimate- plus the teams will be magnets for residents asking for help or assistance. So you will probably want to plan to have an agency representative in uniform or vest with the team.

 

  • Some SUAS may be software disabled from flying in TFR areas. DJI Phantom 3 and Inspires, which are very common, are now disabled by the manufacturer when a TFR is in place. So that may be something to discuss with your SUAS team.  DJI does have a procedure that allows agencies to override the software and fly up to 1.5 nautical miles from an airport, trusting the group to have obtained permissions.

Unmanned Systems and Hurricane Matthew: Lessons from 2010 Haiti Earthquake

As Hurricane Matthew approaches Haiti, it is hard not to think of the terrible devastation from the 2010 earthquake. The Haiti earthquake taught us some valuable lessons about the use of unmanned systems for the initial response to a disaster- that 0-24 hour period where emergency managers are trying to get an accurate assessment of the scope of the disaster and how to allocate resources to save lives immediately and mitigate any dangers, and to set in motion the plans and resources needed to protect lives and quality of life for the longer term. One key lesson is that bigger is better, at least for the initial aerial assessment. Another is to not forget about unmanned marine systems. These two lessons show up in other events such as other hurricanes and tsunamis. A lesson that did not come out of Haiti was that the effective use of unmanned systems in the 0-24 hour time period depends on communications. UAVs generate terabytes of imagery that are difficult to upload to the Cloud or file transfer/email to others.

 

In terms of unmanned aerial vehicles, Haiti makes an interesting case study. The Haitian government quickly put out an aviation notice that UAVs were prohibited. Period. That actually made sense given that there would be a lot of helicopters working at low altitudes, general air traffic control was complex enough, and that UAV coordination with air traffic control was still being worked out (and as of 2016, it’s not 100% resolved to this day). What was interesting was that the US Government put up a Global Hawk (see Peterson, Handbook of Surveillance Technologies, 3rd Edition) which provided aerial assessments of the extent of the damage without entering the Haitian airspace and two weeks later Predators were being used and coordinated with manned air traffic (see http://northshorejournal.org/high-tech-warbird-aids-haiti-relief-efforts). While on one Snowden-we-are-being-watched level, this may be disturbing to have drones able to see into other countries without violating airspace, on another it is wonderful. Emergency workers can get data without having to totally rework how multiple government agencies coordinate. The most important aspect of the use of military drones is that it illustrates that agencies need higher altitude, longer persistence UAVs geographically distributed disasters, in order to get the rapid coverage of damage (area X needs help) and state of the infrastructure (what is the best route to get resources there?). As we have seen with flooding in the US (we have a paper about to come out on this), small hobbyist-styles of UAVs are like flashlights illuminating small patches, while military drones are stadium lighting. Of course, big drones or Civil Air Patrol assets may not be available. This leads to the questions as to whether small hobbyists quadcopters can contribute, how to aggregate the data from hobbyists and send it (especially under low bandwidth conditions), and how can agencies handle the volumes of data and trust the data they are getting. These are some of the issues raised in my article at https://www.computer.org/csdl/mags/co/2016/05/mco2016050019-abs.html

 

The second lesson from Haiti in terms of unmanned systems is to not forget the value of unmanned marine vehicles. If the hurricane brings intensive flooding or high storm surges, then the underwater portions of the critical infrastructure are at risk. This means bridges (I’ll never forget crossing the bridge into Punta Gorda for Hurricane Charley and the team being told not to stop on the bridge because there was not way to know how safe the bridge was). Bridges are important but also ports and shipping channels. It also mean pipelines, which can be leaking and affecting the environment, and telecommunications (the 2015 Texas Memorial Day floods washed away the bridge and the telephone lines to Wemberly). In Haiti, the state of the ship channel was unknown (had any depths changed?) as was the over port (could it take the weight of cargo being unloaded onto the docks?). The traditional approach has been to use divers, but in Haiti, the Navy and Army MDSU 2 team used SeaBotix ROVs to speed up the assessment as noted in Disaster Robotics (https://mitpress.mit.edu/books/disaster-robotics).

 

Disaster Robotics has more information about unmanned systems at the 2010 Haitian earthquake.

The Legacy of 9/11 for Disaster Robotics

[youtube]https://youtu.be/1DYi7_lE6cA[/youtube]

I’ll be speaking at the Smithsonian Museum of American History today as part of the 15th Anniversary of 9/11 event. 9/11 was the first reported use of robots for search and rescue and created a legacy that continues to grow for both disaster response and for science and technology. The robots were successful by any standard for rating search and rescue tools- improved performance over existing tools, frequency of use, and acceptance by professionals. They didn’t find any survivors, but neither did anyone else as there were sadly no survivors to find.
I will never forget my time at the World Trade Center as a responder, a scientist, or as a person. The infinite sadness of such an event still haunts me. The Seamus Heaney poem I read in the NYT and quoted in my IAAI talk:
And we all knew one thing by being there.
The space we stood around had been emptied
Into us to keep, it penetrated
Clearances that suddenly stood open.
High cries were felled and a pure change happened.
I believe the many members of the CRASAR team at the World Trade Center and since have kept the memories and have enabled a pure change- as witnessed by the use of robots at at least 50 disasters worldwide. While the robots are a very small story among the amazing stories of loss and triumph, I am proud to tell the story and add to the history of how 9/11  has made waves and ripples in history.

Legacy for Disaster Robotics

9/11 was an existence proof that small robots could be of significant use searching in rubble, reaching places that people and dogs could not, and penetrating two to three times farther than cameras on poles, which were the nearest similar tool. Large heavy robots had been developed for bomb squads but they were too big and heavy to be used in rubble, as seen at the Oklahoma City bombing. Red Whitaker at CMU had built even larger and heavier robots for Chernobyl and Three Mile Island nuclear accidents for the recovery operations, not for immediate search and rescue.

 

Small robots, ranging in size from a shoe box to a carry-on suitcase, that could be carried in one or two backpacks, had been under development by the pipeline and sewer inspection industry and the DARPA Tactical Mobile Robots program directed by John Blitch, the founding director of CRASAR.  If you look at the DARPA TMR logo you see that there is the “urban terrain” of cities but also a rubble pile, because John was thinking of dual use. He had been at the Oklahoma City bombing and had changed his MS thesis topics to robots for disasters (I was his co-advisor).

 

The robots were used starting shortly after midnight on 9/12 through 9/21 for search and rescue by FDNY, INTF1, OHTF1, PATF1,  and VATF1 and then again from 9/23 through 10/2 for recovery operations (structural inspection of the slurry wall) by the NY Department of Design and Construction when the last robot on site wore out.

 

By my count, robots have been used in 49 disasters since then in 17 countries. 24 of those disasters used UGVs- with the majority using the robot models from 9/11: Inuktuns (ex. mine disasters, building collapses), Packbots (ex. Christchurch for searching the cathedral, Fukushima Daiichi), and Talons (ex. Fukushima Daiichi). See Disaster Robotics for more details.

Legacy for Robotics

9/11 created a legacy for robotics in two ways. Search and rescue is often cited as a motivation for new advances in robotics; if you’re a doctor, you often say you want to solve cancer, if you’re a roboticist you often say you want to help with search and rescue.
One is that it created a new subfield of robotics. The IEEE Robotics and Automation Society, the largest and most prestigious professional organization devoted to robotics, has a technical committee on Safety Security and Rescue Robotics with an annual international symposium that started in 2002 (I was a co-founder of the TC and symposium). Both the European Union and Japan are investing heavily in small disaster robots- including sensors and user interfaces- with multiple projects being funded at the $20M to $35M range (the US doesn’t have dedicated programs for funding robotics projects at that level). The SSRR field now includes UAVs, UMVs, and many innovations in ground robots that can crawl and burrow into rubble.

 

9/11 also uncovered technical challenges that the R&D community is still struggling with. Probably the most significant discovery was that remote presence, or teleoperation, is actually the preferred mode of control for almost every response task- even with UAVs and UMVs.  Because the time pressure is so great and because disasters always have a surprise, the responders want to see in real-time what the robot is seeing and being able to opportunistically change up the plan (“wait— what’s that? Let’s look over there..”). Up until 9/11, researchers and developers had assumed that all robots should be taskable agents- you would tell it what to do, it would go off and do it, and then come back- and remote presence was just because we hadn’t created autonomous programs. Now there is the realization that many applications, not just search and rescue, require the human and robot to work together in a joint cognitive system to get the job done.

 

The second most significant discovery is what Jenny Burke would later describe in her PhD thesis as that 2 heads are 9 times better than 1. Up until 9/11, researchers and developers had assumed that 1 person could operate a robot successfully and thus the real challenge was for 1 person to drive 2 or more robots. We had had signs prior to 9/11 that 1 person couldn’t drive a robot in rubble and look at the same time-as one of my grad students who later went with us to 9/11, Jenn Casper, documented, they could do it but they could literally roll past a victim in front of the robot (we started seeing this in exercises with FLTF3 in buildings that were being demolished). The cognitive challenges of thinking like a ferret or meerkat (the size of the robot) were bigger than anyone had expected and then rubble is deconstructed and hard to mentally sort out. 2 heads makes sense in a way- if you are in a new town driving around in traffic and looking for a particular address you’ve never been to, it helps to have a passenger in the car who is looking too.

 

Italian Earthquake: Recommendations for using ground and aerial robots for immediate lifesaving

Our thoughts and prayers go out to the Italian people impacted by the earthquake. We’ve reached out to colleagues in Italy in case any of us here can be of assistance. Below is a general overview of what might be useful and why.

From the scanty news reports in the US, my guess is that this event will favor the use of small tethered ground robots for locating survivors in rubble based on the case studies from 9/11 World Trade Center, Cologne Archives collapse, Berkman Plaza collapse Prospect Towers collapse, L’aquila earthquake, Mirandola earthquake, and multiple mine disasters worldwide (see Disaster Robotics, MIT Press, 2014 for those case studies). UAVs may be of value in estimating extent, ascertaining whether roads are open or can be easily cleared to allow responders rapid access, and general damage assessment and recovery operations (as per Nepal and Chile), but probably not for direct life saving- though I could be wrong.

“Small”  as in pipe inspection robots- not a bomb squad robot like the Packbots used at Fukushima- because if a person or dog could get into a void to reach a trapped person, they probably would despite the personal risk. A tether is useful because it solves wireless and power problems- but more importantly any entry would likely be from the top of the structure or the upper parts, so the robot has to rappel down.

A video camera, color, is essential. Thermal cameras may be of use initially but are very hard to use for navigation in confined spaces. So I wouldn’t recommend thermal by itself, rather as a second camera. The value of a thermal camera goes away after a few days because decomposing bodies present a heat signature. Navigation gets harder as small protrusion become the same temperature as the surroundings.

A robot with 2-way audio will be valuable because the operator can call out and listen for sounds of survivors, then medical experts can talk with the victims. But even just a speaker or a microphone by itself can be useful.

Should someone find a survivor, a small tube can be attached to the robot to provide water to a trapped victim- hook up the end to an aquarium or koi pond pump. (This is a great solution worked out by Eric Rasmussen and we tested with the USMC CBIRF unite.) The robot can probably maneuver and bring a small payloads- a radio, a space blanket, power bars (assuming they aren’t severely injured). LACoFD does so many confined space rescues, they can have a kit the size of a Pringles can for trying to give to people trapped in caves and culverts.

Missions, Choice of sUAS Platforms, and Manpower for Flying Floods: Lessons learned from our deployment with Fort Bend County May 30-31, 2016

[youtube]https://youtu.be/Mt_dJN3rmSc[/youtube]CRASAR was in the field from Monday and Tuesday (May 30-31, 2016) at the request of Fort Bend County Office of Emergency Management, with Roboticists Without Borders member CartoFusion providing off-site data support.  We flew a small UAS (DJI Phantom 3 Pro) throughout the county to help them validate their flood inundation models, conduct hydrological forensics, and educate the public on why evacuations were necessary. This is our third response to flooding this year (Louisiana March, Fort Bend County April, and this event) and we continue to learn about missions, selection of sUAS, and crew organization and CONOPS building on my previous recommendations for flooding.  I am working on a formal cognitive work analysis using the shared roles model but some preliminary lessons that may be of use to other teams are discussed below. We’d like to especially thank Jeff Braun, Lach Mullen, Adam Wright, and Juling Bao from Fort Bend County. Traci Sarmiento served as VSO both days, Xiaosu Xiao and Grant Wilde took turns being data manager, and I was the pilot.

 

Check out some of the YouTube clips at (some of which were on the CBS news):
RWB member Hydronaulix also offered Texas Task Force 1 the use of five EMILY swiftwater rescue robots that we used in Greece (see NPR story here) but they weren’t needed.

Missions

Six distinct missions have emerged from working with emergency managers:
  • property damage assessment: can sUAS help document the number of houses damaged and the amount of damage in terms of height of water into the houses while there is still flooding in order to qualify the area for disaster assistance? Our experience in Louisiana suggests that UAS of any size are not a good fit because the UAS cannot see an 18 inch high water line on houses in a subdivision crowded with trees or cover much within LOS. This mission appears to be a better match for a robot boat which can zoom down the flooded streets.
  • flood mapping and projection of impact: can sUAS help document the extent of the flood, the impact on residents, roads, levees, etc.? sUAS appear to have advantages over manned aircraft for forested regions where the platform can operate safely at lower altitudes and hover and stare to detect flowing water in between trees. An expert can use the sUAS  in order to identify possible causes of unexpected flooding and mitigation.
  • verification of flood inundation models
  • flood monitoring over time: This is related to flood inundation modeling and one of the reasons why Fort Bend County had us fly multiple days.
  • justification for publicly accountable decisions: The documentation of flooding is useful for future land use planning. Fort Bend County was particularly interested in capturing compelling video of the floods in the western part of the county which were severely flooded to show residents in the eastern part of the county which had not been yet been flooded so that they could see why evacuations were mandated.
  • public information: Fort Bend County immediately posted the video to YouTube and began pointing citizens to the video to answer questions about their neighborhood. One of the neighborhoods filmed had an assisted living facility and relatives calling into the OEM were directed to look at the video and see that the flooding wasn’t going to impact their family member.
Flood mapping, verification of flood models, and flood monitoring are not new missions and have been known for some time. Property damage has always been a projected use of sUAS for floods, though we did not expect to encounter the practical problems introduced by trees, trees, and more trees. People like trees in their yards.

Choosing the type of sUAS

The majority of missions for sUAS (versus larger UAS that can fly higher and longer areas) are expert-in-the-loop missions (more formally called remote presence), where the expert wants to be able to view the video and then direct the sUAS to a better view. It is not clear that orthomosaics and digital elevation maps (DEM) are a priority. The emergency managers generally have DEM already (though they may be outdated) and the areathat they want to look at is so large that a sUAS team is unlikely to reach all of the areas if restricted to line of sight operations— this is where larger UAS or Civil Air Patrol can be of great value. Another problem is that the file size of orthomosaics is unwieldy for OEMs to handle, share, and post among themselves. Not that many mangers have laptops that can handle a 55GB file and the upload times are slow.

We have converged on quadcopters being the default platform because of expert-in-the-loop flying and because of the physical constraints of landing zones, though we don’t rule out fixed wing. In the field there is limited access to the area because of the flooding. Access points such as raised roads or levees also had high tension powerlines which can induce interference- indeed, we had one “whoa!” takeoff next to powerlines. (We won’t discuss the creepy horde of swarming insects making it unsafe to stand in one potential site, or the fire ant bites I am sporting from a misstep at another site.)  Flying near rivers or from residential areas is hard because of trees and power lines. Empty lots without trees are rare, especially in older and urban parts of town, though suburban areas may have soccer fields suitable for launching and recovering fixed-wing sUAS.

Manpower: Crew Organization and CONOPS

This is my area of research, so it is always of interest to me! As noted in previous blogs, my TED talk, and papers, it’s the data that is the barrier to adoption. We’ve converged on a 4 person field team plus dedicated data management team back at the base to handle the data. Gee, that sounds like a lot, doesn’t it? Well, it is better than accidentally writing over data or taking extra hours to get data products to the OEM.

 

Let me explain about the field team. I see four major roles of the field team which leads to 4 people:
  • pilot, who is in charge of the sUAS
  • visual safety officer, who is not allowed to look at the pilot’s display or do anything but eyes on the sUAS and sky (and given the number of manned aircraft zipping by at low altitudes, that is an important and full time job)
  • agency expert, who actually knows what to look for and to opportunistically direct the flight
  • data manager, who immediately backs up the data (hard lesson learned at the 9/11 World Trade Center robot deployment) and makes sure all data is logged and stored for immediate hand-off to the OEM (want to give them that thumb drive as soon as sneaker net permits)

 

The pilot and VSO have to be dedicated roles, held by different people. In the future, one person could share the pilot and agency expert roles but for now it seems unreasonable to expect a flood inundation engineer to also be proficient with sUAS. One person can’t share the agency expert and VSO roles because then the person is sometimes looking at the display and sometimes at the sUAS, which is not permitted by regulations and bazillion of  safety studies.

 

There’s room for debate for having an agency expert- RWB member David Kovar points out that requiring an agency expert can hold up getting to the field. But in my experience, if the flights are exploratory, require expertise, and opportunistic, it is more effective in the long run to have them there. But not all missions for all disasters are remote presence. But for the missions that have emerged for the floods so far, this appears to be reasonable.
By the way, having an agency expert is valuable to the agency— citizens come up and ask questions that we can’t answer and they also like seeing that their officials are out there doing proactive things. It’s also important because it also helps with legitimacy. As a woman with a woman safety officer, we look more like a team of news people trying to sneak in and get footage than engineers operating under the official request of the county.

 

Back to roles, we’ve typically tried to keep it to a 3 person team by having the person serve as the VSO also serve as the data manager. Doubling up on roles seems like a great idea as the VSO takes notes, then when the platform lands, can take the data, make sure the files aren’t corrupted, and as the team drives to the next site, make backups, put in folders with filenames more helpful than say “DCIM”, and so on.

 

Except it never happens that way- which could be me. The VSO starts the process then before everything is done has to stop in the middle because of a) carsickness, b ) too bumpy to type, c) we’re at the next site and the VSO has to do a safety check and help set up, or d) all of the above. The pace just exceeds capacity. In Louisiana, we wound up sorting through a wad of video and imagery, requiring about 3 hours of extra effort at the end of a very long day. You can’t dump this on the agency person (“look, you’re now the data manager, follow this to do list! And remember, we’re here to help you!”) because they are busy making their own notes, talking to the OEM about what they are seeing, and talking to residents.

 

Bringing a fourth person along to do nothing but to sit in the truck with the air conditioning on and manage the data was a huge win on this last deployment. But again as David points out, a well honed team can do this. I suspect that that teams that do not fly together or fly these types of rapid fire missions (called ad hoc teams in industrial psychology)  will need the fourth person and real high performing teams will be able to combine roles.

 

But if you are handling data in the field, why do you need a data management team back at the base? Well, someone needs to create visualizations such as SituMap and summaries of where the data is from, edit high value snippets, upload data over a faster internet connections, etc. The field data manager doesn’t have access to the faster internet, can barely keep up with pace as it is so doesn’t have time to do snippets, and needs to head right back out to the field. Noooo, the OEM doesn’t do this. They are too busy to handle the data themselves. Their solution is to data management is to shout, “pause it there!” and then whip out their cellphone to video the video that is playing on the screen so that they can mail it and post it internally (an Ewok approach to snipeting and reducing resolution). They are trying to USE the information, not process it themselves. So while having Xiaosu or Grant come with us in the field was great, they couldn’t get it all done and we did not even try to work in the visualization software or photogrammetrics.
Again in theory, agencies have people dealing with ESRI and Big Files, but I have encountered this capacity once, maybe twice in my deployments. And in that case, the need to use these products caused a huge problem between the tactical responders and engineers and the people back in the emergency operations center.  So I still see there is a need for “immediate and intermediate” data processing.

CRASAR small UAS Assisted Fort Bend OEM with Determining Flooding

[youtube]https://youtu.be/m3Eahio__mI[/youtube]Texas A&M, US Datawing, USAA, Donan, and CartoFusion Technologies
donated manned and unmanned aerial system flights and advanced  visualization for the Fort Bend Office of Emergency Management on April 20 and 24, 2016. The flights and expertise were donated to the county as part of the Texas A&M Engineering Experiment Station Center for Robot-Assisted Search and Rescue’s Roboticists Without Borders program. The program facilitates companies and researchers to collaborating with emergency professionals. This was the third flooding event that CRASAR has flown small UAS at in the past year and the 23rd response since the 2001 World Trade Center disaster.

The team flew two different small UAS for a total of 10 flights covering approximately 1,000 acres in six different areas of Fort Bend county that were inaccessible. “We had some areas that had never flooded before and we needed to see why they were flooding,” said Adam Wright, project coordinator for Fort Bend County drainage, “there were other areas that have flooded in the past that we needed a better visual on to determine the the cause or extent.”

In addition, US Datawing, a San Antonio aerial analytics company, USAA, the insurance and financial services company, and Donan, a national forensic engineering consulting firm based in Kentucky, shared the costs of manned aircraft to fly Bessie Creek and Barker Reservoir. “We think of small UAS as one tier on a pyramid of aerial imagery assets that go from UAS to manned aircraft to satellites,” said Justin Adams from US Datawing and CRASAR’s lead pilot. “Fort Bend and the surrounding counties needed this bigger picture of the flood.” The manned flights also provide a baseline of high resolution imagery to compare with the data from small UAS, which effectively can cover only about 0.5 miles under current FAA rules. The 10 flights were completed in  just over 2 hours of flight time.

“The CRASAR partnership offered us the ability to access tools that were beyond our capabilities in-house at this point, utilizing the advanced image processing, equipment, and technology.” said Lach Mullen, a planner with Fort Bend County OEM.

“The point isn’t to use small UAS for everything but rather to get the right information to the right people,” added Dr. Robin Murphy, director of CRASAR. “We continue to learn about when to use small UAS versus manned aircraft and how to quickly prioritize and process the imagery- which is essential because the flights generated nearly half a terabyte of data.”

As part of the push to get the right data to the county, Cartofusion supplied SituMap, a software package that allowed easy overlaying of UAS imagery onto maps so that officials could quickly identify the location and extent of flood damage. It also showed where each image was taken. County drainage experts could scroll through the over 5,000 images collected by the UAS and manned aircraft, click on an image, and the location it was taken would appear on the map. Cartofusion is a start-up company spun out of Texas A&M University Corpus Christi. The development of SituMap has been shaped in part by the experiences of previous deployments with CRASAR, including the 2011 Japanese tsunami.

CRASAR also fielded the EMILY robot boat used in swift water rescue and evaluated whether it would be useful for reaching flooded areas in dense tree cover that would block the view from a UAS or manned aircraft. EMILY was developed by Hydronalix, another Roboticists Without Borders member, and deployed to assist with lifeguarding the influx of Syrian refugees on rickety boats into Greece.

Japanese earthquake: how ground, aerial, and marine robots could be used for response

We have been watching with distress the earthquake in Japan and offered any assistance we could provide- however, the first 48 hours are critical for life saving. The International Rescue System Institute (Tokoku University) and the Center for Robot-Assisted Search and Rescue (Texas A&M University)  are the only two centers devoted to disaster robotics, and we work together, so there is considerable expertise available in Japan.

See below for how ground, aerial, and marine robots can be used and best practices are on the home page. Disaster Robotics has 34 case studies worldwide of how these robots have been used at previous earthquakes and disasters through 2013.

I’ll be adding photos and video as I get a chance– this weekend is Aggies Invent: First Responders that we are sponsoring and have two exciting projects based on CRASAR identified (there are 12 others submitted by other response agencies).

Ground robots for locating survivors inside the rubble and speeding up extrication.

Canines typically find survivors but can’t precisely locate where the survivors. Plus dogs can’t provide the “inside view” of the pile of pixie sticks that the extrication team has to be careful not to disturb.  People and canines often can’t get into the rubble because there is often not even a person or dog sized hole that goes all the way from the surface to the interior. Existing boroscopes and cameras on wands can reach about 18 feet or 6 meters into the pile, which means standard US&R equipment is sufficient for single family homes but not apartment buildings or multi-story commercial buildings which are bigger and deeper.

In those case small robots, the size of a lunchbox or smaller, have been used since 2001 (CRASAR at the 9/11 World Trade Center) in go further inside the rubble to where survivors might be and providing the “two for one” of letting the structural specialist visualize how to best remove the rubble to extract. Dr. Tadokoro’s group has one of my favorite small robots, the Active Scope Camera,  that we used together at the Jacksonville Berkman Plaza II collapse. It’s a 6 meter long “caterpillar” robot that can fit in 5cm voids.
Big robots like those used at Fukushima are less valuable because the voids are smaller and the robots can’t move rubble without risking triggering a secondary collapse that will kill the survivors.

UAVs for general reconnaissance and structural inspection.

UAVs have been used since 2005 for disaster response (yes, starting with CRASAR at Hurricane Katrina). The most common uses have been small UAVs for general reconnaissance and for structural inspection. With photogrammetrics, small UAVs are providing geospatial data that are of value to the geologists and public works groups trying to prevent floods, slides, and further collapses. In general, small UAVs are used more frequently because formal responders like the police or fire rescue have access to helicopters and planes. In more remote areas there may be less coverage, so local assets are important. See best practices for UAVs.
One important lesson from the 3/11 earthquake was that the number 1 place to check to see if it was ok and functioning is a hospital!

UMVs for critical underwater infrastructure inspection and reopening ports.

Unmanned marine vehicles, especially ROVs and miniature boats, have been used since 2005 to inspect bridges and reopen ports immediately after an earthquake so that responders can gain access to the affected areas AND get supplies to the hard hit areas. The value of UMVs extends well into the recovery period, both for inspection but also help remap fishing and shipping
channels.

Robot Assistant Lifeguard: Update

Exciting things continue to happen with EMILY- there’s an improved EMILY, a team of computer science, aerospace, and industrial engineering students are working on smartEMILY, and 37 undergraduates in senior capstone design are working on Computing For Disasters topic! Tony Mulligan, CEO of Hydronalix, creator of EMILY, and Roboticists Without Borders member, is heading back to Greece this weekend to check in with the teams and we look forward to his updates.

Everything is going great– except that 410 refugees  have died so far this year and the resort-based tourism economy of Lesvos has been wrecked. Our thoughts and prayers go out to the refugees, the generous and kind citizens of Lesvos, and to the NGOs who continue to do the best they can.

Improved EMILY

Improved EMILY with camera and lights inside the float cover instead of on top
Improved EMILY with camera and lights inside the float cover instead of on top

EMILY has been improved. Notice that her video and thermal cameras are now mounted flush so that if a large number of refugees need to hang on to her, they won’t try to grab and break the camera.

The Hellenic Coast Guard loves their EMILY so much, she’s on their Wikipedia page! Check out https://en.wikipedia.org/wiki/Hellenic_Coast_Guard

smartEMILY

 

Back here in Texas,  we are continuing the theme of participatory research, engaging graduate and undergraduate students in generating new concepts for lifeguard assistant robots:

smartEMILY.  The students in my CSCE 635 Introduction to AI Robotics class are working on making EMILY easier to use. As I wrote in my 1/12/2016 blog “The refugee crossings present a new scenario- how to handle a large number of people in the water. Some may be in different levels of distress, elderly or children, or unconscious. One solution is to use EMILY to go to the people who are still able to grab on, while the lifeguards swim to aid the people who need special professional attention. Chief John Sims from Rural/Metro Fire Department, Pima, (our 4th team member) is anticipating situations where rescuers can concentrate on saving children and unconscious victims while sending EMILY to the conscious and responsive people.” We’re calling this idea “smartEMILY” and the students from computer science, aerospace engineering, and industrial engineering are designing the artificial intelligence needed for robust operation. I can’t wait to test on EMILY in April.

 

Computing for Disasters

Two of the projects in  undergraduate students in our CSCE 482 Senior Capstone design class on “computing for disasters”  are also related to EMILY and two others are on other aspects of humanitarian work.

Dr. Zoi and her colleague trying out EMILY's two way audio during trials with the Hellenic Coast Guard
Dr. Zoi and her colleague trying out EMILY’s two way audio during trials with the Hellenic Coast Guard

One project was inspired by our meeting with Dr. Zoi Livaditou https://m.facebook.com/zoi.livaditou  who is working with the Hellenic Coast Guard. Dr. Livaditou, a medical doctor, has a cassette tape of directions to play over a megaphone to the refugees in their language—yes, a cassette tape. She was so excited at the idea of using EMILY’s two-way radio to play her taped phrases. Three groups of students (EMILYlingo, Fast Phrase, and Team Dragon) and  are working on a smart phone app that she can get different speakers in different languages to record phrases and then easily call them up.  It should be faster to find the right phrase, easier to add phrases, and far more convenient.

A more futuristic variant that would be perfect for a large flexible display mounted on EMILY (the stuff of my dreams!) is to display what you are trying to tell the refugees to do.  For example, how to tie a cleat hitch so their boat can be towed. Even just to reinforce how to steer the boat right or left, so the person hears and sees what the directions are. Two teams, Team Tanks and Team TBD, are working on this.

A very promising non-robotic project is the Refugee Predictor. A student team is writing an inductive machine learning program to predict the of boats, approximate time of arrival, and location for the next day’s data. They are hoping that there is a pattern in the weather, water, time of sunrise/sunset, and any other relevant data for the past year that explains why some days there are 20 boats hitting Skala, and other days 8 boats going to Mytelini. What a great use of machine learning!

The other Computing for Disasters project is there to help with data management by us and other NGOs. In particular, if EMILY is on the water for a morning, the “action” may only be a few minutes.  In order to generate a report, someone has to edit the video clip. The students on Team Snips are working to create a website where any of the NGOs can upload a file plus one or more timestamps, and then it will cut out a snippet of a specified length.

We are seeking funding to buy our own EMILY and Fotokite, then return to Greece to continue to learn and to partner with Prof. Milt Statheropoulos’ group at the National Technical University of Athens.

I am still hoping to raise another $2,504 to cover the unpaid expenses from the January trip so please donate at https://www.gofundme.com/Friends-of-CRASAR