ICARUS: European Union Moves Robot-Assisted Search and Rescue Forward!

I had the pleasure of attending the ICARUS project’s final demonstration in Brussels Belgium as an advisor. ICARUS is the European Union funded project “Integrated Components for Assisted Rescue and Unmanned Search operations” which you can read about at here.  The demonstration was quite the success and the entire project has my greatest admiration!

Just a note to anyone wondering why the US is not doing more of this: the European Union funded the project at $17.5 M Euros, far more than any funding for robotics projects available through the National Science Foundation or the Department of Homeland Security. The great ICARUS team and the funding really helped them move the EU ahead of the US and Asia in robotics and in robotics for disasters. This is not the only project being funded at this level in the EU. NIFTY just finished up, TIRAMISU, CADDY, and SHERPA are all major projects focusing on fundamental research in robotics through applications to disasters. Each project has a strong partnership with an actual response agency or national US&R team, following the model that we use at CRASAR- and indeed that’s why I’m on the advisory board for most of these projects. This is a very different model than the DARPA Robotics Challenge in the US.

There were four aspects of the project that resonated with me:

  1. Engagement of the end-users, in this case, Belgium’s US&R team B-FAST, and emphasis on physical and operational fidelity. This is the major thrust of CRASAR. The engagement of end users led to them deploying their rotorcraft UAV for the Serbia-Bosnia floods, with an excellent set of lessons learned reported at IEEE Safety Security Rescue Robotics at http://tinyurl.com/ppr6c7b.
  2. Focus on heterogeneity of robots. The project demonstrated land, aerial, and marine robots complementing each other to provide responders with more capabilities to see and act at a distance. The July demo showed Aerial-Marine cooperation and this, the September demo, focused on Aerial-Ground cooperation. Heterogeneous robots are not a new topic, nor a new topic for disasters (see our work at the Japanese tsunami http://onlinelibrary.wiley.com/doi/10.1002/rob.21435/abstract) but ICARUS advanced the field by showing interoperability of control of the robots. Arguably, interoperability is not new and something the US Department of Defense is pursuing but it was nice to see, especially combined with heterogeneity of missions.
  3. Heterogeneity of missions. Perhaps the most compelling part of the demo was the how robots could be repurposed for different missions and how the interoperability framework supported this. A large robot for removing rubble could change its end effector and carry a smaller robot and lift it to the roof of a compromised building. The displays showed the payloads and types of functions each robot could do- this visualization was a nice advance.
  4. One size does not fit all. It was music to my ears to hear Geert DeCubber say that there is not a single robot that will work for all missions. I’ve been working on categorizing missions and the environmental constraints (e.g., how small does a robot need to be), with the initial taxonomy in Disaster Robotics https://mitpress.mit.edu/books/disaster-robotics)

The project focused on Interoperability between the assets, which was interesting technologically but I wonder if it will be of practical importance beyond what would be used by a single US&R team- assuming that a single US&R team would own a complete set of ground, aerial, and marine vehicles.

Our experience has been that a single agency or ESF is unlikely to own all the robotic assets. For example at the Fukushima Daiichi nuclear accident, several different types of ground robots and an aerial robot were simultaneously deployed. It didn’t make sense for a single operator to be able to control the devices— with a UGV outside the building clearing rubble, a UGV inside inserting a sensor, and a UAV outside conducting a radiological survey- these seem to be delegated functionality and better kept as separate modules. Furthermore, many of the devices were brought in for the disaster, that the best available was deployed rather than existing JAEA, so there is always the issue of how to incorporate the latest tool.

Even in a relatively small disaster, such as the Prospect Towers parking garage collapse, New Jersey Task Force 1 borrowed ground robots from a law enforcement agency. The point is that for the next decade, teams may be using ad hoc assemblies of robots, not owning a dedicated set of assets.

For CRASAR, the challenge is how the different end-users get the right information from the ad hoc assembly of robotics fast enough to make better decisions.

The project had a host of commendable technical innovations, such as showing a small solar power fixed-wing that operated for 81 hours endurance and provided a wireless network for the responders, a novel stereo sensor for the tiny Astec Firefly which they showed flying in through a window, and an exoskeleton controller for a robot arm which is being commercialized.

I particularly liked the ICARUS focus on establishing useful mission protocols. They experimented with launching a fixed wing immediately to do recon and wireless and provide overwatch of the camp and with using a quadrotor to fly ahead of convoy and try to ascertain the best route to the destination when roads might be blocked with rubble or trees.

Katrina: Aug. 31, 2005, 10th Anniversary of First Small UAS Flight

CRASAR was the first to fly SUAS for a disaster– Hurricane Katrina on Aug. 31 and Sept. 1 for the state of Florida which was assisting with the Mississippi response. Other groups flew too, but CRASAR was first with iSENSYS and WinTEC as our Roboticists Without Borders partners. WIRED has a nice piece on Katrina with some of the footage for our birds.

We were originally tasked to go to New Orleans to help our colleagues with the LSU Fire Emergency Training Institute. It’s a long story but basically, we needed a police escort to get into NOLA so we fell back to Mississippi to help there. Later we returned to the Gulf Coast and flew 32 structural inspection missions over 8 days, establishing crew organization and operational protocols that are now standard in the European Union and Japan. We also discovered that experts from FEMA, Thornton Thomasetti, universities could not readily comprehend the imagery– as we all suspected getting photos IS different than being there, and more so when safety is involved. That motivated a significant amount of research including the Skywriter project to help remote experts direct the robot through the real-time video feed.

Robots, drones and heart-detectors: How disaster technology is saving lives

Robots with cameras, microphones and sensors searched for victims stranded in flooded homes and on rooftops. They assessed damage and sent back images from places rescuers couldn’t get. It was August 31, 2005, two days after Hurricane Katrina hit the Gulf Coast. These robots were a crucial connection between emergency responders and survivors. Ten years later, new technology is changing the way we handle whatever life throws at us. In the case of disaster relief and recovery, this means more effective ways to save lives and begin the arduous process of rebuilding after catastrophe.

“You’ve got a golden 72 hours of the initial response that’s very critical,” said Dr. Robin Murphy,  a robotics professor and director of the Center for Robot-Assisted Search and Rescue (CRASAR) at Texas A&M University and also worked with robots after the September 11, 2001, attacks, in natural disasters such as Hurricane Katrina and at the Fukushima nuclear accident. “Then you have the restoration of services. After the emergency teams have got everything under control, you got to get your power back on, your sewage, you know, your roads and that.”

UAVs such as the PrecisionHawk Lancaster, a fixed wing drone, are not only able to aide human disaster responders by providing photos of where to look for victims, but they also provide a valuable resource for determining how to approach the relief efforts. “It acts like a plane. It’s smarter than a plane because it’s got all sorts of onboard electronics to let it do preprogram surveys. It takes pictures like on a satellite or a Mars explorer and then pulls those back together into a hyper-accurate map — a 3-D reconstruction,” Murphy said. Murphy also said it’s not only very accurate, but it’s also easy to pick up and maneuver.

Check out the rest of the article here

Word from Responders: “Small UAVs are Available, Now Help Us Use The Data They Generate!” REU Students Provide Apps

Virtual reality reconstruction of Blanco River flood by Dr. Russ Taylor from CRASAR flight

TEES just concluded a three day Summer Institute on Floods (July 26-28, 2015), which was hosted by our “mother” center, the Center for Emergency Informatics. The Summer Institute focuses on the data-to-decision problems for a particular type of disaster. This year’s Summer Institute was the second in two-part series on floods and brought together representatives from 12 State agencies, 15 universities, and 5 companies for two days of “this is what we did during the Texas floods” and one day of “this is what we could do or do better” experimentation with small unmanned aerial vehicles, crowd sourcing, computer vision, map-based visualization packages, and mobile phone apps for common operating pictures and data collection.

A portion of the Summer Institute focused strictly on UAVs. The session was organized by the Lone Star UAS Center and the TEES Center for Robot-Assisted Search and Rescue (CRASAR), both of whom flew during the Texas floods.

The UAV field exercises  spanned four scenarios witnessed during the floods:

  • Search of cars and other vehicles swept away by a storm surge for trapped victims;
  • Search for missing persons who may have fled campsites or been swept down river;
  • Assessment of damage to power lines and transformers and presence of standing water which prevents restoration of power infrastructure; and
  • Assessment of damage to houses and estimates of household debris, which is critical to insurance companies estimating damage and agencies such as the American Red Cross in projecting social impacts.

The exercises were staged at the 1,900 acre TAMU Riverside Campus with one fixed wing and three types of rotorcraft  flown by CRASAR and David Kovak, a member of CRASAR’s Roboticists Without Borders program.

A major finding out of the institute was the realization that while State agencies are adopting UAVs, the agencies can’t process all the imagery coming in. For example, a single 20-minute UAV flight by CRASAR at the floods produced over 800 images totaling 1.7GB. There were over a dozen platforms flying daily for two weeks as well as Civil Air Patrol and satellite imagery. Each image has to be viewed manually for signs of survivors, clothing, or debris that indicate a house or car (and thus a person too) was swept to this location.

Given the huge demand for automating image processing, it was no surprise that a panel of four judges from the agencies awarded $900 in prizes to three students from the National Science Foundation Research Experiences for Undergraduates (REU) program. The prizes were for software that classified imagery and displayed where it was taken.

First place went to Julia Proft for an application that used computer vision and machine learning to find images from the floods which contained a spectral anomaly such as color. Below is UAV image flagged with an amaloy and anomalies indicated on a parallel image to make it easier to find where in the image the anomalies occurred without marking up the image and blocking the view.

UAV image flagged with an amaloy and anomalies indicated on a parallel image

urban debrisSecond place went to Matt Hegarty for an application that used computer vision and machine learning to find images from the floods which contained urban debris having straight lines or corners. In this image, the program found  instances of trash, pvc poles, and other indications of where houses (and possibly victims) had been swept down river.

 

Third place went to Abygail McMillian for a program that visually displayed where and when UAV data or any geotagged data was taken. The spatial apsect is important for responders to see where assets have sensed- at Nepal, UAV teams often surveyed the same areas but missed important regions. The temporal aspect (a timeline scale on the map) is important because with the changes in the river flood stage, hydrological and missing person images become “stale” and flights should be reflown to get new data.

Data display with timeline

The 10 students (nine from under-represented groups) were from Colorado, Georgia, Puerto Rico, South Carolina, Texas, Vermont, and Wyoming and spent  the summer conducting research through the Computing for Disasters NSF REU site grant under the direction of Dr. Robin Murphy. Multiple State agencies requested those student-created apps be hardened and released. In previous years, the panels have awarded prizes primarily to hardware- UAVs, battery systems, communications nodes, etc. This year, the focus was on software and programming to help analyze the data the hardware generates.

 

 

Flood of data may be the biggest problem in dealing with floods

We just concluded a three day Summer Institute on Floods (July 26-28, 2015), which was hosted by our “mother” center, the Center for Emergency Informatics. The Summer Institute focuses on the data-to-decision problems for a particular type of disaster. This year’s Summer Institute was the second in two-part series on floods- even though there was a drought last year, our TEEX partners were very worried about flooding for Texas and flooding is the number one disaster in the world. It was originally scheduled for early June but had to be moved due to the floods and allowed us the opportunity to discuss topics while they were still fresh on everyone’s minds.

 

The Summer Institute brought together representatives from 12 agencies, 15 universities, and 5 companies for two days of “this is what we did during the Texas floods” and one day of “this is what we could do or do better” experimentation with small unmanned aerial vehicles, crowd sourcing, computer vision, map-based visualization packages, and mobile phone apps for common operating pictures and data collection. The field exercise was designed to track the resource allocation of field teams, how they can acquire data from the field from UAVs and smartphones, then how the data can be effectively organized, analyzed, and visualized.

Here are my preliminary findings:

  • Agencies are adopting small UAVs, social media, and smartphones apps to acquire new types of data, faster and over larger geospatial and temporal scales than ever. Small UAVs were used during the Texas floods in all of the affected regions (big shout out to Lone Star UAS Center- we flew under their direction during the floods). Agencies were monitoring social media, in one case to uncover and combat a rumor that a levee had broken. Texas Task Force 1 and other agencies used General Dynamics’ GeoSuite mobile app for giving tactical responders a common operating picture (big shout out- GeoSuite was introduced and refined starting with the 2011 Summer Institute).
  •  Wireless communications is no longer the most visible barrier to acquiring data from the field in order to make better decisions.  While responders in the field may not have as much bandwidth as they did before a disaster, cell towers on wheels are being rapidly deployed and the Civil Air Patrol flew repeater  nodes. That said, wireless communications isn’t solved as hilly geography can prevent teams from connecting to sparse temporary nodes. Plus keep in mind that large parts of rural Texas have limited or no connectivity under normal conditions.
  • The new barrier is what to do with the data coming in from unmanned systems, news feeds, social media feeds, and smart phones. Consider that a single 20-minute UAV flight produced roughly over 800 images totaling 1.7GB. There were over a dozen platforms flying daily for two weeks as well as Civil Air Patrol and satellite imagery. Most of the imagery was being used to search for missing persons, which means each image has to be inspected manually by at least (preferably more). Signs of missing persons are hard to see, as there may be only a few pixels of clothing (victims may be covered in mud or obscured by vegetation and debris) or urban debris (as in, if you see parts of a house, there may be the occupant of the house somewhere in the image). Given the multiple agencies and tools, it was hard to pinpoint what data has been collected when (i.e., spatial and temporal complexity) and then access the data by area or time. Essentially no one knew what they had.  Agencies and insurance companies had to manually sort through news feeds and public postings, both text and images, to find nuggets of relevant information.
  • The future of disasters was clearly in organizing, analyzing, and visualizing the data. Responders flocked to SituMap, an interactive map-based visualization tool that Dr. Rick Smith at TAMU Corpus Christi started developing after the 2010 Summer Institute and is now being purchased by TEEX and other responders. The agencies awarded $900 in prizes to NSF Research Experiences for Undergraduate students for software that classified imagery and displayed where it was taken won 1st, 2nd, and 3rd. Multiple agencies requested those apps be hardened and released as well as the Skywriter sketch and point interface (CRASAR developed that for UAVs and it is being commercialized) and the wide area search planning app developed over the last two summers by other students from the NSF REU program. In previous years, the panels have awarded prizes primarily to hardware- UAVs, battery systems, communications nodes, etc. This year, the attitude was “those technologies are available, now help us use the data they generate?”

Each year we hear the cry from emergency management “we’re drowning in data, help” and this year it was more than a bad pun.

 

How to Fly at Floods: Summer Institute PLUS Roboticists Without Borders UAV training

One word about the floods and about UAVs: informatics. Read on to see what I mean 😉

As I blogged earlier, on July 16-28, the Center for Emergency Informatics’ 2015 Summer Institute is on flooding and will bring together state agencies and municipalities who were part of the Texas floods with researchers and industry for a two-day workshop and 1 day exercise. The exercise will include UAVs flying the missing persons missions and the recovery and restoration missions.

Notice that it’s the Center for Emergency Informatics hosting the event because it’s about the right data getting to the right agencies or stakeholders at the right time and displayed in the right way that will enable them to make the right decisions. UAVs (and marine vehicles such as Emily and the small airboats being developed at Carnegie Mellon, Texas A&M, and University of Illinois) have a big role to play. But UAVs are useful only if the entire data-to-decision process works, aka informatics.

The Summer Institute July 26-28 will also host a training session for Roboticists Without Borders members specifically on UAVs and the best practices of how to fly at floods and upcoming hurricanes and collect the useful data– what do the decision makers need? Again, this is the informatics, the science of data, not the aeronautics. The training is independent of platform- because what the decision makers need is what they need 😉  The current (and evolving) best practices are derived from three sources:

  1. CRASAR RWB deployments going back to 2005 Hurricane Katrina Pearl River cresting and including the Oso Mudslides and our deployment with Lone Star UAS Center to the Texas floods,
  2. the reports and analyses of what has worked at typhoons and other flooding events worldwide, and
  3. what researchers through out the world, especially the IEEE Safety, Security, and Rescue Robotics technical community, are doing.

For example, video is not as useful as high resolution imagery for searching for missing persons. Infrared isn’t helpful except in the early morning. Some missions and terrains require remote presence style of control, other can use preprogrammed flight paths. Complex terrains such as river bluffs may require flight paths that are vertical slices, not horizontal slices. Many more and I’m sure we will learn more from each other.

The training session will consist of evening classes on July 26 and 27, with field work on July 28 at the 1,900 acre Riverside Campus. We will fly fixed-wing and rotorcraft for response missions (reconn, missing persons, flood mitigation) and for recovery/restoration (damage assessment, debris estimation, household debris estimation, power utility assessment, transportation assessment). The scenarios will be designed by experts from Texas Task Force 1 and the representatives from the agencies that would use the information, including the fire rescue, law enforcement,  Texas insurance commission, SETRAC, etc.).

It’s not too late to join Roboticists Without Borders and attend! It’s free.

Hope to see you there!

 

Update: Flooding disasters continue to mount casualties and challenge response efforts

We are working with our partner the Lone Star Unmanned Aerial System Center, one of the 6 FAA test centers, to help support the flooding response and recovery efforts for the Texas floods. The floods are a sad example of how flooding costs the U.S. more than 80 lives and $8 billion in damages each year. And that excludes storm surge from hurricanes.

Technology based solutions have advanced to where employing robotic assisted solutions can aide in crisis assessments by federal, state and local officials and emergency workers. Yet more data adds another layer of complexity and extra coordination of robot assisted efforts.

Accurate ground and aerial surveys can help decision makers choose where to deploy limited resources in the best way. By quickly identifying survivors, any follow on threats from the natural disaster, collecting measurements of debris fields, volume of water flows or similar data can be provided without endangering the emergency responders.

Water gone wild and what the Center for Robot-Assisted Search and Rescue (CRASAR) is working to do:

Texas A&M’s Center for Emergency Informatics (CEI) continues its two-year study of information technologies, including unmanned systems, in response to floods. The CEI brings together practitioners, academics, and industry experts to converge and test their knowledge hardware and software to respond to an array of water borne disasters.

CEI as the host for the Summer Institute on Flooding, slated for June 16-18, will set the stage for testing ideas, marry technologies and coordinate decision makers in real-time flood based scenarios

From the 2014 summer institute 42 representatives from 12 states tapping 14 agencies, 14 universities, and eight businesses were able to judge what information technology was mature enough, or which needs more development to assist with flooding response.

This year’s summer institute will include three exercises, each representing the key problem missions identified by the participating agencies:

  • Swift water rescue – a scenario has been developed where suddenly rising waters cut off helping responders to rescue people. Responders from Texas Task Force 1 using the characteristics of a campground flooding will be offered realistic challenges. The TEEX designed exercises focus on real missions rather than around technologies.
  • Life-saving response and immediate mitigation: identify best ways to assess where’s the flood, who is at risk?
  • Restoration and recovery– exercise decision making for resource allocation to restore roads, electricity, sewage, etc. And engaging insurance companies to conduct property damage assessment and projections of debris generation estimation to manage post flooding problems with decay, vermin and disease.

The growth of small UAVs and flooding response:

Small UAVs have been tasked for at least eight disasters with flooding or had flooding associated with it:

  • Hurricane Katrina 2005
  • Typhoon Morakot, Taiwan 2009
  • Thailand Floods 2011
  • Typhoon Haiyan, Philippines 2013
  • Boulder Colorado floods 2013
  • Oso Washington Mudslides 2014
  • Balkans flooding Serbia 2014
  • Cyclone Pamela Vanuatu 2014

From the several UAV platforms that went airborne each provided rapid and often timely information to officials and for responders in search and rescue missions as well as the difficulties with recovery.

General reconnaissance: addressing where’s the flooding, where are the people cut off by the flooding, and what roads are still passable.

Hydrological situation awareness: both real-time and post-processed. The flooding caused by the Oso mudslide was a real problem and rotorcraft could hover and stare at the river, letting the hydrologists estimate the flow rate in different areas. The biggest need remains surveying the amount of flooding. And with the addition of photogrammetric image software – UAVs provided details of the terrain and potential for additional flooding or the best place to put a dyke, channel or other mitigation. Our partners at FIT took it further and printed out a 3D model of the terrain to help everyone visualize the terrain.

The power of Unmanned Autonomous Vehicles:

  • Over watch for swift water rescue teams: our friends at South Carolina Task Force 1 have been pushing us to help create the protocols for using small UAVs. Seeing something such as a logjam that might be coming down river and pose a threat to life and limb as they work to rescue people. They’ll get their wish as this is one of the scenarios to be unfolded in our 2015 Summer Institute on Flooding in June hosted by CEI.
  • Debris estimation: both the debris directly from the flood and the indirect debris that follows on from people having to rip out sheet rock and toss water logged carpets. The advances in photogrammetrics make it possible to estimate the volume of debris. That is based on having the “before” survey of the area. To test it, we flew with PrecisionHawk at the Bennett Landfill superfund site in February 2015 to estimate the volume of toxic trash, which was on fire and needed to be safely removed. Part of the data analysis included estimating the content type, because vegetation and construction materials have to get handled and processed differently.
  • Delivery of supplies to isolated regions. We learned during last year’s summer institute that if the locals can hold their own for 72 hours, usually that was sufficient. In Texas, where breeding stock can represent hundreds of thousands of dollars of investment, someone to stay behind may make sense. Determining that they are not a victim is vital. Disasters can take people by surprise with a bridge being washed out or a vital need of medicines and other perishables. A group of Texas A&M aerospace students won 2nd place for their small, hefty fixed-wing UAV that could be used to drop off heavier/bigger bundles of supplies from further distances. Groups like Matternet, who like CRASAR are members of FIT, are looking at delivering medicine with rotorcraft.
  • Delivering lifelines, life jackets, and small things to people trapped on roofs: A note about delivering things with a rotorcraft– Using rotorcraft to carry a line or bottle to someone is complicated by the weight and distribution of them. Those factors usually make the UAV very sensitive to wind and control errors. As such, if an open rotor system is used, more distance than normal from a person is needed to account for such errors. One senior design project from last year’s Summer Institute created a two-way audio system for rotorcraft. Since rotors can generate about 85 dB of noise hanging a microphone and speaker on the UAV is ineffective. The Computer Engineering team used noise reduction algorithms from National Instruments LabVIEW to prototype a lightweight 2-way audio system impervious to noise.

Look for more to come from the 2015 Summer Institute for Flooding, June 16-18.

 

Flooding and small UAVs

At this time, we aren’t involved in the horrible Texas and Oklahoma flooding. But we’ve been studying flooding since our deployment with Hurricane Charley in 2004. This blog will give a short background on what we do here at A&M and the upcoming 2015 Summer Institute on Flooding, then the history of small UAVs for flooding, and the potential uses generated by experts at our 2014 Summer Institute on Flooding.

About CRASAR and flooding

The Center for Emergency Informatics (CEI) here at Texas A&M is in the middle of a two-year exploration of information technologies, including unmanned systems, for floods. The CEI is a “center of center” that brings together practitioners, academics, and industry to fuse and apply their knowledge to disasters.  CRASAR is the “response” arm for CEI (actually we do participatory research versus response- similar to what anthropologists like Margaret Mead do. Sometimes the only way to learn is by doing by embedding with the responders, we don’t do disaster tourism).

The CEI is hosting a Summer Institute on Flooding June 16-18, the second on flooding. The choice of flooding was motivated by the fact that flooding is costing the U.S. over 80 lives and $8 billion in damages each year, excluding storm surge from hurricanes. Last year’s summer institute brought together 42 representatives and 12 states from 14 agencies, 14 universities, and 8 companies to consider what information technology is mature enough, or needs a bit of encouragement, to assist with flooding response.

This year’s Summer Institute consists of three exercises, each representing the key problem missions identified by the agencies:

  1. Swift water rescue (June 16)- helping responders rescue people suddenly cut off by rising waters. The plan for the swift water rescue reads like the campground flooding- showing the advantage of having the responders from Texas Task Force 1 and TEEX design exercises around real missions rather than around technologies.
  2. Life-saving response and immediate mitigation (June 17)- where’s the flood, who is at risk?
  3. Restoration and recovery (June 18)- restoring roads, electricity, sewage, etc. as well as insurance companies conducting property damage assessment  and cities generating debris estimation so that they can keep the rats out.

About small UAVs and flooding

Small UAVs have been used at least 8 disasters from flooding or had flooding associated with it: Hurricane Katrina 2005, Typhoon Morakot, Taiwan 2009, Thailand Floods 2011, Typhoon Haiyan Philippines 2013, Boulder Colorado floods 2013, Oso Washington Mudslides 2014, Balkans flooding Serbia 2014, and Cyclone Pamela Vanuatu 2014.

Historically they have been used for:

  • General reconnaissance: where’s the flooding, where are the people cut off by the flooding, and what roads are still passable.
  • Hydrological situation awareness, both real-time and post-processed. The flooding caused by the Oso mudslide was a real problem and rotorcraft were considered because they could hover and stare at the river, letting the hydrologists estimate the flow rate in different areas. The biggest push was for surveying the amount of flooding and- with the addition of photogrammetric image software- the terrain and potential for additional flooding (or the best place to put a dyke, channel or other mitigation). Our partners at FIT took it further and printed out a 3D model of the terrain to help everyone visualize the terrain.

Potential uses for UAVs

  • Overwatch for swift water rescue teams: our friends at South Carolina Task Force 1 have been pushing us to help create the protocols for using small UAVs to help them see that a logjam is coming down river and going to wipe them and the people they are trying to rescue out. This is one of the scenarios we will be playing in our Summer Institute on Flooding in June
  • Debris estimation: both the debris directly from the flood and the indirect debris a few days or weeks later from people having to rip out sheet rock and carpets. The advances in photogrammetrics make it possible to estimate the volume of debris— if you have the “before” survey of the area; we flew with PrecisionHawk at the Bennett Landfill superfund site in February in order to estimate the volume of toxic trash (which was on fire) that needed to be safely removed. The next step is to estimate the content, because vegetation and construction materials have to get handled and processed differently.
  • Deliver supplies to cut off regions. We learned during last year’s summer institute that if the locals can hold out for 72 hours, usually that’s sufficient. Indeed, in Texas, where breeding stock can represent hundreds of thousands of dollars of investment, it may make sense for someone to stay behind. But sometimes people get taken by surprise or a bridge washes out unexpectedly and need diabetes medicine and other perishables. Groups like Matternet, who like CRASAR are members of FIT,  are looking at delivering medicine with rotorcraft. A group of Texas A&M aerospace students won 2nd place for their small but hefty fixed-wing UAV that could be used to drop off heavier/bigger bundles of supplies from further distances
  • Delivering lifelines, life jackets, and small things to people trapped on roofs. A note about delivering things with a rotorcraft- You can use rotorcraft to carry a line or bottle to someone but the weight and distribution usually makes the UAV very sensitive to wind and control errors— if you are using a open rotor system just be aware and maintain more distance than normal from a person if at all possible. One senior design project that resulted from last year’s Summer Institute was to create a 2-way audio system for rotorcraft. The rotors generate about 85 dB of noise so if you’re a responders and want to try to talk with a person by hanging a microphone and speaker on the UAV, it probably won’t work. The Computer Engineering design team used noise reduction algorithms from National Instruments LabVIEW to prototype a lightweight 2-way audio system impervious to noise.