Archive for the ‘Research’ Category

ICARUS: European Union Moves Robot-Assisted Search and Rescue Forward!

I had the pleasure of attending the ICARUS project’s final demonstration in Brussels Belgium as an advisor. ICARUS is the European Union funded project “Integrated Components for Assisted Rescue and Unmanned Search operations” which you can read about at here.  The demonstration was quite the success and the entire project has my greatest admiration!

Just a note to anyone wondering why the US is not doing more of this: the European Union funded the project at $17.5 M Euros, far more than any funding for robotics projects available through the National Science Foundation or the Department of Homeland Security. The great ICARUS team and the funding really helped them move the EU ahead of the US and Asia in robotics and in robotics for disasters. This is not the only project being funded at this level in the EU. NIFTY just finished up, TIRAMISU, CADDY, and SHERPA are all major projects focusing on fundamental research in robotics through applications to disasters. Each project has a strong partnership with an actual response agency or national US&R team, following the model that we use at CRASAR- and indeed that’s why I’m on the advisory board for most of these projects. This is a very different model than the DARPA Robotics Challenge in the US.

There were four aspects of the project that resonated with me:

  1. Engagement of the end-users, in this case, Belgium’s US&R team B-FAST, and emphasis on physical and operational fidelity. This is the major thrust of CRASAR. The engagement of end users led to them deploying their rotorcraft UAV for the Serbia-Bosnia floods, with an excellent set of lessons learned reported at IEEE Safety Security Rescue Robotics at
  2. Focus on heterogeneity of robots. The project demonstrated land, aerial, and marine robots complementing each other to provide responders with more capabilities to see and act at a distance. The July demo showed Aerial-Marine cooperation and this, the September demo, focused on Aerial-Ground cooperation. Heterogeneous robots are not a new topic, nor a new topic for disasters (see our work at the Japanese tsunami but ICARUS advanced the field by showing interoperability of control of the robots. Arguably, interoperability is not new and something the US Department of Defense is pursuing but it was nice to see, especially combined with heterogeneity of missions.
  3. Heterogeneity of missions. Perhaps the most compelling part of the demo was the how robots could be repurposed for different missions and how the interoperability framework supported this. A large robot for removing rubble could change its end effector and carry a smaller robot and lift it to the roof of a compromised building. The displays showed the payloads and types of functions each robot could do- this visualization was a nice advance.
  4. One size does not fit all. It was music to my ears to hear Geert DeCubber say that there is not a single robot that will work for all missions. I’ve been working on categorizing missions and the environmental constraints (e.g., how small does a robot need to be), with the initial taxonomy in Disaster Robotics

The project focused on Interoperability between the assets, which was interesting technologically but I wonder if it will be of practical importance beyond what would be used by a single US&R team- assuming that a single US&R team would own a complete set of ground, aerial, and marine vehicles.

Our experience has been that a single agency or ESF is unlikely to own all the robotic assets. For example at the Fukushima Daiichi nuclear accident, several different types of ground robots and an aerial robot were simultaneously deployed. It didn’t make sense for a single operator to be able to control the devices— with a UGV outside the building clearing rubble, a UGV inside inserting a sensor, and a UAV outside conducting a radiological survey- these seem to be delegated functionality and better kept as separate modules. Furthermore, many of the devices were brought in for the disaster, that the best available was deployed rather than existing JAEA, so there is always the issue of how to incorporate the latest tool.

Even in a relatively small disaster, such as the Prospect Towers parking garage collapse, New Jersey Task Force 1 borrowed ground robots from a law enforcement agency. The point is that for the next decade, teams may be using ad hoc assemblies of robots, not owning a dedicated set of assets.

For CRASAR, the challenge is how the different end-users get the right information from the ad hoc assembly of robotics fast enough to make better decisions.

The project had a host of commendable technical innovations, such as showing a small solar power fixed-wing that operated for 81 hours endurance and provided a wireless network for the responders, a novel stereo sensor for the tiny Astec Firefly which they showed flying in through a window, and an exoskeleton controller for a robot arm which is being commercialized.

I particularly liked the ICARUS focus on establishing useful mission protocols. They experimented with launching a fixed wing immediately to do recon and wireless and provide overwatch of the camp and with using a quadrotor to fly ahead of convoy and try to ascertain the best route to the destination when roads might be blocked with rubble or trees.

Word from Responders: “Small UAVs are Available, Now Help Us Use The Data They Generate!” REU Students Provide Apps

Virtual reality reconstruction of Blanco River flood by Dr. Russ Taylor from CRASAR flight

TEES just concluded a three day Summer Institute on Floods (July 26-28, 2015), which was hosted by our “mother” center, the Center for Emergency Informatics. The Summer Institute focuses on the data-to-decision problems for a particular type of disaster. This year’s Summer Institute was the second in two-part series on floods and brought together representatives from 12 State agencies, 15 universities, and 5 companies for two days of “this is what we did during the Texas floods” and one day of “this is what we could do or do better” experimentation with small unmanned aerial vehicles, crowd sourcing, computer vision, map-based visualization packages, and mobile phone apps for common operating pictures and data collection.

A portion of the Summer Institute focused strictly on UAVs. The session was organized by the Lone Star UAS Center and the TEES Center for Robot-Assisted Search and Rescue (CRASAR), both of whom flew during the Texas floods.

The UAV field exercises  spanned four scenarios witnessed during the floods:

  • Search of cars and other vehicles swept away by a storm surge for trapped victims;
  • Search for missing persons who may have fled campsites or been swept down river;
  • Assessment of damage to power lines and transformers and presence of standing water which prevents restoration of power infrastructure; and
  • Assessment of damage to houses and estimates of household debris, which is critical to insurance companies estimating damage and agencies such as the American Red Cross in projecting social impacts.

The exercises were staged at the 1,900 acre TAMU Riverside Campus with one fixed wing and three types of rotorcraft  flown by CRASAR and David Kovak, a member of CRASAR’s Roboticists Without Borders program.

A major finding out of the institute was the realization that while State agencies are adopting UAVs, the agencies can’t process all the imagery coming in. For example, a single 20-minute UAV flight by CRASAR at the floods produced over 800 images totaling 1.7GB. There were over a dozen platforms flying daily for two weeks as well as Civil Air Patrol and satellite imagery. Each image has to be viewed manually for signs of survivors, clothing, or debris that indicate a house or car (and thus a person too) was swept to this location.

Given the huge demand for automating image processing, it was no surprise that a panel of four judges from the agencies awarded $900 in prizes to three students from the National Science Foundation Research Experiences for Undergraduates (REU) program. The prizes were for software that classified imagery and displayed where it was taken.

First place went to Julia Proft for an application that used computer vision and machine learning to find images from the floods which contained a spectral anomaly such as color. Below is UAV image flagged with an amaloy and anomalies indicated on a parallel image to make it easier to find where in the image the anomalies occurred without marking up the image and blocking the view.

UAV image flagged with an amaloy and anomalies indicated on a parallel image

urban debrisSecond place went to Matt Hegarty for an application that used computer vision and machine learning to find images from the floods which contained urban debris having straight lines or corners. In this image, the program found  instances of trash, pvc poles, and other indications of where houses (and possibly victims) had been swept down river.


Third place went to Abygail McMillian for a program that visually displayed where and when UAV data or any geotagged data was taken. The spatial apsect is important for responders to see where assets have sensed- at Nepal, UAV teams often surveyed the same areas but missed important regions. The temporal aspect (a timeline scale on the map) is important because with the changes in the river flood stage, hydrological and missing person images become “stale” and flights should be reflown to get new data.

Data display with timeline

The 10 students (nine from under-represented groups) were from Colorado, Georgia, Puerto Rico, South Carolina, Texas, Vermont, and Wyoming and spent  the summer conducting research through the Computing for Disasters NSF REU site grant under the direction of Dr. Robin Murphy. Multiple State agencies requested those student-created apps be hardened and released. In previous years, the panels have awarded prizes primarily to hardware- UAVs, battery systems, communications nodes, etc. This year, the focus was on software and programming to help analyze the data the hardware generates.



How to Fly at Floods: Summer Institute PLUS Roboticists Without Borders UAV training

One word about the floods and about UAVs: informatics. Read on to see what I mean ;-)

As I blogged earlier, on July 16-28, the Center for Emergency Informatics’ 2015 Summer Institute is on flooding and will bring together state agencies and municipalities who were part of the Texas floods with researchers and industry for a two-day workshop and 1 day exercise. The exercise will include UAVs flying the missing persons missions and the recovery and restoration missions.

Notice that it’s the Center for Emergency Informatics hosting the event because it’s about the right data getting to the right agencies or stakeholders at the right time and displayed in the right way that will enable them to make the right decisions. UAVs (and marine vehicles such as Emily and the small airboats being developed at Carnegie Mellon, Texas A&M, and University of Illinois) have a big role to play. But UAVs are useful only if the entire data-to-decision process works, aka informatics.

The Summer Institute July 26-28 will also host a training session for Roboticists Without Borders members specifically on UAVs and the best practices of how to fly at floods and upcoming hurricanes and collect the useful data– what do the decision makers need? Again, this is the informatics, the science of data, not the aeronautics. The training is independent of platform- because what the decision makers need is what they need ;-)   The current (and evolving) best practices are derived from three sources:

  1. CRASAR RWB deployments going back to 2005 Hurricane Katrina Pearl River cresting and including the Oso Mudslides and our deployment with Lone Star UAS Center to the Texas floods,
  2. the reports and analyses of what has worked at typhoons and other flooding events worldwide, and
  3. what researchers through out the world, especially the IEEE Safety, Security, and Rescue Robotics technical community, are doing.

For example, video is not as useful as high resolution imagery for searching for missing persons. Infrared isn’t helpful except in the early morning. Some missions and terrains require remote presence style of control, other can use preprogrammed flight paths. Complex terrains such as river bluffs may require flight paths that are vertical slices, not horizontal slices. Many more and I’m sure we will learn more from each other.

The training session will consist of evening classes on July 26 and 27, with field work on July 28 at the 1,900 acre Riverside Campus. We will fly fixed-wing and rotorcraft for response missions (reconn, missing persons, flood mitigation) and for recovery/restoration (damage assessment, debris estimation, household debris estimation, power utility assessment, transportation assessment). The scenarios will be designed by experts from Texas Task Force 1 and the representatives from the agencies that would use the information, including the fire rescue, law enforcement,  Texas insurance commission, SETRAC, etc.).

It’s not too late to join Roboticists Without Borders and attend! It’s free.

Hope to see you there!


Summer Institute Dates Announced (finally!) July 26-28

The 2015 Summer Institute on Flooding will be held on July 26-28 at the Riverside Campus, College Station, Tx.  Check out the information at


Robots and Ebola

I’ve been working since Sept 17 on robots for the Ebola epidemic– both in terms of what can be used now and what can be used for future epidemics. Dr. Taskin Padir at WPI deserves a big shout out for calling the robotics community’s attention to this, with Gill Pratt at DARPA and head of the DARPA Robotics Challenge and Richard Voyles Associate Dean at Purdue.
I am pleased to announce that CRASAR will be co-hosting a White House Office of Science and Technology Policy workshop on Safety Robotics for Ebola Workers on Nov. 7. Texas A&M was already planning a medical response workshop on the 7th for disasters in general, so expanding that to a virtual event over the internet with sessions at the White House (OSTP and DARPA), Boston (Taskin), and Berkeley (Ken Goldberg).  CRASAR is already planning to host another workshop to share the results of our current research into specific use cases with the robotics community in the Jan 3-15, 2015, timeframe.
Here on campus, students will be creating prototypes as part of the Aggies Invent event What Would You Build for a First Responder event on Oct. 24-27 and the students in my graduate AI Robotics class this semester will be designing and simulating intelligent robots.
The real issue to me is what are the real needs that robots can play in such a complex event? Here are some possibilities that have emerged in discussions and I am sure that there are many more (let me know what you think!):
  • Mortuary robots to respectfully transport the deceased, as ebola is most virulent at the time of death and immediately following death
  • Reducing the number of health professionals within the biosafety labs and field hospitals (e.g., automated materials handling, tele robotics patient care)
  • Detection of contamination (e.g., does this hospital room, ambulance or house have ebola)
  • Disinfection (e.g., robots that can open the drawers and doors for the commercially available “little Moe” disinfectant robot)
  • Telepresence robots for experts to consult/advise on medical issues, train and supervise worker decontamination to catch accidental self-contamination, and serve as “rolling interpreters” for the different languages and dialects
  • Physical security for the workers (e.g., the food riots in Sierre Leone)
  • Waste handling (e.g., where are all the biowaste from patients and worker suits going and how is it getting there?)
  • Humanitarian relief (e.g., autonomous food trucks, UAVs that can drop off food, water, medicine, but also “regular” medicine for diabetes, etc., for people who are healthy but cut off)
  • Reconnaissance (e.g., what’s happening in this village? Any signs of illness? Are people fleeing?)
In order to be successful at any one of the tasks, robots have to meet a lot of hidden requirements and sometimes the least exciting or glamorous job can be of the most help to the workers. Example hidden requirements: Can an isolated field hospital handle a heavy robot in the muddy rainy season? How will the robots be transported there? Is it easy enough for the locals to use so that they can be engaged and earn a living wage? What kind of network communication is available? What if it needs repairs? That’s what I am working on, applying the lessons learned in robotics for meteorological and geological disasters.
I am certainly not working alone and am reaching out to experts all over the world. In particular, four groups have immediately risen to the challenge and are helping.  Matt Minson MD and head of Texas Task Force 1′s medical team and Eric Rasmussen MD FACP (a retired Navy doctor) who has served as the medical director for the Center for Robot-Assisted Search and Rescue since 9/11 have offered their unique insights. There are two DoD groups:  the USMC Chemical Biological Incident Response Force (the team that cleaned up the anthrax in DC) with whom I’ve served on their technical advisory board and the Army Telemedicine & Advanced Technologies Research Center (TATRC), where Gary Gilbert MD has led highly innovative work in telemedicine and in casualty evacuation (Matt and I had a grant evaluating robotic concepts).

Preventing disasters: small unmanned aerial vehicles for evacuation and crowd control

This month’s issue of Smithsonian Magazine has an article  “Why are people so comfortable with drones?” on  Brittany Duncan’s preliminary study for her NSF graduate fellowship research on using small UAS for evacuation and crowd. Brittany is my PhD student who recently flew the AirRobot at the SR530 mudslide response. It’s nice to see that robots are being considered for more than the immediate life-saving aspects of search and rescue. Brittany sees a near future where aerial vehicles can act as “headers” and “heelers” to guide and block people into following the right exits during an evacuation.

China earthquake and Bangladesh collapse… the challenges of remote disasters

The Chinese earthquake and the Bangladesh collapse coming on the heels of the Tanzania building collapse illustrate the need for rapidly deployed, regional teams of disaster robots that can quickly get there. The Bangladesh collapse might have been aided by the use of small robots to penetrate in the rubble. Ground robots are less useful for a wide area of residential buildings, though UAVs are very helpful for assessing the extent of damage. But for now, the best we can do in the rescue robot community is to send our thoughts and prayers to the victims, their families, and the responders.

Reuters video: Fukushima disaster tests mettle of local robot makers

Check out this 3 minute video on Japanese robots being used, or developed, for Fukushima. Big shout out to Prof. Eiji Koyanagi at the Chiba Institute of Technology- he’s been a real pioneer in rescue robotics.

UAV used with Chemical Train Derailment- just like IEEE SSRR Paper Predicted

The Unmanned Systems Technology website reports that a Datron Scout was used to assist with a chemical train derailment last week. This is a great use of small UAVs and one which CRASAR has been exploring with TEEX through funding by the National Science Foundation.  Josh Peschel (now a research professor at the University of Illinois), Clint Arnett (TEEX), Chief David Martin (TEEX), and I presented a paper two weeks ago at the IEEE International Symposium on Safety, Security, and Rescue Robotics on “Projected Needs for Robot-Assisted Chemical, Biological, Radiological, or Nuclear (CBRN) Incidents”  based on Josh’s PhD work with 20 domain experts using a small unmanned aerial vehicle (UAV) to investigate a simulated chemical train derailment at Disaster City(r). The paper was a finalist for Best Paper.  Good to see the Scout used!

A Decade of Rescue Robots Video out!

Check out our new video presented at IROS 2012 for the Jubilee video competition: It shows the past ten years of rescue robots and CRASAR’s deployments.