You can also click here to check out the whole video!
CRASAR News / Blog
Robots with cameras, microphones and sensors searched for victims stranded in flooded homes and on rooftops. They assessed damage and sent back images from places rescuers couldn’t get. It was August 31, 2005, two days after Hurricane Katrina hit the Gulf Coast. These robots were a crucial connection between emergency responders and survivors. Ten years later, new technology is changing the way we handle whatever life throws at us. In the case of disaster relief and recovery, this means more effective ways to save lives and begin the arduous process of rebuilding after catastrophe.
“You’ve got a golden 72 hours of the initial response that’s very critical,” said Dr. Robin Murphy, a robotics professor and director of the Center for Robot-Assisted Search and Rescue (CRASAR) at Texas A&M University and also worked with robots after the September 11, 2001, attacks, in natural disasters such as Hurricane Katrina and at the Fukushima nuclear accident. ”Then you have the restoration of services. After the emergency teams have got everything under control, you got to get your power back on, your sewage, you know, your roads and that.”
UAVs such as the PrecisionHawk Lancaster, a fixed wing drone, are not only able to aide human disaster responders by providing photos of where to look for victims, but they also provide a valuable resource for determining how to approach the relief efforts. ”It acts like a plane. It’s smarter than a plane because it’s got all sorts of onboard electronics to let it do preprogram surveys. It takes pictures like on a satellite or a Mars explorer and then pulls those back together into a hyper-accurate map — a 3-D reconstruction,” Murphy said. Murphy also said it’s not only very accurate, but it’s also easy to pick up and maneuver.
Check out the rest of the article here
Two prototypes of the Snake Robot for Search and Rescue Missions, called SARP (Snake-like Articulated Robot Platform) have been designed by scientists of the department of mechanical and aerospace engineering at Indian Institute of Technology-Hyderabad (IIT-H). Developed from fire-proof ABS plastic, the snake-like motion of the prototypes (about a metre in length) helps in navigation of rough terrain, he said. The robots can also communicate with each other.
“In a disaster site, like a collapsed building in an earthquake, a building on fire, or a dangerous environment, like a nuclear power plant in an accident, a snake robot can be used to access difficult-to-reach spaces and look for survivors under the debris,” R. Prasanth Kumar, associate professor at the department told IANS. ”It can then relay valuable information about the environment and help rescue workers in planning their missions,” Kumar said.
Check out more information here
A group of researchers will start development next month of a rescue robot that can detect human scents at disaster sites where people may be trapped under debris or earth and sand. The group will consist of researchers from the University of Tokyo, major chemical company Sumitomo Chemical Co. and the Kanagawa Academy of Science and Technology.
The researchers will draw on mosquitoes’ ability to distinguish the faintest smell of animal or human perspiration to create a small sensor that can be attached to an unmanned drone or other device. They aim to put these robots to practical use by 2020.
Check out more information here
Word from Responders: “Small UAVs are Available, Now Help Us Use The Data They Generate!” REU Students Provide Apps
TEES just concluded a three day Summer Institute on Floods (July 26-28, 2015), which was hosted by our “mother” center, the Center for Emergency Informatics. The Summer Institute focuses on the data-to-decision problems for a particular type of disaster. This year’s Summer Institute was the second in two-part series on floods and brought together representatives from 12 State agencies, 15 universities, and 5 companies for two days of “this is what we did during the Texas floods” and one day of “this is what we could do or do better” experimentation with small unmanned aerial vehicles, crowd sourcing, computer vision, map-based visualization packages, and mobile phone apps for common operating pictures and data collection.
A portion of the Summer Institute focused strictly on UAVs. The session was organized by the Lone Star UAS Center and the TEES Center for Robot-Assisted Search and Rescue (CRASAR), both of whom flew during the Texas floods.
The UAV field exercises spanned four scenarios witnessed during the floods:
- Search of cars and other vehicles swept away by a storm surge for trapped victims;
- Search for missing persons who may have fled campsites or been swept down river;
- Assessment of damage to power lines and transformers and presence of standing water which prevents restoration of power infrastructure; and
- Assessment of damage to houses and estimates of household debris, which is critical to insurance companies estimating damage and agencies such as the American Red Cross in projecting social impacts.
The exercises were staged at the 1,900 acre TAMU Riverside Campus with one fixed wing and three types of rotorcraft flown by CRASAR and David Kovak, a member of CRASAR’s Roboticists Without Borders program.
A major finding out of the institute was the realization that while State agencies are adopting UAVs, the agencies can’t process all the imagery coming in. For example, a single 20-minute UAV flight by CRASAR at the floods produced over 800 images totaling 1.7GB. There were over a dozen platforms flying daily for two weeks as well as Civil Air Patrol and satellite imagery. Each image has to be viewed manually for signs of survivors, clothing, or debris that indicate a house or car (and thus a person too) was swept to this location.
Given the huge demand for automating image processing, it was no surprise that a panel of four judges from the agencies awarded $900 in prizes to three students from the National Science Foundation Research Experiences for Undergraduates (REU) program. The prizes were for software that classified imagery and displayed where it was taken.
First place went to Julia Proft for an application that used computer vision and machine learning to find images from the floods which contained a spectral anomaly such as color. Below is UAV image flagged with an amaloy and anomalies indicated on a parallel image to make it easier to find where in the image the anomalies occurred without marking up the image and blocking the view.
Second place went to Matt Hegarty for an application that used computer vision and machine learning to find images from the floods which contained urban debris having straight lines or corners. In this image, the program found instances of trash, pvc poles, and other indications of where houses (and possibly victims) had been swept down river.
Third place went to Abygail McMillian for a program that visually displayed where and when UAV data or any geotagged data was taken. The spatial apsect is important for responders to see where assets have sensed- at Nepal, UAV teams often surveyed the same areas but missed important regions. The temporal aspect (a timeline scale on the map) is important because with the changes in the river flood stage, hydrological and missing person images become “stale” and flights should be reflown to get new data.
The 10 students (nine from under-represented groups) were from Colorado, Georgia, Puerto Rico, South Carolina, Texas, Vermont, and Wyoming and spent the summer conducting research through the Computing for Disasters NSF REU site grant under the direction of Dr. Robin Murphy. Multiple State agencies requested those student-created apps be hardened and released. In previous years, the panels have awarded prizes primarily to hardware- UAVs, battery systems, communications nodes, etc. This year, the focus was on software and programming to help analyze the data the hardware generates.
We just concluded a three day Summer Institute on Floods (July 26-28, 2015), which was hosted by our “mother” center, the Center for Emergency Informatics. The Summer Institute focuses on the data-to-decision problems for a particular type of disaster. This year’s Summer Institute was the second in two-part series on floods- even though there was a drought last year, our TEEX partners were very worried about flooding for Texas and flooding is the number one disaster in the world. It was originally scheduled for early June but had to be moved due to the floods and allowed us the opportunity to discuss topics while they were still fresh on everyone’s minds.
The Summer Institute brought together representatives from 12 agencies, 15 universities, and 5 companies for two days of “this is what we did during the Texas floods” and one day of “this is what we could do or do better” experimentation with small unmanned aerial vehicles, crowd sourcing, computer vision, map-based visualization packages, and mobile phone apps for common operating pictures and data collection. The field exercise was designed to track the resource allocation of field teams, how they can acquire data from the field from UAVs and smartphones, then how the data can be effectively organized, analyzed, and visualized.
Here are my preliminary findings:
- Agencies are adopting small UAVs, social media, and smartphones apps to acquire new types of data, faster and over larger geospatial and temporal scales than ever. Small UAVs were used during the Texas floods in all of the affected regions (big shout out to Lone Star UAS Center- we flew under their direction during the floods). Agencies were monitoring social media, in one case to uncover and combat a rumor that a levee had broken. Texas Task Force 1 and other agencies used General Dynamics’ GeoSuite mobile app for giving tactical responders a common operating picture (big shout out- GeoSuite was introduced and refined starting with the 2011 Summer Institute).
- Wireless communications is no longer the most visible barrier to acquiring data from the field in order to make better decisions. While responders in the field may not have as much bandwidth as they did before a disaster, cell towers on wheels are being rapidly deployed and the Civil Air Patrol flew repeater nodes. That said, wireless communications isn’t solved as hilly geography can prevent teams from connecting to sparse temporary nodes. Plus keep in mind that large parts of rural Texas have limited or no connectivity under normal conditions.
- The new barrier is what to do with the data coming in from unmanned systems, news feeds, social media feeds, and smart phones. Consider that a single 20-minute UAV flight produced roughly over 800 images totaling 1.7GB. There were over a dozen platforms flying daily for two weeks as well as Civil Air Patrol and satellite imagery. Most of the imagery was being used to search for missing persons, which means each image has to be inspected manually by at least (preferably more). Signs of missing persons are hard to see, as there may be only a few pixels of clothing (victims may be covered in mud or obscured by vegetation and debris) or urban debris (as in, if you see parts of a house, there may be the occupant of the house somewhere in the image). Given the multiple agencies and tools, it was hard to pinpoint what data has been collected when (i.e., spatial and temporal complexity) and then access the data by area or time. Essentially no one knew what they had. Agencies and insurance companies had to manually sort through news feeds and public postings, both text and images, to find nuggets of relevant information.
- The future of disasters was clearly in organizing, analyzing, and visualizing the data. Responders flocked to SituMap, an interactive map-based visualization tool that Dr. Rick Smith at TAMU Corpus Christi started developing after the 2010 Summer Institute and is now being purchased by TEEX and other responders. The agencies awarded $900 in prizes to NSF Research Experiences for Undergraduate students for software that classified imagery and displayed where it was taken won 1st, 2nd, and 3rd. Multiple agencies requested those apps be hardened and released as well as the Skywriter sketch and point interface (CRASAR developed that for UAVs and it is being commercialized) and the wide area search planning app developed over the last two summers by other students from the NSF REU program. In previous years, the panels have awarded prizes primarily to hardware- UAVs, battery systems, communications nodes, etc. This year, the attitude was “those technologies are available, now help us use the data they generate?”
Each year we hear the cry from emergency management “we’re drowning in data, help” and this year it was more than a bad pun.
One word about the floods and about UAVs: informatics. Read on to see what I mean
As I blogged earlier, on July 16-28, the Center for Emergency Informatics’ 2015 Summer Institute is on flooding and will bring together state agencies and municipalities who were part of the Texas floods with researchers and industry for a two-day workshop and 1 day exercise. The exercise will include UAVs flying the missing persons missions and the recovery and restoration missions.
Notice that it’s the Center for Emergency Informatics hosting the event because it’s about the right data getting to the right agencies or stakeholders at the right time and displayed in the right way that will enable them to make the right decisions. UAVs (and marine vehicles such as Emily and the small airboats being developed at Carnegie Mellon, Texas A&M, and University of Illinois) have a big role to play. But UAVs are useful only if the entire data-to-decision process works, aka informatics.
The Summer Institute July 26-28 will also host a training session for Roboticists Without Borders members specifically on UAVs and the best practices of how to fly at floods and upcoming hurricanes and collect the useful data– what do the decision makers need? Again, this is the informatics, the science of data, not the aeronautics. The training is independent of platform- because what the decision makers need is what they need The current (and evolving) best practices are derived from three sources:
- CRASAR RWB deployments going back to 2005 Hurricane Katrina Pearl River cresting and including the Oso Mudslides and our deployment with Lone Star UAS Center to the Texas floods,
- the reports and analyses of what has worked at typhoons and other flooding events worldwide, and
- what researchers through out the world, especially the IEEE Safety, Security, and Rescue Robotics technical community, are doing.
For example, video is not as useful as high resolution imagery for searching for missing persons. Infrared isn’t helpful except in the early morning. Some missions and terrains require remote presence style of control, other can use preprogrammed flight paths. Complex terrains such as river bluffs may require flight paths that are vertical slices, not horizontal slices. Many more and I’m sure we will learn more from each other.
The training session will consist of evening classes on July 26 and 27, with field work on July 28 at the 1,900 acre Riverside Campus. We will fly fixed-wing and rotorcraft for response missions (reconn, missing persons, flood mitigation) and for recovery/restoration (damage assessment, debris estimation, household debris estimation, power utility assessment, transportation assessment). The scenarios will be designed by experts from Texas Task Force 1 and the representatives from the agencies that would use the information, including the fire rescue, law enforcement, Texas insurance commission, SETRAC, etc.).
It’s not too late to join Roboticists Without Borders and attend! It’s free.
Hope to see you there!
The 2015 Summer Institute on Flooding will be held on July 26-28 at the Riverside Campus, College Station, Tx. Check out the information at http://crasar.org/?p=1834
Two boys needed to be rescued from the raging Little Androscoggin River in Maine after their tube overturned Tuesday. Only one of them was wearing a life jacket. Frank Roma, chief of the Auburn fire department, wanted to get that boy a life jacket before attempting the rescue. The water was rough, and rescuers had a hard time getting to the boys, so they used Roma’s personal drone to deliver a life jacket and a safety line to the boys.
“We wanted to make sure we got a life jacket on that second child so that if they did fall in the water we could catch them downstream,” said Roma. “We used the drone to fly a tag line out to the young man that was on the rock, we instructed him to untie and to pull life jacket over to him.”
Check out more information here
Robot boats as swift water rescuers, not just for critical infrastructure and restoration/recovery operations anymore!
Grant Wilde and Gino Chacon observed the EMILY rescue boat, a new concept in disaster robotics. I have followed Tony Mulligan and his work with EMILY since 2012 and EMILY is really gaining acceptance. He demoed the boat this week during a swift water exercise (which our partners Austin Fire Department participated in- thanks Coitt for the directions!) The robot acts as a barrel-shaped life ring. An operator teleoperates EMILY to a victim in the water and the victim grabs it. The operator then uses the tether to pull EMILY and her cargo to safety.