Katrina: Aug. 31, 2005, 10th Anniversary of First Small UAS Flight

CRASAR was the first to fly SUAS for a disaster– Hurricane Katrina on Aug. 31 and Sept. 1 for the state of Florida which was assisting with the Mississippi response. Other groups flew too, but CRASAR was first with iSENSYS and WinTEC as our Roboticists Without Borders partners. WIRED has a nice piece on Katrina with some of the footage for our birds.

We were originally tasked to go to New Orleans to help our colleagues with the LSU Fire Emergency Training Institute. It’s a long story but basically, we needed a police escort to get into NOLA so we fell back to Mississippi to help there. Later we returned to the Gulf Coast and flew 32 structural inspection missions over 8 days, establishing crew organization and operational protocols that are now standard in the European Union and Japan. We also discovered that experts from FEMA, Thornton Thomasetti, universities could not readily comprehend the imagery– as we all suspected getting photos IS different than being there, and more so when safety is involved. That motivated a significant amount of research including the Skywriter project to help remote experts direct the robot through the real-time video feed.

Robots, drones and heart-detectors: How disaster technology is saving lives

Robots with cameras, microphones and sensors searched for victims stranded in flooded homes and on rooftops. They assessed damage and sent back images from places rescuers couldn’t get. It was August 31, 2005, two days after Hurricane Katrina hit the Gulf Coast. These robots were a crucial connection between emergency responders and survivors. Ten years later, new technology is changing the way we handle whatever life throws at us. In the case of disaster relief and recovery, this means more effective ways to save lives and begin the arduous process of rebuilding after catastrophe.

“You’ve got a golden 72 hours of the initial response that’s very critical,” said Dr. Robin Murphy,  a robotics professor and director of the Center for Robot-Assisted Search and Rescue (CRASAR) at Texas A&M University and also worked with robots after the September 11, 2001, attacks, in natural disasters such as Hurricane Katrina and at the Fukushima nuclear accident. “Then you have the restoration of services. After the emergency teams have got everything under control, you got to get your power back on, your sewage, you know, your roads and that.”

UAVs such as the PrecisionHawk Lancaster, a fixed wing drone, are not only able to aide human disaster responders by providing photos of where to look for victims, but they also provide a valuable resource for determining how to approach the relief efforts. “It acts like a plane. It’s smarter than a plane because it’s got all sorts of onboard electronics to let it do preprogram surveys. It takes pictures like on a satellite or a Mars explorer and then pulls those back together into a hyper-accurate map — a 3-D reconstruction,” Murphy said. Murphy also said it’s not only very accurate, but it’s also easy to pick up and maneuver.

Check out the rest of the article here

Indian Scientists Making Snake Robot for Search and Rescue Missions

Two prototypes of the Snake Robot for Search and Rescue Missions, called SARP (Snake-like Articulated Robot Platform) have been designed by scientists of the department of mechanical and aerospace engineering at Indian Institute of Technology-Hyderabad (IIT-H). Developed from fire-proof ABS plastic, the snake-like motion of the prototypes (about a metre in length) helps in navigation of rough terrain, he said. The robots can also communicate with each other.

“In a disaster site, like a collapsed building in an earthquake, a building on fire, or a dangerous environment, like a nuclear power plant in an accident, a snake robot can be used to access difficult-to-reach spaces and look for survivors under the debris,” R. Prasanth Kumar, associate professor at the department told IANS. “It can then relay valuable information about the environment and help rescue workers in planning their missions,” Kumar said.

Check out more information here

Researchers to tap mosquitoes’ sense of smell to develop rescue bot

A group of researchers will start development next month of a rescue robot that can detect human scents at disaster sites where people may be trapped under debris or earth and sand. The group will consist of researchers from the University of Tokyo, major chemical company Sumitomo Chemical Co. and the Kanagawa Academy of Science and Technology.

The researchers will draw on mosquitoes’ ability to distinguish the faintest smell of animal or human perspiration to create a small sensor that can be attached to an unmanned drone or other device. They aim to put these robots to practical use by 2020.

Check out more information here

Word from Responders: “Small UAVs are Available, Now Help Us Use The Data They Generate!” REU Students Provide Apps

Virtual reality reconstruction of Blanco River flood by Dr. Russ Taylor from CRASAR flight

TEES just concluded a three day Summer Institute on Floods (July 26-28, 2015), which was hosted by our “mother” center, the Center for Emergency Informatics. The Summer Institute focuses on the data-to-decision problems for a particular type of disaster. This year’s Summer Institute was the second in two-part series on floods and brought together representatives from 12 State agencies, 15 universities, and 5 companies for two days of “this is what we did during the Texas floods” and one day of “this is what we could do or do better” experimentation with small unmanned aerial vehicles, crowd sourcing, computer vision, map-based visualization packages, and mobile phone apps for common operating pictures and data collection.

A portion of the Summer Institute focused strictly on UAVs. The session was organized by the Lone Star UAS Center and the TEES Center for Robot-Assisted Search and Rescue (CRASAR), both of whom flew during the Texas floods.

The UAV field exercises  spanned four scenarios witnessed during the floods:

  • Search of cars and other vehicles swept away by a storm surge for trapped victims;
  • Search for missing persons who may have fled campsites or been swept down river;
  • Assessment of damage to power lines and transformers and presence of standing water which prevents restoration of power infrastructure; and
  • Assessment of damage to houses and estimates of household debris, which is critical to insurance companies estimating damage and agencies such as the American Red Cross in projecting social impacts.

The exercises were staged at the 1,900 acre TAMU Riverside Campus with one fixed wing and three types of rotorcraft  flown by CRASAR and David Kovak, a member of CRASAR’s Roboticists Without Borders program.

A major finding out of the institute was the realization that while State agencies are adopting UAVs, the agencies can’t process all the imagery coming in. For example, a single 20-minute UAV flight by CRASAR at the floods produced over 800 images totaling 1.7GB. There were over a dozen platforms flying daily for two weeks as well as Civil Air Patrol and satellite imagery. Each image has to be viewed manually for signs of survivors, clothing, or debris that indicate a house or car (and thus a person too) was swept to this location.

Given the huge demand for automating image processing, it was no surprise that a panel of four judges from the agencies awarded $900 in prizes to three students from the National Science Foundation Research Experiences for Undergraduates (REU) program. The prizes were for software that classified imagery and displayed where it was taken.

First place went to Julia Proft for an application that used computer vision and machine learning to find images from the floods which contained a spectral anomaly such as color. Below is UAV image flagged with an amaloy and anomalies indicated on a parallel image to make it easier to find where in the image the anomalies occurred without marking up the image and blocking the view.

UAV image flagged with an amaloy and anomalies indicated on a parallel image

urban debrisSecond place went to Matt Hegarty for an application that used computer vision and machine learning to find images from the floods which contained urban debris having straight lines or corners. In this image, the program found  instances of trash, pvc poles, and other indications of where houses (and possibly victims) had been swept down river.

 

Third place went to Abygail McMillian for a program that visually displayed where and when UAV data or any geotagged data was taken. The spatial apsect is important for responders to see where assets have sensed- at Nepal, UAV teams often surveyed the same areas but missed important regions. The temporal aspect (a timeline scale on the map) is important because with the changes in the river flood stage, hydrological and missing person images become “stale” and flights should be reflown to get new data.

Data display with timeline

The 10 students (nine from under-represented groups) were from Colorado, Georgia, Puerto Rico, South Carolina, Texas, Vermont, and Wyoming and spent  the summer conducting research through the Computing for Disasters NSF REU site grant under the direction of Dr. Robin Murphy. Multiple State agencies requested those student-created apps be hardened and released. In previous years, the panels have awarded prizes primarily to hardware- UAVs, battery systems, communications nodes, etc. This year, the focus was on software and programming to help analyze the data the hardware generates.