Leaping Lizards: Bob Full’s work and US&R

Several groups are reporting on Prof. Bob Full’s lab work in tails for robots based on the ways lizards use tail to counterbalance but also steer when they jump! I was interviewed for comments. I’m a big fan of Bob’s. This would have been fantastic to have a small robot that could steer itself as it was lowered (or jumped) from ledge to ledge at the Midas Gold Mine disaster back in 2007. And many robots use some sort of shifting weight like a flipper or a manipulator arm or its shape to try to get over obstacles or down stairs without tipping.  Check it out!

Researchers and Responders to Jointly Develop UAV Visual Common Ground

Researchers and responders from The Texas A&M University System have received a grant from the National Science Foundation (NSF) to create a visual “common ground” between operators and responders who use micro and small unmanned aerial vehicles (UAVs) for search and rescue.

Following principles in how people know what other people are talking about in conversations, visual common ground will allow responders to easily express where they want the UAV to fly and what angle to examine collapsed structures using an iPad or other tablet. The responders would also be able to review imagery and video while the UAV continues its mission rather than wait for the UAV to land.

Response professionals from the Texas Engineering Extension Service (TEEX) Disaster Preparedness and Response Division (DPR) will fly weekly at Disaster City® with researchers from the Texas Engineering Experiment Station’s (TEES) Center for Robot-Assisted Search and Rescue (CRASAR), speeding the development and refinement of the natural user interface.

Disaster City® is a 52-acre facility designed featuring full-scale collapsible structures that replicate community infrastructure. The site includes simulations of a strip mall, office building, industrial complex, assembly hall/theater, single-family dwelling, train derailments, three active rubble piles and a small lake.

The grant is the first direct partnering of emergency responders with university professors/researchers for UAV research. Bob McKee, DPR director and agency chief for Texas Task Force 1, serves as a principal investigator with Dr. Robin Murphy, Texas A&M University professor and CRASAR director. The partnership leverages the capabilities of top academic researchers and the preparedness and response expertise of TEEX, all existing within the A&M System.

“Being able to work directly and routinely with responders under conditions as near to a real disaster as one can get will allow the research to progress faster. This could only happen at Texas A&M,” Murphy said. “Normally we’d have to try to condense a year of work into one week of trials, and if something went wrong we’d have to wait months for another opportunity for responders or a demolished building to become available.”

McKee said, “TEEX has been actively involved in efforts to develop and adapt robots for search and rescue applications. Though working with the National Institute for Standards and Technology project to develop standard test methods for emergency response robots to collaborating with scientific researchers and commercial developers at our unique Disaster City® facility, we’re hoping to someday use small UAVs and other unmanned systems to help save lives.”
The grant will help enable emergency responders to take advantage of small “personal” UAVs being developed for the U.S. Department of Defense. Urban search and rescue operations can be more challenging than military peacekeeping operations as they can require assessment and analysis of damaged structures, hazardous areas, and other unique situations.

 

The idea for creating shared displays is a result of over a decade of research on rescue robotics by Murphy, who was recently named one of the most influential women in technology by Fast Company magazine. She has led UAV deployments at numerous disasters starting with Hurricane Katrina. Her work with Dr. Jenny Burke (a former graduate student currently with Boeing), based on CRASAR experiences with ground robots at the World Trade Center, showed that search and rescue specialists were nine times more effective if two responders—not one—worked together using a shared visual display.

The team expects to have an open source tablet interface for AirRobot and Dragan UAVs within 24 months that leads to a significant, measurable improvement in team performance as well as high user acceptance.

 

Contact for TEEX: Brian Blake   Brian.blake@tamu.edu (O) 979-458-6837 (C) 979-324-8995

Contact for TEES: Pam Green  p-green@tamu.edu (O) 979-845-5510 (C) 979-574-4138

IRS-CRASAR team finalist for Best Paper SSRR 2011

The IRS-CRASAR paper on our April deployment to Japan was a finalist for best paper at the IEEE Safety Security Rescue Robot conference, which met this week in Kyoto. The work by the Japanese team that produced the QUINCE robot used at Fukushima deservedly won- but it was a great honor to be a finalist!  The paper is Use of Remotely Operated Marine Vehicles at Minamisanriku and Rikuzentakata Japan for Disaster Recovery by R. Murphy, K. Dreger, S. Newsome, J. Rodocker, E. Steimle. T. Kimura, K. Makabe, F. Matsuno, S.Tadokoro, and K. Kon. Congratulations all! The paper should be available from download from IEEE Xplore shortly.

1 robot: 80,000 m2 covered and 104 objects found at 32 locations in 4 days

We are at Narita, getting ready to head home! In four days the team was in the field in Minami Sanriku with our IRS colleagues, the SeaBotix SARbot surveyed 32 locations and covered 80,000 m2 of Shizugawa Bay in just over 6 hours of time in the water, finding 104 objects such as cars, a lighthouse, and nets. And it wasn’t just the robot, we got to work through the data-to-decision process with GeoSuites and GIS systems… I’ll post video and lessons learned as soon as I can.

Minami Sanriku Cho: Day 1 summary

While the Turkey earthquake response forges on, the team in Japan continues to work. We just finished Day 2, but here is a video summary.  A lot of the shots are from Richard Smith, our GIS expert (great job!) The SeaBotix SARbot and the Lynn image enhancement software is performing wonderfully! [youtube]http://www.youtube.com/watch?v=-4KbhpCeh-0[/youtube]

In Japan but reaching out to Turkey…

We got word about the Turkey earthquake from our medical lead, Eric Rasmussen,  while we were on the water deploying ROVs and an AUV in Minami Sanriku Cho today. The CNN site is sketchy but it looks like very challenging conditions- beside getting help to the site, the types of houses and the weather are tough on searching, on victims in the cold, and on rescuers.

The International Rescue Systems Institute is looking at the availability of their caterpillar-like Active Scope Camera, the best robot I know of for penetrating extremely narrow voids. Small UAVs may be of use in understanding the situation and the civil engineering. I can’t tell from the news about marine vehicles– as one of the technologists here in Japan with me said: “I had no idea so much infrastructure is related to water!”

Today, the city of Minami Sanriku is celebrating the opening of their port after the tsunami– a great day and great progress in recovery. But a sad day for Turkey. Our hearts and prayers go out to the victims, their families, and the responders.

In Minamisanriku-Cho, gearing up for first mission

It is dawn here at Minamisanriku and from my hotel room, I can see streaks of orange over the New Port, the site we first searched and cleared in April. The tendrils of fog driftly past the small islands dotting Shizugawa Bay. We start checking our gear in another hour before breakfast and depart to an inlet on the northern border of the bay. There we will meet with city officials and fishermen who have asked us to find underwater debris, map it, and attach a float to it so that it can be removed.

 

We have brought two robots for the mission: a OceanServer autonomous underwater vehicle (AUV) that looks like a small yellow torpedo. It carries side scan sonar and will map out the area of the bay rapidly, probably within a hour. That will let us know where debris or possible debris is, which will then use the SARbot from SeaBotix from a boat to investigate. With its gripper, we should be able to attach a float to the underwater debris to mark it for easy identification for later removal. If not, IRS has two divers with us who will do it manual. Meanwhile the AUV will move to a new location and we will work in sequence: get “big picture” with AUV,  perform “little” actions with the tethered ROV.

 

But just having robots isn’t the same as getting the information to the right people at the right time- which is also called the “data-to-decision” problem. We will be integrating the data from the two robots using the General Dynamics GeoSuites software package- which is a civilian version of the command and control software used by the military. This will help us, the robot team keep up with the two groups deploying the AUV and ROV in different areas at the same time AND allow tactical responders such as Kenichi Makabe and his team of fire fighters who will be joining us  to see the fused and geolocated incoming data. GeoSuites will let multiple users have a global visual “common ground” that lets different groups get the view of the enterprise that they need, allowing me to see what everyone is doing, while letting officials start planning for removal and our GIS experts learn and start generating models of where more debris will be found.

On Way to Japan: IRS-CRASAR return to Minamisanriku-cho

We are flying out today, bringing back the SeaBotix SARbot and adding in an Oceanserver autonomous underwater vehicle, plus the EDGE network is sending GeoSuites to explore distributing the data and we have a  GIS expert from TAMU-CC.

We’ll be returning to Minamisanriku-cho, one of the two cities we visited in April. The mission that our colleagues at the International Rescue Systems Institute is quite different- we are tasked to find and clear debris in the surrounding bay. We plan to use the OceanServer AUV to quickly map portions of the bay, then use the ROV to attach a float and post it on the map and distribute it through GeoSuites. However, as Rick Smith, our GIS expert says, there’s more to cartography than putting red pins on a map- we are hoping to predict where other debris will be. GeoSuites is the civilian version of TIGR, the common operating picture software used by the US military– Brian Slaughter will be managing that. Eric Steimle (AEOS) is our lead again and Jesse is on the SARbot again.  YSI/Nanotech,  the OceanServer distributor in Japan, will meet the IRS-CRASAR team there.

Wish us well– the slow pace of recovery in Minamisanriku was reported on by the New Times here.  We are honored to join our Japanese colleagues at IRS in participatory research: learning while doing.