PFI:BIC: Enhanced Situational Awareness Using Unmanned Autonomous Systems for Disaster Remediation


Abstract: This Partnerships for Innovation: Building Innovation Capacity (PFI:BIC) project from the University of Nevada-Reno has as its goal the enhancement of the situational awareness capabilities of law enforcement agencies and first responders by employing unmanned autonomous systems (UAS) with high-resolution sensing and imaging capabilities for disaster remediation. Law enforcement agencies and first responders face significant challenges during an emergency event, such as a natural or anthropogenic disaster (earthquake, tsunami, fire, hurricane, tornado, flood, power or nuclear accident, act of war, or terror). One of the major challenges is acting decisively based on available information and considering human factors, making high-quality real-time situational awareness critical to effectively manage and safeguard civilians and in- field personnel. This project focuses on creating a smart emergency-response service system using UAS, both air- and ground-based systems, equipped with state-of-the-art imaging, sensing, and communication systems to provide first response teams with high-quality, real-time information to act decisively and effectively via human-machine interactions. Such a smart service system will guide/escort humans to safety, direct rescue crews to access trapped humans, and provide in-situ communication, medication, water and food, and power. The program will develop the requisite human infrastructure of graduate students in mechanical engineering, computer science and engineering, electrical engineering, and social psychology. Working together in the interdisciplinary team, students will be exposed to research outside of their respective disciplines and to innovative opportunities for entrepreneurship. A successful smart service system will impact the operations of the public safety sector, and it could be adapted by similar organizations.

The project's objectives include the following: (1) develop and integrate UAS platforms, sensors, imaging and communication systems, and control and path planning algorithms to create a UAS-based smart service system for first response; (2) model the state of humans and infrastructure during a disaster, identify the scene, and create access paths to safety; (3) test prototypes and pursue commercialization opportunities; and (4) educate the public and train first responders on the technology. The translational research to create the UAS-based smart service system will focus on sensor data fusion, scene identification, modeling of the state of humans, infrastructure and their interactions as well as on the platform for testing and evaluating remediation strategies, communication schemes, and access path planning. Understanding of the system aspects will enable first responders and public safety command personnel to analyze and understand on-scene, active emergency situations through interactive, integrated data analysis and visualization; and give them the ability to sense, predict, and act in a variety of disaster scenes and human socio-psychological conditions.

Partners at the inception of this project are the University of Nevada at Reno (involving faculty across four departments: mechanical engineering, computer science and engineering, electrical engineering, and social psychology); industry partners: two small businesses, Drone America and SpecTIR, companies based out of Reno, Nevada; academic partners: University of Nevada, Las Vegas; and the University of Utah as well as experts from the UNR Seismology Lab and the Washoe County as System users; and broader context partners include the newly established Nevada Advanced Autonomous Systems Innovation Center (NAASIC) at UNR, the state-supported UAS program management office, Nevada Institute for Autonomous Systems (NIAS), and the Nevada Industry Excellence (NVIE).

Details

  • Organization: National Science Foundation
  • Award #: IIP-1430328
  • Amount: $800,000
  • Date: Aug. 1, 2014 - July 31, 2017
  • PI: Kam Leang

Supported Publications

  • Ahmed Siddiqui, K., Feil-Seifer, D., Yang, T., Jose, S., Liu, S., & Louis, S. Development of a Swarm UAV Simulator Integrating Realistic Motion Control Models For Disaster Operations. In Proceedings of the ASME Dynamic Systems and Controls Conference (DSCC), page V003T39A003, Tysons Corner, Virginia, Oct 2017. ( details ) ( .pdf )
  • Pham, X. H., La, H., Feil-Seifer, D., & Deans, M. A Distributed Control Framework for a Team of Unmanned Aerial Vehicles for Dynamic Wildfire Tracking. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), page 6648-6653, Vancouver, BC, Canada, Sep 2017. ( details ) ( .pdf )

Supported Projects

FRIDI: First Responder Interface for Disaster Information Aug. 1, 2014 - Dec. 31, 2018

In recent years robots have been deployed in numerous occasions to support disaster mitigation missions through exploration of areas that are either unreachable or potentially dangerous for human rescuers. The UNR Robotics Research Lab has recently teamed up with a number of academic, industry, and public entities with the goal of developing an operator interface for controlling unmanned autonomous system, including the UAV platforms to enhance the situational awareness, response time, and other operational capabilities of first responders during a disaster remediation mission. The First Responder Interface for Disaster Information (FRIDI) will include a computer-based interface for the ground control station (GCS) as well as the companion interface for portable devices such as tablets and cellular phones. Our user interface (UI) is designed with the goal of addressing the human-robot interaction challenges specific to law enforcement and emergency response operations, such as situational awareness and cognitive overload of the human operators. Situational Awareness, or otherwise the understanding that the human operator has about the location, activities, surroundings, and the status of the unmanned aerial vehicle (UAV), is a key factor that determines the success in a robot-assisted disaster mitigation operation. With that in mind, our goal is to design an interface that will use pre-loaded terrain data augmented with real-time data from the UAV sensors to provide for a better SA during the disaster mitigation mission. Our UI displays a map of near-live images of the scene as recorded from UAVs, position and orientation of the vehicles in the map, as well as video and other sensor readings that are crucial for the efficiency of the emergency response operation. This UI layout enables human operators to view and task multiple individual robots while also maintaining full situational awareness over the disaster area as a whole.