{"id":608385,"date":"2019-09-16T09:00:10","date_gmt":"2019-09-16T16:00:10","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=608385"},"modified":"2019-09-16T09:50:53","modified_gmt":"2019-09-16T16:50:53","slug":"helping-first-responders-achieve-more-with-autonomous-systems-and-airsim","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/helping-first-responders-achieve-more-with-autonomous-systems-and-airsim\/","title":{"rendered":"Helping first responders achieve more with autonomous systems and AirSim"},"content":{"rendered":"

\"\" (opens in new tab)<\/span><\/a><\/p>\n

With inputs from: Elizabeth Bondi (Harvard University), Bob DeBortoli (Oregon State University), Balinder Malhi (Microsoft) and Jim Piavis (Microsoft)<\/em><\/p>\n

Autonomous systems have the potential to improve safety for people in dangerous jobs, particularly first responders. However, deploying these systems is a difficult task that requires extensive research and testing.<\/p>\n

In April, we explored (opens in new tab)<\/span><\/a> complexities and challenges present in the development of autonomous systems (opens in new tab)<\/span><\/a> and how technologies such as AirSim (opens in new tab)<\/span><\/a> provide a pragmatic way to solve these tasks. Microsoft believes that the key to building robust and safe autonomous systems is providing a system with a wide range of training experiences to properly expose it to many scenarios before it can be deployed in the real world. This ensures training is done in a meaningful way (opens in new tab)<\/span><\/a>\u2014similar to how a student might be trained to tackle complex tasks through a curriculum curated by a teacher.<\/p>\n

With autonomous systems, first responders gain sight into the unknown<\/h3>\n

One way Microsoft trains autonomous systems is through participating in unique research opportunities focused on solving real-world challenges, like aiding first responders in hazardous scenarios. Recently, our collaborators at Carnegie Mellon University and Oregon State University, collectively named Team Explorer (opens in new tab)<\/span><\/a>, demonstrated technological breakthroughs in this area during their first-place win (opens in new tab)<\/span><\/a> at the first round of the DARPA Subterranean (SubT) Challenge (opens in new tab)<\/span><\/a>.<\/p>\n

\"Snapshots

(opens in new tab)<\/span><\/a> Snapshots from the AirSim simulation showing the effects of different conditions such as water vapor, dust and heavy smoke. Such variations in conditions can provide useful data when building robust autonomous systems.<\/p><\/div>\n

The DARPA SubT Challenge aspires to further the technologies that would augment difficult operations underground. Specifically, the challenge focuses on the methods to map, navigate, and search complex underground environments. These underground environments include human-made tunnel systems, urban underground, and natural cave networks. Imagine constrained environments that are several kilometers long and structured in unique ways with regular or irregular geological topologies and patterns. Weather or other hazardous conditions, due to poor ventilation or poisonous gasses, often make first responders\u2019 work even more dangerous.<\/p>\n

Team Explorer engaged in autonomous search and detection of several artifacts within a man-made system of tunnels. The end-to-end solution that the team created required many different complex components to work across the challenging circuit including mobility, mapping, navigation, and detection.<\/p>\n

Microsoft\u2019s Autonomous Systems (opens in new tab)<\/span><\/a> team worked closely with Team Explorer to provide a high-definition simulation environment to help with the challenge. The team used AirSim to create an intricate maze of man-made tunnels in a virtual world that was representative of such real-world tunnels, both in complexity as well as size. The virtual world was a hybrid synthesis, where a team of artists used reference material from real-world mines to modularly generate a network of interconnected tunnels spanning two kilometers in length spread over a large area.<\/p>\n

Additionally, the simulation included robotic vehicles\u2014wheeled robots as well as unmanned aerial vehicles (UAVs)\u2014and a suite of sensors that adorned the autonomous agents. AirSim provided a rich platform that Team Explorer could use to test their methods along with generate training experiences for creating various decision-making components for the autonomous agents.<\/p>\n

At the center of the challenge was the ability for the robots to perceive the underground terrain and discover things (such as human survivors, backpacks, cellular phones, fire extinguishers, and power drills) while adjusting to different weather and lighting conditions.\u00a0Multimodal perception is important in challenging environments, as well as AirSim\u2019s ability to simulate a wide variety of sensors, and their fusion can provide a competitive edge. One of the most important sensors is a LIDAR (opens in new tab)<\/span><\/a>, and in AirSim, the physical process of generating the point clouds are carefully reconstructed in software, so the sensor used on the robot in simulation uses the same configuration parameters (such as number-of-channels, range, points-per-second, rotations-per-second, horizontal\/vertical FOVs, and more) as those found on the real vehicle.<\/p>\n

It is challenging to train perception modules based on deep learning models to detect the target objects using LIDAR point clouds and RGB cameras. While curated datasets, such as ScanNet and MS COCO, exist for more canonical applications, none exist for underground exploration applications. Creating a real dataset for underground environments is expensive because a dedicated team is needed to first deploy the robot, gather the data, and then label the captured data. Microsoft\u2019s ability to create near-realistic autonomy pipelines in AirSim means that we can rapidly generate labeled training data for a subterranean environment.<\/p>\n