Reliable Object Detection and Identification in Adverse Conditions

# 149






Abstract

Perception is the one of the most critical aspects of any autonomous driving or navigation system. Currently, environment perception is heavily dependent on light-based sensors (cameras and lidars). While these sensors provide high-resolution data, they are known to fail under adverse weather conditions. There is a growing trend towards using radars for autonomous sensing applications since they are all-weather resistant and are known to provide accurate depth and velocity measurements. Our research focuses on developing systems that use radars as their primary sensors and enable perception in these adverse conditions. We propose a multi-radar system and a radar-camera fusion system that can reliably detect objects in all environmental conditions. We also develop smart infrastructure augmentation tags that can be used to identify objects only by using radars, enabling a myriad of applications both indoors and outdoors. Finally, we investigate how a reliable object detection system improves the performance of downstream tasks of SLAM, path planning and navigation

Kshitiz Bansal, University of California

Kshitiz Bansal is a Computer Science Ph.D. candidate at the University of California, San Diego. He is a part of Wireless Communication Sensing and Networking Group, headed by Prof. Dinesh Bharadia. His primary research interest lies in the fields of Automotive Imaging Radar, computer vision and signal processing algorithms, with applications both in indoor and outdoor domains like perception in adverse weather conditions, infrastructure sensing and so on. He has worked on building real-time, safety-critical, and reliable perception systems that use multi-modal sensing from cameras, radars and lidars. His work has been published on top conferences such as SenSys and featured in several press releases including Wall Street Journal.