Publications / 2019 Proceedings of the 36th ISARC, Banff, Canada

Through-Wall Object Recognition and Pose Estimation

Ruoyu Wang, Siyuan Xiang, Chen Feng, Pu Wang, Semiha Ergan and Yi Fang
Pages 1176-1183 (2019 Proceedings of the 36th ISARC, Banff, Canada, ISBN 978-952-69524-0-6, ISSN 2413-5844)
Abstract:

Robots need to perceive beyond lines of sight, e.g., to avoid cutting water pipes or electric wires when drilling holes on a wall. Recent off-the-shelf radio frequency (RF) imaging sensors ease the process of 3D sensing inside or through walls. Yet unlike optical images, RF images are difficult to understand by a human. Meanwhile, in practice, RF components are often subject to hardware imperfections, resulting in distorted RF images, whose quality could be far from the claimed specifications. Thus, we introduce several challenging geometric and semantic perception tasks on such signals, including object and material recognition, fine-grained property classification and pose estimation. Since detailed forward modeling of such sensors is sometimes difficult, due to hidden or inaccessible system parameters, onboard processing procedures and limited access to raw RF waveform, we tackled the above tasks by supervised machine learning. We collected a large dataset of RF images of utility objects through a mock wall as the input of our algorithm, and the corresponding optical images were taken from the other side of the wall simultaneously as the ground truth. We designed three learning algorithms based on nearest neighbors or neural networks, and report their performances on the dataset. Our experiments showed reasonable results for semantic perception tasks yet unsatisfactory results for geometric ones, calling for more efforts in this research direction.

Keywords: Through-Wall Imaging; ObjectRecognition; Pose Estimation; Deep Learning