Tesla-rapture: A lightweight gesture recognition system from mmwave radar sparse point clouds
Salami, Dariush and Hasibi, Ramin and Palipana, Sameera and Popovski, Petar and Michoel, Tom and Sigg, Stephan
Abstract
We present Tesla-Rapture, a gesture recognition interface for point clouds generated by mmWave Radars. State of the art gesture recognition models are either too resource consuming or not sufficiently accurate for integration into real-life scenarios using wearable or constrained equipment such as IoT devices (e.g. Raspberry PI), XR hardware (e.g. HoloLens), or smart-phones. To tackle this issue, we developed Tesla, a Message Passing Neural Network (MPNN) graph convolution approach for mmWave radar point clouds. The model outperforms the state of the art on two datasets in terms of accuracy while reducing the computational complexity and, hence, the execution time. In particular, the approach, is able to predict a gesture almost 8 times faster than the most accurate competitor. Our performance evaluation in different scenarios (environments, angles, distances) shows that Tesla generalizes well and improves the accuracy up to 20% in challenging scenarios like a through-wall setting and sensing at extreme angles. Utilizing Tesla, we develop Tesla-Rapture, a real-time implementation using a mmWave Radar on a Raspberry PI 4 and evaluate its accuracy and time-complexity. We also publish the source code, the trained models, and the implementation of the model for embedded devices.
Read the PaperBibtex
@article{salami2022tesla, title = {Tesla-rapture: A lightweight gesture recognition system from mmwave radar sparse point clouds}, author = {Salami, Dariush and Hasibi, Ramin and Palipana, Sameera and Popovski, Petar and Michoel, Tom and Sigg, Stephan}, journal = {IEEE Transactions on Mobile Computing}, year = {2022}, publisher = {IEEE}, link = {https://arxiv.org/abs/2109.06448} }