This master thesis project was realized by Brent De Decker during the academic year 2020-2021.
This thesis explores the possibilities of combining localization and object detection using smart glasses and microcomputers. The question posed is whether these low-powered devices have the capabilities to perform these demanding tasks in parallel, and if so, whether the system can be developed as being autonomous. By not using GPS or WiFi signals, this application has potential to improve the Situational Awareness (SA) of military operators in unfamiliar and often dynamic conditions.
First, the problem of localization is analyzed by two different approaches. On the one hand, IMU (Internal Measurement Unit) measurements are used to perform dead reckoning. On the other hand, a SLAM (Simultaneous Localization And Mapping) system is introduced to calculate positions based on camera frames coming from the smart glasses. To accomplish these calculations, a micro computer (Jetson nano) is added to the system. The SLAM results exceed the IMU dead reckoning results in all tests and the system is able to work autonomously.
Then, the second part studies the performance of different object detectors. Depending on the preference for accuracy or speed, this results in a selection of the YOLOv3 (You Only Look Once) and YOLO-tiny systems as implementation, respectively. Finally, the SLAM and object detection systems are merged by a scheduler.
This scheduler can be set as being dynamic or detailed referring to the object detector used and the speed of execution. Tests show that the system is capable of performing such tasks with satisfactory results.
Featured image: the Vuzix Blade smartglasses, that were used in this thesis.