Deep Learning Based Formation Control of Drones
Özet
Robot swarms can accomplish demanding missions fast, efficiently, and accurately. For a robust operation, robot swarms need to be equipped with reliable localization algorithms. Usually, the global positioning system (GPS) and motion capture cameras are employed to provide robot swarms with absolute position data with high precision. However, such infrastructures make the robots dependent on certain areas and hence reduce robustness. Thus, robots should have onboard localization capabilities to demonstrate a swarm behavior in challenging scenarios such as GPS-denied environments. Motivated by the need for a reliable onboard localization framework for robot swarms, we present a distance and vision-based localization algorithm integrated into a distributed formation control framework for three-drone systems. The proposed approach is established upon the bearing angles and the relative distances between the pairs of drones in a cyclic formation where each drone follows its coleader. We equip each drone with a monocular camera sensor and derive the bearing angle between a drone and its coleader with the recently developed deep learning algorithms. The onboard measurements are then relayed back to the formation control algorithm in which every drone computes its control action in its own frame based on its neighbors only, forming a completely distributed architecture. The proposed approach enables three-drone systems to perform in coordination indepen- dent of any external infrastructure. We validate the performance of our approach in a realistic simulation environment. © 2021, The Author(s), under exclusive license to Springer Nature Switzerland AG.