Human Drone Natural Interaction
DOI:
https://doi.org/10.37591/joarb.v10i2.697Keywords:
Unmanned Aerial Vehicle, Gesture Piloting, Long-range Drone Control, ground control unit, unmanned aerial vehicles (UAVs), MAV Link protocolAbstract
The necessity for user-friendly and intuitive control systems has been brought to light by the growing use of drones in a variety of businesses and by the general public. However, existing drone piloting techniques frequently need intricate manual manoeuvres and technical know-how, creating difficulties for novice and inexperienced pilots. The main goal of this effort is to create a ground control unit (GCU) that, instead of using a traditional radio remote controller, allows users to operate a quad copter using hand gestures and motions. The GCU reads the pilot’s motions and converts them into the appropriate flight orders for the drone using computer vision techniques and gesture recognition algorithms. This study focuses on the difficulties that novice and general drone pilots must overcome, such as the steep learning curve, challenging manual controls, and the danger of accidents in the early phases of flying. The suggested approach intends to streamline the learning process and increase the accessibility of drone piloting to a larger audience by utilising natural interaction modalities, such as gestures. For rookie pilots and amateurs who might not have prior experience or technical understanding, this is especially helpful. The significance of this work lies in its ability to enhance both the usability and security of drone operations. The system lessens the cognitive strain on the pilot and improves their overall control of the drone by doing away with the necessity for complicated manual controls and substituting them with intuitive gestures. Additionally, the suggested system’s long-range capabilities broaden the scope of possible drone applications by permitting operations beyond lines of sight and streamlining operations in arid or difficult areas. By offering a simple and intuitive control method, this work hopes to aid in the development and use of drones. By addressing the difficulties early and general public pilots encountered, it will become more usable by a wider spectrum of users. As a result, more drones may be used in more industries, drone operations may be safer, and new and creative uses for drones may be investigated.
References
Tezza D, Andujar M. The state-of-the-art of human–drone interaction: A survey. IEEE Access. 2019 Nov 18; 7: 167438–54.
Cauchard JR, E JL, Zhai KY, Landay JA. Drone & me: an exploration into natural human-drone interaction. In Proceedings of the 2015 ACM international joint conference on pervasive and ubiquitous computing. 2015 Sep 7; 361–365.
Walter I, Khadr M. (2019). Gesture Controlled Drone. Retrieved from Corpus ID: 250444369.
Natarajan K, Nguyen TH, Mete M. Hand gesture-controlled drones: An open source library. In 2018 IEEE 1st International Conference on Data Intelligence and Security (ICDIS). 2018 Apr 8; 168–175.
Gio N, Brisco R, Vuletic T. Control of a drone with body gestures. Proceedings of the Design Society. 2021 Aug; 1: 761–70.
Menshchikov A, Ermilov D, Dranitsky I, Kupchenko L, Panov M, Fedorov M, Somov A. Data-driven body-machine interface for drone intuitive control through voice and gestures. In IECON 2019-45th Annual Conference of the IEEE Industrial Electronics Society. 2019 Oct 14; 1: 5602–5609.
Cauchard JR, Tamkin A, Wang CY, Vink L, Park M, Fang T, Landay JA. Drone. io: A gestural and visual interface for human-drone interaction. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). 2019 Mar 11; 153–162.
Sun T, Nie S, Yeung DY, Shen S. Gesture-based piloting of an aerial robot using monocular vision. In 2017 IEEE International Conference on Robotics and Automation (ICRA). 2017 May 29; 5913–5920.
Bandala AA, Maningo JM, Sybingco E, Vicerra RR, Dadios EP, Guillarte JD, Salting JO, Santos MJ, Sarmiento BA. Development of Leap Motion Capture Based-Hand Gesture Controlled Interactive Quadrotor Drone Game. In 2019 IEEE 7th International Conference on Robot Intelligence Technology and Applications (RiTA). 2019 Nov 1; 174–179.
Fernandez RA, Sanchez-Lopez JL, Sampedro C, Bavle H, Molina M, Campoy P. Natural user interfaces for human-drone multi-modal interaction. In 2016 IEEE International Conference on Unmanned Aircraft Systems (ICUAS). 2016 Jun 7; 1013–1022.
Haratiannejadi K, Fard NE, Selmic RR. Smart glove and hand gesture-based control interface for multi-rotor aerial vehicles. In 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC). 2019 Oct 6; 1956–1962.
Zhang F, Bazarevsky V, Vakunov A, Tkachenka A, Sung G, Chang CL, Grundmann M. Mediapipe hands: On-device real-time hand tracking. arXiv preprint arXiv:2006.10214. 2020 Jun 18.
ZeroMQ: An Open-source Universal Messaging Library. [Online]. Available: https://zeromq.org.