Gesture control of drone using a motion controller

Abstract : In this study, we present our implementation of using a motion controller to control the motion of a drone via simple human gestures. We have used the Leap as the motion controller and the Parrot AR DRONE 2.0 for this implementation. The Parrot AR DRONE is an off the shelf quad rotor having an on board Wi-Fi system. The AR DRONE is connected to the ground station via Wi-Fi and the Leap is connected to the ground station via USB port. The LEAP Motion Controller recognizes the hand gestures and relays it on to the ground station. The ground station runs ROS (Robot Operating System) in Linux which is used as the platform for this implementation. Python is the programming language used for interaction with the AR DRONE in order to convey the simple hand gestures. In our implementation, we have written python codes to interpret the hand gestures captured by the LEAP, and transmit them in order to control the motion of the AR DRONE via these gestures.
 EXISTING SYSTEM :
 ? In existing work, Hand recognition by conventional methods using 2- D cameras suffer from instability due to lighting and skin color variations. ? The extracted fingertip positions of each gesture are stored in the database. The distance between them is used as feature vector. ? It tracks the hand and finger movements in digital format and gives few key points associated with each gesture. ? With this technique, in a contactless and markerless environment one operator could manage dual robots through both his or her hands. ? In the pattern recognition and computer vision communities Dynamic hand gesture recognition is a crucial but challenging task.
 DISADVANTAGE :
 ? The positive throttle will increase the flying impact of the drone and the negative throttle will lower the drone. ? All of these issues could have been minimized or avoided altogether by using another sensor such as an optical flow sensor, similar to the one used by the drone for stabilization. ? This is done through averaging the samples collected between drone actuations, thus resulting in a minimal impact from noise but also a lower spatial resolution as can be seen throughout. ? By having no dynamic calibration, not only does changing the environment slightly have an impact on the sensor’s output, the sensor also struggles with continued distortions of the surrounding electromagnetic emissions.
 PROPOSED SYSTEM :
 • The goal is to fuse all sensors information arriving at variable rates, and for this purpose, an extended Kalman filter (EKF) is considered. • The challenge lies in fusing the information arriving from different sensors at variable rates, and for this purpose, an EKF is considered. • The systematic framework of the proposed technique includes two main steps: feature extraction and classification. • The proposed methtechniqueod is evaluated on two dynamic hand gesture datasets with frames acquired with a LMC. • In the proposed system, an input gesture is acquired using a Leap Motion sensor.
 ADVANTAGE :
 ? It should be noted that ‘rospy’ package in ROS is designed in such a way so as to favour the implementation speed, i.e., the developer time, over the runtime performance, thus, enabling the quick prototyping of algorithms using Python and further testing them on ROS. ? Regarding the performance, it indicates that the undertaking fulfils the entire target requirement. ? Because the sensor is relatively cheap, there is no calibration that can be done to adjust measurements for varying environments and because the user cannot access the raw electrode measurements there is little that can be done to improve its performance and sensitivity. ? The relatively comparable price and potential increase in accuracy and performance make this kind of device a more suitable option for further studies.

We have more than 145000 Documents , PPT and Research Papers

Have a question ?

Mail us : info@nibode.com