Daniel Whitelegge (2108775)

Adaptive Gesture Controlled Drone

Photo

Project Abstract

Drones are becoming an increasingly popular choice of technology for both industrial and commercial use. Despite their growing popularity, drone control remains difficult requiring highly skilled and experienced operators. To overcome this challenge this project develops a gesture-controlled drone, where users can perform their desired gesture command to operate a drone. The project focuses on creating a natural user interface, prioritising user’s desired controls. The system uses a gesture recognition algorithm to identify the user’s desired command observed by a laptop webcam. Once the gesture is identified and categorised, commands are sent to the drone for execution. The project successfully produced a gesture-controlled drone which completes the user’s desired command correctly with a very low error margin. This project signifies a large step in decreasing the skill gap and expertise required in operating a drone whilst also allowing further progress for the development of natural user interfaces within human-computer interaction.

Keywords: Human-Drone Interaction, Gesture Recognition,

 

 Conference Details

 

Session: A

Location: Sir Stanley Clarke Auditorium at 11:00 13:00

Markers: Deepak Sahoo, Benjamin Mora

Course: BSc Computer Science with a Year in Industry 4yr FI

Future Plans: I have a job lined-up