Omaid Nasary (2213545)
Rock-Paper-Scissors Gesture Recognition Application Using Depth and RGB-based Cameras.

Project Abstract
Increasing screen time has made simple games like Rock Paper Scissors (RPS) feel less interactive to younger audiences. Yet, such games help develop important social skills like communication and peer interaction. This project aims to make RPS more immersive by bringing it into an interactive digital format, complete with score-tracking to reduce disputes. On a personal level, I also aimed to explore unfamiliar technologies such as depth cameras and libraries like MediaPipe and PyRealSense to build a unique, accessible application. While most hand gesture recognition research focuses on sign language, I chose a more niche yet novel direction by applying it to RPS. The goal was to create a GUI-based application that uses both RGB and depth cameras to detect hand gestures and facilitate games of RPS with automatic scoring. I also aimed to evaluate gesture detection performance across the two camera types. The project uses a deep learning-based hand gesture recognition approach, leveraging libraries like MediaPipe and OpenCV for detection and classification, and PyRealSense for interacting with the depth camera. Initial findings indicate that the depth camera is less accurate than the RGB camera, likely due to both technical and environmental factors explored further in my research. Despite this, the project’s main objectives were met, and the application will be demonstrated at the conference. While my project doesn’t assess RPS’s impact on engagement due to time constraints, it offers a novel use of gesture recognition and demonstrates how classic games can be reimagined through emerging technologies.
Keywords: Gesture Recognition, Software Engineering, Software-Hardware Interaction
Conference Details
Session: A
Location: Sir Stanley Clarke Auditorium at 11:00 13:00
Markers: Daniele Cafolla, Matt Roach
Course: BSc Computer Science 3yr FT
Future Plans: I’m looking for work