Long Hands Gesture Recognition: Real-Time Hand Gesture Recognition and Hand Tracking

Pavel Popov, Robert Laganiere
University of Ottawa
Ottawa, ON, Canada
Questions? Drop us a line

Long Hands Gesture Recognition and Hand Tracking: Hand Tracking Dataset


A dataset used to test the hand tracking capabilities of the Long Hands Gesture Recognition and Hand Tracking system is available here. The dataset contains videos of 5 different users, as well as hand location annotations, and finger count and specification information. A demonstration of the Long Hands Gesture Recognition and Hand Tracking system which is capable of automatic user registration and hand tracking in a 2D video only context is available here.

Long Hands Gesture Recognition: Gesture Recognition Dataset


A dataset of 5 gestures and negative background samples used for the machine learning on this project can be found here. The negative background samples were enriched with additional images from publically available datasets [1],[2].

References
1. Faces 1999. http://www.vision.caltech.edu/archive.html
2. Quattoni, A., & Torralba, A. Indoor Scene Recognition. http://web.mit.edu/torralba/www/indoor.html

Overview


Real-time Hand Gesture Recognition is interesting and power way for users to interface with computers. It is a multi-faceted problem that has many different competing approaches. Our approaches focus on:

  • Automatic Colour Image Processing and Segmentation Techniques
  • Real-time Performance
  • Robust hand gesture recognition using machine learning and hand contour shape analysis
  • Robust hand tracking using hand contour shape analysis and template matching
This project aims to create powerful hand gesture recognition and hand tracking approaches and to combine them into robust real-time user hand gesture interfaces.

Click here for more

Projects that have contributed to this research direction



Several projects have contributed to this research direction.
  • Hand Gesture Recognition for car driver interactions: This project achieved position independent hand gesture recognition using machine learning.
  • Long Hands Controller for Halo: This controller proved the viability of my gesture recognition system for video game control applications
  • Long Hands Gesture Recognition System: This project achieved robust fixed position hand gesture recognition and hand tracking
  • Gesture Recognition Paint: This project successfully ported an earlier version of my gesture recognition work on an Android tablet. It allows a user to draw pictures using Gesture Recognition


Hand gesture recognition for car driver interactions.

MITACS Accelerate Project 2017

Pavel Popov, Robert Laganiere
University of Ottawa
Ottawa, ON, Canada
Klashwerks Inc,
Collaborating Partner
Ottawa, ON, Canada
Questions? Drop us a line


This project was the result of a MITACS Accelerate research internship with a partnering company called Klashwerks. The company wanted to have a gesture recognition interface for their mobile device built for the connect car market. During this internship I improved my existing Hand Tracking algorithm by adding hand pose recognition capabilities using HOG Cascade machine learning. Using what I learned with HOG Cascades I then created a robust real-time Hand Gesture Recognition system using HOG Cascades capable of recognizing specific static hand gestures anywhere in a video frame. The system performed well on PC. The Hand Gesture Recognition system was trained with a small sample dataset. I also added an option to the Hand Gesture Recognition system to use HOG SVMs instead of HOG Cascades in order to make speed and performance comparisons with a competing machine learning method. The Hand Gesture Recognition system was attempted to be ported to Klashwerks's mobile device for the connected car market. However the Hand Gesture Recognition system achieved low performance on the mobile device and was replaced with a simpler proximity sensor. This research internship also produced a very large hand gesture image dataset that will be crucial in future research that I do with machine learning methods for my PhD. MITACS Accelerate is a research funding program run by MITACS.

Results

Robust real-time Hand Gesture Recognition system capable of recognizing 3 different hand gestures anywhere in a video frame.



Static hand pose recognition add-on using HOG Adaboost Cascades for my previous Hand Tracking algorithm.





Long Hands Gesture Recognition controller for Halo

Controller for first person video game called Halo. 2015

Pavel Popov, Robert Laganiere
University of Ottawa
Ottawa, ON, Canada
Questions? Drop us a line


I made a controller for a first person shooter video game called Halo using my Long Hands Gesture Recognition system. Using two web cameras the system tracks the user's hands. The left hand controls character movement, while the right hand controls aiming and shooting. This served as an important proof of concept that showed that user interfaces capable of rapid response could be made using Long Hands Gesture Recognition.

Results

Long Hands Gesture Recognition controller for Halo video demonstration




Long Hands Gesture Recognition System

Contour Shape Analysis and Pixel Colour Based Template Matching for Hybrid Real-Time Gesture Recognition and Tracking. 2015

Pavel Popov, Robert Laganiere
University of Ottawa
Ottawa, ON, Canada
Questions? Drop us a line


I integrated my previously developed hand recognition algorithm and my contour based and template based hand tracking algorithms into one system. Unimodal histogram filtering and contour shape analysis works to rapidly detect a hand 5 gesture displayed in the middle of a video frame, and then the two part hand tracking algorithm tracks the hand in subsequent frames. The original contour based tracking algorithm was complemented by adding the template based tracking algorithm. The contour based tracking algorithm partially guides the template based tracking algorithm and this improves the robustness of the template tracking. The template tracking algorithm provides stable tracking of the fingertips improving the system's usability as a user interface input mechanism. The description of the project was drafted as a journal article and a video demonstration was made. My thesis supervisor and I decided to add machine learning to improve the hand gesture recognition and make it position independent before publishing in order to present a more complete system.

Results

Long Hands Gesture Recognition System video demonstration



Gesture Recognition Paint

Android application allowing a user to draw pictures with hand gestures in front of an Android device camera. 2015

Pavel Popov, Robert Laganiere
University of Ottawa
Ottawa, ON, Canada
Questions? Drop us a line


I successfully ported my gesture recognition work to an Android Device. Using my hand recognition and contour based hand tracking algorithms I created an App that lets a user draw on the screen of a tablet by making hand gestures in front of the front facing device camera. The App is called Gesture Recognition Paint. It has 6 different colours to choose from when drawing. This important milestone for my thesis research proved that my gesture recognition methods are scalable to different platforms.

Results

Gesture Recognition Paint App created for Android Tablet Devices