Thumb b43808cc c42c 47ae 99eb eedd4c2627d9

Implementing gesture control on Raspberry Pi

In this project I’m going to show how hand tracking could be used to control Raspberry PI. We'll use the simplest Computer Vision algorithms, so anyone can implement and run this on his own device.

Project State

Public Project Participation icon Participation wanted

Licences

Software Licence: GPL v.3
Hardware Licence: Project has no hardware

Project Tags

Admins

jackersson

Does this project pique your interest?

Login or register to join or follow this project.

Hardware

    1 Laptop, 1 Raspberry PI 2, 1 Web-camera

Dependencies

     1. Install conda (downloads)

     2. Get code from repository

git clone https://github.com/jackersson/hand_mouse_control.git
cd hand_mouse_control

     3.  Create conda environment (prepared environment in repository folder environment.yml)

conda env create -f environment.yml
source activate tutor # environment name - ‘tutor’

 Required libraries: python3, OpenCV, numpy, pyautogui

 Additional: Learn how to manage conda environments.

      4. Install Visual Studio Code for code editing.

Launch Guide

      1. Start program

python main.py -c 0
# where ‘-c 0’ defines camera index (if only one camera plugged to Raspberry PI use 0)

Note: to list connected cameras use

ls -ltrh /dev/video*

     2. Hotkeys:

  •   press ‘a’ - start/stop mouse control by hand
  •   press ‘w’ - draw blind zones (transparent green colored zones) where hand can’t be placed
  •   press ‘h’ - show/hide green rectangles for hand extraction
  •   press ‘s’ - update intensity threshold for color hand extraction
  •   press ‘q’ - quit program

  After you launched application, you are going to see two windows - ‘Hand mouse control’ with live video from webcamera and ‘Colored mask’ - with extracted skin segments from camera stream.

  •     Place your hand on green rectangles and press ‘s’ until image in window (‘Hand mouse control’) show you only your extracted hand. Better extraction - better quality of hand movement.
  •     Press ‘h’ to hide green rectangles
  •     Press ‘a’ to start mouse movements with your hand.
  •     Move your left hand and watch at mouse. Cursor position should be updated when you hand moves. Remember gestures to control application.

  •    Press ‘w’ to see blind zones

Here are some demos

     Simple mouse movement with gesture recognition

     Painting, Drag&Drop with gesture recognition

Conclusion     

    Hope this project will inspire you to modify this code and make more advanced system. This could be also extremely useful in Augmented Reality applications. So looking forward to your suggestions and possible collaboration.

GitHub Repository

https://github.com/jackersson/hand_mouse_control

Commits

Taras
pushed 08b893d648017d25987c9063cf622f4438ded3b9
Update README.md
2017-11-19 15:25:48 UTC
taras
pushed eeded90c72983c06dd1320dd4e000aaea0dee191
added environment
2017-11-19 14:46:47 UTC
taras
pushed 3e2133ae55f2c93f1981f3eabd9b0ec01af0a551
initial
2017-11-19 14:41:41 UTC
Taras
pushed df60873a42d3bb9f0db0dcb5aafcd38b1a83ce94
Initial commit
2017-10-15 11:01:43 UTC

Comments

heshanfer report abuse
Are you using a leap motion for the project ?
jackersson report abuse
Hi, no) This is simple hand-crafted CV algorithms)