In this project I’m going to show how hand tracking could be used to control Raspberry PI. We'll use the simplest Computer Vision algorithms, so anyone can implement and run this on his own device.
1 Laptop, 1 Raspberry PI 2, 1 Web-camera
1. Install conda (downloads)
2. Get code from repository
git clone https://github.com/jackersson/hand_mouse_control.git
cd hand_mouse_control
3. Create conda environment (prepared environment in repository folder environment.yml)
conda env create -f environment.yml
source activate tutor # environment name - ‘tutor’
Required libraries: python3, OpenCV, numpy , pyautogui
Additional: Learn how to manage conda environments.
4. InstallVisual Studio Code for code editing.
1. Start program
python main.py -c 0
# where ‘-c 0’ defines camera index (if only one camera plugged to Raspberry PI use 0)
Note: to list connected cameras use
ls -ltrh /dev/video*
2. Hotkeys:
After you launched application, you are going to see two windows - ‘Hand mouse control’ with live video from webcamera and ‘Colored mask’ - with extracted skin segments from camera stream.
Simple mouse movement with gesture recognition
Painting, Drag&Drop with gesture recognition
Hope this project will inspire you to modify this code and make more advanced system. This could be also extremely useful in Augmented Reality applications. So looking forward to your suggestions and possible collaboration.
Taras | pushed 08b893d648017d25987c9063cf622f4438ded3b9 | 2017-11-19 15:25:48 UTC | |
taras | pushed eeded90c72983c06dd1320dd4e000aaea0dee191 | 2017-11-19 14:46:47 UTC | |
taras | pushed 3e2133ae55f2c93f1981f3eabd9b0ec01af0a551 | 2017-11-19 14:41:41 UTC | |
Taras | pushed df60873a42d3bb9f0db0dcb5aafcd38b1a83ce94 | 2017-10-15 11:01:43 UTC |
Want to comment this ...
Show more