Udacity Nanodegree 131 - OpenVINO computer vision on the edge Project 3 - Gaze Tracking
-
Updated
May 13, 2020 - Python
Udacity Nanodegree 131 - OpenVINO computer vision on the edge Project 3 - Gaze Tracking
Kormochari is an automatic keyboard and mouse controlling system.
Control mouse with keyboard using arrow keys and spacebar key
Virtual Controller implements hand gesture recognition to control mouse movements and adjust system volume. Using MediaPipe and PyCaw libraries, it tracks hand landmarks to simulate mouse cursor movements and recognize gestures for volume control. The project offers intuitive interaction for users through simple hand gestures.
Deep learning based Gaze Detection model to control the mouse pointer of your computer.
A gesture based Mouse controller using openCV and PyMouse
This HCI (Human-Computer Interaction) application in Python(3.6) will allow you to control your mouse cursor with your facial movements, works with just your regular webcam. Its hands-free, no wearable hardware or sensors needed.
Trackpoint-like mouse control via webcam-recorded hand gestures
Real-time mouse control via webcam-recorded hand gestures.
This HCI (Human-Computer Interaction) application in Python(3.6) will allow you to control your mouse cursor with your facial movements, works with just your regular webcam. Its hands-free, no wearable hardware or sensors needed.
Add a description, image, and links to the mouse-controller topic page so that developers can more easily learn about it.
To associate your repository with the mouse-controller topic, visit your repo's landing page and select "manage topics."