Skip to content

Converting a depth-map from the Kinect to haptic feedback motors to allow blind individuals to gain a physical, intuitive feel for the world around them

License

Notifications You must be signed in to change notification settings

tusing/blind_navigation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Blind Navigation

Mapping depth (in real space) to haptic feedback, allowing blind individuals to navigate - in a sense, granting a sort of depth-based "sight" to the blind.

Tested on a vest with 32 vibration motors arranged in a grid; depth is obtained from a fusion between the Microsoft Kinect and ultrasonic sensors. Each motor receives a segment of the depth-image, and it vibrates stronger the closer you get to an obstacle.

Over time, the user adapts to the haptic feedback input and learns from it, allowing the user to accept this input as intuitively as natural vision itself.

Blind Navigation Video

See it in action by clicking the image above, or any of the following links: 1, 2, 3

This branch is a refactor-in-progress. Please check out "master" for a less buggy version.

About

Converting a depth-map from the Kinect to haptic feedback motors to allow blind individuals to gain a physical, intuitive feel for the world around them

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages