Allow input by Gesture with any body part(s)(hand, facial, posture, etc)
For some users who are blind or deaf or hard of hearing, it is much easier and more natural to use gesture-based text input methods. This feature may allow text input using sign language, using handwriting, or swiping through letters (or swipe to form the pattern) on a virtual keyboard (onscreen or even in the air), and others.
Discussion by Disabilities
Using gesture-based input that is unrestricted to accurate pointing can greatly simplify input for people are blind or have low vision.
Sign language recognition will allow users who are deaf or hard of hearing input text more easily and naturally.
Please note that these products are not necessarily endorsed by RtF, but represent the range of available options.
Free, not necessarily open source
These products are free to use, but may have strict restrictions on viewing and modifying source code.
Related Research and Papers
- Interface for electronic devices providing improved access for people with disabilitiesGregg C. Vanderheiden, Christopher M. Law and David P. Kelso
- A Gesture Controlled User Interface for Inclusive Design and Evaluative Study of Its Usability- Moniruzzaman Bhuiyan, Rich Picking
- Gesture inputs for a portable display device- Hideyuki Hashimoto and Shigetoshi Kitayama
- Analysis of intentional head gestures to assist computer access by physically disabled people- WS Harwin and RD Jackson
- Interaction and recognition challenges in interpreting children's touch and gesture input on mobile devices- Lisa Anthony, Quincy Brown, Jaye Nias, Berthel Tate, Shreya Mohan
- Wearables and chairables: inclusive design of mobile input and output techniques for power wheelchair users- Patrick Carrington, Amy Hurst, Shaun K. Kane
- Access Interface Strategies- Susan Fager, David R. Beukelman, Melanie Fried-Oken, Tom Jakobs PE, John Baker
- Enabling a gesture-based numeric input on mobile phones- Jiho Choi, Kyohyun Song, Seongil Lee
- A Kinect-based system for physical rehabilitation: A pilot study for young adults with motor disabilities- Yao-Jen Chang, Shu-Fang Chen and Jun-Da Huang
- The gesture pendant: a self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring- T. Starner, J. Auxier, D. Ashbrook and M. Gandy
- Head gesture recognition for hands‐free control of an intelligent wheelchair- Pei Jia, Huosheng H. Hu, Tao Lu,Kui Yuan
- Gesture-controlled user interfaces, what have we done and what’s next?- Moniruzzaman Bhuiyan and Rich Picking
- User-defined gestures for surface computing-Jacob O. Wobbrock, Meredith Ringel Morris and Andrew D. Wilson
Related content in the DeveloperSpace
- What are Learning Disabilities?
- AsTeRICS WebACS Tutorial
- What is Deaf and Hard of Hearing?
- What is Cognitive Disability?
- What is Blindness?
- AsTeRICS Plugin Development - Step by Step
- AsTeRICS User Manual
- AsTeRICS Nexus Connector