Allow input by Gesture with any body part(s)(hand, facial, posture, etc)
For some users who are blind or deaf or hard of hearing, it is much easier and more natural to use gesture-based text input methods. This feature may allow text input using sign language, using handwriting, or swiping through letters (or swipe to form the pattern) on a virtual keyboard (onscreen or even in the air), and others.
Discussion by Disabilities
Using gesture-based input that is unrestricted to accurate pointing can greatly simplify input for people are blind or have low vision.
Sign language recognition will allow users who are deaf or hard of hearing input text more easily and naturally.
Existing Products
Please note that these products are not necessarily endorsed by RtF, but represent the range of available options.
Free, not necessarily open source
These products are free to use, but may have strict restrictions on viewing and modifying source code.
- Swype for Android(link is external) – Nuance
- SpeeG(link is external) (prototype) – AEGIS
Related Research and Papers
- Interface for electronic devices providing improved access for people with disabilities(link is external)Gregg C. Vanderheiden, Christopher M. Law and David P. Kelso
- A Gesture Controlled User Interface for Inclusive Design and Evaluative Study of Its Usability(link is external)- Moniruzzaman Bhuiyan, Rich Picking
- Gesture inputs for a portable display device(link is external)- Hideyuki Hashimoto and Shigetoshi Kitayama
- Analysis of intentional head gestures to assist computer access by physically disabled people(link is external)- WS Harwin and RD Jackson
- Interaction and recognition challenges in interpreting children's touch and gesture input on mobile devices(link is external)- Lisa Anthony, Quincy Brown, Jaye Nias, Berthel Tate, Shreya Mohan
- Wearables and chairables: inclusive design of mobile input and output techniques for power wheelchair users(link is external)- Patrick Carrington, Amy Hurst, Shaun K. Kane
- Access Interface Strategies(link is external)- Susan Fager, David R. Beukelman, Melanie Fried-Oken, Tom Jakobs PE, John Baker
- Enabling a gesture-based numeric input on mobile phones(link is external)- Jiho Choi, Kyohyun Song, Seongil Lee
- A Kinect-based system for physical rehabilitation: A pilot study for young adults with motor disabilities(link is external)- Yao-Jen Chang, Shu-Fang Chen and Jun-Da Huang
- The gesture pendant: a self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring(link is external)- T. Starner, J. Auxier, D. Ashbrook and M. Gandy
- Head gesture recognition for hands‐free control of an intelligent wheelchair(link is external)- Pei Jia, Huosheng H. Hu, Tao Lu,Kui Yuan
- Gesture-controlled user interfaces, what have we done and what’s next?(link is external)- Moniruzzaman Bhuiyan and Rich Picking
- User-defined gestures for surface computing(link is external)-Jacob O. Wobbrock, Meredith Ringel Morris and Andrew D. Wilson
Related content in the DeveloperSpace
- What are Learning Disabilities?
- AsTeRICS WebACS Tutorial
- What is Deaf and Hard of Hearing?
- What is Cognitive Disability?
- What is Blindness?
- Arduino-Head-Mouse-Project
- AsTeRICS Plugin Development - Step by Step
- AsTeRICS User Manual
- OpenFaceIOS
- AsTeRICS Nexus Connector
- wedjat
- blink_based_aural_scanning_keyboard_with_Morse_code_option
- touchegg
- openbr