Allow control by gesture
This feature will allow users to control by gesture with any body part(s) (hand, facial, posture, etc.). By recognizing gestures that originates from any bodily motion or state, the feature enables a user to interact with the system naturally without manipulating any mechanical devices.
Discussion by Disabilities
Blind users will find it easy to control by gesture since it doesn’t require accurate pointing and locating.
Control by gesture can simplify complex actions and make interactions easier for people with cognitive impairments.
People with physical limitations may not be able to use conventional mouse, keyboard, joystick, or other mechanical controls. Gesture control will allow them to use any body parts with motion to interact.
Existing Products
This listing includes a wide range of products, from screen readers, to simple text-to-speech utilities, to large literacy suites that include a text-to-speech application. Please note that these products are not necessarily endorsed by RtF, but represent the range of available options.
Many operating systems, including Windows Vista and Mac OS X, include limited built-in screen reading ability as well.
Commercial, with free trial
These products are free to try for a limited period of time or with limited functionality. They must be purchased for full functionality.
- FaceMOUSE(link is external) – Claro
- Swype for Android(link is external) – Nuance
- SpeeG(link is external) (prototype) – AEGIS
Commercial, no free trial
These products must be purchased to be used, and did not offer free trials at the time of posting.
- Leap Motion Controller(link is external) – LEAP
- PointGrab(link is external) – PointGrab
Related Research and Papers
- A Kinect-based system for physical rehabilitation: A pilot study for young adults with motor disabilities(link is external)- Yao-Jen Chang, Shu-Fang Chen and Jun-Da Huang
- The gesture pendant: a self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring(link is external)- T. Starner, J. Auxier, D. Ashbrook and M. Gandy
- Head gesture recognition for hands‐free control of an intelligent wheelchair(link is external)- Pei Jia, Huosheng H. Hu, Tao Lu,Kui Yuan
- Gesture-controlled user interfaces, what have we done and what’s next?(link is external)- Moniruzzaman Bhuiyan and Rich Picking
- User-defined gestures for surface computing(link is external)-Jacob O. Wobbrock, Meredith Ringel Morris and Andrew D. Wilson
- Interface for electronic devices providing improved access for people with disabilities(link is external)Gregg C. Vanderheiden, Christopher M. Law and David P. Kelso
- A Gesture Controlled User Interface for Inclusive Design and Evaluative Study of Its Usability(link is external)- Moniruzzaman Bhuiyan, Rich Picking
- Gesture inputs for a portable display device(link is external)- Hideyuki Hashimoto and Shigetoshi Kitayama
- Analysis of intentional head gestures to assist computer access by physically disabled people(link is external)- WS Harwin and RD Jackson
- Interaction and recognition challenges in interpreting children's touch and gesture input on mobile devices(link is external)- Lisa Anthony, Quincy Brown, Jaye Nias, Berthel Tate, Shreya Mohan
- Wearables and chairables: inclusive design of mobile input and output techniques for power wheelchair users(link is external)- Patrick Carrington, Amy Hurst, Shaun K. Kane
- Access Interface Strategies(link is external)- Susan Fager, David R. Beukelman, Melanie Fried-Oken, Tom Jakobs PE, John Baker
- Enabling a gesture-based numeric input on mobile phones(link is external)- Jiho Choi, Kyohyun Song, Seongil Lee