translate

android.md

Displaying 421 - 440 of 1717

ASLSpeak

by nelson-liu

:microphone: DubHacks 2015 project. Decode sign language using the Leap Motion, and speak it!

espeak

by nvaccess

A fork of the official eSpeak speech synthesizer project, specific to NVDA. Contains missing data.

owlcarousel2-a11ylayer

by rtrvrtg

Accessibility layer for Owl Carousel v2

ASL-LEX: A lexical database of American Sign Language

Naomi K. Caselli, Z. Sehyr, Ariel M. Cohen-Goldberg, K. Emmorey

A dataset of video streams of American sign language sentences is collected. This dataset was rated for various metrics such as iconicity from non-signers, subjective frequency, video duration, grammatical class and more. Multiple RGB camera captured signs as speakers made them, and ratings were collected via MTurk
Tags: 
Sign language

VizWiz-Priv

D. Gurari, Q. Li, C. Lin, Y. Zhao, A. Guo, A. Stangl, Jeffrey P. Bigham

Images from the original VizWiz dataset are tagged for presence of private information, and regions that have that information are labelled and masked. This dataset would help train algorithms that would prevent disclosure of private information provided accidentantly or otherwise by blind people. Images taken by blind people, labeled and masked using crowdsourcing.
Tags: 
blind

A dataset of eye movements for the children with autism spectrum disorder

H. Duan, G. Zhai, X. Min, Z. Che, Y. Fang, X. Yang, J. Gutiérrez, P. Callet

This dataset is released for analyzing the visual traits of children with ASD and designing specialized visual attention models to promote research in related fields, as well as design specialized models to identify the individuals with ASD. Tobii T120 Eye Tracker was used to display the images and record the eye movements.

Web users with autism: eye tracking evidence for differences

S. Eraslan, V. Yaneva, Y. Yesilada, S. Harper

Eye tracking study with 18 participants with high-functioning autism and 18 neurotypical participants to investigate the similarities and differences between these two groups in terms of how they search for information within web pages. Participants were shown screenshots from webpages and were given some search tasks. Their eye movements were then tracked.
Tags: 
Eye trackingWeb usage

English-ASL Gloss Parallel Corpus 2012: ASLG-PC12

A. Othman, M. Jemni

This work aims to build a big parallel corpus of English written texts and American Sign Language glosses useful for developing an automatic translator. Experts in ASL contributed to the collection or correction of bilingual corpus applying grammatical dependencies rules.
Tags: 
Sign languageAmerican Sign LanguageEnglish

RWTH German Fingerspelling

P. Dreuw, T. Deselaers, D. Keysers, H. Ney

This work collects images of fingerspelling letters of German Sign Language for appearance-based gesture recognition. The database consists of 1400 image sequences that contain gestures of 20 different signers recorded by two different cameras, one webcam and one camcorder under non-uniform daylight lighting conditions.
Tags: 
Sign languageGerman sign languageGerman

RWTH-BOSTON-104

P. Dreuw, J. Forster, T. Deselaers, H. Ney

A dataset of video streams of American sign language sentences is collected Multiple cameras collect videos of ASL signers, and then the video is manually annotated. 201 ASL sentences are singed by 3 signers to generate 201 annotated videos.
Tags: 
American Sign LanguageSign languageEnglish

LSFB-CONT and LSFB-ISOL: Two New Datasets for Vision-Based Sign Language Recognition

J. Fink, B. Frénay, L. Meurant, A. Cleve

Two large scale datasets are released for continuous and isolated sign language recognition, consisting of French Belgian Sign Language (LSFB) conversation videos. Since 2012, 90 hours of video conversations between both native and non-native signersn were gathered with an RGB camera at 50 FPS in a studio with a controlled environment. 25 hours are fully annotated, and the process is ongoing.
Tags: 
Sign languageFrench Belgian Sign Language

MS-ASL American Sign Language Dataset

H. Joze, O. Koller

This dataset covers over 200 signers, signer independent sets, challenging and unconstrained recording conditions to advance the sign language recognition community. ASL vocabulary publicly accessible videos were obtained and identified with 222 signers using face recognition and clustering who are distinct across train, validation and test sets.
Tags: 
Sign languageAmerican Sign Language

Australian Sign Language signs (High Quality) Data Set

M. Kadous

The studyaims for recognition of Australian Sign Language using instrumented gloves. Samples from a single signer (a native Auslan signer) were collected over a period of nine weeks. In total, 27 samples per sign, and a total of 2565 signs were collected. The average length of each sign was approximately 57 frames.
Tags: 
Sign language

RWTH-BOSTON-50

M. Zahedi, D. Keysers, T. Deselaers, H. Ney.

A dataset of video streams of American sign language sentences is collected Multiple cameras collect videos of ASL signers, and then the video is manually annotated. 50 sign language words are signed by 3 signers to generate 201 annotated videos.
Tags: 
Sign languageEnglishAmerican Sign Language

ASL-100-RGBD: An Isolated-Signing RGBD Dataset of 100 American Sign Language Signs Produced by Fluent ASL Signers

S. Hassan, L. Berke, E. Vahdani, L. Jing, Y. Tian, M. Huenerfauth

A RGBD dataset of body movements and HD face data for American sign language is collected to aid in the development of a sign-recognition system. Kinect 2.0 RGBD camera captured the 100 signs of 22 fluent ASL signers, yeidling a total collection of 42 videos, and the recordings were seperated into skeleton, face markings and depth maps and annotated together.
Tags: 
Sign languageAmerican Sign Language

National Center for Sign Language and Gesture Resources (NCSLGR) Corpus

C. Neidle

This project makes available several different types of experimental resources and analyzed data to facilitate linguistic and computational research on signed languages and the gestural components of spoken languages. ASL videos were collected and linguistically annotated.
Tags: 
Sign language

PSYKOSE: A Motor Activity Database of Patients with Schizophrenia

P. Jakobsen, E. Garcia-Ceja, L. Stabell, K. Oedegaard, J. Berle, V. Thambawita, S. Hicks, P. Halvorsen, O. Fasmer, M. Riegler

This dataset constains motor activity data collected from body sensors to explore if machine learning-based analysis can support diagnostic practice of schizophrenia. Motor activity data was collected with a wrist-worn actigraph device to record the integration of intensity, amount and duration of movement in all directions.
Tags: 
SchizophreniaMental health

Tappy Keystroke Data

Warwick R. Adams

Keystroke logging app, Tappy, is used to build a dataset for detecting changes in characteristics of Parkinson's disease. A keystroke logging app, Tappy, was installed on participants personal computers. It recorded key press and release timings as the participants work on their computers at home.
Tags: 
ParkinsonsFinger movement

Gait Dynamics in Neurodegenerative Disease Database

Jeffrey M. Hausdorff

A dataset of gait dynamics is collected to quantify the effect of neuro-generative diseases on mobility. Patients and control participants wore force resistive sensors as they walk on a predefined path, features for each stride is computed and provided.
Tags: 
ParkinsonsHuntingtons

RWTH-PHOENIX-Weather: A Large Vocabulary Sign Language Recognition and Translation Corpus

J. Forster, C. Schmidt, T. Hoyoux, O. Koller, U. Zelle, J. Piater, H. Ney

German sign language transcriptions of weather announcements are collected. 190 weather forecasts of one and a half minutes are recorded and manually annotated using glosses distinguishing sign variants. Time boundaries have been marked on the sentence and the gloss level.
Tags: 
Sign languageGermanGerman sign language