translate

intro.md

Displaying 301 - 320 of 1717

Attention-based Autism Spectrum Disorder Screening with Privileged Modality

S. Chen, Q. Zhao

Behavioral data of individuals with autism spectrum disorder and healthy controls in photo-taking and image-viewing tasks are collected to build a classifier for automatic and quantitative screening of ASD. Participants were asked to take photos to reveal their attentional preference used to train a model. The model also incorporated temporal information in eye tracking data collected from participants viewing 700 images.
Tags: 
Neurodevelopmental DisordersAutism Spectrum Disorders

Eyelid Gestures on Mobile Devices for People with Motor Impairments

M. Fan, Z. Li, F. Li

Data were collected to design, detect, and evaluate a set of eyelid gestures for people with motor impairments on mobile devices. A Samsung S7 Android phone running Android OS 8.0 was used as the testing device to collect eyelid states and evaluate the eyelid gesture recognition app in real time.
Tags: 
Motor impairment

ATIS sign language corpus

J. Bungeroth, D. Stein, P. Dreuw, H. Ney, S. Morrissey, A. Way, L. Zijl

This work presents the sign language corpus sourced from the Air Travel Information System (ATIS) dataset, which aims to support machine translation and automatic sign language recognition. 595 sentences were chosen from the ATIS datasets as a base and translated into Irish sign language (ISL), German sign language (DGS1) and South African sign language (SASL).
Tags: 
Sign languageIrish Sign LanguageGerman sign languageSouth African Sign Language

PDMove

H. Zhang, C. Xu, H. Li, A. Rathore, C. Song, Z. Yan, D. Li, F. Lin, K. Wang, W. Xu

PDMove, a smartphone-based passive sensing system, compares diferent gait patterns before and after taking medication to facilitate medication adherence monitoring. PDMove passively collects gait data in a nonclinical daily-life environment using built-in inertial sensors of a smartphone, and afterward the gait preprocessor helps extract gait cycle containing the Parkinsonism-related biomarkers.
Tags: 
Parkinson's disease

PDVocal

H. Zhang, C. Song, A. Wang

A dataset for passive sensing of body sounds is collected to train a privacy preserving neural networks for detecting Parkinson's disease. Voice samples with body and speech sounds is collected via a smartphone app. They first collect body sounds and then perform the sustained vowel sound of the letter A.
Tags: 
Parkinson's disease

Visualizing Gaze Direction to Support Video Coding of Social Attention for Children with Autism Spectrum Disorder

K. Higuchi, S. Matsuda

Recordings of therapeutic activities for assessment of austism spectrum disorder (ASD) are collected to introduce a computer-vision assisted interface for video based evaluations of social attention. Expert therapists performed professional therapeutic activities with children with ASD and typical development, composing of multiple tasks to see development of social interaction.
Tags: 
Autism Spectrum DisordersSocial Attention

Fluent: An AI Augmented Writing Tool for People who Stutter

B. Ghai, K. Mueller

This work aims to build an AI augmented writing tool that identifies words an individual with stuttering might struggle pronouncing and presents alternative words to speak more fluently. The model to classify between easy and difficult words is further refined from the intial model based on user feedback collected via the interface. The classifier perfomormance is measured using different user profiles with varying degrees of stutttering created from self reports of online stuttering communities and personal experiences of a person who stutters.
Tags: 
StutteringStammering

CopyCat: An American Sign Language Game for Deaf Children

Z. Zafrulla, H. Brashear, H. Hamilton, T. Starner

A dataset of ASL phrases obtained from five deaf children playing an educational game called CopyCat was collected. Each child wearing colored gloves with 3-axis accelerometers attached and seated facing a camera was asked to sign the appropriate phrase given scenarios presented on the game.
Tags: 
Sign languageAmerican Sign Language

KETI (Korea Electronics Technology Institute) sign language dataset

S. Ko, C. Kim, H. Jung, C. Cho

The KETI sign language dataset is constructed to develop a neural network model for Korean sign language translation. The dataset consists of 14,672 full high definition videos that captured 524 different signs, potentially used in various emergency situations, and were recorded at 30 frames per second and from two camera angles: front and side.
Tags: 
Sign languageKorean Sign Language

Analysis of fundamental frequency, jitter, shimmer and vocal intensity in children with phonological disorders

Haydée F. Wertzner, S. Schreiber, L. Amaro

Phonology test of language and speech test were conducted with children with phonological disorders and without speech or language affections to analyze vocal characteristics that support the process of diagnosis of the disorder. All tests were recorded in digital audio tape and in digital video, and acoustic analyses of the vowels were performed using The Computer Speech Lab.
Tags: 
Phonological DisorderLanguage developmentLanguage Development Disorders

Predicting risk of dyslexia with an online gamified test

L. Rello, R. Baeza-Yates, A. Ali, Jeffrey P. Bigham, M. Serra

People with and without dyslexia played an online game while interaction data is collected which is analyzed to find patterns that could predict dyslexia. For every game played, right and wrong answer were collected, number of clicks, overall accuracy and other metrics were collected.
Tags: 
dyslexia

Predicting Reading Difficulty for Readers with Autism Spectrum Disorder

V. Yaneva, R. Evans, I. Temnikova

This work aims to create a machine learning readability model for English, developed specifically for the needs of adults with autism and evaluate it on the ASD corpus Participants were asked to read a small paragraph and were later asked 3 MCQs on it.
Tags: 
Autism Spectrum DisordersReading

Detection of Dyslexia using Eye Tracking Measures

M. Modak, K. Ghotane, S. V, N. Kelkar, A. Iyer, P. G

The study explores whether eye tracking techniques combined with machine learning can be used to develop screening models for identifying youngsters suffering from dyslexia. Data was recorded by presenting a set of 3 reading modules to each candidate and following their eye movements throughout the activity to detect learning disability.
Tags: 
dyslexiaLearning Disability

American sign language recognition with the kinect

Z. Zafrulla

A dataset of 3D video streams and hand tracking of ASL signers is collected and annotated. A predefined set of phrases are communicated in ASL, while a kinect sensor is used to collect 3D data along with accelerometer and marker based hand tracking system.
Tags: 
Sign languageAmerican Sign LanguageEnglish

Prosodic analysis of neutral, stress-modified and rhymed speech in patients with Parkinson's disease

Z. Galaz, J. Mekyska, Z. Mzourek, Z. Smekal, I. Rektorova, I. Eliasova, M. Kostalova, M. Mrackova, D. Berankova

For quantitative prosodic analysis of neutral, stress-modified and rhymed speech in patients with PD. Participant speech was recorded through microphone and sampled by audio interface M-AUDIO fast track pro.
Tags: 
Parkinson's disease

Detection of Nasalized Voiced Stops in Cleft Palate Speech Using Epoch-Synchronous Features

C. M. Vikram, N. Adiga, S. R. Mahadeva Prasanna

To create an algorithm for the detection of nasalized voiced stops in CP speech using epoch-synchronous features Participants were asked to repeat the target words spoken by the instructor and were recorded using Bruel & Kjaer sound level meter in a sound-treated room.
Tags: 
Cleft Lip and PalateCleft Palate

A Smartphone-Based Tool for AssessingParkinsonian Hand Tremor

N. Kostikis

Smartphone sensors are used to classify tremors in Parkinson's patients corellated with the UPDRS scale. Participants wore smartphone based gloves and data during tremors was collected and features like accelertation were computed after the data was collected.
Tags: 
Parkinson's disease

Erlangen-CLP: A Large Annotated Corpus of Speech from Children with Cleft Lip and Palate

T. Bocklet, A. Maier, K. Riedhammer, U. Eysholdt, E. Noth

A dataset to understand what speech processes are activated in children with cleft lip and palate Record speech during recitation of semi-standardized PLAKSS test that consists of words with all German phonemes in different positions
Tags: 
Cleft Lip and Palate

The TYPALOC Corpus: A Collection of Various Dysarthric Speech Recordings in Read and Spontaneous Styles

Meunier C., Fougeron C., Fredouille C., Bigi B., Crevier-Buchman L., Delais-Roussarie E., Georgeton L., Ghio A., Laaridh I., Legou T., Pillot-Loiseau C., Pouchoulin G.

Dysarthic speakers pronounced various sentences and a corpus was collected to understand the phonetic variations between them and healthy controls(without Dysarthria) Audio was collected, transcribed and automatically aligned with the phonemes pronounced.
Tags: 
Parkinson's diseaseDysarthriaAmyotrophic Lateral SclerosisCerebellar Alteration

PSL 101

M. Oszust, M. Wysocki

A video dataset is collected to improve Polish sign language recognition. Two signers performed 101 words and 35 sentences signs 20 times, this was recorded using a video camera.
Tags: 
Sign languagePolish sign language