translate

Low Vision

Here you will find a number of other interesting summative documents that do not deal with the DeveloperSpace topic (access to ICT), but provide a more spherical view of other aspects of disability.

Turn Right: Analysis of Rotation Errors in Turn-by-Turn Navigation for Individuals with Visual Impairments

D. Ahmetovic, U. Oh, S. Mascetti, C. Asakawa

This work studies rotation errors and their effect on turn-by-turn guidance for individuals with visual impairments to inform the design of navigation assistance in real-world scenarios. A dataset of indoor trajectories of 11 blind participants using NavCog, a turn-by-turn smartphone navigation assistant, was collected.
Tags: 
navigation

Easy Return: An App for Indoor Backtracking Assistance

G. Flores, R. Muchi

A mobile app that lets people with visual impairments backtrack their steps for indoor navigation. Participants would walk in a simulated indoor environment of varying lengths, while wearing different intertial sensors and apple watch.
Tags: 
Indoor NavigationBacktracking Assistance

WebinSitu: a comparative analysis of blind and sighted browsing behavior

Jeffrey P. Bigham, Anna C. Cavender, Jeremy T. Brudvik, Jacob O. Wobbrock, Richard E. Ladner

This work analyzes and compares browsing behaviours between blind and sighted participants. An HTTP proxy connection monitored and collected all the user interactions with webpages for a week.
Tags: 
blindness

iMove dataset

H. Kacorri, S. Mascetti, A. Gerino, D. Ahmetovic, H. Takagi, C. Asakawa

This work collects mobile app usage data to understand how people with visual impairment use voiceover and other settings People with visual impairment had a software on a mobile device that logged their interactions.
Tags: 
Visual impairment

Low Vision Stroke-Gesture Dataset

R. Vatavu, B. Gheran, M. Schipor

A dataset of stroke gestures is collected to understand how people with low vision articulate gestures. Stroke gestures were collected on a touchscreen tablet
Tags: 
Low visionstroke

ORBIT: A Real-World Few-Shot Dataset for Teachable Object Recognition

D. Massiceti, L. Zintgraf, J. Bronskill, L. Theodorou, M. Harris, E. Cutrell, C. Morrison, K. Hofmann, S. Stumpf

This dataset is a collection of videos of objects recorded by people who are blind/low-vision on their mobile phones to drive research in Teachable Object Recognisers (TORs) under few-shot, high-variation conditions. Collectors recorded and submitted videos to the ORBIT benchmark dataset via an accessible iOS app.
Tags: 
blindnessblindLow vision

Structured Analysis of the Retina

Michael H. Goldbaum

Retinal images are collected to diagnose various retinopathology conditions Retinal images of healthy patients and patients with a variety of eye conditions are collected. Artery and vein images are labeled and blood vessel segmentation is also done.
Tags: 
DiabetesAgeingRetinopathy

Hands Holding Clues for Object Recognition in Teachable Machines

K. Lee, H. Kacorri

A dataset of hand held objects is collected to build an object recognizer that could be used by people with visual impairment. The dataset has photographs by people with and without visual impairments. A sighted and a blind individual collected images of objects using a smartphone camera.
Tags: 
blindnessLow visionVisual impairment