Goal
As part of the Accessibility program at Microsoft, the goal was to understand visual search pattern of users with severe speech and motor impairment and create a eye-controlled interfaces
Role: User Experience Researcher & Designer
Duration: 7 weeks
Team: Junior Researchers, Stakeholders (Vidyasagar, Spastic Society of India & Microsoft Research)
Methods & Artifacts:
USER CENTERED DESIGN
IN-DEPTH INTERVIEWS
EYE TRACKING
FIELD STUDIES
A/B TESTING
STATISTICAL DATA ANALYSIS
ACCESSIBLE APPLICATIONS
The Problem
The Approach
People with physical/motor disabilities have trouble interacting with computers the same way the general public does, leaving them feeling fatigue and disconnected from digital space
The current accessibility systems lack adaption of on-screen elements based on user preferences, leaving frustration and increased intervention of caregivers while users navigate through the applications while performing daily activities
Users are unable to use existing input modalities like switches and adaptive buttons due to their disabilities
Conducted in-depth user interviews to understand pain-points of caregiver and parents as they help the individuals with disabilities interact with computers
Conducted eye tracking studies to investigate interaction behaviors' of users with severe speech and motor impairment
Created eye -controlled accessible edutainment and communication systems
Developed an AI algorithm to control an on-screen cursor using eyes movements as an alternate input modality
The Project
Conducted a series of eye tracking studies to understand existing situation and current pain points of users. The data was supported with in-depth interviews with the caregivers and parents
Executed quantitative and qualitative data analysis from the studies to understand the user preferences on screen for better interaction
Created eye-controlled applications and presented to the stakeholder, and conducted user studies to assess the usage and effectiveness of application
The Outcome
Formulated 4 design guidelines that can be used to design any eye-controlled application
Created two versions of eye-controlled word construction game and conducted A/B testing to understand user preferences of screen elements
Designed an eye-controlled assistive communication and chatting platform that was easy-to-use and reduced intervention of caregivers.
Publications