Research Projects

Man wearing a Google Glass with a sign language video.

Improving Classroom Accessibility

Technologies for enhancing the classroom experience for students who are deaf and hard-of-hearing...

Screenshot of a website for requesting accessibility accommodations.

Requesting Accessibility Services

Is it possible to enhance the usability of websites used by members of the university community for requesting access services...

Image of powerpoint slides for teaching accessibility.

Effective Methods of Teaching Accessibility

Can we measure and compare the effectiveness of methods for teaching university computing students about accessibility...

Diagram of a white-cane user standing holding a cane with an smart-phone device strapped to their upper arm.

Situation awareness of blind travelers

We are developing an objective method, using situational awareness techniques, to evaluate navigation technologies to benefit blind travelers...

Screenshot of an animation of a virtual human signer.

Facial Expression for Animations of ASL

Producing linguistically accurate facial expressions for animations of ASL to make them more understandable and natural...

An ASL signer wearing motion-capture equipment.

Generating ASL Animation from Motion-Capture Data

Collecting a motion-capture corpus of native ASL signers and modeling this data to produce linguistically accurate animations...

Depth image of two people standing, as taken by a Kinect camera.

Learning ASL through Real-Time Practice

Enabling students learning American Sign Language (ASL) to practice independently through a tool that provides feedback automatically...

Comprehension Questions for a Text Readability Detection Test.

Predicting English Text Readability for Users

Analyzing English text automatically using computational linguistic tools to identify the difficult level of the content for users...

Screenshot of a video of a human signer and an animation of a virtual human.

Eye-Tracking to Predict User Performance

Analyzing eye-movement behaviors to automatically predict when a user is struggling to understand information content...

An image of a human face with gridwork overlaid, and an image of a virtual human face, showing a grid-like mesh of its structure.

ASL Animation Tools & Technologies

Developing technologies to automate the process of synthesizing animations of a virtual human character performing American Sign Language...

Visualization of various speech parameters as scatterplots or graphs.

Improving the Usability of Resources for Speech Language Therapists

This project investigates the usability and utility of resources available to speech language therapists...

Man wearing a Google Glass with a sign language video

Improving classroom accessibility

How can we improve the classroom experience of deaf and hard-of-hearing students? This project’s goal is to investigate the effectiveness of eyewear computers to display ASL for managing multiple visual sources of information.

Screenshot of a website for requesting accessibility accommodations

Requesting Accessibility Services

RIT’s Department of Access Services enables students to request services for classroom accessibility. This project has re-designed the services request to improve the user experience.

Image of powerpoint slides for teaching accessibility

Effective Methods of Teaching Accessibility

This project examines the effectiveness of a variety of methods for teaching computing students about concepts related to computer accessibility for people with disabilities. This multi-year project will include longitudinal testing of students two years after the instruction to search for lasting impacts.


This project is joint work among Stephanie Ludi, Vicki Hanson, and Matt Huenerfauth.

Diagram of a white-cane user standing holding a cane with an smart-phone device strapped to their upper arm.

Developing an objective method to facilitate the situation awareness of blind travelers

The current evaluation methods of Orientation Assistive Technology (OAT) that aid blind travelers indoors rely on the performance metrics. When enhancing such systems, evaluators conduct qualitative studies to learn where to focus their efforts.

Screenshot of an animation of a virtual human signer

Facial Expression for Animations of ASL

We are investigating techniques for producing linguistically accurate facial expressions for animations of American Sign Language; this would make these animations easier to understand and more effective at conveying information -- thereby improving the accessibility of online information for people who are deaf.


This project is joint work with researchers at Boston University and Rutgers University.

An ASL signer wearing motion-capture equipment

Generating ASL Animation from Motion-Capture Data

This project is investigating techniques for making use of motion-capture data collected from native ASL signers to produce linguistically accurate animations of American Sign Language. In particular, this project is focused on the use of space for pronominal reference and verb inflection/agreement.

This project also supported a summer research internship program for ASL-signing high school students, and REU supplements from the NSF have supported research experiences for visiting undergraduate students.


Data & Corpora

The motion-capture corpus of American Sign Language collected during this project is available for non-commercial use by the research community.

Depth image of two people standing, as taken by a Kinect camera

Learning ASL through Real-Time Practice

We are investigating new video and motion-capture technologies to enable students learning American Sign Language (ASL) to practice their signing independently through a tool that provides feedback automatically.

Comprehension Questions for a Text Readability Detection Test.

Predicting English Text Readability for Users

This project has investigated the use of computational linguistic technologies to identify whether textual information would meet the special needs of users with specific literacy impairments.

In research conducted prior to 2012, we investigated text-analysis tools for adults with intellectual disabilities. A state-of-the-art predictive model of readability was developed that was based on discourse, syntactic, semantic, and other linguistic features.

In current work, we are investigating technologies for a wider variety of users.

Screenshot of a video of a human signer and an animation of a virtual human.

Eye-Tracking to Predict User Performance

Computer users may benefit from user-interfaces that can predict whether the user is struggling with a task based on an analysis of the user's eye movement behaviors. This project is investigating how to conduct precise experiments for measuring eye-tracking movements and user task performance -- relationships between these variables can be examined using machine learning techniques in order to produce preditive models for adaptive user-interfaces.

An important branch of this research has investigated whether eye-tracking technology can be used as a complementary or alternative method of evaluation for animations of sign language, by examining the eye-movements of native signers who view these animations to detect when they may be more difficult to understand.

An image of a human face with gridwork overlaid, and an image of a virtual human face, showing a grid-like mesh of its structure.

ASL Animation Tools & Technologies

The goal of this research is to develop technologies to generate animations of a virtual human character performing American Sign Language.

The funding sources have supported various animation programming platforms that underlie research systems being developed and evaluated at the laboratory.

In current work, we are investigating how to create tools that enable researchers to build dictionaries of animations of individual signs and to efficiently assemble them to produce sentences and longer passages.

Visualization of various speech parameters using scatterplots or graphs.

Improving the Usability of Resources for Speech Language Therapists

This project investigates the usability and utility of resources available to speech language therapists. By understanding the usability of existing resources, we design tools that give insight to the varied language characteristics of diverse individuals with non-fluent aphasia.

Want to get involved?