Navigation Menu

News

Check out the latest news from the lab...

Conferences

Conferences relevant to the research at the LATLab...

Funded Research Projects

There are several funded research projects at the lab:


Learning ASL through Real-Time Practice

Microsoft Kinect Sensor

This project is joint work with researchers at CUNY City College and CUNY Hunter College.

We are investigating new video and motion-capture technologies to enable students learning American Sign Language (ASL) to practice their signing independently through a tool that provides feedback automatically.

Funding Source:

Matt Huenerfauth, PI. September 2014 to August 2018.  “CHS: Medium: Collaborative Research: Immediate Feedback to Support Learning American Sign Language through Multisensory Recognition.”   National Science Foundation, CISE Directorate, IIS Division.  Amount: $537,997. (Collaborative research, linked to corresponding NSF research grants to YingLi Tian, P.I., CUNY City College, for $557,918 and to Elaine Gale, P.I., CUNY Hunter College, for $104,000.)

Project Personnel:

Matt Huenerfauth


Facial Expression for Animations of American Sign Language

Still image from an animation of facial expression during ASL.

This project is joint work with researchers at Boston University and Rutgers University.

We are investigating techniques for producing linguistically accurate facial expressions for animations of American Sign Language; this would make these animations easier to understand and more effective at conveying information -- thereby improving the accessibility of online information for people who are deaf.

Funding Source:

Matt Huenerfauth, PI. July 2011 to June 2015.  “Generating Accurate Understandable Animations of American Sign Language Animation.”   National Science Foundation, CISE Directorate, IIS Division.  Amount: $338,005. (Collaborative research, linked to corresponding NSF research grants to Carol Neidle, P.I., Boston University, for $385,957 and to Dimitris Metaxas, P.I., Rutgers University, for $469,996.)

Sample Publication:

Matt Huenerfauth, Pengfei Lu, Andrew Rosenberg. 2011. “Evaluating the Importance of Facial Expression in American Sign Language and Pidgin Signed English Animations.” Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2011), Dundee, Scotland, UK.

Project Personnel:

Hernisa Kacorri, Matt Huenerfauth


Generating ASL Animation from Motion-Capture Data

Photo of a signer wearing motion-capture equipment.

This project is investigating techniques for making use of motion-capture data collected from native ASL signers to produce linguistically accurate animations of American Sign Language. In particular, this project is focused on the use of space for pronominal reference and verb inflection/agreement.

This project also supports a summer research internship program for ASL-signing high school students, and REU supplements from the NSF have supported research experiences for visiting undergraduate students.

Funding Sources:

Matt Huenerfauth, PI. June 2008 to May 2013.  “CAREER: Learning to Generate American Sign Language Animation through Motion-Capture and Participation of Native ASL Signers.”   National Science Foundation, Faculty Early Career Development (CAREER) Award Program, CISE Directorate, IIS Division, HCC Cluster.  Amount: $581,496.

Matt Huenerfauth, PI. June 2011 to May 2012. Research Experiences for Undergraduates (REU) Supplement to “CAREER: Learning to Generate American Sign Language Animation through Motion-Capture and Participation of Native ASL Signers.” National Science Foundation, CISE Directorate, IIS Division. Amount: $12,000.

Matt Huenerfauth, PI. June 2010 to May 2011. Research Experiences for Undergraduates (REU) Supplement to “CAREER: Learning to Generate American Sign Language Animation through Motion-Capture and Participation of Native ASL Signers.” National Science Foundation, CISE Directorate, IIS Division. Amount: $12,000.

Matt Huenerfauth, PI. June 2009 to May 2010. Research Experiences for Undergraduates (REU) Supplement to “CAREER: Learning to Generate American Sign Language Animation through Motion-Capture and Participation of Native ASL Signers.” National Science Foundation, CISE Directorate, IIS Division. Amount: $12,000.

Sample Publications:

Matt Huenerfauth, Pengfei Lu. 2010. “Modeling and Synthesizing Spatially Inflected Verbs for American Sign Language Animations.” In Proceedings of The 12th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2010), Orlando, Florida, USA. New York: ACM Press.

Matt Huenerfauth, Pengfei Lu. 2010. “Accurate and Accessible Motion-Capture Glove Calibration for Sign Language Data Collection.” ACM Transactions on Accessible Computing. New York: ACM Press.

Pengfei Lu, Matt Huenerfauth. 2010. “Collecting a Motion-Capture Corpus of American Sign Language for Data-Driven Generation Research,” Proceedings of the First Workshop on Speech and Language Processing for Assistive Technologies (SLPAT), Human Language Technologies: The 11th Annual Conference of the North American Chapter of the Association for Computational Linguistics (HLT-NAACL 2010), Los Angeles, CA, USA. East Stroudsburg, PA: Association for Computational Linguistics.

Webpage for Project:

http://latlab.cs.qc.cuny.edu/nsf0746556/

Project Personnel:

Pengfei Lu, Matt Huenerfauth, Jonathan Lamberton, Lijun Feng, Meredith Turtletaub, Amanda Krieger, Kelsey Gallagher, Wesley Clarke, Aaron Pagan, Jaime Penzellna, Giovanni Moriarty, Kenya Bryant, Raymond Ramirez


Sign Language Eye-tracking Data Analysis and Distribution

Screenshot eye tracking.

Despite the recent ubiquity of eye tracking technologies, they have never before been used to analyze where deaf people are looking when interacting with computer animations of sign language. These experiments may reveal aspects of the animations that include errors or indicate whether some portions of the animation are successfully conveying information.

Funding Source:

Matt Huenerfauth.  2012.  "Sign Language Eye-tracking Data Analysis and Distribution."  Graduate Research Technology Initiative 2012/13, Queens College, The City University of New York.  Total Amount: $20,000.

Project Personnel:

Allen Harper, Hernisa Kacorri, Matt Huenerfauth


Sign Language Video Analysis for Generating Realistic ASL Animation

Screenshot of a sign language video.

In ongoing NSF-funded research, our lab uses motion-capture equipment to digitize the movements of humans performing ASL sentences. We also collect video recordings of these sign language performances, and this grant will support our analysis of this video data using analysis software to identify key movement information from the videos of human signers.

Funding Source:

Matt Huenerfauth.  2011.  "Sign Language Video Analysis for Generating Realistic ASL Animation."  Graduate Research Technology Initiative 2011/12, Queens College, The City University of New York.  Total Amount: $20,000.

Project Personnel:

Pengfei Lu, Hernisa Kacorri, Matt Huenerfauth


American Sign Language Animation Generation Technologies

Screenshot of an animation of a signer.

The goal of this research is to develop technologies to generate animations of a virtual human character performing American Sign Language.

The funding sources have supported various animation programming platforms that underlie research systems being developed and evaluated at the laboratory.

Funding Sources:

Matt Huenerfauth, PI. June 2007 to June 2011. “Generating Animations of American Sign Language.” Go PLM Grant Program. Siemens A&D UGS PLM Software. Amount: $633,150.

Visage Technologies software: Visage Life and Visage Interactive. This project uses character animation software from Visage Technologies AB (www.visagetechnologies.com) under the free Academic License.

Sample Publcations:

Matt Huenerfauth, Pengfei Lu. 2010. “Effect of Spatial Reference and Verb Infection on the Usability of American Sign Language Animations.” Universal Access in the Information Society. Berlin/Heidelberg: Springer.

Matt Huenerfauth. 2009. "Improving Spatial Reference in American Sign Language Animation through Data Collection from Native ASL Signers." In Proceedings of the International Conference on Universal Access in Human-Computer Interaction. San Diego, CA.

Matt Huenerfauth. 2009. "A Linguistically Motivated Model for Speed and Pausing in Animations of American Sign Language." ACM Transactions on Accessible Computing.

Project Personnel:

Pengfei Lu, Matt Huenerfauth, Jonathan Lamberton


Generating Expressive Speech Animations

Still image from an animation of facial expression during speech.

This project investigated the generation of speech animations that contain facial expressions to indicate important prosodic information. Such animations could be used in accessibility applications for people who are deaf or hard-of-hearing. Evaluations of prototype animations were conducted.

Funding Source:

Andrew Rosenberg, PI. Matt Huenerfauth, co-PI. December 2009 to December 2010. “Generating Expressive Cued Speech from Audio Speech Signals.” Research Enhancement Committee, Queens College, The City University of New York. Amount: $12,800.

Sample Publication:

Matt Huenerfauth, Pengfei Lu, Andrew Rosenberg. 2011. “Evaluating the Importance of Facial Expression in American Sign Language and Pidgin Signed English Animations.” Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2011), Dundee, Scotland, UK.

Project Personnel:

Andrew Rosenberg, Matt Huenerfauth, Pengfei Lu


Eye-Tracking to Predict User Performance

Screenshot of a computer desktop.

Computer users may benefit from user-interfaces that can predict whether the user is struggling with a task based on an analysis of the user's eye movement behaviors. This project is investigating how to conduct precise experiments for measuring eye-tracking movements and user task performance -- relationships between these variables can be examined using machine learning techniques in order to produce preditive models for adaptive user-interfaces.

Funding Source:

Award to Computer Science Department (participating faculty: Jinlin Chen, Matt Huenerfauth, Christopher Vickery).  2009.  "Eye-Tracking Analysis for User Interface Design."  Graduate Investment Initiative 2009, Queens College, The City University of New York.  Total Amount: $30,000.

Project Personnel:

Allen Harper, Matt Huenerfauth


Text Readability for Adults with Intellectual Disabilities

Clip art of people reading books.

This project investigated the use of computational linguistic technologies to identify whether textual information would meet the special needs of adults with intellectual disabilities. A state-of-the-art predictive model of readability was developed that was based on discourse, syntactic, semantic, and other linguistic features.

Funding Source:

Matt Huenerfauth, PI.  December 2008. “Text readability software for adults with intellectual disabilities.”  Research Enhancement Committee, Queens College, The City University of New York.  Amount: $10,000.

Sample Publications:

Lijun Feng, Martin Jansche, Matt Huenerfauth, Noémie Elhadad. 2010. “A Comparison of Features for Automatic Readability Assessment.” In Proceedings of The 23rd International Conference on Computational Linguistics (COLING 2010), Beijing, China.

Matt Huenerfauth, Lijun Feng, Noemie Elhadad. 2009. “Comparing Evaluation Techniques for Text Readability Software for Adults with Intellectual Disabilities.” In Proceedings of the 11th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2009), Pittsburgh, Pennsylvania, USA.

Lijun Feng, Noemie Elhadad, Matt Huenerfauth. 2009. “Cognitively Motivated Features for Readability Assessment,” Proceedings of the 12th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2009), Athens, Greece.

Project Personnel:

Lijun Feng, Noemie Elhadad (Columbia University), Matt Huenerfauth


Educational Software for Deaf Users

Screenshot of an animation of a signer.

This project investigated the development of educational software including animations of a human character for deaf or hard-of-hearing students to practice their communication skills.

Funding Source:

Matt Huenerfauth, PI.  July 2008 to June 2009. “Educational Software for Deaf Users.”  Professional Staff Congress - City University of New York (PSC-CUNY) Research Award Program, Regular-Cycle Round 39.  Amount: $3,800.

Sample publication:

Matt Huenerfauth. 2009. “Improving Spatial Reference in American Sign Language Animation through Data Collection from Native ASL Signers.” International Conference on Universal Access in Human-Computer Interaction (UAHCI). San Diego, CA. July 2009. In C. Stephanidis (Ed.), Universal Access in HCI, Part III, HCII 2009, LNCS 5616, pp. 530–539, 2009. Berlin/Heidelberg: Springer-Verlag.

Project Personnel:

Matt Huenerfauth


Timing Parameters for ASL Animations

Screenshot of a user-interface used during our experiments.

This project determined guidelines for how to best set parameters of speed and timing needed to generate animations of American Sign Language.  The goal of this work is to learn how to make these animations more understandable and natural-looking for ASL signers.

Funding Source:

Matt Huenerfauth, PI.  July 2007 to December 2008. “Evaluating Parameters for American Sign Language Animations.”  Professional Staff Congress - City University of New York (PSC-CUNY) Research Award Program, Out-Of-Cycle Round 38.  Amount: $4,095.

Sample Publications:

Matt Huenerfauth. 2009. "A Linguistically Motivated Model for Speed and Pausing in Animations of American Sign Language." ACM Transactions on Accessible Computing.

Matt Huenerfauth. 2008. "Evaluation of a Psycholinguistically Motivated Timing Model for Animations of American Sign Language." The 10th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2008), Halifax, Nova Scotia, Canada.

Project Personnel:

Jonathan Lamberton, Matt Huenerfauth