Paper - HoloAAC: A Mixed Reality AAC Application for People with Expressive Language Difficulties

HCII 2024

Abstract

We present a novel AAC application, HoloAAC, based on mixed reality that helps people with expressive language difficulties communicate in grocery shopping scenarios via a mixed reality device. A user, who has difficulty in speaking, can easily convey their intention by pressing a few buttons. Our application uses computer vision techniques to automatically detect grocery items, helping the user quickly locate the items of interest. In addition, our application uses natural language processing techniques to categorize the sentences to help the user quickly find the desired sentence. We evaluate our mixed reality-based application on AAC users and compare its efficacy with traditional AAC applications. HoloAAC contributed to the early exploration of context-aware AR-based AAC applications and provided insights for future research.

Acknowledgments

We are grateful to the participants for their feedback on our application. This project was supported by NSF grants (award numbers: 1942531 and 2128867).

BibTeX

@inproceedings{yu2024holoaac,
  title={HoloAAC: A Mixed Reality AAC Application for People with Expressive Language Difficulties},
  author={Yu, Liuchuan and Feng, Huining and Alghofaili, Rawan and Byun, Boyoung and O’Neal, Tiffany and Rampalli, Swati and Chung, Yoosun and Genaro Motti, Vivian and Yu, Lap-Fai},
  booktitle={International Conference on Human-Computer Interaction},
  pages={304--324},
  year={2024},
  organization={Springer}
}

Last update @ Jul 6, 2024