Using large language models to accelerate communication for eye gaze typing users with ALS

Nat Commun. 2024 Nov 1;15(1):9449. doi: 10.1038/s41467-024-53873-3.

Abstract

Accelerating text input in augmentative and alternative communication (AAC) is a long-standing area of research with bearings on the quality of life in individuals with profound motor impairments. Recent advances in large language models (LLMs) pose opportunities for re-thinking strategies for enhanced text entry in AAC. In this paper, we present SpeakFaster, consisting of an LLM-powered user interface for text entry in a highly-abbreviated form, saving 57% more motor actions than traditional predictive keyboards in offline simulation. A pilot study on a mobile device with 19 non-AAC participants demonstrated motor savings in line with simulation and relatively small changes in typing speed. Lab and field testing on two eye-gaze AAC users with amyotrophic lateral sclerosis demonstrated text-entry rates 29-60% above baselines, due to significant saving of expensive keystrokes based on LLM predictions. These findings form a foundation for further exploration of LLM-assisted text entry in AAC and other user interfaces.

MeSH terms

  • Adult
  • Amyotrophic Lateral Sclerosis* / physiopathology
  • Communication
  • Communication Aids for Disabled*
  • Female
  • Fixation, Ocular* / physiology
  • Humans
  • Sprache
  • Male
  • Middle Aged
  • Pilot Projects
  • User-Computer Interface