Tracking the Costs of Clear and Loud Speech: Interactions Between Speech Motor Control and Concurrent Visuomotor Tracking

J Speech Lang Hear Res. 2021 Jun 18;64(6S):2182-2195. doi: 10.1044/2020_JSLHR-20-00264. Epub 2021 Mar 9.

Abstract

Purpose Prior work has demonstrated that competing tasks impact habitual speech production. The purpose of this investigation was to quantify the extent to which clear and loud speech are affected by concurrent performance of an attention-demanding task. Method Speech kinematics and acoustics were collected while participants spoke using habitual, loud, and clear speech styles. The styles were performed in isolation and while performing a secondary tracking task. Results Compared to the habitual style, speakers exhibited expected increases in lip aperture range of motion and speech intensity for the clear and loud styles. During concurrent visuomotor tracking, there was a decrease in lip aperture range of motion and speech intensity for the habitual style. Tracking performance during habitual speech did not differ from single-task tracking. For loud and clear speech, speakers retained the gains in speech intensity and range of motion, respectively, while concurrently tracking. A reduction in tracking performance was observed during concurrent loud and clear speech, compared to tracking alone. Conclusions These data suggest that loud and clear speech may help to mitigate motor interference associated with concurrent performance of an attention-demanding task. Additionally, reductions in tracking accuracy observed during concurrent loud and clear speech may suggest that these higher effort speaking styles require greater attentional resources than habitual speech.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acoustics
  • Dysarthria
  • Humans
  • Speech Acoustics*
  • Speech Production Measurement
  • Speech*