More Than Words: the Relative Roles of Prosody and Semantics in the Perception of Emotions in Spoken Language by Postlingual Cochlear Implant Users

Research output: Contribution to journalArticlepeer-review

Abstract

Objectives: The processing of emotional speech calls for the perception and integration of semantic and prosodic cues. Although cochlear implants allow for significant auditory improvements, they are limited in the transmission of spectro-temporal fine-structure information that may not support the processing of voice pitch cues. The goal of the current study is to compare the performance of postlingual cochlear implant (CI) users and a matched control group on perception, selective attention, and integration of emotional semantics and prosody. Design: Fifteen CI users and 15 normal hearing (NH) peers (age range, 18-65 years) 1istened to spoken sentences composed of different combinations of four discrete emotions (anger, happiness, sadness, and neutrality) presented in prosodic and semantic channels - T-RES: Test for Rating Emotions in Speech. In three separate tasks, listeners were asked to attend to the sentence as a whole, thus integrating both speech channels (integration), or to focus on one channel only (rating of target emotion) and ignore the other (selective attention). Their task was to rate how much they agreed that the sentence conveyed each of the predefined emotions. In addition, all participants performed standard tests of speech perception. Results: When asked to focus on one channel, semantics or prosody, both groups rated emotions similarly with comparable levels of selective attention. When the task was called for channel integration, group differences were found. CI users appeared to use semantic emotional information more than did their NH peers. CI users assigned higher ratings than did their NH peers to sentences that did not present the target emotion, indicating some degree of confusion. In addition, for CI users, individual differences in speech comprehension over the phone and identification of intonation were significantly related to emotional semantic and prosodic ratings, respectively. Conclusions: CI users and NH controls did not differ in perception of prosodic and semantic emotions and in auditory selective attention. However, when the task called for integration of prosody and semantics, CI users overused the semantic information (as compared with NH). We suggest that as CI users adopt diverse cue weighting strategies with device experience, their weighting of prosody and semantics differs from those used by NH. Finally, CI users may benefit from rehabilitation strategies that strengthen perception of prosodic information to better understand emotional speech.

Original languageEnglish
Pages (from-to)1378-1389
Number of pages12
JournalEar and Hearing
Volume43
Issue number4
DOIs
StatePublished - 1 Jul 2022

Keywords

  • Cochlear implant
  • Emotion identification
  • Emotion perception
  • Postlingual deafness
  • Prosody

All Science Journal Classification (ASJC) codes

  • Otorhinolaryngology
  • Speech and Hearing

Fingerprint

Dive into the research topics of 'More Than Words: the Relative Roles of Prosody and Semantics in the Perception of Emotions in Spoken Language by Postlingual Cochlear Implant Users'. Together they form a unique fingerprint.

Cite this