TY - JOUR
T1 - Crossmodal phase reset and evoked responses provide complementary mechanisms for the influence of visual speech in auditory cortex
AU - Mégevand, Pierre
AU - Mercier, Manuel R.
AU - Groppe, David M.
AU - Golumbic, Elana Zion
AU - Mesgarani, Nima
AU - Beauchamp, Michael S.
AU - Schroeder, Charles E.
AU - Mehta, Ashesh D.
N1 - Publisher Copyright: © 2020 the authors
PY - 2020/10/28
Y1 - 2020/10/28
N2 - Natural conversation is multisensory: when we can see the speaker's face, visual speech cues improve our comprehension. The neuronal mechanisms underlying this phenomenon remain unclear. The two main alternatives are visually mediated phase modulation of neuronal oscillations (excitability fluctuations) in auditory neurons and visual input-evoked responses in auditory neurons. Investigating this question using naturalistic audiovisual speech with intracranial recordings in humans of both sexes, we find evidence for both mechanisms. Remarkably, auditory cortical neurons track the temporal dynamics of purely visual speech using the phase of their slow oscillations and phase-related modulations in broadband high-frequency activity. Consistent with known perceptual enhancement effects, the visual phase reset amplifies the cortical representation of concomitant auditory speech. In contrast to this, and in line with earlier reports, visual input reduces the amplitude of evoked responses to concomitant auditory input. We interpret the combination of improved phase tracking and reduced response amplitude as evidence for more efficient and reliable stimulus processing in the presence of congruent auditory and visual speech inputs.
AB - Natural conversation is multisensory: when we can see the speaker's face, visual speech cues improve our comprehension. The neuronal mechanisms underlying this phenomenon remain unclear. The two main alternatives are visually mediated phase modulation of neuronal oscillations (excitability fluctuations) in auditory neurons and visual input-evoked responses in auditory neurons. Investigating this question using naturalistic audiovisual speech with intracranial recordings in humans of both sexes, we find evidence for both mechanisms. Remarkably, auditory cortical neurons track the temporal dynamics of purely visual speech using the phase of their slow oscillations and phase-related modulations in broadband high-frequency activity. Consistent with known perceptual enhancement effects, the visual phase reset amplifies the cortical representation of concomitant auditory speech. In contrast to this, and in line with earlier reports, visual input reduces the amplitude of evoked responses to concomitant auditory input. We interpret the combination of improved phase tracking and reduced response amplitude as evidence for more efficient and reliable stimulus processing in the presence of congruent auditory and visual speech inputs.
KW - Audiovisual speech
KW - Broadband high-frequency activity
KW - Crossmodal stimuli
KW - Intracranial electroencephalography
KW - Neuronal oscillations
KW - Phase-amplitude coupling
UR - http://www.scopus.com/inward/record.url?scp=85094910154&partnerID=8YFLogxK
U2 - 10.1523/JNEUROSCI.0555-20.2020
DO - 10.1523/JNEUROSCI.0555-20.2020
M3 - مقالة
C2 - 33023923
SN - 0270-6474
VL - 40
SP - 8530
EP - 8542
JO - Journal of Neuroscience
JF - Journal of Neuroscience
IS - 44
ER -