Imagined Visual Stimuli Change Future Auditory Perception
In this study we examined whether repeatedly imagining a visual stimulus at the same time, but in a different location than an auditory stimulus would elicit cross-modal plasticity of one's representation of acoustic space. The results are presented in a paper that is published in Psychological Science. Click here for full article and here for press release from APS.
Paper Title: Mental Imagery Induces Cross-Modal Sensory Plasticity and Changes Future Auditory Perception
Paper Abstract: Can what we imagine in our minds change how we perceive the world in the future? A continuous process of multisensory integration and recalibration is responsible for maintaining a correspondence between the senses (e.g., vision, touch, audition) and, ultimately, a stable and coherent perception of our environment. This process depends on the plasticity of our sensory systems. The so-called ventriloquism aftereffect—a shift in the perceived localization of sounds presented alone after repeated exposure to spatially mismatched auditory and visual stimuli—is a clear example of this type of plasticity in the audiovisual domain. In a series of six studies with 24 participants each, we investigated an imagery-induced ventriloquism aftereffect in which imagining a visual stimulus elicits the same frequency-specific auditory aftereffect as actually seeing one. These results demonstrate that mental imagery can recalibrate the senses and induce the same cross-modal sensory plasticity as real sensory stimuli.
Schematic overview of the sequence of events within an adaptation block. Each adaptation block involved five repetitions of the sequence in the top part of the figure (150 exposure trials; 100 localization trials) with only one of the adaptation conditions (i.e., leftward, same location, or rightward) repeated dur- ing the adaptation phases of that block. The order of the adaptation blocks was counterbalanced across participants. The cue phase of the countdown reminded the participants of the frequency, location, and content of the stimulus they were to imagine seeing at the end of the countdown and for the duration of the adapta- tion phase. The bottom panel outlines the spatial relationship between imagined visual and real auditory stimuli during the adaptation phase of each block. Stimuli and angles are for explanatory purposes and are not drawn to scale.
Visual-imagery and real-visual-stimulus-induced ventriloquism aftereffects. The plots show logistic regression curves fitted to the group data from the visual-imagery-adaptation conditions in Experiment 1 (a) and real-visual-stimulus-adaptation conditions in Experiment 2 (b). The dotted lines denote the points of subjective equality (PSEs) for each condition. The bar graphs display the mean PSEs as a function of each adaptation condition for each experiment. Error bars represent ±1 SE.
Group localization data from a white noise stimulus following adaptation to a 4 kHz tone. The plots show logis- tic regression curves fitted to the group white-noise localization data from the visual-imagery-adaptation conditions in Experiment 3 (a) and real-visual-stimulus-adaptation conditions in Experiment 4 (b). The dotted lines denote the points of subjective equality (PSEs) for each condition. The bar graphs display the mean PSEs as a function of each adaptation condition for each experiment. Error bars represent ±1 SE.
Ventriloquism aftereffects for 4 kHz tones. The plots show logistic regression curves fitted to the group 4 kHz tone- localization data following adaptation to a 4kHz tone from the visual-imagery-adaptation conditions in Experiment 5 (a) and real-visual-stimulus-adaptation conditions in Experiment 6 (b). The dotted lines denote the points of subjective equality (PSEs) for each condition. The bar graphs display the mean PSEs as a function of each adaptation condition for each experi- ment. Error bars represent ±1 SE.
Exploring the Neural Basis of the Sense of Self
In this study published in Cerebral Cortex, we explored the neural basis fo the sense of self.
Paper Title: Dissociating the Neural Basis of Conceptual
Self-Awareness from Perceptual Awareness and
Unaware Self-Processing
Abstract: Conceptual self-awareness is a mental state in which the content of one’s consciousness refers to a particular aspect of semantic knowledge about oneself. This form of consciousness plays a crucial role in shaping human behavior; however, little is known about its neural basis. Here, we use functional magnetic resonance imaging (fMRI) and a visual masked priming paradigm to dissociate the neural responses related to the awareness of semantic autobiographical information (one’s own name, surname, etc.) from the awareness of information related to any visual stimulus (perceptual awareness), as well as from the unaware processing of self-relevant stimuli. To detect brain activity that is highly selective for selfrelevant information, we used the blood-oxygen-level-dependent (BOLD) adaptation approach, which goes beyond the spatial limitations of conventional fMRI. We found that self-awareness was associated with BOLD adaptation in the medial frontopolar-retrosplenial areas, whereas perceptual awareness and unaware self-processing were associated with BOLD adaptation in the lateral fronto-parietal areas and the inferior temporal cortex, respectively. Thus, using a direct manipulation of conscious awareness we demonstrate for the first time that the neural basis of conceptual self-awareness is neuroanatomically distinct from the network mediating perceptual awareness of the sensory environment or unaware processing of self-related stimuli.
Experimental design and behavioral results. (A) Self-and other-related stimuli (names, surnames, dates of birth, and nationality codes) were presented
Results of fMRI analysis. Brain regions related to (A) conceptual self-awareness, (B) perceptual awareness, (C) unaware processing of self-related information, and (D) unaware processing of all types of stimuli. Areas showing significant BOLD-adaptation effects are highlighted in red (peaks P < 0.05 corrected; cluster maps thresholded at P < 0.001, uncorrected for display purposes) and superimposed on an " inflated " MNI template brain. The bar plots show the BOLD-adaptation effect size (incongruent > congruent trials) and they are added for purely descriptive purposes; error bars denote standard errors. FG, fusiform gyrus; HI, hippocampus; IPS, intraparietal sulcus; IT, inferior temporal cortex; MFP, medial frontal pole; PCS, precentral sulcus; PH, parahippocampal gyrus; RSC, retrosplenial cortex; SMA, supplementary motor area; TP, temporal pole.
Anatomical localization of BOLD-adaptation responses related to conceptual self-awareness. The peak activations from the interaction contrast (also shown in Fig. 2A) are here superimposed on a mean MRI generated from the standardized anatomical MRIs of the 26 participants. As can be seen, the peaks (indicted by the crossing of the blue lines) are located in (A) the medial frontopolar prefrontal cortex, (B) the hippocampus, and (C) the retrosplenial cortex.
Overlap between the " conceptual self-awareness network " and the " DMN " . Neural responses selective for conceptual self-awareness (SA; marked in green)
How The Content of Imagined Sound Changes Visual Perception
In this study, published in Nature Publishing Group's Journal Scientific Reports, we explored whether the content of a sound we imagine in our mind (i.e., what kind of sound we imagine hearing) changes how we see visual motion.
Paper Title: The Content of Imagined Sounds Changes Visual Motion Perception in the Cross-Bounce Illusion
Paper Abstract: Can what we imagine hearing change what we see? Whether imagined sensory stimuli are integrated with external sensory stimuli to shape our perception of the world has only recently begun to come under scrutiny. Here, we made use of the cross-bounce illusion in which an auditory stimulus presented at the moment two passing objects meet promotes the perception that the objects bounce off rather than cross by one another to examine whether the content of imagined sound changes visual motion perception in a manner that is consistent with multisensory integration. The results from this study revealed that auditory imagery of a sound with acoustic properties typical of a collision (i.e., damped sound) promoted the bounce-percept, but auditory imagery of the same sound played backwards (i.e., ramped sound) did not. Moreover, the vividness of the participants’ auditory imagery predicted the strength of this imagery-induced illusion. In a separate experiment, we ruled out the possibility that changes in attention (i.e., sensitivity index d′) or response bias (response bias index c) were sufficient to explain this effect. Together, these findings suggest that this imagery-induced multisensory illusion reflects the successful integration of real and imagined cross-modal sensory stimuli, and more generally, that what we imagine hearing can change what we see.
Schematic overview of the experimental setup and the different overlap conditions used in all experiments and the imagined sound conditions used in Experiment 1a and Experiment 2. In Experiment 1b, the sounds were played to the participants, rather than being imagined, at the moment the discs met. In Experiment 2, the Response question asked whether the discs partially overlapped or completely overlapped rather than whether the discs bounced or crossed. For display purposes, the sizes of the discs and the percentage overlap conditions are not drawn exactly to scale.
Mean proportion of perceived bounce as a function of the percent overlap of the moving discs and the kind of imagined sound in Experiment 1a (a) and heard sound in Experiment 1b (b). The error bars represent ± SE.
Regression plot (and 95% confidence bands) showing the relationship between the participants' self-reported vividness of the imagined sounds and the strength of the cross-bounce illusion. The dotted line denotes the divide between the participants who perceived the discs to bounce more when they imagined the damped compared to ramped sound and the participants who did not.
Mean sensitivity index d′ (a) and response bias index c (b) as a function of the percent the moving discs that overlapped in the two partial overlap conditions and the kind of imagined sound in Experiment 2. The error bars represent ± SE.
When Auditory Motion Changes How We See Visual Motion
By making use of the well-known 'waterfall' illusion (i.e., visual motion aftereffect), whereby staring at the downward motion of a waterfall elicits an strong illusion of visual motion of static objects what the viewer looks away from the waterfall (see demo here), we investigated whether apparent auditory motion (i.e., the sounds of repeated movement in a given direction) could elicit the same effect, and found that indeed auditory motion can elicit a visual motion illusion, albeit across sensory modalities (access paper here)
Paper Title: Auditory Motion Elicits a Visual Motion Aftereffect
Paper Abstract: The visual motion aftereffect is a visual illusion in which exposure to continuous motion in one direction leads to a subsequent illusion of visual motion in the opposite direction. Previous findings have been mixed with regard to whether this visual illusion can be induced cross-modally by auditory stimuli. Based on research on multisensory perception demonstrating the profound influence auditory perception can have on the interpretation and perceived motion of visual stimuli, we hypothesized that exposure to auditory stimuli with strong directional motion cues should induce a visual motion aftereffect. Here, we demonstrate that horizontally moving auditory stimuli induced a significant visual motion aftereffect—an effect that was driven primarily by a change in visual motion perception following exposure to leftward moving auditory stimuli. This finding is consistent with the notion that visual and auditory motion perception rely on at least partially overlapping neural substrates.
Following the baseline motion sensitivity assessment, the participants began one of the auditory motion adaptation (leftward or rightward) blocks. Participants were instructed to maintain fixation on the central fixation point during the experiment. Each trial began with an auditory motion exposure phase (60 s for the first trial and 10 s for each subsequent trial in that block) followed by one of six possible test motion stimuli lasting 1 s (the coherence level for each of the 6 motion stimuli was individually catered to each participants' baseline performance), and the participants subsequently indicated whether they saw the test stimulus move leftward or rightward. The test stimulus in the above example trial shows a 50% coherence trial for illustration purposes. The black arrows in the example test stimulus indicate the direction of each dot in this example for display purposes only. Each sound motion adaptation block was followed by a sound motion adaptation block with the sound moving in the opposite direction and was repeated once, resulting in 4 blocks total. Block order was counterbalanced across participants.
Visual motion aftereffect following leftward and rightward auditory motion. Curves represent logistic regression functions fitted to group data. The data points represent the mean frequency of a “rightward” response. Normalized coherence values are represented on the x-axis, with negative values arbitrarily assigned to leftward moving motion displays and positive values assigned to rightward moving displays. The bar plot represents the participants' mean point of subjective equality (PSE) for the leftward and rightward auditory motion adaptation conditions. Asterisks next to bars indicate a significant (p < 0.01) shift in the participants' PSE compared to a normalized coherence test value of zero, and “n.s.” indicates that there was no significant shift (p > 0.05) from zero. Asterisks between bars indicate a significant (p < 0.01) difference between the participants' PSEs for the rightward and leftward auditory motion adaptation conditions. Error bars represent ± SEMs.
Where Imagination Meets Sensory Perception In The Brain
In this study, published in the Journal of Neuroscience, we explored the neural correlates of the influence of imagined visual stimuli over auditory perception using an adaptation of a well-known cross-modal illusion known as the 'ventriloqusim-effect' whereby an auditory stimulus presented at the same time, but in a different location than a visual stimulus leads to a mis-localization of sound towards the visual stimulus. Here we explored whether imagined visual stimuli have the same influence over auditory localization, and in particular whether the same neural processes involved in the capture of sound by visual stimuli are involved in the capture of sound by imagined visual stimuli.
Paper Title: The Fusion of Mental Imagery and Sensation in The Temporal Association Cortex
Paper Abstract: It is well understood that the brain integrates information that is provided to our different senses to generate a coherent multisensory percept of the world around us (Stein and Stanford, 2008), but how does the brain handle concurrent sensory information from our mind and the external world? Recent behavioral experiments have found that mental imagery—the internal representation of sensory stimuli in one's mind—can also lead to integrated multisensory perception (Berger and Ehrsson, 2013); however, the neural mechanisms of this process have not yet been explored. Here, using functional magnetic resonance imaging and an adapted version of a well known multisensory illusion (i.e., the ventriloquist illusion; Howard and Templeton, 1966), we investigated the neural basis of mental imagery-induced multisensory perception in humans. We found that simultaneous visual mental imagery and auditory stimulation led to an illusory translocation of auditory stimuli and was associated with increased activity in the left superior temporal sulcus (L. STS), a key site for the integration of real audiovisual stimuli (Beauchamp et al., 2004a, 2010; Driver and Noesselt, 2008; Ghazanfar et al., 2008; Dahl et al., 2009). This imagery-induced ventriloquist illusion was also associated with increased effective connectivity between the L. STS and the auditory cortex. These findings suggest an important role of the temporal association cortex in integrating imagined visual stimuli with real auditory stimuli, and further suggest that connectivity between the STS and auditory cortex plays a modulatory role in spatially localizing auditory stimuli in the presence of imagined visual stimuli.
Task design.Each trial was preceded by 12 s of fixation, followed by instructions and a countdown. The instructions informed the participants that theyshould imagine the circle on that trial, and the countdown cued the participants to the timing and location (left or right; right in the above example trial) of the to-be-imagined visual stimulus. Following the countdown, the participants imagined the brief appearance (100 ms) of the visualstimulus once persecond for 12s while a brief auditory (100 ms) stimulus was presented in synchrony or asynchrony (i.e., 500 ms following the onset of the first imagined visual stimulus). Participants indicated whether they perceived the sound to come from the left, center, or right of fixation after each 12 s trial.
Imagery-induced ventriloquism. A, Behavioral results obtained in the scanner revealed a stronger ventriloquist effect when auditory stimuli were presented synchronously with imagined visual stimuli (AVi sync.) compared with asynchronously (AVi async.). B, The same effect was found for real visual stimuli presented synchronously with an auditory stimulus (AV sync.) compared with asynchronously (AV async.) during functional localizer scans. Error bars denote +/- 1 SEM; asterisks between bars indicate significance (*p’s % 0.05).
Neural basis of imagery-induced ventriloquism. A, Activity associated with audiovisual synchrony (vs asynchrony) for imagined visual stimuli within the functionally defined multisensory regions of interest (fROIs outlined in white) overlaid on a representative inflated cortical surface (left); coronal and sagittal sections displaying the peak activation in the L. STS is overlaid on the average normalized anatomical image from our participants (right). The activity differences observed in the parietal cortex were the result of deactivations, i.e., less activity for synchronous auditory and imagined visual stimuli compared with the resting baseline (see Results). B, Bar plot shows the parameter estimates from the significant peak of activation in the L. STS; error bars denote +/- 1 SEM. C, Post hoc multiple-regression analysis demonstrating that the activity in the L. STS in the AVi synchronous [Avi sync.; vs AVi asynchronous (Avi async.)] condition could be predicted by the strength (i.e., the difference of AVi synchrony and AVi asynchrony ventriloquism indices) of the imagery-induced ventriloquist effect. D, Significant enhanced connectivity between the right auditory cortex and the L. STS seed region for the AVi synchronous (vs AVi asynchronous) condition overlaid on a representative inflated cortical surface (bottom left). A yellow circle marks the approximate location of the L. STS seed on an inflated left hemisphere cortical surface (top left; fROIs outlined in white). Coronal and axial sections displaying the peak connectivity to the right auditory cortex are overlaid on the average anatomical image from our participants (right). E, Plot of the PPI for one representative subject showing a steeper regression slope relating L. STS activity to the response magnitude of the right(R.) auditory cortex during the AVi synchrony(AVi sync., green) compared with the AVi asynchrony condition(AVi async., black).F,Post hoc multiple-regression analysis demonstrating that effective connectivity from L. STS to the right (R.) auditory cortex in the AVi synchrony (vs AVi asynchrony) condition could be predicted by the strength of the imagery-induced ventriloquist effect. Activation maps are displayed at p-uncorrected < 0.01 for display purposes only.
Effective connectivity during perceptual functional localizer scans. A, Significant enhanced connectivity between the left auditory cortex (-51, -27, 6; t(21) = 3.74, pFWE-corrected < 0.05) and the L. STS seed region for the AV synchrony (vs AV asynchrony) condition overlaid on a representative inflated cortical surface (left) and in a coronal (top right) and axial (bottom right)section of the average anatomical image from our participants. A yellow circle marks the approximate location of the L. STS seed. B, Plot of the PPI for one representative subject showing a steeper regression slope relating L. STS activity with the response magnitude of the left (L.) auditory cortex during the AVsynchrony (AVsync., blue) compared with the AV asynchrony condition(AV async., black). C, Significant(63, -25, 9; t(21) = 3.65, p-FWE-corrected < 0.05) enhanced connectivity between the right(R.) auditory cortex and the L. STS seed region for the AV synchrony (vs AV asynchrony) condition overlaid on a representative inflated cortical surface(left) and inthe coronal(top right) and axial(bottom right)section of the average anatomical image from our participants. D, Plot of the PPI for one representative subject showing a steeper regression slope relating L. STS activity with the response magnitude of the right (R.) auditory cortex during the AV synchrony (AV sync., blue) compared with the AV asynchrony condition (AV async., black). Activation maps are displayed at p-uncorrected < 0.005 for display purposes.
When Imagination Changes How We Sense
In this study, published in Current Biology we explored the ways in which imagining something in one sensory modality (e.g., vision) can change our perception of a different sensory modality.
Paper Title: Mental Imagery Changes Multisensory Perception
Paper Abstract: Multisensory interactions are the norm in perception, and an abundance of research on the interaction and integration of the senses has demonstrated the importance of combining sensory information from different modalities on our perception of the external world [1–9]. However, although research on mental imagery has revealed a great deal of functional and neuroanatomical overlap between imagery and perception, this line of research has primarily focused on similarities within a particular modality [10–16] and has yet to address whether imagery is capable of leading to multisensory integration. Here, we devised novel versions of classic multisensory paradigms to systematically examine whether imagery is capable of integrating with perceptual stimuli to induce multisensory illusions. We found that imagining an auditory stimulus at the moment two moving objects met promoted an illusory bounce percept, as in the classic cross-bounce illusion; an imagined visual stimulus led to the translocation of sound toward the imagined stimulus, as in the classic ventriloquist illusion; and auditory imagery of speech stimuli led to a promotion of an illusory speech percept in a modified version of the McGurk illusion. Our findings provide support for perceptually based theories of imagery and suggest that neuronal signals produced by imagined stimuli can integrate with signals generated by real stimuli of a different sensory modality to create robust multisensory percepts. These findings advance our understanding of the relationship between imagery and perception and provide new opportunities for investigating how the brain distinguishes between endogenous and exogenous sensory events.
Altered Motion Perception from Imagery. (A) Proportion of perceived bounce when no sound was imagined and when sounds were imagined before, at, and after the moment of coincidence. The timing of the imagined sound led to significant alterations in the perception of motion [F(3, 19) = 9.78, p < 0.001]. Planned comparisons revealed a significant promotion of the perception of bouncing when the sound was imagined at the moment of coincidence compared to the passive viewing condition [t(21) = 3.77, p = 0.001], whereas imagination of the sound before [t(21) = 1.31, not significant (n.s.)] or after [t(21) = –0.12, n.s.] coincidence did not. See the Supplemental Results for the results of an additional condition containing tactile imagery (data not shown). (B) Proportion of perceived bounce for both the imagined and real sound at coincidence, finger lift at coincidence, and view-only conditions. Planned comparisons revealed that imagination of a sound at the moment of coincidence significantly promoted the perception of bounce compared to the passive viewing [t(11) = 5.26, p < 0.001] and imagined finger lift [t(11) = 3.35, p = 0.007] conditions, whereas imagination of the finger lift [t(11) = 1.77, n.s.] or a real finger lift [t(11) = .54, n.s.] did not. Error bars represent 6 SEM. Asterisks between bars indicate significance (**p < 0.01) for planned comparisons.
Ventriloquism Effect from Visual Imagery. (A) Percent visual bias for stimuli presented at disparities of 15 and 30 for both imagined (left) and real (right) visual stimuli. Significant biases of perceived sound location toward the imagined visual stimuli were observed for disparities of 15 [t(20) = 4.11, p = 0.001, single-sample t test (test value = 0)] and 30 [t(20) = 18.39, p < 0.001, single-sample t test (test value = 0)], with a stronger bias for disparities of 30 compared to 15 [t(20) = 2.05, p = 0.05, paired-samples t test]. Significant visual biases were also observed for 15 [t(20) = 10.38, p < 0.001, single-sample t test (test value = 0)] and 30 [t(20) = 27.87, p < 0.001, single-sample t test (test value = 0)] disparities, and a stronger bias for disparities of 30 compared to 15 [t(20) = 2.76, p = 0.012, pairedsamples t test] was observed in the perceptual version of the experiment. (B) Multisensory enhancement of congruent, 15 incongruent, and 30 incongruent audiovisual stimuli for imagined (left) and real (right) stimuli. Multisensory enhancement of auditory perception was observed when the participant imagined a visual stimulus in the same location as an auditory stimulus [t(20) = 23.63, p = 0.002, singlesample t test (test value = 0)]. Correspondingly, a significant multisensory enhancement of auditory perception was also observed for a real visual stimulus presented in the same location as an auditory stimulus [t(20) = 28.20, p < 0.001]. (C) Mean location, in degrees, of the first eight response reversals on each staircase for the imagine-circle and no-circle conditions. (D) The average distance (in degrees) between the left and right staircases was significantly larger in the imagine-circle condition than in the no-circle condition [t(17) = 2.39, p = 029]. Error bars denote 6 SEM. Asterisks above bars indicate significance (**p < 0.01) for singlesample tests. Asterisks between bars indicate significance (*p < 0.05) for paired-sample tests.
Auditory Imagery-Induced McGurk Effect. Proportion of perceived illusory ‘‘da’’ when the participants imagined hearing ‘‘ba,’’ imagined hearing ‘‘ka,’’ or passively viewed silent videos of a person saying ‘‘ga’’ for McGurk illusion perceivers and non-perceivers. A significant 3 (imagine ‘‘ba,’’ imagine ‘‘ka,’’ view only) 3 2 (perceivers, non-perceivers) interaction was observed [F(2, 21) = 5.18, p = 0.01]. A significant increase in the illusory ‘‘da’’ percept was observed in the imagine-‘‘ba’’ condition for perceivers compared to non-perceivers [t(21) = 2.73, p = 0.012]. No significant differences were observed between perceivers and non-perceivers for the imagine-‘‘ka’’ [t(21) = 20.93, n.s.] or view-only [t(21) = 20.52, n.s.] conditions. Error bars denote 6 SEM. The asterisk indicates significance (*p < 0.016) for independent sample tests.