emotion modulates activity in the ‘what’ but not ‘where...

11
Emotion modulates activity in the whatbut not whereauditory processing pathway James H. Kryklywy d,e , Ewan A. Macpherson c,f , Steven G. Greening b,e , Derek G.V. Mitchell a,b,d,e, a Department of Psychiatry, University of Western Ontario, London, Ontario N6A 5A5, Canada b Department of Anatomy & Cell Biology, University of Western Ontario, London, Ontario N6A 5A5, Canada c National Centre for Audiology, University of Western Ontario, London, Ontario N6A 5A5, Canada d Graduate Program in Neuroscience, University of Western Ontario, London, Ontario N6A 5A5, Canada e Brain and Mind Institute, University of Western Ontario, London, Ontario N6A 5A5, Canada f School of Communication Sciences and Disorders, University of Western Ontario, London, Ontario N6A 5A5, Canada abstract article info Article history: Accepted 8 May 2013 Available online 24 May 2013 Keywords: Auditory localization Emotion Auditory processing pathways Dual processing pathways Virtual environment fMRI Auditory cortices can be separated into dissociable processing pathways similar to those observed in the vi- sual domain. Emotional stimuli elicit enhanced neural activation within sensory cortices when compared to neutral stimuli. This effect is particularly notable in the ventral visual stream. Little is known, however, about how emotion interacts with dorsal processing streams, and essentially nothing is known about the impact of emotion on auditory stimulus localization. In the current study, we used fMRI in concert with individualized auditory virtual environments to investigate the effect of emotion during an auditory stimulus localization task. Surprisingly, participants were signicantly slower to localize emotional relative to neutral sounds. A separate localizer scan was performed to isolate neural regions sensitive to stimulus location independent of emotion. When applied to the main experimental task, a signicant main effect of location, but not emo- tion, was found in this ROI. A whole-brain analysis of the data revealed that posterior-medial regions of au- ditory cortex were modulated by sound location; however, additional anterior-lateral areas of auditory cortex demonstrated enhanced neural activity to emotional compared to neutral stimuli. The latter region re- sembled areas described in dual pathway models of auditory processing as the whatprocessing stream, prompting a follow-up task to generate an identity-sensitive ROI (the whatpathway) independent of loca- tion and emotion. Within this region, signicant main effects of location and emotion were identied, as well as a signicant interaction. These results suggest that emotion modulates activity in the what,but not the where,auditory processing pathway. © 2013 Elsevier Inc. All rights reserved. Introduction The ability to interact effectively in an environment requires the accurate recognition and localization of surrounding objects and the capacity to prioritize these objects for behavior. One characteristic known to modulate this is the emotional nature of the stimuli (Adolphs, 2008; Lang and Davis, 2006; Pessoa and Ungerleider, 2004; Vuilleumier, 2005). Considerable evidence suggests that emo- tional visual stimuli gain rapid and often preferential access to the brain's processing resources. At the behavioral level, emotional visual stimuli are detected faster than neutral stimuli (Graves et al., 1981), are more likely to enter into awareness (Amting et al., 2010; Mitchell and Greening, 2012) and can cause signicantly greater inuence on task-relevant behaviors (Mitchell et al., 2008; Vuilleumier and Driver, 2007). These effects are thought to be conferred by enhanced sensory processing; thus, in the visual domain, emotional stimuli elicit greater activity than similar neutral stimuli within areas of the visual cortex (Morris et al., 1998; Vuilleumier and Driver, 2007). Just like in the visual domain, studies of auditory processing have demonstrated that the analysis of emotional auditory stimuli occurs rapidly (Goydke et al., 2004; Sauter and Eimer, 2009) and is associated with enhanced activity in sensory (i.e., auditory) cortices (Fecteau et al., 2007; Viinikainen et al., 2012). Despite some emerging work concerning the inuence of emotion on the representation of auditory objects, essentially nothing is known about how emotion inuences auditory stimulus localization. There is accumulating evidence that auditory processing occurs within two separate cortical streams (Ahveninen et al., 2006; Alain et al., 2001; Barrett and Hall, 2006; Clarke et al., 2002; Lomber and Malhotra, 2008; Mathiak et al., 2007; Rauschecker, 2012; Rauschecker and Tian, 2000) that may share some similarities with the well-established dorsal and ventral processing streams of the visual sys- tem (Haxby et al., 1991; Milner and Goodale, 1993). Spatial cues used for localization are processed primarily in posterior-medial regions of the auditory cortex (Arnott et al., 2004; Bushara et al., 1999; Lomber NeuroImage 82 (2013) 295305 Corresponding author at: Brain and Mind Institute, University of Western Ontario, London, Ontario N6A 5B7, Canada. Fax: +1 519 663 3935. E-mail address: [email protected] (D.G.V. Mitchell). 1053-8119/$ see front matter © 2013 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.neuroimage.2013.05.051 Contents lists available at SciVerse ScienceDirect NeuroImage journal homepage: www.elsevier.com/locate/ynimg

Upload: others

Post on 12-Oct-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Emotion modulates activity in the ‘what’ but not ‘where ...sites01.lsu.edu/.../sites/98/2015/11/Kryklywy_2013_auditory_emotion… · Emotion modulates activity in the ‘what’

NeuroImage 82 (2013) 295–305

Contents lists available at SciVerse ScienceDirect

NeuroImage

j ourna l homepage: www.e lsev ie r .com/ locate /yn img

Emotion modulates activity in the ‘what’ but not ‘where’ auditoryprocessing pathway

James H. Kryklywy d,e, Ewan A. Macpherson c,f, Steven G. Greening b,e, Derek G.V. Mitchell a,b,d,e,⁎a Department of Psychiatry, University of Western Ontario, London, Ontario N6A 5A5, Canadab Department of Anatomy & Cell Biology, University of Western Ontario, London, Ontario N6A 5A5, Canadac National Centre for Audiology, University of Western Ontario, London, Ontario N6A 5A5, Canadad Graduate Program in Neuroscience, University of Western Ontario, London, Ontario N6A 5A5, Canadae Brain and Mind Institute, University of Western Ontario, London, Ontario N6A 5A5, Canadaf School of Communication Sciences and Disorders, University of Western Ontario, London, Ontario N6A 5A5, Canada

⁎ Corresponding author at: Brain and Mind Institute,London, Ontario N6A 5B7, Canada. Fax: +1 519 663 39

E-mail address: [email protected] (D.G.V. Mitchell).

1053-8119/$ – see front matter © 2013 Elsevier Inc. Allhttp://dx.doi.org/10.1016/j.neuroimage.2013.05.051

a b s t r a c t

a r t i c l e i n f o

Article history:Accepted 8 May 2013Available online 24 May 2013

Keywords:Auditory localizationEmotionAuditory processing pathwaysDual processing pathwaysVirtual environmentfMRI

Auditory cortices can be separated into dissociable processing pathways similar to those observed in the vi-sual domain. Emotional stimuli elicit enhanced neural activation within sensory cortices when compared toneutral stimuli. This effect is particularly notable in the ventral visual stream. Little is known, however, abouthow emotion interacts with dorsal processing streams, and essentially nothing is known about the impact ofemotion on auditory stimulus localization. In the current study, we used fMRI in concert with individualizedauditory virtual environments to investigate the effect of emotion during an auditory stimulus localizationtask. Surprisingly, participants were significantly slower to localize emotional relative to neutral sounds. Aseparate localizer scan was performed to isolate neural regions sensitive to stimulus location independentof emotion. When applied to the main experimental task, a significant main effect of location, but not emo-tion, was found in this ROI. A whole-brain analysis of the data revealed that posterior-medial regions of au-ditory cortex were modulated by sound location; however, additional anterior-lateral areas of auditorycortex demonstrated enhanced neural activity to emotional compared to neutral stimuli. The latter region re-sembled areas described in dual pathway models of auditory processing as the ‘what’ processing stream,prompting a follow-up task to generate an identity-sensitive ROI (the ‘what’ pathway) independent of loca-tion and emotion. Within this region, significant main effects of location and emotion were identified, as wellas a significant interaction. These results suggest that emotion modulates activity in the ‘what,’ but not the‘where,’ auditory processing pathway.

© 2013 Elsevier Inc. All rights reserved.

Introduction

The ability to interact effectively in an environment requires theaccurate recognition and localization of surrounding objects and thecapacity to prioritize these objects for behavior. One characteristicknown to modulate this is the emotional nature of the stimuli(Adolphs, 2008; Lang and Davis, 2006; Pessoa and Ungerleider,2004; Vuilleumier, 2005). Considerable evidence suggests that emo-tional visual stimuli gain rapid and often preferential access to thebrain's processing resources. At the behavioral level, emotional visualstimuli are detected faster than neutral stimuli (Graves et al., 1981),are more likely to enter into awareness (Amting et al., 2010; Mitchelland Greening, 2012) and can cause significantly greater influence ontask-relevant behaviors (Mitchell et al., 2008; Vuilleumier and Driver,2007). These effects are thought to be conferred by enhanced sensory

University of Western Ontario,35.

rights reserved.

processing; thus, in the visual domain, emotional stimuli elicit greateractivity than similar neutral stimuli within areas of the visual cortex(Morris et al., 1998; Vuilleumier andDriver, 2007). Just like in the visualdomain, studies of auditory processing have demonstrated that theanalysis of emotional auditory stimuli occurs rapidly (Goydke et al.,2004; Sauter and Eimer, 2009) and is associated with enhanced activityin sensory (i.e., auditory) cortices (Fecteau et al., 2007; Viinikainen etal., 2012). Despite some emerging work concerning the influence ofemotion on the representation of auditory objects, essentially nothingis known about how emotion influences auditory stimulus localization.

There is accumulating evidence that auditory processing occurswithin two separate cortical streams (Ahveninen et al., 2006; Alain etal., 2001; Barrett and Hall, 2006; Clarke et al., 2002; Lomber andMalhotra, 2008; Mathiak et al., 2007; Rauschecker, 2012; Rauscheckerand Tian, 2000) that may share some similarities with thewell-establisheddorsal and ventral processing streams of the visual sys-tem (Haxby et al., 1991; Milner and Goodale, 1993). Spatial cues usedfor localization are processed primarily in posterior-medial regions ofthe auditory cortex (Arnott et al., 2004; Bushara et al., 1999; Lomber

Page 2: Emotion modulates activity in the ‘what’ but not ‘where ...sites01.lsu.edu/.../sites/98/2015/11/Kryklywy_2013_auditory_emotion… · Emotion modulates activity in the ‘what’

296 J.H. Kryklywy et al. / NeuroImage 82 (2013) 295–305

et al., 2007) including the posterior superior temporal gyrus (STG) andthe transverse temporal gyrus. In contrast, sound identity cues, includ-ing pitch and language features, are processed in anterior-lateral re-gions of auditory cortex along the anterior STG (Altmann et al., 2008;Warren and Griffiths, 2003). However, despite continuous advances to-ward understanding the neural mechanisms underlying both enhancedrepresentation of emotion within sensory cortices and our representa-tions of auditory space, the impact of emotion during auditory localiza-tion remains unknown. Specifically, it remains unclear whetherevidence of enhanced activity observed in prior studies to emotionalrelative to neutral, non-spatialized auditory stimuli (Fecteau et al.,2007; Viinikainen et al., 2012) would also translate into enhanced audi-tory stimulus localization and enhanced activity in areas of auditorycortex sensitive to object location.

The potential of auditory virtual environments (AVEs) as a meth-od to examine neural pathways associated with auditory stimulus lo-calization has been described in previous studies (Bohil et al., 2011;Fujiki et al., 2002; Langendijk and Bronkhorst, 2000; Wightman andKistler, 1989a,b). Previous neuroimaging studies investigating audito-ry localization have created AVEs using generic head-related transferfunctions (HRTFs) generated from measurements of mannequins or aprototypical head shape (Ahveninen et al., 2006; Bushara et al., 1999;Krumbholz et al., 2009). These, however, fail to accommodate indi-vidual differences in head size and pinnae structure that alterssound as it enters the ear canals, resulting in imperfect perceptionof spatialized sounds (Middlebrooks et al., 2000; Wenzel et al.,1993). Such variables have been shown to influence reactions toand ratings of emotional auditory stimuli (Vastfjall, 2003). Despiteits potential importance, we are not aware of any neuroimaging stud-ies utilizing unique AVEs created from individualized HRTFs.

In the present study, we investigatedwhether the emotion-relatedenhancements observed in the visual domain at the behavioral(Amting et al., 2010; Graves et al., 1981) and neural levels (Morris etal., 1998; Vuilleumier and Driver, 2007) would also be found duringauditory stimulus localization. We hypothesized that positive andnegative auditory cues would receive prioritized processing relativeto neutral stimuli. We predicted that this prioritization would bereflected by increased accuracy, decreased reaction time, and in-creased neural activity within the posterior-medial ‘where’ pathwaysof auditory processing during the localization of emotional comparedto neutral sounds. However, as will be described below, the data didnot fit this prediction, and instead we found slower response timesfor emotional auditory stimuli compared to neutral ones. Additionally,consistent with previous studies involving non-spatialized emotionalauditory cues (Fecteau et al., 2007), we predicted that anterior-lateral areas of auditory cortex (i.e., the putative ‘what’ processingpathway)would also show enhanced activity for emotional comparedto neutral sounds. Furthermore, in light of lesion data suggesting thatthe what/where pathways are doubly dissociable (Lomber andMalhotra, 2008), we predicted that anterior-lateral regions of auditorycortex would not be modulated by sound location.

To test these predictions, we created AVEs by generating soundsbased on each individual's unique HRTFs. While undergoing fMRI, par-ticipants located or identified a series of auditory stimuli presentedin these virtual environments. The current study consisted of threerelated tasks. Task 1 was designed as a functional localizer, aimedat independently identifying ROIs specifically related to sound local-ization while controlling for object identity. Task 2 was conducted inthe same scanning session as Task 1. In this task, participants wererequired to identify the source locations of positive, negative andneutral sounds presented within a virtual auditory environment.This task served two purposes. First, the ‘where’ ROI derived fromthe Task 1 localizer was applied to the data in Task 2 and interrogat-ed to determine potential effects of emotion on location-sensitiveareas of auditory cortex. Second, Task 2 allowed us to performan exploratory whole-brain analysis examining the effects of, and

interactions between, emotion and location during auditory stimu-lus localization. Contrary to expectations, the results showed thatemotion did not modulate regions of auditory cortex sensitive to lo-cation. However, a distinct region of anterior lateral temporal cortexidentified in this exploratory study was modulated by emotion. Thisarea strongly resembled regions associated with sound-identityprocessing in previous studies (i.e., the putative ‘what’ pathway;Barrett and Hall, 2006; Warren and Griffiths, 2003). To help deter-mine the degree to which this area could be characterized as partof the ‘what’ auditory pathway, a follow-up localizer was conductedin a subset of participants in a subsequent session. This functional‘what’ pathway localizer identified ROIs that were modulated bysound identity while location and emotion were held constant. Theresulting ROI was extracted and applied to the data generated fromTask 2, allowing us to independently test the effects of emotion onthe resulting ‘what’ pathway.

Methods

Subjects

Eighteen healthy human subjects, (9 male, 9 female) with a meanage of 23.56 (range 19–35, SD 4.51), completed Tasks 1 and 2. All sub-jects granted informed consent and were in good mental health, asassessed by a Structured Clinical Interview for DSM-IV (Diagnosticand Statistical Manual of Mental Disorders, 4th Edition). All subjectshad normal hearing, normal or corrected-to-normal vision and werefluent English speakers. Ten of these subjects (5 male, 5 female),with a mean age of 24.3 (range 19–35, SD 5.42), returned to completeTask 3. All participants were reimbursed for their time at the end ofthe study. All experiments were approved by the Health Science Re-search Ethics Board at the University of Western Ontario.

Stimuli and apparatus

StimuliTwelve sounds were chosen from the International Affective Digitized

Sound (IADS) stimuli set thatwere of a neutral, negative or positive affec-tive nature as defined by standardized ratings (Bradley and Lang, 1999).Each stimulus category contained two single-source non-verbal humanvocalizations, one multi-source non-verbal human vocalization, andone non-human sound. All sounds were presented with a variable dura-tion of 2000–3000 ms (balanced across stimuli; variable durations wereused to facilitate deconvolution of the BOLD signal). Importantly, allstimuli were matched for their onset amplitude and root mean-squareamplitude, which ensures that the power and energy were consistent.Positive and negative stimuli were balanced for arousal ratings (meanpositive = 6.58, mean negative = 6.92) and valence levels (posi-tive = 7.56, negative = 2.36, absolute neutral = 5). In addition, to cre-ate a novel unrecognizable noise for Task 1, the 12 task sounds of Task 2were merged into a single audio file, segmented into b3 ms fragments,and subsequently scrambled, reconstituted and cropped to a durationof 15,000 ms. This sound maintains the average long-term power spec-trum of the stimulus set of Task 2, while remaining unidentifiable.

In order to localize neural regions that were sensitive to stimulusidentity, a novel set of nine neutral sounds were chosen from theIADS (mean valence 5.28, SD 0.98) for use in Task 3. These ninesounds were human, animal, or machine in origin (3 of each). An ad-ditional three segments of scrambled noise (identical to that used inTask 1) comprised a fourth sound class. All sounds in this set were5000 ms in duration, and were matched for onset amplitude androot-square mean amplitude.

Auditory virtual environmentThroughout the experiment, all sounds were presented within an

auditory virtual environment through Sensimetric MRI-Compatible

Page 3: Emotion modulates activity in the ‘what’ but not ‘where ...sites01.lsu.edu/.../sites/98/2015/11/Kryklywy_2013_auditory_emotion… · Emotion modulates activity in the ‘what’

297J.H. Kryklywy et al. / NeuroImage 82 (2013) 295–305

Insert Earphones. Volume was controlled by a Dayton Audio Class TDigital Amplifier. Initial volume was set to 88–90 dBs and adjustedslightly to the comfort level of each individual participant. To inducethe perceptual experience of spatialized sounds using insert-styleheadphones, HRTFs were measured individually for each subject.

To obtain the HRTF measurements, miniature electret microphones(Knowles FG3629) were mounted facing outwards in foam earplugsinserted flush with the ear canal entrances. The participant stood onan adjustable platform which positioned his or her head at the heightof a 1.5-m radius array of 16 loudspeakers (Tannoy i5 AW) spaced in22.5-degree intervals around the listener. The array was located in alarge hemi-anechoic chamber, and the floor area within the array wascovered with acoustical foam to attenuate reflections. The impulse re-sponse from each loudspeaker to each ear microphone was measuredusing the averaged response to 32 periods of a 2047-pointmaximum-length sequence signal (Rife and Vanderkooy, 1989)presented at a sampling rate of 48,828 Hz via a Tucker Davis Technolo-gies RX6 real time processor and QSC CX168 power amplifiers. Duringthe measurement procedure, head motion was minimized by monitor-ing headpositionwith an electromagnetic tracker (Polhemus FASTRAK)while participants were asked to aim the light from a head-mountedLED accurately at a frontal target position. To correct for individual loud-speaker characteristics, each HRTF measurement was equalized in thefrequency domain by dividing by the appropriate loudspeaker transferfunction measured with a reference microphone (Bruel & Kjaer 4189)placed at the center of the array in the absence of the listener's head.The impulse responses were windowed in post-processing to removeany residual reflections. The resulting HRTF measurements wereresampled to 44.1 kHz and combined with the equalization filters forthe headphones supplied by themanufacturer to create a new set of au-ditory filters limited to the 10-kHz bandwidth of the headphones. Theseindividualized filters were then applied to each sound by time-domainconvolution to create the experience of a virtual 3-dimensional auditoryspace; sounds presented with headphones were perceived to originatefrom controlled physical locations external to the participants.

Sounds for Tasks 1 and 2 were spatialized to four locations alongthe horizontal plane (−90°, −22.5°, 22.5° and 90° from the sagittalmidline; negative = left). Sounds for Task 3, along with an additional500-ms artificially generated white noise were spatialized to a loca-tion directly along the sagittal midline, in front of the listener. The en-tire set of stimuli (across locations, sound sources, and listeners) wasnormalized such that each stimulus had the same root-mean-squarelevel computed on the concatenated left- and right-ear signals.

Procedure

Task 1: isolating location-sensitive regions of auditory cortex (the‘where’ pathway)

Prior to entering the scanner, participants were acclimatized tothe auditory virtual environment by completing modified versionsof Task 1 and Task 2. For a detailed review of these training tasks,see Supplementary materials. In Task 1, 18 participants (9 female, 9male) performed a pure auditory localization task to help identifyneural regions involved in representing auditory space. This taskwas designed as a functional localizer allowing us to generate anROI corresponding to the ‘where’ pathway of auditory processing.Participants were instructed to close their eyes and listen to a seriesof 15,000-ms scrambled sounds. Each sound was presented in oneof the four virtual locations (8 times per location for a total of 32events broken into 2 runs). Participants were instructed to activelyfixate on the sound location for the duration of its presentation, andto indicate with a button press the location of the perceived sourceduring the sound presentation. Between each sound presentation,there was a 15,000-ms period of silence. Order of presented sound lo-cation was randomized for each run.

Task 2: locating emotional sounds within auditory spaceIn Task 2, the same 18 participants (9 female, 9 male) completed

an emotional auditory localization task. Participants were asked toclose their eyes for the duration of each scan to reduce possible con-founds of visual feedback. Each trial began with a white noise burst(500 ms) spatialized to a location along the sagittal midline, in frontof the participant. This acted as an ‘auditory fixation cross’ to reorientthe subject's attention to a standard central point and was immedi-ately followed by a spatialized target sound (2000–3000 ms) and aperiod of silence (2000–3000 ms). The target was randomlypresented in one of the four possible locations. The participant wasrequired to locate the target sound with a button press indicatingthe perceived location as quickly and accurately as possible. Overthe course of a single run, participants heard each of the 12 soundsin every location a single time for a total of 48 trials (16 trials peremotion condition). Additionally, there were 16 silent ‘jitter’ trials in-corporated into each run, for a total of 64 trials per run. All trials wereselected in a random order within each run. The task run was repeat-ed six times for a total of 384 trials.

Task 3: isolating identity-sensitive regions of auditory cortex (the ‘what’pathway)

Task 3 was conducted during a follow-up scanning session in a sub-set of the original sample to identify auditory areas associated with au-ditory object identification. This additional functional localizer scanwas initiated in response to the results of Task 1 and 2, outlinedbelow, in order to better explore the potential doubly dissociable effectsof emotion on the auditory ‘what’ and ‘where’ processing pathways.Specifically, it was designed to independently derive functional ROIscorresponding to the ‘what’ pathway of auditory processing. This typeof cross-validation, using separate sources for region identification andsignal estimation, avoids problems of circularity in interpreting neuro-imaging results (Vul and Pashler, 2012), allowing us to interrogate theROIs over the time course of Task 2 without fear of statistical bias(Esterman et al., 2010; Vul andKanwisher, 2010). In the scanner, 10 par-ticipants (5 female, 5 male) performed a block design auditory identifi-cation task similar to that used in Task 1. During this task, each15,000-ms sound presentation consisted of a triad of stimuli belongingto a single object class presented directly in front of the listener alongthe mid-sagittal plane within the AVE. There were four possible objectclasses: human, animal,machine and scrambled sounds. Over the courseof two runs, each object class was presented 8 times, for a total of 32events. Participants were instructed to actively attend to the identityof the sound objects for the duration of their presentation, and to indi-cate the identity of the presented sounds (human, animal, machine orscrambled) via button press. Between the presentations of each triad,there was a 15,000-ms period of silence. Presentation order wasrandomized.

Behavioral data analysis

To investigate the possible effect of emotional content on sound lo-calization, we recorded reaction times and accuracy level for the dura-tion of Task 2. A 4 (location) × 3 (emotion) ANOVA was conductedfor each of the behavioral measures. Follow-up pair-wise t-tests wereperformed to delineate the nature of any significant effects.

Imaging

MRI data acquisitionSubjects were scanned during all task performances using a 3 T

Siemens Scanner with a 32-channel head coil. fMRI images weretaken with a T2*-gradient echo-planar imaging sequence (repetitiontime [TR] = 2500 ms, echo time [TE] = 36 ms; field of view[FOV] = 18.8 cm, 78 × 78 matrix). Tasks 1 and 2 took place over asingle scan session. Task 3 took place in a separate session. For all

Page 4: Emotion modulates activity in the ‘what’ but not ‘where ...sites01.lsu.edu/.../sites/98/2015/11/Kryklywy_2013_auditory_emotion… · Emotion modulates activity in the ‘what’

Table 1Results of Experiment 1; accuracy and time to localization for all regressor conditions.

Emotiona Location Time to localization(ms; correct trials only)

Localizationaccuracy (%)

Positive −90° 775.1 (242.2) 83.8 (19.1)−22.5° 756.2 (199.2) 85.9 (16.6)

22.5° 804.5 (199.4) 88.2 (14.1)90° 808.5 (251.9) 82.2 (20.0)

Negative −90° 736.6 (210.0) 87.0 (15.3)−22.5° 776.2 (218.2) 84.3 (15.4)

22.5° 779.9 (201.0) 83.1 (18.1)90° 733.6 (209.1) 84.7 (17.8)

Neutral −90° 686.5 (153.3) 89.4 (12.0)−22.5° 736.5 (194.7) 85.6 (20.9)

22.5° 751.7 (218.4) 88.2 (17.5)90° 727.5 (224.6) 85.9 (18.5)

Standard deviations are in parentheses.a Significant effects of emotion were identified in the reaction time data. These ef-

fects of emotion were driven by a significantly slower localization of positive and neg-ative stimuli when compared to neutral stimuli (p b 0.001 and p b 0.05 respectively).

298 J.H. Kryklywy et al. / NeuroImage 82 (2013) 295–305

functional runs, complete brain coverage was obtained with 38 inter-leaved slices of 2.4 × 2.4 mm in plane with a slice thickness of2.4 mm, forming 2.4 mm isovoxels. A series of 148 functional imageswere collected for each run of Task 2 and a series of 202 functionalimages were collected for each run during Tasks 1 and 3. A high res-olution T1 weighted anatomical scan was obtained covering thewhole brain (TR = 2300 ms, TE = 4.25 ms; FOV = 25.6 cm, 192axial slices; voxel size = 1 mm isovoxels; 256 × 256 matrix) inboth scan sessions.

fMRI analysisAn analysis of the fMRI data was conducted using Analysis of

Functional NeuroImages (AFNI) software (Cox, 1996) at both the in-dividual and group levels. Motion correction was performed by regis-tering all volumes to the relevant functional volume acquiredtemporally adjacent to the anatomical scan. The data set for each par-ticipant was spatially smoothed (using an isotropic 4 mm Gaussiankernel) to reduce the influence of individual differences. The time se-ries data were normalized by dividing the signal intensity of a voxel ateach time point by the mean signal intensity of that voxel for each runand multiplying the result by 100. Thus, resultant regression

Table 2Neural regions modulated by location, emotion and identity within the auditory domain.

Effect R/L Location

‘Where’ ROIa R STG, TTGL STG, TTG

Whole brain analysis: locationa R STG, TTG, MTGL STG, TTGR IPL, precuneusR PrecuneusR Pre-central gyrusR Post-central gyrusR IPLR Post-central gyrusR MOG

Whole brain analysis: emotionb R STGL STG

Whole brain analysis: interactionc R/L Precuneus, SPL, CCR/L Cuneus, MOGR STG

‘What’ ROI R STG, TTG

Significant clusters are thresholded at p b 0.005 (corrected to p b 0.05).STG = superior temporal gyrus; TTG = transverse temporal gyrus; MTG = middle temporcortex.Note: XYZ are Talairach coordinates and refer to center of mass.

a Effects of location in both main effects are driven by significantly increased neural activareas.

b Effects of emotion are driven by a significantly increased neural response to positive anc Effects of sound identity were driven by significantly increased neural response to iden

coefficients represent the percent signal change from the mean activ-ity. Regressor files modeling the presentation time course of relevantstimuli were created for each of the 12 conditions of Task 2 (4 loca-tions × 3 emotions) during correct trials only, and for each of thefour conditions of Task 1 (4 locations) and Task 3 (4 classes of ob-jects). The relevant hemodynamic response function was fit to eachregressor to perform linear regression modeling. To account forvoxel-wise correlated drifting, a baseline plus linear drift and qua-dratic trend were modeled for each time series. This resulted in a βcoefficient and t value for each voxel and regressor. To facilitategroup analysis, each individual's data were transformed into the stan-dard space of Talairach and Tournoux. Following this, a 4 (loca-tion) × 3 (emotion) ANOVA was performed on the imaging datafrom Task 2, while two separate one-way ANOVAs were performedon the imaging data from Tasks 1 and 3 to examine the effects of lo-cation and sound-identity respectively (4 levels of each). ANOVAs atthis level were conducted using the AFNI function GroupAna, yieldingan F value for each main effect and interaction at each voxel. Percentsignal change from the mean activity was extracted from significantclusters of activation for each relevant regressor using the AFNI func-tion 3Dmaskave. To correct for multiple comparisons, a spatial clus-tering operation was performed using AlphaSim with 1000 MonteCarlo iterations on the whole brain EPI matrix.

Results

Behavioral results

To delineate the effects of emotion on auditory localization, partic-ipants were presented a series of positive, negative and neutralsounds within a virtual auditory environment while undergoingfMRI. Participants were instructed to localize these sounds as quicklyand accurately as possible (Task 2).

A 4 (location) × 3 (emotion) ANOVAwas conducted on the reactiontime and error data (Table 1) to determine the impact of stimulus emo-tion on auditory localization at a behavioral level. This yielded a signif-icant main effect of emotion (F(2, 34) = 12.617, p b 0.005) on thereaction times. We had originally predicted that emotion would en-hance the localization of auditory stimuli. However, contrary to thesepredictions, follow-up t-tests revealed that this effect was driven by a

BA X Y Z Vol. (mm3)

41/22/13 44 −22 12 934241/22/13 −44 −25 10 380742/41/39/37 48 −27 12 22,11342/41 −48 −23 10 10,9087 17 −68 41 53197 10 −46 47 31866 29 −7 52 9454/3 48 −16 52 72940 43 −47 50 5673 23 −32 59 51319 32 −76 24 45922/21 55 −8 2 866822/21 −54 −11 2 499531/7 −6 −54 45 472819/18/17 3 −83 9 161742/41/22 60 −16 6 38742/41/22 54 −20 8 4266

al gyrus; IPL = inferior parietal lobule; MOG = middle occipital gyrus; CC = cingulate

ity for sounds presented in the contralateral hemifield. This is consistent in all cortical

d negative stimuli compared to neutral stimuli.tifiable stimuli (human, animal, machine) rather than scrambled sounds.

Page 5: Emotion modulates activity in the ‘what’ but not ‘where ...sites01.lsu.edu/.../sites/98/2015/11/Kryklywy_2013_auditory_emotion… · Emotion modulates activity in the ‘what’

299J.H. Kryklywy et al. / NeuroImage 82 (2013) 295–305

significantly slower localization of emotional sounds (both positive andnegative)when compared to neutral sounds (p b 0.001 and p b 0.05 re-spectively). There was no significant effect of sound location (F(3, 51) =.616, p > 0.05) or location × emotion interaction (F(6, 102) = 1.891,p > 0.05) on reaction time. The same analysis applied to the localizationerror data yielded no significant effects for either sound location(F(3, 51) = 0.158, p > 0.05) or emotion (F(2, 34) = 1.783, p > 0.05), nordid it yield a significant location × emotion interaction (F(6, 102) =1.320, p > 0.05).

Imaging results

Interrogating location-sensitive regions of auditory cortex (the ‘where’pathway)

In order to determine the impact of sound location independent ofsound identity, location-related activitywas assessed during the localiza-tion of unrecognizable scrambled sounds (Task 1). A one-way ANOVA(four locations) conducted on the whole brain EPI data obtained duringTask 1 identified a significant effect of sound location (p b 0.005;corrected at p b 0.05) within regions of the temporal cortex (Table 2).This activation included primary auditory cortex (BA 41/42) and extend-ed medially along the transverse temporal gyrus (BA 13; Fig. 1A). Thisregion showed greatest activity to stimuli positioned far in the contralat-eral hemifield, decreasing activity to midline sounds, and significantlyless activity to ipsilaterally positioned sounds (p b 0.005; Fig. 1B). Thisarea was then used as an auditory ‘where’ pathway ROI, and applied tothe BOLD data collected in Task 2. The percent signal change to all loca-tions and emotional categories within this ROI was extracted using3Dmaskave, and subjected to a 4 (location) × 3 (emotion) ANOVA.

As expected, activity within the ‘where’ pathway ROI was signifi-cantly modulated by sound location (F(3, 51) = 38.435, p b 0.001),wherein there was an increased activity for sounds presented in thecontralateral hemifield relative to the ispilateral hemifield (Fig. 1C).Contrary to our original predictions, however, there was no signifi-cant main effect of emotion (F(34,2) = 0.776, p > 0.05), nor wasthere a significant location × emotion interaction (F(6, 102) = 1.164,p > 0.05; Fig. 1C). Thus, emotion did not modulate activity inlocation-sensitive areas of the auditory cortex.

Whole brain analysis: locating emotional sounds in auditory spaceAn exploratory whole brain analysis by way of a 4 (location) × 3

(emotion) ANOVA was conducted on the data from Task 2 to identifyneural regions that varied as a function of location and emotion. A sig-nificant main effect of location (p b 0.005; corrected at p b 0.05) wasidentified in regions of temporal and parietal cortices (Table 2). Thisactivation extended from the primary auditory cortices (BA41/42)posteriorly and medially along the transverse temporal gyrus andinto the inferior parietal lobule (Fig. 2A). Percent signal change fromthe mean activity for each significantly activated voxel was extractedfrom all regions displaying significant main effects or interactionsusing 3Dmaskave. Contributing to this effect, the activity in the poste-rior superior temporal gyrus (posterior areas of BA 13) varied as afunction of location; the activity was greatest to the stimuli presentedin the far contralateral hemifield of the auditory virtual space, and de-creased progressively with a distance from that location (Fig. 2B).This general pattern of activation emerged bilaterally, with eachhemisphere showing an inverse pattern of activation to the other.This result is consistent with studies that describe auditory stimuluslocation encoding as involving differential activation of units acrossthe two hemispheres, as opposed to local encoding within specificnuclei or regions (Grothe et al., 2010; Stecker et al., 2005). Similarpatterns of activity were identified in the areas of precuneus, inferiorparietal lobule, pre/post-central gyrus and medial occipital gyrus thatexhibited main effects of sound location.

A significant main effect of emotion was identified in the bilateraltemporal cortex (p b 0.005; corrected at p b 0.05; Table 2). Activation

extended from the primary auditory cortices (BA 41/42) anterior andinferior along the superior temporal gyrus (to BA 22; Fig. 2C). This re-gion showed significantly greater activation bilaterally during thepresentation of emotional sounds (positive and negative), whencompared to neutral sounds (p b 0.001 and p b 0.005 respectively;Fig. 2D).

A significant interaction was identified within regions of the right au-ditory cortex (BA 41/42/22; Fig. 3A), bilateral precuneus (BA 31/7;Fig. 3C), and bilateral occipital lobe (BA 17/18/19; Fig. 3E; p b 0.005,corrected at p b 0.05; Table 2; Figs. 3A–F). To identify the nature ofthese effects, a series of one-way ANOVAs were performed, and wheresignificant (p b 0.05), these were followed-up with paired t-tests. First,the impact of location within each emotion was examined. The right au-ditory cortex showed greater activity to positive stimuli presented inthe contralateral relative to ipsilateral hemifield (p b 0.01); however, nosuch effect was observed for neutral or negative stimuli (p > 0.05). Theimpact of emotion within each location was also examined. Soundscoming from the contralateral hemifield (i.e., 90° and 22.5° left of thesagittal midline) elicited greater activity in the right auditory cortexfor emotional relative to neutral sounds (positive > neutral, p b 0.001;negative > neutral, p b 0.01; positive > negative p b 0.05). These resultsare illustrated in Fig. 3B.

Bilateral precuneus and occipital lobe displayed strikingly similarpatterns of activation.Within both regions, the impact of location with-in emotion was significant during the presentation of negative(p b 0.005) but not neutral (p > 0.05) sounds. Within bilateralprecuneus, the impact of location during the presentation of positivestimuli was significant (p b 0.05), while in the occipital lobe, it wasnot (p > 0.05). These results are illustrated in Fig. 3D/F. The impact ofemotionwithin each location for both regionswas also examined, yield-ing similar patterns of activation. Sounds coming from the medial rightlocation (i.e., 22.5° right of the sagittal midline) significantly modulatedactivity in both precuneus and occipital regions across emotional cate-gories (positive > neutral, p b 0.05; positive > negative, p b 0.005;neutral > negative p b 0.005). Significant changes in activity as a func-tion of emotion were not found in any of the other locations (p b 0.05).These results are illustrated in Fig. 3D/F. The effects in the visual cortexwere unexpected as subjects had their eyes closed during the task;however, it is noteworthy that occipital lobe activity has been asso-ciated with mental imagery in the absence of visual input (Borst andKosslyn, 2008; Kosslyn et al., 1995). The effects may therefore be dueto the relative propensity of the different classes of sounds to inducemental imagery.

Interrogating identity-sensitive regions of auditory cortex (the ‘what’pathway)

The whole-brain analysis (Task 2) identified a separate anterior-lateral area of auditory cortex that was modulated by emotion. This re-gion resembled areas implicated in the putative ‘what’ auditory pro-cessing stream (Altmann et al., 2008; Warren and Griffiths, 2003),raising the question of whether emotion has dissociable effects on the‘what’ versus ‘where’ auditory processing streams. To test this hypothe-sis, a follow-up localizer scan (Task 3)was performed to extract a soundidentity sensitive ‘what’ pathway ROI (independent of location andemotion) that could then be applied to the Task 2 data. A one-wayANOVA (with four sound categories) was conducted on the EPI data re-vealing object-identity sensitive areas within the temporal cortex(p b 0.005; corrected at p b 0.05; Table 2). This activation included re-gions of cortex anterior and lateral to primary auditory cortex alongthe superior temporal gyrus (BA 22; p b 0.005; corrected at p b 0.05;Fig. 4A). Further exploration of this effect revealed that a significantlygreater activitywas elicited in this area for sounds in the human, animaland machine categories compared to scrambled sounds (p b 0.001;Fig. 4B). In addition, this region showed a significantly greater activityto biologically generated sounds (human and animal) relative tomachine-generated sounds (p b 0.005, p b 0.001 respectively). This

Page 6: Emotion modulates activity in the ‘what’ but not ‘where ...sites01.lsu.edu/.../sites/98/2015/11/Kryklywy_2013_auditory_emotion… · Emotion modulates activity in the ‘what’

C

0

0.05

0.1

0.15

0.2

0.25

0.3

0.35

Left Right

% s

ign

al c

han

ge

Hemisphere

‘Where’ ROI Applied to Task 2: Effect of Location

-90o

-22.5o

22.5o

90o

B

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

Left Right

% s

ign

al c

han

ge

Hemisphere

Task 1: Effect of Location

-90o

-22.5o

22.5o

90o

∗∗∗∗∗∗

∗∗∗

∗∗ ∗∗ ∗∗∗∗∗∗

∗∗∗∗∗∗

∗ ∗∗∗∗∗

∗∗

∗∗∗ ∗∗

∗∗∗∗

∗ ∗∗

z = 7 R

Task 1: Main Effect of Location

ROI: Location-Sensitive Regions of Auditory Cortex

A

Fig. 1. Neural regions significantly modulated by sound location (Task 1). (A) Activity in posterior aspects of bilateral superior temporal gyrus is significantly modulated by soundlocation. (B) These effects of location were parametric, wherein cortical activation increases as a sound is presented more eccentrically in the contralateral hemifield. (C) Alocation-sensitive ‘where’ ROI generated from Task 1 was applied to the EPI data acquired during Task 2. As expected, a significant main effect of location was identified in this re-gion (F(3, 51) = 38.435, p b 0.001). The nature of this effect was consistent with the parametric location effects reported above. Significant main effects of stimulus emotion werenot found within this ROI (F(34,2) = 0.776, p > 0.05), nor were any location X emotion interaction effects (F(6, 102) = 1.164, p > 0.05). [* p b 0.01; ** p b 0.005 *** p b 0.001].

300 J.H. Kryklywy et al. / NeuroImage 82 (2013) 295–305

identity-related ROIwas then applied to the BOLD data collected in Task2, and the percent signal change was extracted from each mask for alllocations and emotional categories using 3Dmaskave, and subjected toa 4 (location) × 3 (emotion) ANOVA. Within the ‘what’ pathway ROI,activity was found to be significantly modulated by both sound location(F(3, 51) = 26.223, p b 0.001) and emotion (F(2, 34) = 8.914, p b 0.005;Fig. 4c). Activity in this region was significantly increased for soundspresented in the contralateral hemifield when compared with soundspresented in the ispilateral hemifield (p b 0.001). Additionally, this re-gion showed enhanced activation for positive and negative versus neu-tral stimuli (p b 0.005 and p b 0.05 respectively). Lastly, a significantlocation × emotion interaction was observed in this region (F(6, 102) =2.450, p b 0.05).

To identify the nature of the interaction effects in the ‘what’ pathway,a series of one-wayANOVAswere performed. First, the impact of locationwithin each emotionwas examined.Within this ROI, therewas a greateractivity for both positive and negative stimuli presented in the contralat-eral relative to ipsilateral hemifield (p b 0.001 and p b 0.05 respective-ly); however, no such effect was observed for neutral stimuli(p > 0.05). The interaction was further examined by comparing the im-pact of emotion within each location; however, no significant effectswere identified for this contrast (p > 0.05). This effect contrasts withthat found in the ‘where’ pathwayROI, which featured a significant effectof location, but no significant effect of emotion or location × emotion in-teraction. Furthermore, the coordinates of the interaction identified

during Task 2 are more closely related to the coordinates of the ‘what’ROI compared to the ‘where’ ROIs, so this effect is not unexpected. Con-junction analyses confirmed that ROIs generated from Task 1 (‘where’localizer) and Task 3 (‘what’ localizer) overlapped with the main effectsof location and emotion respectively in Task 2 (see Supplementarymate-rials). It should benoted that Task 3 involved a subset of the original sam-ple used in Tasks 1 and 2, and therefore had less power to define the ROI.Nevertheless, the fact that this independently derived ROI shows thesame functional properties exhibited in the emotion-sensitive regionsuncovered in Task 2 (despite reduced power) lends further support tothe conclusion that the auditory ‘what’ pathway is modified by bothemotion and location characteristics.

Discussion

Considerable evidence from visual studies suggests that emotionalstimuli gain rapid and often preferential access to the brain's process-ing resources. Although less work has been conducted in the auditorydomain, enhanced effects of emotion on auditory cortex have alsobeen observed across multiple investigative techniques, includingfMRI (Wiethoff et al., 2008) and NIRS (Plichta et al., 2011). However,essentially nothing is known about how emotion influences auditorystimulus localization. In the current study, we used fMRI in concertwith individualized virtual auditory environments to investigate theeffect of emotion on sound localization. Surprisingly, participants

Page 7: Emotion modulates activity in the ‘what’ but not ‘where ...sites01.lsu.edu/.../sites/98/2015/11/Kryklywy_2013_auditory_emotion… · Emotion modulates activity in the ‘what’

A B

C D

0

0.05

0.1

0.15

0.2

0.25

0.3

Left Right

% s

ign

al c

han

ge

Hemisphere

Task 2: Location (auditory cortex)

-90o

-22.5o

22.5o

90o

0

0.05

0.1

0.15

0.2

0.25

Left Right

% s

ign

al c

han

ge

Hemisphere

Task 2: Emotion

Positive

Negative

Neutral

C

z = 12 R

z = 3 R

x = 49

x = 60

Main Effect of Sound Location

Main Effect of Sound Emotion

Whole Brain Analysis: Location

Whole Brain Analysis: Emotion

∗ ∗∗∗ ∗∗

∗ ∗∗

∗∗∗∗

∗ ∗∗∗ ∗∗

∗ ∗∗∗ ∗∗

∗ ∗∗

∗∗∗ ∗∗

∗ ∗∗∗

Fig. 2. Neural regions demonstrating significant main effects of sound location and emotion in a whole brain analysis (Task 2). (A) Posterior superior temporal gyrus activity wassignificantly modulated bilaterally as a function of sound location. (B) These effects of location are parametric, wherein cortical activation increases as a sound is presented moreeccentrically in the contralateral hemifield. (C) Anterior superior temporal gyrus activity bilaterally was significantly modulated by sound emotion. (D) These effects of emotion aredriven by significantly greater activation during the presentation of positive and negative relative to neutral sounds. The whole-brain analysis was conducted at a two-tailed thresh-old of p b 0.005, and corrected to a family wise error rate of p b 0.05. [* p b 0.01; ** p b 0.005 *** p b 0.001].

301J.H. Kryklywy et al. / NeuroImage 82 (2013) 295–305

were significantly slower to localize positive and negative sounds rel-ative to neutral ones. Moreover, activity in an independently identi-fied location-sensitive region of auditory cortex was not modulatedby emotion. Subsequent whole-brain analyses were conducted on atask involving the localization of emotional and neutral stimuli. Thisanalysis revealed that enhanced activity to positive and negativestimuli was observed in anterior-lateral areas of auditory cortexirrespective of location. In contrast, posterior-medial regions of audi-tory cortex, as well as the inferior parietal lobule and precuneus, weremodulated by location, irrespective of emotion. Both the response ofanterior-lateral regions of auditory cortex to sound location and thelack of response of posterior-medial region of auditory cortex to emo-tion ran contrary to original predictions. These unexpected resultsraised the possibility that emotional sounds augment activity in theanterior-lateral ‘what’ pathway, but not the posterior-medial‘where’ pathway during auditory localization. To more clearly delin-eate the functional significance of the divisions identified, weconducted an additional functional localizer scan to independentlyidentify regions of auditory cortex that responded to changes insound identity (i.e., areas associated with encoding ‘what’ in dualpathway models of auditory processing; Alain et al., 2001; Altmannet al., 2008; Rauschecker and Tian, 2000; Warren and Griffiths,2003). This functionally-derived ROI was then applied to the originalstudy. Collectively, the results support the conclusion that whereassound location modulates activity within the ‘what’ and ‘where’functionally-derived pathways, emotion modulates neural activity inthe ‘what’ but not the ‘where’ pathway.

Regions associated with processing auditory object location

Amain effect of location was observed in bilateral temporal cortex(Tasks 1 and 2), precuneus, and inferior parietal cortex (Task 2 only).Of note, the location-related effects were characterized in posterior-medial auditory cortex by greatest activity to stimuli positioned fur-thest in the contralateral hemifield, decreasing activity to midlinesounds, and significantly less activity to ipsilaterally positionedsounds. This general pattern of activation emerged bilaterally, witheach hemisphere showing the inverse pattern of activation. ROIs gen-erated from Task 1 confirmed that independently defined location-sensitive regions in STG were not modulated by emotion.

Regions associated with processing auditory object identity

In the current study, areas within STG were implicated in processingboth the emotional and object identity features of auditory stimuli (Tasks2 and 3). The identity-related effect identified in Task 3 was character-ized in anterior-lateral auditory cortex by greater activity to biologicalrelative to non-biological sounds, but did not distinguish betweenhuman and non-human sounds. This result appears to contradict previ-ous work that implicates superior temporal sulcus (STS) specifically inhuman voice processing (Belin et al., 2000; Ethofer et al., 2012). One crit-ical difference between these studies and the current study is the treat-ment of non-vocal human sounds (i.e., sounds that do not involvevocal fold vibration). Previous work found that contrasting vocalhuman sounds with non-vocal sounds yielded voice selective activity

Page 8: Emotion modulates activity in the ‘what’ but not ‘where ...sites01.lsu.edu/.../sites/98/2015/11/Kryklywy_2013_auditory_emotion… · Emotion modulates activity in the ‘what’

-0.04-0.02

00.020.040.060.080.1

Neutral Negative Positive

% s

ign

al c

han

ge

Emotion

Task 2: Location X Emotion(Occipital Lobe)

-90

-22

22

90

-0.04-0.02

00.020.040.060.080.1

Neutral Negative Positive

% s

ign

al c

han

ge

Emotion

Task 2: Location X Emotion (Precuneus)

-90

-22

22

90

00.10.20.30.40.5

Neutral Negative Positive

% s

ign

al c

han

ge

Emotion

Task 2: Location X Emotion (Auditory Cortex)

-90

-22

22

90

R

R

R

A

C

E

B

F

D

Whole Brain Analysis: Interaction

LocationX

Emotion

LocationX

Emotion

LocationX

Emotion

x = 62

x = -2

x = 11 x = -80

y = -57

y = -13

Whole Brain Analysis: Interaction

Whole Brain Analysis: Interaction

Fig. 3. Neural regions demonstrating a significant location X emotion interaction in a whole brain analysis (Task 2). (A) Right superior temporal gyrus was significantly modulatedas a function of a location X emotion interaction. (B) These effects are driven by an increased response to emotional (both positive and negative) relative to neutral sounds whenpresented in the contralateral hemifield (p b 0.001 and p b 0.01 respectively). (C/E) A significant location X emotion interaction emerged in bilateral precuneus and occipital lobe.(D/F) Within both the precuneus and occipital cortex, the impact of location within each emotion was significant during the presentation of negative and positive (p b 0.005) butnot neutral (p > 0.05) sounds. [* p b 0.01; ** p b 0.005 *** p b 0.001].

302 J.H. Kryklywy et al. / NeuroImage 82 (2013) 295–305

of the STS (Belin et al., 2000). Other studies yielding similar effects inSTS contrasted human vocal sounds (human non-vocal sounds were ex-cluded) with both animal and environmental sounds (Ethofer et al.,2012). The current study, however, classed all sound originating from ahuman source into the ‘human’ category (e.g., clapping and cryingwere both ‘human’). This regressor was designed to be an easily distin-guishable auditory class, and not to represent neural changes associatedwith vocal processes. As such, it included one vocal and two non-vocalstimuli to eliminate any bias toward the identification of regionsdisplaying voice-specific activation. The neural region observed inthe present study may be located in an area slightly superior to that im-plicated in prior studies of human vocalizations. Interestingly, the use oflinguistic utterances may also impact the location of emotion-sensitiveregions. For example, Ethofer et al. (2012) demonstrate that an area ofSTG identified as selective for human voices is also sensitive to the emo-tional inflection of pseudo-linguistic stimuli. This region, whileoverlapping with the emotion-sensitive areas in the present study, ex-tends into more posterior-lateral areas of temporal cortex. In futurework, it will be interesting to determine whether verbal and non-verbal emotional sounds augment activity in dissociable areas of tempo-ral cortex.

Regions associated with processing both auditory object identity andlocation

In addition to showing an effect of object identity, areas of rightanterior-lateral STG also showed a significant effect of location, and a

significant location X emotion interaction. Activity in this region wasmodulated by location for positive and negative stimuli, but not neu-tral stimuli. Furthermore, these regions showed increased activity topositive and negative relative to neutral stimuli, but only for thosestimuli presented in the contralateral hemifield. Thus, these areasshow some evidence of location-related encoding for emotional, butnot neutral stimuli. These findings raise the intriguing possibilitythat anterior-lateral regions of right auditory cortex may be involvedin integrating information about object identity and location for emo-tionally salient stimuli. It also calls into question strict boundaries be-tween spatial and object identity encoding in human STG. Furtherwork will be required to delineate the underlying neuroanatomy andthe stimulus parameters associated with these different processes inhumans.

Representation of emotion in a dual pathway model of audition

Considerable evidence exists supporting the suggestion that the vi-sual system is comprised of separable ventral and dorsal pathways re-sponsible for processing objects for identity and action respectively(Milner and Goodale, 1998). It has also recently been proposed that au-ditory cortices can be separated into similar ‘what’ and ‘where’ path-ways (Alain et al., 2001; Lomber and Malhotra, 2008; Rauschecker,2012). In the present study, a functional dissociation betweenanterior-lateral and posterior-medial divisions of auditory corticeswas observed that resembled the boundaries described in some dualpathway models of auditory processing (Bushara et al., 1999; Lomber

Page 9: Emotion modulates activity in the ‘what’ but not ‘where ...sites01.lsu.edu/.../sites/98/2015/11/Kryklywy_2013_auditory_emotion… · Emotion modulates activity in the ‘what’

0

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

Positive Negative Neutral

% s

ign

al c

han

ge

Axis Title

‘What’ ROI Applied to Task 2: Location X Emotion

-90 -22.5

22.5 90

C

A B

0

0.2

0.4

0.6

0.8

1

1.2

1.4

Right Hemisphere

% s

ign

al c

han

ge

Task 3: Effect of Identity

Animal

Human

Machine

White Noise

z = 10 R

Task 3: Main Effect of Identity

ROI: Identity-Sensitive Regions of Auditory Cortex ∗∗∗

∗∗

∗∗∗∗

∗∗∗

Fig. 4. Neural regions significantly modulated by sound identity (Task 3). (A) Activity in bilateral superior temporal gyrus was significantly modulated by sound identity. (B) Theseeffects of identity were driven by a significant increase in activation during the presentation of ecologically relevant sounds relative to unrecognizable scrambled sounds and a sig-nificant increase to biological relative to non-biological sounds. The whole-brain analysis was conducted at a two-tailed threshold of p b 0.005, and corrected to a family wise errorrate of p b 0.05. (C) A sound identity-sensitive (‘what’) ROI derived from Task 3 was applied to the EPI data from Task 2. A significant main effect of sound location was identified inthis region (F(3, 51) = 26.223, p b 0.001). This region displayed a significant main effect of emotion during (F(2, 34) = 8.914, p b 0.005) whereby positive and negative stimuli elic-ited significantly greater activation relative to neutral stimuli (p b 0.005 and p b 0.05 respectively). Additionally, the ‘what’ pathway ROI displayed a significant location X emotioninteraction effect (F(6, 102) = 2.450, p b 0.05) wherein there was greater activity for both positive and negative stimuli presented in the contralateral relative to ipsilateral hemifield(p b 0.001 and p b 0.05 respectively). No significant effect of location was observed for neutral stimuli. [* p b 0.01; ** p b 0.005 *** p b 0.001].

303J.H. Kryklywy et al. / NeuroImage 82 (2013) 295–305

andMalhotra, 2008;Warren andGriffiths, 2003). To further explore thispossibility, results from Tasks 1 and 3 were used to independently gen-erate functionally defined ROIs corresponding to these pathways. Thisanalysis revealed that whereas sound locationmodulates neural activa-tion within both the ‘what’ and ‘where’ auditory processing pathways,emotion modulates activity in the ‘what’ but not the ‘where’ processingpathway. Importantly, the location X emotion interaction observed inthe anterior-lateral ‘what’ pathway (but not the posterior-medial‘where’ pathway) showed that spatial processing in the ‘what’ pathwaywas present specifically for emotional stimuli.

Effects of emotion on localization behavior

It would seem advantageous for organisms to rapidly and accuratelylocalize the source of emotional auditory stimuli in the environment.Contrary to expectations, participants in the present study were slowerto localize positive and negative sounds relative to neutral ones; sowhymight emotional stimuli be associated with slower localization perfor-mance in the present study? One possibility is that important differ-ences exist between the dual pathways of the visual and auditorysystems. Ourfinding that participantswere significantly slower to local-ize emotional relative to neutral sounds conflicts with evidence fromthe visual domain suggesting that dorsal stream guided behaviors areless susceptible to processing limitations (Fecteau et al., 2000; Pisella

et al., 2000) and emotional interference (Ngan Ta et al., 2010). Anotherpossibility is that, unlike in the visual system, both object perceptionand localization in the auditory domain may be sensitive to processingload. In the current study, emotion may have augmented the represen-tation of object features at the expense of spatial-cues, thereby slowinglocalization performance. Consistent with this interpretation, we ob-served evidence that both spatial and emotional characteristicswere in-tegrated in anterior-lateral areas of auditory cortex, making this apotential site of competition between stimulus features. This integra-tion of object and spatial features within the putative ‘what’ auditoryarea raises questions about the extent to which the ‘what’ and ‘where’streams of the auditory system are functionally segregated.

It is important to note that the precise relationship between emo-tion and the putative ‘what/where’ pathways of the auditory systemwas only preliminarily addressed in the current study. First, the na-ture and anatomical mapping of the putative dual processing streamin the auditory system remains unclear, as is the degree to which itsfunction resembles that of the visual system. In addition, the dissocia-ble effects of emotional versus neutral stimuli within regions of audi-tory cortex observed were unexpected, and the interpretation isbased in part on ROIs generated from follow-up scans involving asmaller sample. Although generated from activity in a sample smallerthan the initial population, this ROI displayed similar functional prop-erties to the regions identified in the original task, providing

Page 10: Emotion modulates activity in the ‘what’ but not ‘where ...sites01.lsu.edu/.../sites/98/2015/11/Kryklywy_2013_auditory_emotion… · Emotion modulates activity in the ‘what’

304 J.H. Kryklywy et al. / NeuroImage 82 (2013) 295–305

cross-validation for our original interpretation. Nevertheless, in fu-ture work it would be beneficial to precisely map the boundaries ofthe ‘what/where’ auditory processing streams, and establish the im-pact of emotion on each.

Conclusions

Although the emotional content of a sound has been demonstrat-ed to influence processing in auditory cortex, the neural and behav-ioral effects of emotion on stimulus localization was previouslyunknown. Our results indicate that during sound localization, theemotional content of a sound modulates activity in anterior-lateralregions of STG (areas corresponding to the putative ‘what’ pathwayof auditory processing). In contrast, and contrary to predictions, emo-tion elicited no significant changes in neural activity to posterior-medial areas of STG (regions corresponding to the putative ‘where’pathway). An unexpected interaction between emotion and locationwas also observed in anterior-lateral areas of STG suggesting thatthe boundaries between object identity and location decoding maybe blurred; emotional sounds were associated with enhanced spatialprocessing in the ‘what’ pathway despite having no effect on the‘where’ pathway. It is important to note that at the behavioral level,a significant delay in localization of emotional compared to neutralsounds was also observed. The idea was raised that this effect mayhave been driven by competition between spatial and emotional fea-tures for representation and control over behavior. Interestingly,anterior-lateral auditory cortex activity was modulated by both spa-tial location and emotion, suggesting this area as a potential site forsuch competitive interactions. Finally, this work demonstrates, forthe first time, the feasibility of utilizing individualized virtual auditoryenvironments as a stimulus presentation tool during fMRI. This tech-nique, which involves presenting auditory stimuli through head-phones while maintaining spatial integrity and perceptual realism,holds promise for future neuroimaging studies investigating the spa-tial component of sound.

Supplementary data to this article can be found online at http://dx.doi.org/10.1016/j.neuroimage.2013.05.051.

Acknowledgments

The authors would like to thank Dr. Marc Joanisse and Dr. StefanEverling for helpful discussion on experimental design and analysis,and David Grainger, of the National Centre for Audiology, for techni-cal support. This work was funded by a Natural Science and Engineer-ing Research Council Discovery Grant to Dr. Derek Mitchell.

Conflict of interest

The authors have no conflict of interest regarding the currentexperiment.

References

Adolphs, R., 2008. Fear, faces, and the human amygdala. Curr. Opin. Neurobiol. 18,166–172.

Ahveninen, J., Jaaskelainen, I.P., Raij, T., Bonmassar, G., Devore, S., Hamalainen, M.,Levanen, S., Lin, F.H., Sams, M., Shinn-Cunningham, B.G., Witzel, T., Belliveau,J.W., 2006. Task-modulated “what” and “where” pathways in human auditory cor-tex. Proc. Natl. Acad. Sci. U.S.A. 103, 14608–14613.

Alain, C., Arnott, S.R., Hevenor, S., Graham, S., Grady, C.L., 2001. “What” and “where” inthe human auditory system. Proc. Natl. Acad. Sci. U. S. A. 98, 12301–12306.

Altmann, C.F., Henning, M., Doring, M.K., Kaiser, J., 2008. Effects of feature-selective at-tention on auditory pattern and location processing. Neuroimage 41, 69–79.

Amting, J.M., Greening, S.G., Mitchell, D.G., 2010. Multiple mechanisms ofconsciousness: the neural correlates of emotional awareness. J. Neurosci. 30,10039–10047.

Arnott, S.R., Binns, M.A., Grady, C.L., Alain, C., 2004. Assessing the auditory dual-pathway model in humans. Neuroimage 22, 401–408.

Barrett, D.J., Hall, D.A., 2006. Response preferences for “what” and “where” in humannon-primary auditory cortex. Neuroimage 32, 968–977.

Belin, P., Zatorre, R.J., Lafaille, P., Ahad, P., Pike, B., 2000. Voice-selective areas in humanauditory cortex. Nature 403, 309–312.

Bohil, C.J., Alicea, B., Biocca, F.A., 2011. Virtual reality in neuroscience research andtherapy. Nat. Rev. Neurosci. 12, 752–762.

Borst, G., Kosslyn, S.M., 2008. Visual mental imagery and visual perception: structuralequivalence revealed by scanning processes. Mem. Cognit. 36, 849–862.

Bradley, M.M., Lang, P.J., 1999. International Affective Digitized Sounds (IADS): Stimuli,Instruction Manual and Affective Ratings (Tech. Rep. No. B-2). Center for Researchin Psychophysiology, University of Florida Gainesville, Florida.

Bushara, K.O., Weeks, R.A., Ishii, K., Catalan, M.J., Tian, B., Rauschecker, J.P., Hallett, M.,1999. Modality-specific frontal and parietal areas for auditory and visual spatial lo-calization in humans. Nat. Neurosci. 2, 759–766.

Clarke, S., Bellmann Thiran, A., Maeder, P., Adriani, M., Vernet, O., Regli, L., Cuisenaire,O., Thiran, J.P., 2002. What and where in human audition: selective deficits follow-ing focal hemispheric lesions. Exp. Brain Res. 147, 8–15.

Cox, R.W., 1996. AFNI: software for analysis and visualization of functional magneticresonance neuroimages. Comput. Biomed. Res. 29, 162–173.

Esterman, M., Tamber-Rosenau, B.J., Chiu, Y.C., Yantis, S., 2010. Avoiding non-independence in fMRI data analysis: leave one subject out. Neuroimage 50,572–576.

Ethofer, T., Bretscher, J., Gschwind, M., Kreifelts, B., Wildgruber, D., Vuilleumier, P.,2012. Emotional voice areas: anatomic location, functional properties, andstructural connections revealed by combined fMRI/DTI. Cereb. Cortex 22, 191–200.

Fecteau, S., Belin, P., Joanette, Y., Armony, J.L., 2007. Amygdala responses tononlinguistic emotional vocalizations. Neuroimage 36, 480–487.

Fecteau, J.H., Enns, J.T., Kingstone, A., 2000. Competition-induced visual field differ-ences in search. Psychol. Sci. 11, 386–393.

Fujiki, N., Riederer, K.A., Jousmaki, V., Makela, J.P., Hari, R., 2002. Human cortical repre-sentation of virtual auditory space: differences between sound azimuth and eleva-tion. Eur. J. Neurosci. 16, 2207–2213.

Goydke, K.N., Altenmuller, E., Moller, J., Munte, T.F., 2004. Changes in emotional toneand instrumental timbre are reflected by the mismatch negativity. Brain Res.Cogn. Brain Res. 21, 351–359.

Graves, R., Landis, T., Goodglass, H., 1981. Laterality and sex differences forvisual recognition of emotional and non-emotional words. Neuropsychologia 19,95–102.

Grothe, B., Pecka, M., McAlpine, D., 2010. Mechanisms of sound localization in mam-mals. Physiol. Rev. 90, 983–1012.

Haxby, J.V., Grady, C.L., Horwitz, B., Ungerleider, L.G., Mishkin, M., Carson, R.E.,Herscovitch, P., Schapiro, M.B., Rapoport, S.I., 1991. Dissociation of object and spa-tial visual processing pathways in human extrastriate cortex. Proc. Natl. Acad. Sci.U. S. A. 88, 1621–1625.

Kosslyn, S.M., Thompson, W.L., Kim, I.J., Alpert, N.M., 1995. Topographical representa-tions of mental images in primary visual cortex. Nature 378, 496–498.

Krumbholz, K., Nobis, E.A., Weatheritt, R.J., Fink, G.R., 2009. Executive control of spatialattention shifts in the auditory compared to the visual modality. Hum. Brain Mapp.30, 1457–1469.

Lang, P.J., Davis, M., 2006. Emotion, motivation, and the brain: reflex foundations in an-imal and human research. Prog. Brain Res. 156, 3–29.

Langendijk, E.H., Bronkhorst, A.W., 2000. Fidelity of three-dimensional-sound repro-duction using a virtual auditory display. J. Acoust. Soc. Am. 107, 528–537.

Lomber, S.G., Malhotra, S., 2008. Double dissociation of ‘what’ and ‘where’ processing inauditory cortex. Nat. Neurosci. 11, 609–616.

Lomber, S.G., Malhotra, S., Hall, A.J., 2007. Functional specialization in non-primary au-ditory cortex of the cat: areal and laminar contributions to sound localization.Hear. Res. 229, 31–45.

Mathiak, K., Menning, H., Hertrich, I., Mathiak, K.A., Zvyagintsev, M., Ackermann, H.,2007. Who is telling what from where? A functional magnetic resonance imagingstudy. Neuroreport 18, 405–409.

Middlebrooks, J.C., Macpherson, E.A., Onsan, Z.A., 2000. Psychophysical customizationof directional transfer functions for virtual sound localization. J. Acoust. Soc. Am.108, 3088–3091.

Milner, A.D., Goodale, M.A., 1993. Visual pathways to perception and action. Prog. BrainRes. 95, 317–337.

Milner, A.D., Goodale, M.A., 1998. The Visual Brain in Action. Oxford University Press.Mitchell, D.G., Greening, S.G., 2012. Conscious perception of emotional stimuli: brain

mechanisms. Neuroscientist 18, 386–398.Mitchell, D.G., Luo, Q., Mondillo, K., Vythilingam, M., Finger, E.C., Blair, R.J., 2008. The in-

terference of operant task performance by emotional distracters: an antagonisticrelationship between the amygdala and frontoparietal cortices. Neuroimage 40,859–868.

Morris, J.S., Friston, K.J., Buchel, C., Frith, C.D., Young, A.W., Calder, A.J., Dolan, R.J., 1998.A neuromodulatory role for the human amygdala in processing emotional facialexpressions. Brain 121 (Pt 1), 47–57.

Ngan Ta, K.L., Liu, G., Brennan, A.A., Enns, J.T., 2010. Spider-phobia influences conscious,but not unconscious, control of visually guided action. J. Vis. 10, 1069.

Pessoa, L., Ungerleider, L.G., 2004. Neuroimaging studies of attention and the process-ing of emotion-laden stimuli. Prog. Brain Res. 144, 171–182.

Pisella, L., Grea, H., Tilikete, C., Vighetto, A., Desmurget, M., Rode, G., Boisson, D.,Rossetti, Y., 2000. An 'automatic pilot' for the hand in human posterior parietal cor-tex: toward reinterpreting optic ataxia. Nat. Neurosci. 3, 729–736.

Plichta, M.M., Gerdes, A.B., Alpers, G.W., Harnisch, W., Brill, S., Wieser, M.J., Fallgatter,A.J., 2011. Auditory cortex activation is modulated by emotion: a functional near-infrared spectroscopy (fNIRS) study. Neuroimage 55, 1200–1207.

Page 11: Emotion modulates activity in the ‘what’ but not ‘where ...sites01.lsu.edu/.../sites/98/2015/11/Kryklywy_2013_auditory_emotion… · Emotion modulates activity in the ‘what’

305J.H. Kryklywy et al. / NeuroImage 82 (2013) 295–305

Rauschecker, J.P., 2012. Ventral and dorsal streams in the evolution of speech and lan-guage. Front. Evol. Neurosci. 4, 7.

Rauschecker, J.P., Tian, B., 2000. Mechanisms and streams for processing of “what” and“where” in auditory cortex. Proc. Natl. Acad. Sci. U. S. A. 97, 11800–11806.

Rife, D.D., Vanderkooy, J., 1989. Transfer-function measurement with maximum-lengthsequences. J. Audio Eng. Soc. 37, 419–444.

Sauter, D.A., Eimer, M., 2009. Rapid detection of emotion from human vocalizations. J.Cogn. Neurosci. 22, 474–481.

Stecker, G.C., Harrington, I.A., Middlebrooks, J.C., 2005. Location coding by opponentneural populations in the auditory cortex. PLoS Biol. 3, e78.

Vastfjall, D., 2003. The subjective sense of presence, emotion recognition, and experi-enced emotions in auditory virtual environments. Cyberpsychol. Behav. 6,181–188.

Viinikainen, M., Katsyri, J., Sams, M., 2012. Representation of perceived sound valencein the human brain. Hum. Brain Mapp. 33, 2295–2305.

Vuilleumier, P., 2005. How brains beware: neural mechanisms of emotional attention.Trends Cogn. Sci. 9, 585–594.

Vuilleumier, P., Driver, J., 2007. Modulation of visual processing by attention andemotion: windows on causal interactions between human brain regions. Philos.Trans. R. Soc. Lond. B Biol. Sci. 362, 837–855.

Vul, E., Kanwisher, N., 2010. Begging the Question: The Non-independence error infMRI Data Analysis.

Vul, E., Pashler, H., 2012. Voodoo and circularity errors. Neuroimage 62, 945–948.Warren, J.D., Griffiths, T.D., 2003. Distinct mechanisms for processing spatial sequences

and pitch sequences in the human auditory brain. J. Neurosci. 23, 5799–5804.Wenzel, E., Arruda, M., Kistler, D.J., Wightman, F.L., 1993. Localization using non-

individualized head-related transfer functions. J. Acoust. Soc. Am. 94, 111–123.Wiethoff, S., Wildgruber, D., Kreifelts, B., Becker, H., Herbert, C., Grodd, W., Ethofer, T.,

2008. Cerebral processing of emotional prosody— influence of acoustic parametersand arousal. Neuroimage 39, 885–893.

Wightman, F.L., Kistler, D.J., 1989a. Headphone simulation of free-field listening I: stim-ulus synthesis. J. Acoust. Soc. Am. 85, 858–867.

Wightman, F.L., Kistler, D.J., 1989b. Headphone simulation of free-field listening II: psy-chophysical validation. J. Acoust. Soc. Am. 85, 868–878.