Alexithymia and the labeling of facial emotions: response slowing and increased motor and somatosensory processing
© Ihme et al.; licensee BioMed Central Ltd. 2014
Received: 4 December 2013
Accepted: 7 March 2014
Published: 14 March 2014
Alexithymia is a personality trait that is characterized by difficulties in identifying and describing feelings. Previous studies have shown that alexithymia is related to problems in recognizing others’ emotional facial expressions when these are presented with temporal constraints. These problems can be less severe when the expressions are visible for a relatively long time. Because the neural correlates of these recognition deficits are still relatively unexplored, we investigated the labeling of facial emotions and brain responses to facial emotions as a function of alexithymia.
Forty-eight healthy participants had to label the emotional expression (angry, fearful, happy, or neutral) of faces presented for 1 or 3 seconds in a forced-choice format while undergoing functional magnetic resonance imaging. The participants’ level of alexithymia was assessed using self-report and interview. In light of the previous findings, we focused our analysis on the alexithymia component of difficulties in describing feelings. Difficulties describing feelings, as assessed by the interview, were associated with increased reaction times for negative (i.e., angry and fearful) faces, but not with labeling accuracy. Moreover, individuals with higher alexithymia showed increased brain activation in the somatosensory cortex and supplementary motor area (SMA) in response to angry and fearful faces. These cortical areas are known to be involved in the simulation of the bodily (motor and somatosensory) components of facial emotions.
The present data indicate that alexithymic individuals may use information related to bodily actions rather than affective states to understand the facial expressions of other persons.
KeywordsAlexithymia Supplementary motor area Somatosensory cortex Facial emotion Labeling Toronto structured interview for Alexithymia
Understanding the emotional expression of another person is thought to require mimicry or simulation of others’ facial expressions [1, 2]. Thus, it is likely that neural assemblies exist that are active both when a person is experiencing and expressing an emotion and when the same person is seeing and interpreting the facial emotions of somebody else [3, 4]. Recent evidence indicates that interpreting facial expressions is a multi-faceted endeavor that requires recruiting a multitude of cortical and subcortical circuits, such as the visual system (e.g., occipital gyrus, fusiform gyrus [FFG]), to process the visual information of the face, the motor system for the (covert) physical simulation of the facial movement (supplementary motor area [SMA] or premotor cortex), somatosensory areas for proprioceptive feedback (primary somatosensory cortex, insula) and limbic or frontal regions for reenacting and feeling the according emotion (striatum, ventromedial pre-frontal cortex [vmPFC], amygdala [AMG]) [3–8].
A personality trait that is related to difficulties in the recognition of emotional facial expression is alexithymia (literally translated as “no words for emotion”). Alexithymia is characterized by deficits in identifying and describing one’s feelings . Alexithymic features can be assessed using the 20-item self-reported Toronto Alexithymia Scale (TAS-20, ) or the Toronto Structured Interview for Alexithymia (TSIA, ). Both measures of alexithymia include the subscales Difficulties Describing Feelings (DDF), Difficulties Identifying Feelings and Externally Oriented Thinking (the TSIA additionally includes imaginal processing).
It has been repeatedly shown that alexithymia is associated with a decreased ability to identify the facial expressions of others, especially when these expressions are presented under temporal constraints [12–14]. Interestingly, a recent electromyographic (EMG) study demonstrated that highly alexithymic individuals exhibit less facial mimicry when confronted with emotional faces . This could mean that individuals who are high in alexithymia have difficulties in interpreting the emotions of others because they automatically simulate others’ facial expressions to a lesser degree and therefore lack the capability to fully capture the other person’s feelings.
On the contrary, when the presentation time is increased, most studies did not reveal a relationship between the degree of alexithymia and recognition accuracy for emotional facial expressions (e.g., [12, 16, 17]). So far, only one study  has investigated brain activation related to facial emotion labeling, as assessed with longer presentation times (3.75 s) and as a function of alexithymia. No differences as a function of alexithymia could be found. However, the authors studied only 23 participants in a correlational approach. Yarkoni and Braver instead proposed the use of at least 40 participants for a correlational analysis in neuroimaging research . In addition, alexithymic tendencies were only assessed through self-report, although a multi-method approach is recommended [19–21]. Moreover, behavioral evidence  suggests that DDF, as opposed to the TAS-20 total score, is most predictive for facial emotion recognition. Thus, the current study investigated the labeling of facial emotions and brain responses to facial emotions as a function of DDF (as measured with TAS-20 and TSIA) using functional magnetic resonance imaging (fMRI). Because our design includes a relatively long response window after the presentation of the facial stimuli, we hypothesized that DDF would have an adverse effect on response latencies but not recognition accuracy.
Fifty-two healthy young German native speakers (age range: 18 to 29 yrs) participated in the study. All of them were right-handed and had normal or corrected-to-normal visual acuity. None of the participants had any history of neurological or psychiatric illnesses or contraindications for magnetic resonance imaging. All participants gave written consent to participate and received financial compensation for their participation. The study procedure was approved by the ethics committee of the Medical School of the University of Leipzig and was in accordance with the Declaration of Helsinki. Four participants had to be excluded from data analysis (one participant had a depression score of BDI > 14 at time of scanning, one subject displayed excessive head motions in the magnetic resonance imaging (MRI) scanner (>3 mm translation) and two participants demonstrated erroneous reactions and responded before the intended time window). Thus, 48 participants (23 female, age 24 ± 3 yrs, mean ± SD) entered final analysis.
Assessment of alexithymia and control variables
Alexithymic tendencies were measured using a questionnaire, the TAS-20 (German version: ), and an observer-rated measure, the TSIA (German version: ). The complete TSIA was administered by one trained interviewer and rated during the interview according to the manual. Before the study, the interviewer was trained to conduct and score the TSIA by the translators of the German version of the TSIA (coauthors MR and HG). This included becoming familiar with the alexithymia construct, the manual outlining administration and the scoring procedures for the TSIA, as well as discussion of the guidelines, the scoring of the items and the correct use of the prompts and probes. Moreover, test interviews were supervised until the interviewer was secure in the solo administration and scoring of the interview. Our analysis was focused on one subscale, DDF, of the TAS-20 and TSIA. This subscale consists of five items in the TAS-20 and six items in the TSIA, respectively. To control for depressive symptoms, anxiety and affectivity, participants also completed the Beck Depression Inventory (German version: ), the State-Trait-Anxiety Inventory (German Version: ) and the Positive and Negative Affect Schedule (German Version: ) trait version.
Task and design
The participants’ task was to label the facial emotion of a target face. Facial stimuli were color photographs taken from the Karolinska Directed Emotional Face database  depicting four different emotions (happy – HA, angry – AN, fearful – FE, and neutral – NE). Pictures of twenty different individuals (ten females) were shown in each of the four emotional conditions, consisting of 80 trials in total. Each trial lasted for 9 s, initiated by the presentation of a fixation cross in the center of the screen for 800 ms. In the first 40 trials of the experiment, the target was shown for 1 s; in the second half of the experiment, the target presentation time was set to 3 s. After presentation of the target, participants had 7.2 (5.2) s to label the emotions by pressing a button. Participants had one response pad per hand with two buttons each and provided their responses with their index and middle fingers. Each emotion was attributed to one button during the entire experiment counterbalanced across participants. During the response window, participants saw the four options in the order of button attribution, e.g., the label on the left side on the screen matched the most left button (i.e., left middle finger). After pressing a button, the labels vanished and only a gray screen was visible until the next trial started with the presentation of the fixation cross. Participants were instructed to answer as correctly as possible within the given time frame and were aware of the fact that the response window was shorter in the second half of the experiment. Trials were shown in two fixed random sequences with the constraints that no two subsequent trials depict the same person and that no more than two subsequent trials show the same emotion.
MRI acquisition and preprocessing
Structural and functional MR images were obtained on a 3 T scanner (Magnetom Verio, Siemens, Erlangen, Germany). For each participant, structural images were acquired with a T1-weighted 3D MP-RAGE . Magnetization preparation consisted of a non-selective inversion pulse. The imaging parameters were as follows: TI 650 ms, TR 1300 ms, TE 3.5 ms, flip angle 10°, isotropic spatial resolution of 1 mm3, two averages. Blood oxygen level dependent contrast sensitive images were collected using T2*-weighted echo-planar imaging (EPI) sequence [matrix 642; resolution 3 mm × 3 mm × 4 mm; gap 0.8 mm; TR 2 s; TE 30 ms; flip angle 90°; interleaved slice acquisition; 385 images]. The slices were oriented parallel to a line through the posterior and anterior commissures.
MRI data were preprocessed and analyzed using SPM8 (http://www.fil.ion.ucl.ac.uk/spm/). The initial five functional volumes were discarded to allow longitudinal magnetization to reach equilibrium. Functional volumes were slice-time corrected (temporal middle slice as reference), realigned to the first image and corrected for movement-induced image distortions (6-parameter rigid body affine realignment). The structural T1 images were coregistered to the mean functional EPI image (default in SPM). Anatomical images were segmented, including normalization to a standard stereotaxic space using the T1 MNI within SPM8. The normalization parameters were then applied to the functional EPI series. The resulting voxel size for the functional images was 3x3x3 mm3. A temporal high-pass filter (128 s) was applied to remove slow signal drifts. For the functional data, spatial smoothing was performed using a three-dimensional Gaussian filter of 6 mm full-width at half-maximum. We chose this rather small smoothing kernel such that the potential activation in subcortical areas involved in facial emotion processing was still detectable and not washed out.
Labeling accuracy was evaluated by the Grier sensitivity index , which considers true and false positives. The resulting values for this sensitivity index range from 0 to 1, with a value of 1 meaning perfect performance and a value of 0.5 referring to chance level. Due to the high accuracy and thus lack of sufficient trials to reliably estimate error responses, incorrect trials were discarded prior to analysis of reaction time and fMRI data. The data were pooled across both presentation time conditions. Originally, we aimed to differentiate between the two temporal conditions (1 and 3 s), similar to the study of Parker et al. . However, the accuracy was at its ceiling (> .9) with little variance, such that we decided to collapse across temporal conditions for analysis of reaction time and fMRI data. The high recognition rates in the current study compared to those of Parker et al. seem to be related to our long response window. The participants in Parker et al.’s study had to respond while the picture was presented (1 or 3 s). Participants had more time to respond in the current study, most likely resulting in higher accuracy. This is in line with the conclusions of a recent review (Grynberg et al. ), which was published when the data collection for this study was almost finished. Grynberg and colleagues concluded that alexithymic individuals' difficulties in recognizing facial emotions are most prominent when the pictures are presented for less than 300 ms. To investigate associations between measures of alexithymia and labeling accuracy, as well as RTs, correlational analyses were accomplished using Spearman’s rho. Spearman’s rho was also used to check for associations between the measures of alexithymia and affectivity questionnaires (BDI, STAI, and PANAS). We employed Spearman’s rho for correlational analyses because the RT and TSIA-DDF scores were not normally distributed. All associations were tested against a significance threshold of p = .05 (two-tailed).
The fMRI data were analyzed by modeling the onset and duration of the presentation times of each facial expression and by convolving these regressors with the hemodynamic response function for the different emotions. Incorrect trials were included in the first-level design matrix as nuisance regressor. First level t-contrasts were calculated by contrasting each emotional condition with the neutral one (HA > NE, AN > NE, FE > NE). The contrast images for the first level contrasts were then transferred to the second level models for the main effects (HA > NE, AN > NE and FE > NE) and regression models with TAS-20-DDF and TSIA-DDF as regressors. One second level model was calculated per alexithymia measure (TAS-20-DDF, TSIA-DDF) and experimental condition. For all models, significance was tested at the cluster level against a family-wise-error-corrected significance threshold of p = .05 at an individual voxel threshold of t = 3.5. As advised in the literature , we also report the activations that would survive a more lenient threshold (p = .001, k = 10) in the additional material to afford using these data in future meta-analyses.
In a recent paper, Yarkoni and colleagues  argued that the reaction times per second increase brain activation because the time required for preparatory processes for motor activation is increased. Thus, for contrasts yielding significant clusters, we checked whether adding the difference in RT between the two the conditions in that contrast (e.g., AN > NE) or the RT for the emotion only (e.g., AN) as nuisance covariates changed the results substantially.
Although an association between behavior and TSIA-DDF was revealed for angry and fearful faces, it was only reflected in significant brain activation related to TSIA-DDF in the contrast AN > NE, but not in FE > NE. For FE > NE, the effects on brain activation may be smaller and could thus not be detected using a whole brain approach. Thus, we additionally tested whether there was an association between TSIA-DDF and brain activation in these clusters in an ROI-based approach using small volume correction for FE > NE. For this, the significant clusters from the model testing for a positive correlation between TSIA-DDF and brain activation for the contrast AN > NE were saved as a mask. These, in turn, were employed as an ROI to check for activations positively correlating with TSIA-DDF in these brain areas.
Finally, an exploratory analysis was conducted to check whether our measures of alexithymia (TAS-20, TAS-20-DDF, TSIA, TSIA-DDF) displayed a relationship with brain activations in ROIs, which, based on the previous literature, are associated with facial emotion processing. To estimate the activation in these ROIs, the eigenvariates of the activation in these ROIs were extracted for the main contrasts (i.e., HA > NE, AN > NE, FE > NE) using SPM8. The activations in these ROIs were then related to the measures of alexithymia by employing Spearman’s rho. We decided to employ the following ROIs: amygdala (AMG), ventro-medial pre-frontal gyrus (vmPFC), fusiform gyrus (FFG) and striatum. The masks for AMG, FFG and striatum were defined using the automated anatomical labeling toolbox  as implemented in the WFU Pick Atlas  using SPM8. However, this tool did not include a reasonable mask for the vmPFC, so we defined this region as a sphere of 20 mm around the MNI coordinates xyz = [0 50–2]. These coordinates were based on the results of a study by Pessoa et al. on facial emotion processing . We also decided to include the clusters (SMA, right S1) positively correlating with TSIA-DDF in the contrast AN > NE as further ROIs.
Alexithymia measures and control variables
Correlations (Spearman’s rho) between measures of alexithymia
Correlations (Spearman’s rho) between difficulties describing feelings (as assessed by TAS-20-DDF and TSIA-DDF) and reaction times in the four facial expression conditions
Significant brain activations for all fMRI main contrasts
HA > NE
Middle occipital gyrus, middle temporal gyrus
Middle orbital gyrus, superior frontal gyrus, superior medial gyrus, anterior cingulate gyrus
Middle frontal gyrus, superior frontal gyrus
AN > NE
Inferior occipital gyrus, middle occipital gyrus, lingual gyrus
Fusiform gyrus, inferior temporal gyrus
Middle occipital gyrus
FE > NE
Inferior frontal gyrus, pars triangularis
Inferior occipital gyrus
Fusiform gyrus, inferior occipital gyrus
Middle temporal gyrus
Cerebellum, lobule VIIb
Relationships between brain activation and measures of alexithymia
Entering the difference between the reaction times for AN and NE as nuisance covariates into our second-level model slightly changed the results (SMA: xyz = [3 2 61], k = 43, pcluster = .05; somatosensory cortex: xyz = [30–37 40], k = 47, pcluster = .038). Similarly, using only the reaction time in the angry condition as a covariate led to small changes in the results (SMA: xyz = [3 2 61], k = 37, pcluster = .08; somatosensory cortex: xyz = [30–37 40], k = 45, pcluster = .046). Thus, our findings are highly likely to mainly reflect differences due to alexithymia (TSIA-DDF) and cannot be attributed to (differences in) the reaction time. The activations for the models related to the measures of alexithymia are presented at a more lenient threshold (p = .001, k = 10) in the Additional file 2: Table S2.
Post-hoc analysis of activation in SMA and S1 positively correlating with TSIA-DDF for contrast FE > NE
A post-hoc region of interest (ROI) analysis revealed a significant small-volume-corrected (SVC) peak voxel activation in the SMA (xyz = [-9 -4 61], pSVC = .019) and a marginally significant peak voxel activation in S1 (xyz = [30–37 40], pSVC = .069) positively correlating with TSIA-DDF in the contrast FE > NE. The activation in SMA remained marginally significant when controlling for PANAS positive affect (pSVC = .061) and significant when entering the difference in RT between FE and NE (pSVC = .032), or RT in FE alone (pSVC = .034). Similarly, the significance of the activation in S1 changed only slightly when controlling for PANAS (pSVC = .084), the difference in RT (pSVC = .052) or the reaction time for FE (pSVC = .059).
Exploratory analysis of correlations between brain regions relevant for emotion processing and measures of alexithymia
This study investigated the effects of self-report (TAS-20-DDF) and observer-rated (TSIA-DDF) facets of alexithymia on the labeling and neural processing of facial emotions presented for a rather long time (1 or 3 seconds). Our analysis of the main contrasts revealed significant clusters of brain activation in the fusiform gyrus, inferior and middle occipital gyrus (all conditions), in the middle temporal gyrus (fearful faces), inferior (fearful) and orbital and medial (happy) frontal gyrus as well as the cerebellum. All of these regions have been reported to be implicated in facial emotion processing (e.g.: [7, 8, 40–42]). Thus, we can assume that our experimental design is suitable for eliciting brain activation related to facial emotion recognition. Considering the specific effects of alexithymia, we found that high TSIA-DDF scores were related to increased reaction times when labeling angry and fearful faces and to increased brain activation in SMA and right S1 during the recognition of these negative faces. A post-hoc exploratory analysis suggests that activity in brain areas that are important in the affective components of facial emotion processing (AMG, vmPFC, striatum) does not show a particular relationship with alexithymia in the current task.
Their increased reaction times indicate that alexithymic individuals were slower in labeling negative emotions. Highly alexithymic individuals appear to need more time to reach a labeling accuracy level similar to subjects with low alexithymia. In contrast to previous studies describing a relationship between accuracy and degree of alexithymia [12, 13], we used relatively long stimulus presentation times and response windows and could not reveal interrelationships between alexithymia and recognition accuracy. Thus, it seems that alexithymic individuals have difficulties in recognizing facial expressions, which are reflected in decreased accuracy when presentation times and response windows are short (see also ). Prolonging presentation times and response windows could improve recognition accuracy, however, at the cost of increases in response time.
SMA is part of a brain network that is involved in the processing of motor-related information and motor preparation and has been shown to be involved in the production of facial emotions . Moreover, it has been argued that (especially pre-) SMA is involved in the recognition of facial emotions  by playing an important role in the motor components of simulation (see also ). Additionally, a cluster in S1 was revealed, which seems to reflect somatosensory aspects of facial emotion processing [3, 7, 45]. According to Adolphs et al. , recognizing emotions from facial expressions requires right primary somatosensory areas. The authors argue that recognition of another individual’s emotional state is mediated by internally generated somatosensory representations that simulate how the other individual would feel when displaying a certain facial expression. Taken together, this mediation could mean that highly alexithymic individuals have difficulties in automatically reenacting the negative facial emotion of others when these are presented briefly . When the presentation time is increased, highly alexithymic individuals can reach a similar performance as less alexithymic individuals, which seem to require an increased activation of motor and somatosensory areas. Interestingly, it has been found that highly (as compared to less) alexithymic individuals also show increased activation in motor-related brain areas when interpreting the directed actions of others in a classical mirror-neuron task and show no differences in interpreting these actions . Thus, highly alexithymic individuals may be more inclined to imitate the actions of others via (covert) motor simulation than are non-alexithymics. A recent meta-analysis by van der Velde et al.  reported that high levels of alexithymia are related to decreased activity in the SMA when participants are confronted with negative stimuli. However, this meta-analysis included all types of emotional paradigms and tasks (not only facial emotion recognition), so the published results may not necessarily reflect processes related specifically to facial emotion recognition.
There seems to be no particular relationship between activity in the amygdala, vmPFC and ventral striatum and alexithymia in the task studied here. This finding is very interesting because earlier studies on brain function [49–52] and structure  reported alterations in highly alexithymic individuals in these regions. In particular, functional studies on automatic processing of emotional faces (affective priming) [49–51] have revealed decreased activations in these brain areas. The lack of involvement in the current task may be the case because the emotional faces were presented for a rather long time in the current study. The amygdala and the ventral striatum, however, are thought to operate in a fast and automatic fashion and may be less relevant when the participants are fully aware of the emotional nature of the faces (e.g., [54, 55]), as in the current study. Thus, it seems that alexithymic individuals show less automatic activation in brain regions particularly involved in the affective components of face processing (AMG, ventral striatum, vmPFC), which most likely leads to alterations in the processing of and difficulties in the labeling of briefly presented faces. However, alexithymic individuals seem to be able to simulate the bodily aspects of facial expressions when the presentation times and response windows are long enough, which makes the correct recognition of faces possible in this case.
Our study points to deficits limited to the recognition of negative faces in alexithymia. Neither behavioral nor neurobiological differences were revealed for happy faces. This finding suggests that alexithymics have fewer problems interpreting positive compared to negative facial expressions. A recent review on alexithymia and the processing of emotional facial expressions concluded that the difficulties of alexithymic individuals in processing facial emotions are not specific to certain emotions . The work of Sonnby-Borgström  shows that the imitation of facial expressions (measured with facial EMG) in highly alexithymic individuals was only decreased for corrugator activity related to negative emotions, but not for zygomaticus activity related to happy faces. Against this background, alexithymic individuals may display fewer deficits in automatically simulating happy faces compared to neutral ones, which possibly renders the recognition of happy faces easier.
It is important to note that in our study, the objective measure of alexithymia (TSIA), but not the self-report measure (TAS-20), was predictive for recognition performance. Because some alexithymic individuals may not be aware of their own deficits, self-report tests could be less suitable for measuring difficulties in describing feelings compared to objective tests such as the TSIA.
It has been argued that the TAS-20 and the TSIA only measure cognitive aspects of alexithymia and neglect affective parts of the alexithymia construct . A questionnaire that possibly captures these affective components is the Bermond-Vorst-Alexithymia Questionnaire (, but see also ). It is possible that additionally applying this measure of alexithymia may have the potential to discover relationships between the brain areas involved in the affective components of emotional face processing. Future studies need to be conducted to determine whether the results of the current study are only related to cognitive alexithymia or whether they generalize to affective alexithymia as well.
In summary, alexithymic individuals have difficulties in labeling facial expressions of emotion, even when these are presented with little temporal constraints. Such individuals are slowed in their labeling of angry and fearful facial emotions, and they manifest increased activation in the somatosensory and supplementary motor cortex in response to these negative faces. These cortical regions are involved in the simulation of the bodily components of facial emotional expressions. Thus, the present data suggest that alexithymic individuals may recruit cortical processing resources that are involved in the simulation of the bodily components rather than of affective states (angry and fearful) to interpret these facial expressions.
Experimental condition with angry faces
Beck depression inventory
Difficulties describing feelings (subscale of TAS-20 and TSIA)
Echo planar imaging
Experimental condition with fearful faces
(functional) magnetic resonance imaging
Experimental condition with happy faces
Montreal neurological institute
Experimental condition with neutral faces
Positive and Negative affect schedule
Region of interest
Primary somatosensory cortex
Supplementary motor area
Small volume corrected
20-Item Toronto Alexithymia scale
Toronto structured interview for Alexithymia
Ventro-medial prefrontal cortex.
This work was supported by a grant from the German Research Foundation DFG to Thomas Suslow and Harald Kugel (SU 222/6-1).
We thank Sophie-Luise Lenk and Marc Rupietta for their help in data collection.
- Niedenthal PM: Embodying emotion. Science. 2007, 316: 1002-1005.View ArticlePubMedGoogle Scholar
- Dimberg U, Andréasson P, Thunberg M: Emotional empathy and facial reactions to facial expressions. J Psychophysiol. 2011, 25: 26-31.View ArticleGoogle Scholar
- Heberlein AS, Adolphs R: Edited by: Harmon-Jones E, Winkielman P. 2007, New York: Guilford Press, 31-55. Neurobiology of Emotion Recognition: Current Evidence for Shared Substrates, 1, Soc Neurosci.
- Heberlein AS, Atkinson AP: Neuroscientific evidence for simulation and shared substrates in emotion recognition: beyond faces. Emot Rev. 2009, 1: 162-177.View ArticleGoogle Scholar
- Heberlein AS, Padon AA, Gillihan SJ, Farah MJ, Fellows LK: Ventromedial frontal lobe plays a critical role in facial emotion recognition. J Cogn Neurosci. 2008, 20: 721-733.View ArticlePubMedGoogle Scholar
- Van der Gaag C, Minderaa RB, Keysers C: Facial expressions: what the mirror neuron system can and cannot tell us. Soc Neurosci. 2007, 2: 179-222.View ArticlePubMedGoogle Scholar
- Adolphs R: Neural systems for recognizing emotion. Curr Opin Neurobiol. 2002, 12: 169-177.View ArticlePubMedGoogle Scholar
- Fusar-Poli P, Placentino A, Carletti F, Landi P, Allen P, Surguladze S, Benedetti F, Abbamonte M, Gasparotti R, Barale F, Perez J, McGuire P, Politi P: Functional atlas of emotional faces processing: a voxel-based meta-analysis of 105 functional magnetic resonance imaging studies. J Psychiatry Neurosci. 2009, 34: 418-432.PubMed CentralPubMedGoogle Scholar
- Sifneos PE: The prevalence of alexithymic characteristics in psychosomatic patients. Psychother Psychosom. 1973, 22: 255-262.View ArticlePubMedGoogle Scholar
- Bagby RM, Parker JDA, Taylor GJ: The twenty-item Toronto Alexithymia scale—I. Item selection and cross-validation of the factor structure. J Psychosom Res. 1994, 38: 23-32.View ArticlePubMedGoogle Scholar
- Bagby RM, Taylor GJ, Parker JDA, Dickens SE: The development of the Toronto Structured Interview for Alexithymia: item selection, factor structure, reliability and concurrent validity. Psychother Psychosom. 2006, 75: 25-39.View ArticlePubMedGoogle Scholar
- Parker PD, Prkachin KM, Prkachin GC: Processing of facial expressions of negative emotion in alexithymia: the influence of temporal constraint. J Pers. 2005, 73: 1087-1107.View ArticlePubMedGoogle Scholar
- Swart M, Kortekaas R, Aleman A: Dealing with feelings: characterization of trait alexithymia on emotion regulation strategies and cognitive-emotional processing. PLoS One. 2009, 4: e5751.PubMed CentralView ArticlePubMedGoogle Scholar
- Grynberg D, Chang B, Corneille O, Maurage P, Vermeulen N, Berthoz S, Luminet O: Alexithymia and the Processing of Emotional Facial Expressions (EFEs): systematic review, unanswered questions and further perspectives. PLoS One. 2012, 7: e42429.PubMed CentralView ArticlePubMedGoogle Scholar
- Sonnby-Borgström M: Alexithymia as related to facial imitation, mentalization, empathy, and internal working models-of-self and -others. Neuropsychoanalysis. 2009, 11: 111-128.View ArticleGoogle Scholar
- Mériau K, Wartenburger I, Kazzer P, Prehn K, Lammers C-H, van der Meer E, Villringer A, Heekeren HR: A neural network reflecting individual differences in cognitive processing of emotions during perceptual decision making. Neuroimage. 2006, 33: 1016-1027.View ArticlePubMedGoogle Scholar
- Montebarocci O, Surcinelli P, Rossi N, Baldaro B: Alexithymia, verbal ability and emotion recognition. Psychiatr Q. 2011, 82: 245-252.View ArticlePubMedGoogle Scholar
- Yarkoni T, Braver TS: Edited by: Gruszka A, Matthews G, Szymura B. 2010, New York, NY: Springer, 87-107. [The Springer Series on Human Exceptionality (Series editor)], Cognitive Neuroscience Approaches to Individual Differences in Working Memory and Executive Control: Conceptual and Methodological Issues, Handb Individ Differ Cogn.
- Lumley MA, Gustavson BJ, Partridge RT, Labouvie-Vief G: Assessing alexithymia and related emotional ability constructs using multiple methods: interrelationships among measures. Emotion. 2005, 5: 329-342.View ArticlePubMedGoogle Scholar
- Taylor GJ: Recent developments in alexithymia theory and research. Can J Psychiatry. 2000, 45: 134-142.PubMedGoogle Scholar
- Lichev V, Rufer M, Rosenberg N, Ihme K, Grabe HJ, Kugel H, Donges U-S, Kersting A, Suslow T: Assessing alexithymia and emotional awareness: Relations between measures in a German non-clinical sample. Compr Psychiatry. -in press
- Bach M, Bach D, de Zwaan M, Serim J, Böhmer J: Validierung der deutschen Version der 20-Item Toronto Alxithymie Skala bei Normalpersonen und psychiatrischen Patienten. Psychother Psychosom Med Psychol. 1996, 46: 23-28.PubMedGoogle Scholar
- Grabe HJ, Löbel S, Dittrich D, Bagby RM, Taylor GJ, Quilty LC, Spitzer C, Barnow S, Mathier F, Jenewein J, Freyberger HJ, Rufer M: The German version of the Toronto Structured Interview for Alexithymia: factor structure, reliability, and concurrent validity in a psychiatric patient sample. Compr Psychiatry. 2009, 50: 424-430.View ArticlePubMedGoogle Scholar
- Hautzinger M, Keller F, Kühner C: Beck-Depressions-Inventar (BDI). 2006, Harcourt Test Services: Frankfurt/MainGoogle Scholar
- Laux L, Glanzmann P, Schaffner P, Spielberger CD: State–Trait–Anxiety Inventory (STAI). [German version.]. 1981, .Google Scholar
- Krohne HW, Egloff B, Kohlmann C-W, Tausch A: Untersuchungen mit einer deutschen Version der “Positive and Negative Affect Schedule” (PANAS). [Investigations with a German version of the Positive and Negative Affect Schedule (PANAS).]. Diagnostica. 1996, 42: 139-156.Google Scholar
- Lundqvist D, Flykt A, Öhmann A: The Karolinska Directed Emotional Faces - KDEF. 1998Google Scholar
- Mugler JP, Brookeman JR: Three-dimensional magnetization-prepared rapid gradient-echo imaging (3D MP RAGE). Magn Reson Med. 1990, 15: 152-157.View ArticlePubMedGoogle Scholar
- Grier JB: Nonparametric indexes for sensitivity and bias: computing formulas. Psychol Bull. 1971, 75: 424-429.View ArticlePubMedGoogle Scholar
- Lieberman MD, Cunningham WA: Type I and Type II error concerns in fMRI research: re-balancing the scale. Soc Cogn Affect Neurosci. 2009, 4: 423-428.PubMed CentralView ArticlePubMedGoogle Scholar
- Yarkoni T, Barch DM, Gray JR, Conturo TE, Braver TS: BOLD correlates of trial-by-trial reaction time variability in gray and white matter: a multi-study fMRI analysis. PLoS One. 2009, 4: e4257.PubMed CentralView ArticlePubMedGoogle Scholar
- Tzourio-Mazoyer N, Landeau B, Papathanassiou D, Crivello F, Etard O, Delcroix N, Mazoyer B, Joliot M: Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the MNI MRI single-subject brain. Neuroimage. 2002, 15: 273-289.View ArticlePubMedGoogle Scholar
- Maldjian JA, Laurienti PJ, Kraft RA, Burdette JH: An automated method for neuroanatomic and cytoarchitectonic atlas-based interrogation of fMRI data sets. Neuroimage. 2003, 19: 1233-1239.View ArticlePubMedGoogle Scholar
- Pessoa L, McKenna M, Gutierrez E, Ungerleider LG: Neural processing of emotional faces requires attention. Proc Natl Acad Sci U S A. 2002, 99: 11458-11463.PubMed CentralView ArticlePubMedGoogle Scholar
- Beck AT, Steer RA, Brown GK: BDI-II Beck Depression Inventory Manual. 1998, Psychological Corp.: San Antonio, TX, 2Google Scholar
- Spielberger CD, Gorsuch R, Lushene PR, Vagg PR, Jacobs AG: State-Trait Anxiety Inventory. 1970Google Scholar
- Watson D, Clark LA, Tellegen A: Development and validation of brief measures of positive and negative affect: the PANAS scales. J Pers Soc Psychol. 1988, 54: 1063-1070.View ArticlePubMedGoogle Scholar
- Vul E, Harris C, Winkielman P, Pashler H: Puzzlingly high correlations in fMRI studies of emotion, personality, and social cognition. Perspect Psychol Sci. 2009, 4: 274-290.View ArticlePubMedGoogle Scholar
- Kriegeskorte N, Lindquist MA, Nichols TE, Poldrack RA, Vul E: Everything you never wanted to know about circular analysis, but were afraid to ask. J Cereb Blood Flow Metab. 2010, 30: 1551-1557.PubMed CentralView ArticlePubMedGoogle Scholar
- Pizzagalli DA, Lehmann D, Hendrick AM, Regard M, Pascual-Marqui RD, Davidson RJ: Affective judgments of faces modulate early activity (160 ms) within the fusiform gyri. Neuroimage. 2002, 16: 663-677.View ArticlePubMedGoogle Scholar
- Suslow T, Kugel H, Rauch AV, Dannlowski U, Bauer J, Konrad C, Arolt V, Heindel W, Ohrmann P: Attachment avoidance modulates neural response to masked facial emotion. Hum Brain Mapp. 2009, 30: 3553-3562.View ArticlePubMedGoogle Scholar
- Dannlowski U, Stuhrmann A, Beutelmann V, Zwanzger P, Lenzen T, Grotegerd D, Domschke K, Hohoff C, Ohrmann P, Bauer J, Lindner C, Postert C, Konrad C, Arolt V, Heindel W, Suslow T, Kugel H: Limbic scars: long-term consequences of childhood maltreatment revealed by functional and structural magnetic resonance imaging. Biol Psychiatry. 2012, 71: 286-293.View ArticlePubMedGoogle Scholar
- Likowski KU, Mühlberger A, Gerdes ABM, Wieser MJ, Pauli P, Weyers P: Facial mimicry and the mirror neuron system: simultaneous acquisition of facial electromyography and functional magnetic resonance imaging. Front Hum Neurosci. 2012, 6: 214.PubMed CentralView ArticlePubMedGoogle Scholar
- Rochas V, Gelmini L, Krolak-Salmon P, Poulet E, Saoud M, Brunelin J, Bediou B: Disrupting pre-SMA activity impairs facial happiness recognition: an event-related TMS study. Cereb Cortex. 2012, 23: 1517-1525.View ArticlePubMedGoogle Scholar
- Haxby JV, Hoffman EA, Gobbini MI: The distributed human neural system for face perception. Trends Cogn Sci. 2000, 4: 223-233.View ArticlePubMedGoogle Scholar
- Adolphs R, Damasio H, Tranel D, Cooper G, Damasio AR: A role for somatosensory cortices in the visual recognition of emotion as revealed by three-dimensional lesion mapping. J Neurosci. 2000, 20: 2683-2690.PubMedGoogle Scholar
- Moriguchi Y, Ohnishi T, Decety J, Hirakata M, Maeda M, Matsuda H, Komaki G: The human mirror neuron system in a population with deficient self-awareness: an fMRI study in alexithymia. Hum Brain Mapp. 2009, 30: 2063-2076.View ArticlePubMedGoogle Scholar
- Van der Velde J, Servaas MN, Goerlich KS, Bruggeman R, Horton P, Costafreda SG, Aleman A: Neural correlates of alexithymia: A meta-analysis of emotion processing studies. Neurosci Biobehav Rev. 2013, 37: 1774-1785.View ArticlePubMedGoogle Scholar
- Reker M, Ohrmann P, Rauch AV, Kugel H, Bauer J, Dannlowski U, Arolt V, Heindel W, Suslow T: Individual differences in alexithymia and brain response to masked emotion faces. Cortex. 2010, 46: 658-667.View ArticlePubMedGoogle Scholar
- Duan X, Dai Q, Gong Q, Chen H: Neural mechanism of unconscious perception of surprised facial expression. Neuroimage. 2010, 52: 401-407.View ArticlePubMedGoogle Scholar
- Kugel H, Eichmann M, Dannlowski U, Ohrmann P, Bauer J, Arolt V, Heindel W, Suslow T: Alexithymic features and automatic amygdala reactivity to facial emotion. Neurosci Lett. 2008, 435: 40-44.View ArticlePubMedGoogle Scholar
- Lee B-T, Lee H-Y, Park S-A, Lim J-Y, Tae WS, Lee M-S, Joe S-H, Jung I-K, Ham B-J: Neural substrates of affective face recognition in alexithymia: a functional magnetic resonance imaging study. Neuropsychobiology. 2011, 63: 119-124.View ArticlePubMedGoogle Scholar
- Ihme K, Dannlowski U, Lichev V, Stuhrmann A, Grotegerd D, Rosenberg N, Kugel H, Heindel W, Arolt V, Kersting A, Suslow T: Alexithymia is related to differences in gray matter volume: a voxel-based morphometry study. Brain Res. 2013, 1491: 60-67.View ArticlePubMedGoogle Scholar
- Pessoa L: On the relationship between emotion and cognition. Nat Rev Neurosci. 2008, 9: 148-158.View ArticlePubMedGoogle Scholar
- Pessoa L, Japee S, Sturman D, Ungerleider LG: Target visibility and visual awareness modulate amygdala responses to fearful faces. Cereb Cortex. 2006, 16: 366-375.View ArticlePubMedGoogle Scholar
- Bermond B, Clayton K, Liberova A, Luminet O, Maruszewski T, Ricci Bitti PE, Rimé B, Vorst HH, Wagner H, Wicherts J: A cognitive and an affective dimension of alexithymia in six languages and seven populations. Cogn Emot. 2007, 21: 1125-1136.View ArticleGoogle Scholar
- Vorst H, Bermond B: Validity and reliability of the Bermond–Vorst Alexithymia Questionnaire. Pers Individ Dif. 2001, 30: 413-434.View ArticleGoogle Scholar
- Bagby RM, Quilty LC, Taylor GJ, Grabe HJ, Luminet O, Verissimo R, Grootte I, De Vanheule S: Are there subtypes of alexithymia?. Pers Individ Dif. 2009, 47: 413-418.View ArticleGoogle Scholar
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.