Skip to main content

Inscrutable games? Facial expressions predict economic behavior

Neuroscientific and behavioral evidence shows that when subjects are engaged in simple economic games, they pay attention to the face of their opponents. Is this a good idea? Does the face of a decision-maker contain information about his strategy space? We tested this hypothesis by modeling facial expressions of subjects playing the Ultimatum Game. We recorded videos of 60 participants, and automatically extracted time-series of facial actions (12 action units [1], shown in Fig. 1A., as well as pitch, yaw, and roll of the head) using the real-time facial coding system of [2, 3]. We then trained non-linear support vector machines (SVM) to predict the decision of the second player from a segment of video acquired after the offer was received and before the decision was entered (n = 376). To separate the dynamics of facial behavior into different temporal scales, the data was preprocessed with a bank of Gabor filters. With this method we achieved a between-subjects cross-validation accuracy of 0.66 (chance = 0.50) in predicting decisions. Because receiving an unfair offer in the Ultimatum Game is known to evoke a differential facial expression [4], we also trained a model which can capture non-linear relations between facial expressions, fairness and decisions. To do so, we labeled each instance as fair (offer > $3) or unfair, and then trained different classifiers to be ‘experts’ on either fair or unfair offers only. In this case, out-of-sample classification accuracy increased to 0.78. For both cases, we used a foreword selection procedure to identify the most predictive features (Fig.1B).

Figure 1
figure 1

A. Action units (AUs) used in the analysis (image of the face created with Artnatomy [5]). B. Frequency with which a feature is selected as covariate in a logistic classifier, using increases in area under the ROC as inclusion criterion.

Abstract approaches that study social decision-making usually disregard the fact that choices are made in informationally rich environments. Instead, one important goal is to model different sources of information as well as the way in which they affect decisions. The current study suggests that one important source of information about strategic decision making behavior may be the face, since, given sensitive enough instruments, this information can be measured and quantified in real-time by a computer. This also suggests that real-time analysis of facial action codes may serve as a powerful new tool for understanding strategic decision making which can complement neuroimaging techniques such as EEG and fMRI.


  1. Ekman P, Friesen WV: Facial action coding system: A technique for the measurement of facial movement. 1978, Palo Alto: Consulting Psychologists Press

    Google Scholar 

  2. Littlewort G, Whitehill J, Wu T, Fasel I, Frank M, Movellan J, Bartlett M: The Computer Expression Recognition Toolbox (CERT). Face and Gesture Recognition. to appear

  3. Littlewort G, Bartlett MS, Fasel I, Susskind J, Movellan J: Dynamics of facial expression extracted automatically from video. Image and Vision Computing. 2006, 24: 615-625. 10.1016/j.imavis.2005.09.011.

    Article  Google Scholar 

  4. Chapman HA, Kim DA, Susskind JM, Anderson AK: In Bad Taste: Evidence for the Oral Origins of Moral Disgust. Science. 2009, 328: 1222-1226. 10.1126/science.1165565.

    Article  Google Scholar 


Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Filippo Rossi.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and Permissions

About this article

Cite this article

Rossi, F., Fasel, I. & Sanfey, A.G. Inscrutable games? Facial expressions predict economic behavior. BMC Neurosci 12 (Suppl 1), P281 (2011).

Download citation

  • Published:

  • DOI: