Citation: Pauline M. Insch, Gillian Slessor, Louise H. Phillips, Anthony Atkinson, Jill Warrington. The Impact of Aging and Alzheimers Disease on Decoding Emotion Cues from Bodily Motion[J]. AIMS Neuroscience, 2015, 2(3): 139-152. doi: 10.3934/Neuroscience.2015.3.139
[1] | Justyna Gerłowska, Krzysztof Dmitruk, Konrad Rejdak . Facial emotion mimicry in older adults with and without cognitive impairments due to Alzheimer's disease. AIMS Neuroscience, 2021, 8(2): 226-238. doi: 10.3934/Neuroscience.2021012 |
[2] | Marco Calabrò, Carmela Rinaldi, Giuseppe Santoro, Concetta Crisafulli . The biological pathways of Alzheimer disease: a review. AIMS Neuroscience, 2021, 8(1): 86-132. doi: 10.3934/Neuroscience.2021005 |
[3] | Craig T. Vollert, Jason L. Eriksen . Microglia in the Alzheimers brain: a help or a hindrance?. AIMS Neuroscience, 2014, 1(3): 210-224. doi: 10.3934/Neuroscience.2014.3.210 |
[4] | Mani Pavuluri, Amber May . I Feel, Therefore, I am: The Insula and Its Role in Human Emotion, Cognition and the Sensory-Motor System. AIMS Neuroscience, 2015, 2(1): 18-27. doi: 10.3934/Neuroscience.2015.1.18 |
[5] | Nour Kenaan, Zuheir Alshehabi . A review on recent advances in Alzheimer's disease: The role of synaptic plasticity. AIMS Neuroscience, 2025, 12(2): 75-94. doi: 10.3934/Neuroscience.2025006 |
[6] | Ronald Kamoga, Godfrey Zari Rukundo, Samuel Kalungi, Wilson Adriko, Gladys Nakidde, Celestino Obua, Johnes Obongoloch, Amadi Ogonda Ihunwo . Vagus nerve stimulation in dementia: A scoping review of clinical and pre-clinical studies. AIMS Neuroscience, 2024, 11(3): 398-420. doi: 10.3934/Neuroscience.2024024 |
[7] | Kiran Kumar Siddappaji, Shubha Gopal . Molecular mechanisms in Alzheimer's disease and the impact of physical exercise with advancements in therapeutic approaches. AIMS Neuroscience, 2021, 8(3): 357-389. doi: 10.3934/Neuroscience.2021020 |
[8] |
Frank O. Bastian .
Is Alzheimers Disease Infectious? Relative to the CJD Bacterial Infection Model of Neurodegeneration. AIMS Neuroscience, 2015, 2(4): 240-258. doi: 10.3934/Neuroscience.2015.4.240 |
[9] | Khue Vu Nguyen . Special Issue: Alzheimer’s disease. AIMS Neuroscience, 2018, 5(1): 74-80. doi: 10.3934/Neuroscience.2018.1.74 |
[10] | Ubaid Ansari, Jimmy Wen, Burhaan Syed, Dawnica Nadora, Romteen Sedighi, Denise Nadora, Vincent Chen, Forshing Lui . Analyzing the potential of neuronal pentraxin 2 as a biomarker in neurological disorders: A literature review. AIMS Neuroscience, 2024, 11(4): 505-519. doi: 10.3934/Neuroscience.2024031 |
Older adults have poorer ability to perceive emotional expressions than their younger counterparts [1]. Perception of emotions is even more impaired in older people who have dementia, including Alzheimer’s disease [2]. Understanding the nature of these difficulties is important because emotion perception predicts quality of life in both healthy older adults and those with AD [3]. Although there have been claims that emotion perception problems are specific to certain emotions such as sadness, anger and fear in both healthy aging [1] and AD [4], it is also possible that response biases in choosing emotion labels might influence the pattern of results found [5]. The majority of studies looking at age or AD effects on emotion perception use photographs of facial expressions of emotion. It is important to understand whether the pattern of problems found also translates to other modalities such as bodily gestures and motion. In the current study we explore the effects of aging and AD on decoding emotions from bodily motion, looking at both the quantity and nature of errors made.
Emotion perception studies have focused on facial expressions. However, other nonverbal sources of cues to emotion are also conveyed through body movements [6]. Decoding cues from the body is also thought to aide effective social interaction [7]. The ability to perceive bodily motion is typically investigated using point light animation, in which the only visible elements are points located at body joints. Point light stimuli reduce or eliminate visual information about the static form of the body and the face, leaving intact motion information. Despite the impoverished visual information in point light stimuli, observers can recognize human actions such as walking and dancing [8] and the emotional status of others [9,10].
Normal aging impairs the ability to detect emotional information from point light displays of bodily motion. Ruffman et al. (2009) asked older and younger participants to watch videos depicting bodily expressions of emotions. Older adults were significantly worse than young at decoding anger and sadness (but not disgust, fear or happiness), consistent with the age differences in facial emotion perception [1,5,12]. Insch et al. (2012) explored age-related differences in decoding both emotional and non-emotional point light stimuli (actions). Older adults performed more poorly than the young on both tasks, suggesting more general difficulties in motion perception [14,15]. Due to the small number of stimuli for each emotion, Insch et al. did not analyze age differences for the individual emotion labels.
There is substantial evidence that AD impairs facial emotion perception, though there is disagreement on the extent to which AD may differentially affect the ability to label specific facial emotions. Some studies indicate impairments in labeling sad, angry and fearful facial expressions, and a relative sparing of the ability to label disgusted faces [4], or happy faces [16]. However, a meta-analysis [2] indicated that AD-related impairment in labeling emotions were general rather than specific. No studies to date have looked specifically at the effects of AD on the ability to decode emotional information from point light bodily displays.
Processing of visual motion is known to be impaired in AD [17]. People with AD perform worse than age-matched controls at identifying actions from bodily motion, and this might be related to changing frontal-cingulate networks in AD [18]. There is also evidence that people with AD are impaired at processing affective information from videos of body movements [19], particularly sadness. Henry et al. (2012) explored the ability to decode actions and emotions from point light figures in a group of participants with dementia and healthy aging controls. Dementia subtyping information was not available in their participants, so the dementia group likely included those with AD, frontotemporal dementia and vascular dementias. Henry et al. (2012) found that people with dementia performed worse at decoding both emotions and actions from point light bodily stimuli compared to controls. People with dementia performed worse at recognizing all exemplars of emotions (anger, fear, sadness and happiness), but disgust was not included amongst the stimuli and this is the emotion which has often been relatively robust to dementia effects on face emotion perception [4].
Therefore, limited research has investigated the ability to decode basic emotions from point light depictions of biological motion in healthy aging and AD. Furthermore, none of these studies have analyzed the pattern of errors that participants make on the tasks or corrected for potential response biases to specific emotions. In emotion perception tasks participants are usually required to choose an emotion label from a list. There may be a risk that a participant favors one label more than others. If a label is given more frequently (e.g. disgust) then this may increase the level of accuracy in identifying that emotion by chance [5]. For example if a participant said disgust for each trial they would score 100% accuracy for disgust recognition because of a labeling bias and not because of an ability to recognize disgust better than other emotions. For facial emotions, there is evidence that apparent age-related stability in labeling some emotions may in fact reflect response biases [5].
The aims of the current research were therefore to explore how healthy aging (Study 1), and AD (Study 2) impact on the ability to accurately decode information about sadness, happiness, fear, anger and disgust from point light cues to bodily motion. Few studies have looked at both healthy aging and AD effects on emotion perception using exactly the same methods, and the current study will improve understanding of whether any AD effects are qualitatively different to those seen in normal aging. Of particular interest was the pattern of age and AD effects in labeling specific emotions. A non-emotional action identification task was also used to investigate whether group effects were specific to identifying emotions. We also explored the error patterns and carried out analyses to control for response biases.
Study 1 addresses the effects of healthy aging on perception of biological motion, older adults are less accurate at labeling emotions from point light figures of body motion than younger adults [3], particularly anger and sadness [11]. Our first prediction for this study was therefore that age-related impairments would be greatest for sadness and anger, and smaller for disgust and happiness. Following results from facial expression tasks, it was also predicted that older adults might be biased to choose the disgust label, and that correcting for this would result in age differences being found in identifying all emotions.
Two groups of participants were recruited: 40 young adults (11 male) aged from 18 to 40 years (M = 20.90 years, SD = 4.20 years), all of whom were undergraduate psychology students who took part for course credit, and 45 older adults (12 male) aged from 63 to 87 years (M = 72.80 years, SD = 5.95 years), who were volunteers from the University of Aberdeen participant panel and reimbursed for their time. All participants were native English speakers and did not report previous or current neuropsychological disorders. All participants had normal or corrected to normal near and distance vision; those who required corrective lenses wore them during the experiment. No significant difference in gender ratio was found between the age groups, χ2 (1) = 0.007, p = 0.93. The age groups differed in years of education, t (84) = 4.37, p < 0.001, d = 0.97 (young M = 14.75 years, SD = 1.17; old M = 12.82 years, SD = 2.56).
The biological motion films [9] (see http://community.dur.ac.uk/a.p.atkinson/Welcome.html for examples) were presented to the participants at a viewing distance of approximately 45 cm on a 17" screen. Each clip lasted for 3 seconds. The actor started from a neutral standing position, then after portraying the action or emotion the point light figure returned to a neutral standing position and remained on the screen until the participant had made their choice. In the action block participants viewed 30 point light videos depicting one of five actions (bend, dig, hop, jump and walk); six different versions of each action were presented. There were also 30 trials for the emotion block; five emotions were included (anger, disgust, fear, happiness, sadness) with six exemplars of each. The presentation order of the action and emotion blocks was counterbalanced. The trials within each block were presented in the same randomized order to each participant. At the end of each video participants said aloud what action or emotion they thought was being depicted in the trial choosing from a list of five possible actions or emotions.
The dependent variable was the number of actions or emotions reported correctly (max = 30); these scores were converted to percentage accuracy for each block. Frequencies of errors were calculated for the emotion block by logging each incorrect label given in error. Response bias was controlled for using kappa scores as a measure of accuracy adjusted to account for response biases [5,21]. This kappa analysis involves calculating the number of times a participant correctly identified the emotion (correct responses) and also the number of times the participant refrained from using that emotion label incorrectly (correct rejections). An adjustment was then made for the number of times that an emotion label may occur as a result of chance alone. A kappa (K) score was calculated for each emotion for each participant. The formula is K = (total number of correct responses and correct rejections - total number of responses expected by chance)/(total number of stimuli - number of responses expected by chance). The resulting kappa scores can be from 0 (performance at chance level) to 1 (all responses were either correct responses or correct rejections).
A 2 (task: action and emotion) × 2 (age: young and older) between subjects ANOVA was conducted. There was a main effect of task, F (1, 83) = 742.95, p < 0.001, ηp2 = 0.90 with the action task (M = 97.02) showing higher levels of accuracy than the emotions task (M = 59.01). There was also a main effect of age, F (1, 83) = 44.14, p < 0.001, ηp2 = 0.35 with younger adults (M = 83.15) scoring higher than the older adults (M = 73.45). Finally the interaction between age and task was also significant F (1, 84) = 30.49, p < 0.001, ηp2 = .27. An independent samples t-test revealed that there were no significant age differences on the action block, t (83) = 1.96 p = 0.09, with younger (M = 98.13) and older adults (M = 96.03) scoring comparably. There were significant age differences on the emotion block of the task, t (83) = 6.54, p < 0.001, with older adults (M = 50.87) less accurate than younger adults (M = 68.17).
A 2 (age category: young and old) by 5 (emotion label: anger, disgust, fear, happiness and sadness) mixed design ANOVA was conducted on the emotion trials, to determine if age differences were apparent for specific emotions. A main effect of emotion label was revealed, F (4, 332) = 109.23, p < 0.001, ηp2 = 0.57, with highest scores for happiness (M= 5.01) and lowest for disgust (M= 1.62). There was also a main effect of age, F (1, 83) = 43.71, p < 0.001, ηp2 = 0.35, with younger adults (M = 4.01) achieving higher levels of accuracy than the older group (M = 3.07). The interaction between emotion label and age category was also significant, F (4, 332) = 7.71, p < 0.001, ηp2 = 0.09. Independent samples t-tests (see Table 1) revealed that older adults were significantly worse than younger participants when decoding anger, sadness and fear portrayed in the point light stimuli. There were no significant differences between groups for disgust or happiness.
Young (n = 40) | Old (n = 45) | Independent Samples t-test | Effect Size | ||||
Mean | SD | Mean | SD | t(83) | p | d | |
Anger | 4.75 | 1.19 | 3.44 | 1.25 | 4.90 | < 0.001 | 1.08 |
Disgust | 1.85 | 1.17 | 1.42 | 1.14 | 1.71 | ns | 0.37 |
Fear | 4.23 | 1.27 | 3.20 | 1.47 | 3.41 | < 0.001 | 0.80 |
Happiness | 5.23 | 0.83 | 4.82 | 1.09 | 1.89 | ns | 0.42 |
Sadness | 4.45 | 1.08 | 2.47 | 1.44 | 7.10 | < 0.001 | 1.56 |
In order to determine if the labels given erroneously in the emotion task varied by age a 2 (age: young and old) by 5 (emotion: anger, disgust, fear, happiness and sadness) mixed design ANOVA was performed (see Table 2 for descriptive statistics). A main effect of emotion label was found, F (4, 332) = 28.40, p < 0.001, ηp2 = 0.25, with sadness being most likely (M = 3.47) and happiness (M = 1.11) being the least likely to be used as a label in error. There was also a main effect of age, F (1, 83) = 45.73, p < 0.001, ηp2 = 0.35, with older adults making higher levels of errors (M = 2.92) than the younger group (M = 1.89). The interaction between emotion label and age category was also significant, F (4, 332) = 4.99, p = < 0.001, ηp2 = 0.06. Independent sample t-tests were used to explore the interaction (see Table 2); older adults used the emotion labels anger, disgust, fear and happiness significantly more often than the younger group. There was no significant age difference for the emotion label of sadness.
Young (n = 40) | Old (n = 45) | Independent Samples t-test | Effect Size | ||||
Mean | SD | Mean | SD | t(83) | p | d | |
Anger | 1.18 | 1.08 | 3.27 | 1.92 | -6.07 | < 0.001 | 1.34 |
Disgust | 2.38 | 1.87 | 3.38 | 1.74 | -2.55 | < 0.05 | 0.55 |
Fear | 1.73 | 1.19 | 3.07 | 1.69 | -4.16 | < 0.001 | 0.91 |
Happy | 0.78 | 0.83 | 1.40 | 1.27 | -2.65 | < 0.05 | 0.58 |
Sad | 3.40 | 1.53 | 3.53 | 1.71 | -0.38 | ns | 0.08 |
Controlling for Response Biases in Healthy Aging
Kappa scores were calculated (see section 2.12) to control for potential response biases. Age differences in these unbiased indicators of emotion perception performance were analyzed in a 2 (age: young and old) by 5 (emotion: anger, disgust, fear, happiness and sadness) mixed design ANOVA. There was a main effect of emotion label, F (4, 332) = 101.23, p < 0.001, ηp2 = 0.55, with best performance for happiness (M = 0.95) and poorest for disgust (M = 0.74). There was a main effect of age, F (1, 83) = 43.79, p < 0.001, ηp2 = 0.35, with younger adults achieving higher accuracy (M = 0.89) than older adults (M = 0.81). The interaction between emotion label and age category was also significant, F (4, 332) = 5.78, p < 0.001, ηp2 = 0.06. Independent samples t tests revealed that older adults were now significantly less accurate for all emotions with the greatest age effects being shown for anger, fear and sadness respectively (See Table 3).
Young (n = 40) | Old (n = 45) | Independent Samples t-test | Effect Size | ||||
Mean | SD | Mean | SD | t(83) | p | d | |
Anger | 0.94 | 0.07 | 0.81 | 0.10 | 6.76 | < 0.001 | 1.52 |
Disgust | 0.78 | 0.09 | 0.72 | 0.09 | 2.78 | < 0.01 | 0.60 |
Fear | 0.90 | 0.80 | 0.81 | 0.10 | 4.94 | < 0.001 | 1.10 |
Happy | 0.98 | 0.05 | 0.93 | 0.07 | 2.89 | < 0.01 | 0.60 |
Sad | 0.84 | 0.09 | 0.75 | 0.08 | 4.52 | < 0.001 | 1.00 |
Results of study one showed that older adults performed comparably to the younger group in decoding actions from biological motion but were less accurate when decoding emotions. However, it should be noted that there were ceiling effects present in the action task data requiring care to be taken when interpreting these results. Older adults were significantly worse than young at labeling anger, sadness and fear from the emotional point light displays, in common with previous studies looking at both facial and bodily expressions of emotion [1,11]. The conclusion which would generally be drawn from this data is therefore that older adults show a specific difficulty in labeling some emotions (anger, sadness, fear), but preserved ability to label others (happiness, disgust). However, when the data were corrected for potential response biases in emotion labeling (kappa scores), older adults were less accurate for all five of the emotions. This indicates that previous results identifying that only specific emotions such as anger, and sadness were subject to greater age effects might partly reflect response biases. Therefore caution should be used when interpreting emotion labeling tasks, as performance may reflect response biases in the choice of emotion labels. The largest age effects were still seen for sadness, anger and fear respectively, even when response biases were controlled for.
Previous research has suggested that those with AD have even greater difficulties in recognizing emotions from facial expressions than healthy older adults [2]. As noted above, only one study to date has investigated the effects of dementia on the ability to decode emotion from point light displays, finding that patients with mixed dementia etiology had greater difficulties with this task than healthy controls [20]. However, the dementia group in this study were not separated by sub-type, and therefore would be likely to include people with vascular or frontotemporal dementia as well as people with Alzheimer’s disease (AD). Given that frontotemporal dementia is known to have particularly large effects on social and emotional processing [22], it is important to look specifically at the effects of AD on the ability to distinguish emotions from point light displays.
The first aim of the current study was to explore how AD impacts on the ability to decode emotional and action cues from biological motion. Second, the current study aimed to investigate whether AD caused specific problems in labeling individual emotions (particularly sadness and fear) compared to others (particularly disgust and happiness). Finally this study was the first to explore possible biases in choosing emotion labels for biological motion stimuli in AD, and investigate how correction for them influences the pattern of performance.
The healthy aging group volunteers were recruited from the University of Aberdeen participant panel and reimbursed for their time. The AD individuals were recruited from three sources: 1) the Department of Old Age Psychiatry at Royal Cornhill Hospital, Aberdeen, 2) local Alzheimer Scotland groups, and 3) registered volunteers from the Scottish Dementia Clinical Research Network. All met the diagnostic criteria for “probable” AD as established by the National Institute of Neurological and Communicative Disorders and Stroke and the AD and Related Disorders Association working group [23], as diagnosed by a psychiatrist. AD participants had Mini-Mental State Examination (MMSE) [24] scores of 16-26, falling in the mild to moderate stage of the disease and had capacity to consent. The MMSE scores for the healthy aging group ranged from 28-30. Exclusion criteria (based on self-report) for both groups included severe sensory impairment, Epilepsy, learning disability as classified by ICD10 and alcohol/drug dependency. Those in the healthy aging group were excluded if their MMSE score was below 26 or had a self-reported history of psychiatric conditions or depression. All patients with Alzheimer’s Disease were taking prescribed medications including cholinesterase inhibitors and pain relief such as paracetamol. 15 healthy controls (8 females) and 15 people with AD (8 females) completed all of the biological motion tasks. The groups did not differ significantly in gender χ2 (1) = 0.13, p > = 0.72, age, t (28) = -1.61, p = 12 (control M= 73.13, SD = 5.21; AD M= 75.25, SD = 6.46) or years of education, t (28) = 0.29, p = 0.77, (control M = 14.33 years, SD = 2.67, AD M = 13.93 years, SD = 4.54). As expected, MMSE scores were significantly lower in the AD group, t (28) = 7.13, p < 0.01, (control M = 29.47, SD = 0.64;AD M = 23.87, SD = 2.97).
The same biological motion tasks administered in study one were also presented in study two.
A 2 (group: control and AD) by 2 (Biological Motion Stimulus: action and emotion) mixed design ANOVA revealed a main effect of stimulus, F (1, 28) = 411.31, p < 0.001, ηp2 = 0.94: the action block was performed with higher accuracy (M = 92.63) than the emotion block (M = 50.78). The main effect of group was significant, F (1, 28) = 14.80, p < 0.001, ηp2= 0.35, with the AD group scoring lower than the control group. The interaction between task and group was also significant F (1, 28) = 13.98, p < 0.001, ηp2 = 0.33. Independent samples t-tests revealed no group differences for the action block of the task, t (28) = 1.01, p= 0.32. (control M = 94.53; AD M = 90.00) In contrast an independent samples t-test, t (28) = 6.88, p = < 0.001, revealed that the AD group performed significantly less accurately (M = 40.53) than the controls (M = 60.47) in the emotion block.
Group differences for specific emotions were also explored (see Table 4 for descriptive statistics) using a 2 (group: control and AD) by 5 (emotion: anger, disgust, fear, happiness and sadness) mixed design ANOVA. The main effect of emotion label was significant, F (4, 112) = 36.75, p < 0.001, ηp2 = .57, with highest scores for happiness (M = 5.17) and lowest for disgust (M = 1.33). The main effect of group was significant, F (1, 28) = 35.53, p < 0.001, ηp2 = 0.56, with the AD group being significantly less accurate (M = 2.57) than controls (M = 3.64). The interaction between emotion label and group was also significant, F (4, 112) = 2.47, p < 0.05, ηp2 = 0.08. Independent samples t-tests revealed that the AD group was significantly poorer than the control group at recognizing the emotions anger, fear and sadness (see Table 4).
Young (n = 40) | Old (n = 45) | Independent Samples t-test | Effect Size | ||||
Mean | SD | Mean | SD | t(83) | p | d | |
Anger | 3.93 | 1.16 | 2.40 | 1.35 | 3.33 | < 0.01 | 1.21 |
Disgust | 1.60 | 1.12 | 1.07 | 1.39 | 1.15 | ns | 0.41 |
Fear | 4.00 | 1.13 | 2.07 | 1.58 | 3.85 | < 0.01 | 1.40 |
Happy | 5.27 | 0.70 | 5.07 | 0.88 | 0.69 | ns | 0.25 |
Sad | 3.40 | 1.30 | 2.27 | 1.23 | 2.46 | < 0.05 | 0.89 |
The frequency of types of emotion errors made was also explored using a 2 (group: control and AD) by 5 (emotion label error: anger, disgust, fear, happiness and sadness) mixed design ANOVA (see Table 5 for descriptive and inferential statistics). The analysis revealed a significant main effect of emotion label error, F (4, 112) = 3.24, p < 0.05, ηp2= 0.10, with fear being the label least likely (M = 2.20) and sadness being the label most likely (M = 3.83) to be used in error. There was also a main effect of group, F (1, 28) = 35.53, p < 0.01, ηp2 = 0.56, with the AD group making more errors overall (M = 3.43) than the control group (M = 2.36).
Control (n = 15) | AD (n = 15) | |||
Mean | SD | Mean | SD | |
Anger | 2.13 | 1.64 | 2.87 | 1.77 |
Disgust | 2.60 | 1.76 | 3.33 | 1.63 |
Fear | 1.33 | 1.23 | 3.07 | 1.94 |
Happiness | 2.47 | 1.45 | 3.47 | 2.13 |
Sadness | 3.27 | 1.83 | 4.40 | 1.95 |
The group category x emotion label error interaction was not significant, F (4, 112) = 0.36, p =0.84, suggesting that the types of label given in error were not influenced by group membership.
Controlling for Response Biases in AD
Descriptive and inferential statistics for kappa scores (see section 2.12) are shown in Table 6. Kappa scores reflect a measure of accuracy that corrects for potential response biases and were analyzed using a 2 (group: control and AD) by 5 (emotion: anger, disgust, fear, happiness and sadness) mixed design ANOVA. The main effect of emotion was significant, F (4, 112) = 21.04, p < 0.001, ηp2 = 0.43, with highest scores for happiness (M = 0.85) and lowest for disgust (0.77). There was a main effect of group, F (1, 28) = 35.53, p < 0.001, ηp2= 0.56, with the control group being overall more accurate. The interaction between emotion label and group was not significant, F (4, 112) =0.75, p = 0.60, indicating no differential impairment in identifying specific emotions once response biases were considered.
Control (n = 15) | AD (n = 15) | |||
Mean | SD | Mean | SD | |
Anger | 0.87 | 0.06 | 0.78 | 0.09 |
Disgust | 0.76 | 0.09 | 0.71 | 0.08 |
Fear | 0.86 | 0.08 | 0.76 | 0.10 |
Happy | 0.95 | 0.05 | 0.86 | 0.09 |
Compared to controls, those with AD performed worse on the emotion but not the action block of the point light task. This finding contrasts with results from other neurodegenerative patient populations which indicate declines for both instrumental actions and emotion recognition [25]. The group with AD was significantly less accurate than the controls when identifying anger, sadness and fear from point light displays. This supports previous reports on identifying facial expressions of emotion which indicate that AD is most likely to cause impairment in labeling sadness and fear, with relative sparing of disgust and happiness [4,16]. In contrast to the current research, Henry et al., (2012) found no specific effect of dementia on labeling particular emotions from biological motion stimuli. However the Henry et al. study included participants with a range of different types of dementia, in contrast to the specific AD sample described here. In the current study we explored labeling errors, and the groups did not differ in the frequency with which emotion specific labels were given in error. When potential response biases across all emotions were corrected by calculating kappa scores, the group with AD was revealed to be worse at emotion recognition overall with deficits generalizing to all emotions rather than just anger, sadness and fear as indicated in the initial analysis of accuracy.
Body movements are a rich source of social cues, and the evidence from the current studies indicates that both healthy aging and AD cause problems in interpreting emotion from point light displays. Compared to young adults, older people had problems in labeling angry, fearful and sad, expressions from bodily movements (Study 1). Compared to healthy older people, those with AD had even more difficulty in labeling bodily expressions of those same emotions (Study 2). This suggests that although AD results in substantial quantitative impairments in recognizing emotions from bodily motion, there is not a qualitative difference in the effects of adult aging and AD. Forced-choice labeling tasks are the most common method of assessing emotion recognition skills. However, these tasks may be subject to response biases, and those were also investigated here.
Statistical control of response biases through kappa scores revealed that both healthy older adults (Study 1) and people with AD (Study 2) had difficulties in recognizing all emotions from body movements. The analysis of kappa scores indicated a difference in response patterns to bodily expressed emotions in aging and AD. While in Study 1 there was an interaction between age group and emotion label, reflecting a stronger effect of age on anger, fear and sadness; there was no such interaction in Study 2, suggesting similar effects of AD in identifying all emotions. Interpreting the specificity of impairments in labeling emotions in aging or dementia should be carried out in the context of possible response biases. Without this information, apparent differences in accuracy of labeling specific emotion may in fact reflect a bias to choose some emotion labels more often than others.
Current results indicate fairly widespread problems in emotion perception in normal aging and AD. These impairments could be caused by changes in key brain networks. Ruffman et al. (2008) propose that age changes in emotion perception, including from body movement, may reflect decline in emotion-related frontal-subcortical brain networks. While AD is predominantly seen as a disorder of memory, related initially to changes in entorhinal cortex and the hippocampus, there is also evidence that AD causes atrophy observable in the orbitofrontal cortex [26], the cingulate cortex [27] and the amygdala [28]. Therefore the problems that participants with AD showed in decoding biological motion in the current study might reflect deterioration in the frontal-subcortical networks important in emotion processing. Indeed, previous evidence indicates a link between reduced function in key frontal lobe regions and poorer facial emotion perception in AD [29].
It is also possible that other neurocognitive mechanisms might underlie the effects of aging and AD on decoding emotions from biological motion. Henry et al. (2012) showed that, in a group of patients with MCI and mixed forms of dementia, there was a substantial correlation between semantic memory performance and the ability to decode action and emotion from point light displays. They argue that the ability to access semantic information about motion might be important in correctly classifying bodily motion. Also, there is likely a role for executive functions in the type of decision-making involved in emotion labeling tasks. Phillips et al. (2010) report a significant correlation between executive function and facial emotion perception in a group of participants with AD. Detailed measurement of cognitive function was not available in the current participants, but it would be useful in future research to gather more information on key cognitive domains (e.g. semantic memory, executive function, visual perception) as well as neuroimaging data to help understand better the mechanisms which might underlie the effects of AD on decoding emotional information from bodily motion.
Results showed that aging and AD did not influence ability to label simple actions from biological motion cues. However, it is important to note that there were ceiling effects in participants’ data for the action block. Using a different task, Henry et al. (2012) found dementia-related impairments of similar magnitude in identifying action and emotions from point light stimuli. It would be useful in future AD research to use an action identification task which varied more in difficulty by manipulating factors such as inversion and visual noise [15]. Another limitation of the present research is that we have a relatively small sample of participants with AD, but the number is comparable to other studies looking at effects of AD on emotion perception [30,31,32,33].
Problems in perceiving nonverbal cues to emotions may have social implications for older adults, and those with AD. The ability to understand dynamic cues from gestures and body movements is an important factor in understanding and reacting appropriately to other people. Understanding facial expressions of emotion relates to quality of life in healthy older adults and those with AD, independent of cognitive decline and mood [3]. We did not assess quality of life or social function in the current sample, but evidence from other neurological samples indicates that problems with multimodal emotion perception related to poor social interaction skills in brain injury [34] and reduced social participation following stroke [35]. Further research to explore the links between different aspects of emotion perception and social functioning in aging and dementia is needed.
In sum, decoding emotional cues from biological motion was impaired in aging, and further impaired in AD. When biases in labeling were corrected older adults were significantly less accurate than younger adults for all emotions, with particular difficulty in identifying anger, fear and sadness. People with AD made more errors overall but did not differ significantly from controls in the pattern of errors made. Future research should investigate aging and AD effects on a wide range of depictions of emotions from different modalities, exploring the issue of response biases.
This work was supported by a grant from the Lily Charlton Trust. The authors wish to acknowledge the support of the Older Adult Mental Health Directorate at Royal Cornhill Hospital, NHS Grampian, Alzheimer's Scotland and the Scottish Dementia Clinical Research Network. We would also like to thank all the participants for their support in taking part. This project was completed as part of a doctoral dissertation by P. M. Insch.
No author has any conflict of interest to report.
[1] | Ruffman T, Henry JD, Livingstone V, et al. (2008) A meta-analytic review of emotion recognition and aging: Implications for neuropsychological models of aging. Neurosci Biobehav Rev 32: 863-881. |
[2] | Klein-Koerkamp Y, Beaudoin M, Baciu M, et al. (2012) Emotional decoding abilities in Alzheimer's disease: A meta-analysis. J Alzheimer's Disease 32: 109-125. |
[3] | Phillips LH, Scott C, Henry JD, et al. (2010) Emotion perception in Alzheimer's disease and mood disorder in old age. Psycholo Aging 25: 38-47. |
[4] | Henry JD, Ruffman T, McDonald S, et al. (2008) Recognition of disgust is selectively preserved in Alzheimer's disease. Neuropsycholo 46: 203-208. |
[5] |
Isaacowitz DM, Löckenhoff CE, Lane RD, et al. (2007) Age differences in recognition of emotion in lexical stimuli and facial expressions. Psycholo Aging 22: 147-159. doi: 10.1037/0882-7974.22.1.147
![]() |
[6] | de Gelder B, Van den Stock J (2011) The Bodily Expressive Action Stimulus Test (BEAST). Construction and Validation of a Stimulus Basis for Measuring Perception of Whole Body Expression of Emotions. Frontiers Psycholo 2: 181. |
[7] | de Gelder B, Van den Stock J, Meeren H, et al. (2010) Standing up for the body. Recent progress in uncovering the networks involved in the perception of bodies and bodily expressions. Neurosci Biobehav Rev 34 (4): 513-527. |
[8] | Johansson G (1973) Visual perception of biological motion and a model for its analysis. Perception Psychophysics 14: 201-211. |
[9] | Atkinson AP, Dittrich WH, Gemmell AJ, et al. (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33: 717-746. |
[10] | Heberlein AS, Adolphs R, Tranel D, et al. (2004) Cortical regions for judgments of emotions and personality traits from point-light walkers. J Cognitive Neurosci 16: 1143-1158. |
[11] | Ruffman T, Sullivan S, Dittrich W (2009) Older adults' recognition of bodily and auditory expressions of emotion. Psycholo Aging 24: 614-622. |
[12] | Calder AJ, Keane J, Manly T, et al. (2003) Facial expression recognition across the adult life span. Neuropsychologia 41: 195-202. |
[13] |
Insch PM, Bull R, Phillips LH, et al. (2012) Adult aging, processing style, and the perception of biological motion. Exper Aging Res 38: 169-185. doi: 10.1080/0361073X.2012.660030
![]() |
[14] | Billino J, Bremmer F, Gegenfurtner KR (2008) Differential aging of motion processing mechanisms: Evidence against general perceptual decline. Vision Res 48: 1254-1261. |
[15] | Pilz KS, Bennett PJ, Sekuler AB (2010) Effects of aging on biological motion discrimination. Vision Res 50: 211-219. |
[16] | Rosen HJ, Wilson MR, Schauer GF, et al. (2006) Neuroanatomical correlates of impaired recognition of emotion in dementia. Neuropsychologia 44: 365-373. |
[17] | Gilmore GC, Wenk HE, Naylor LA, et al. (1994) Motion perception and Alzheimers disease. J Gerontolo 49: 52-57. |
[18] | Sauer J, Ffytche DH, Ballard C, et al. (2006) Differences between Alzheimer's disease and dementia with Lewy bodies: an fMRI study of task-related brain activity. Brain 129: 1780-1788 |
[19] | Koff E, Zaitchik D, Montepare J, et al. (1999) Processing of emotion through the visual and auditory domains by patients with Alzheimer's disease. J Int Neuropsycholo Soc 5: 32-40. |
[20] |
Henry JD, Thompson C, Rendell PG, et al. (2012) Perception of biological motion and emotion in mild cognitive impairment and dementia. J Int Neuropsycholo Soc 18: 866-873. doi: 10.1017/S1355617712000665
![]() |
[21] | Sasson NJ, Pinkham AE, Richard J, et al. (2010) Controlling for response biases clarifies sex and age differences in facial affect recognition. J Nonverbal Behav 34: 207-221. |
[22] | Kumfor F, Piguet O (2012) Disturbance of emotion processing in frontotemporal dementia: a synthesis of cognitive and neuroimaging findings. Neuropsycholo Rev 22: 280-297 |
[23] | McKhann G, Drachman D, Folstein M (1984) Clinical diagnosis of Alzheimer's disease: report of the NINCDS-ADRDA work group under the auspices of department of health and human services task force on Alzheimer's disease. Neurolo 34: 939-944. |
[24] | Folstein MF, Folstein SE, McHugh PR (1975) 'Mini mental state'. A practical method for grading the cognitive state of patients for the clinician. J Psychiatric Res 12: 189-198. |
[25] | de Gelder B, Van den Stock J, Balaguer R, et al. (2008) Huntington's disease impairs recognition of angry and instrumental body language. Neuropsychologia 46 (1): 369-373. |
[26] | Van Hoesen GW, Parvizi J, Chu C (2000) Orbitofrontal cortex pathology in Alzheimer's disease. Cerebral Cortex 10: 243-251. |
[27] | Jones BF, Barnes J, Uylings HB, et al. (2006) Differential regional atrophy of the cingulate gyrus in Alzheimer disease: a volumetric MRI study. Cereb Cortex 16:1701-1708. |
[28] | Poulin SP, Dautoff R, Morris JC, et al. (2011) Amygdala atrophy is prominent in early Alzheimer's disease and relates to symptom severity. Psychiatry Res Neuroimaging 194: 7-13. |
[29] | Staff RT, Ahearn TS, Phillips LH, et al. (2011) The cerebral blood flow correlates of emotional facial processing in mild Alzheimer's disease. Neurosci Med 2: 6-13. |
[30] | Bucks RS, Radford SA (2004) Emotion processing in Alzheimer's disease. Aging Mental Health 8: 222-232. |
[31] | Cadieux NL, Greve KW (1997) Emotion processing in Alzheimer's disease. J Int Neuropsycholo Soc 3: 411-419. |
[32] |
Kohler CG, Anselmo-Gallagher G, Bilker W, et al. (2005) Emotion discrimination deficits in mild Alzheimer's disease. Am J Geriatric Psychiatry 13: 926-933. doi: 10.1097/00019442-200511000-00002
![]() |
[33] | Lavenu I, Pasquier F, Lebert F, et al. (1999) Perception of emotion in frontotemporal dementia and Alzheimer's disease. Alzheimer Disease Associated Disorders: 96-101. |
[34] | McDonald S, Flanagan S, Martin I, et al. (2004) The ecological validity of TASIT: a test of social perception. Neuropsycholo Rehabilitation 14: 285-302. |
[35] | Cooper CL, Phillips LH, Johnston M, et al. (2014) Links between emotion perception and social participation restriction following stroke. Brain Injury 28: 122-126. |
1. | Sara Isernia, Alexander N. Sokolov, Andreas J. Fallgatter, Marina A. Pavlova, Untangling the Ties Between Social Cognition and Body Motion: Gender Impact, 2020, 11, 1664-1078, 10.3389/fpsyg.2020.00128 | |
2. | Petra M.J. Pollux, Frouke Hermens, Alexander P. Willmott, Age-congruency and contact effects in body expression recognition from point-light displays (PLD), 2016, 4, 2167-8359, e2796, 10.7717/peerj.2796 | |
3. | Łukasz Okruszek, It Is Not Just in Faces! Processing of Emotion and Intention from Biological Motion in Psychiatric Disorders, 2018, 12, 1662-5161, 10.3389/fnhum.2018.00048 | |
4. | Esther Setién-Suero, Nancy Murillo-García, Manuel Sevilla-Ramos, Georgelina Abreu-Fernández, Ana Pozueta, Rosa Ayesa-Arriola, Exploring the Relationship Between Deficits in Social Cognition and Neurodegenerative Dementia: A Systematic Review, 2022, 14, 1663-4365, 10.3389/fnagi.2022.778093 | |
5. | Sofia Bonventre, Martina De Cesaris, Massimo Bertoli, Francesca Graziano, Valentina Tomassini, Marcella Brunetti, Investigating social cognition in Multiple Sclerosis: Does Implicit Biological Motion processing affect visuo-spatial attention?, 2025, 211, 00283932, 109131, 10.1016/j.neuropsychologia.2025.109131 |
Young (n = 40) | Old (n = 45) | Independent Samples t-test | Effect Size | ||||
Mean | SD | Mean | SD | t(83) | p | d | |
Anger | 4.75 | 1.19 | 3.44 | 1.25 | 4.90 | < 0.001 | 1.08 |
Disgust | 1.85 | 1.17 | 1.42 | 1.14 | 1.71 | ns | 0.37 |
Fear | 4.23 | 1.27 | 3.20 | 1.47 | 3.41 | < 0.001 | 0.80 |
Happiness | 5.23 | 0.83 | 4.82 | 1.09 | 1.89 | ns | 0.42 |
Sadness | 4.45 | 1.08 | 2.47 | 1.44 | 7.10 | < 0.001 | 1.56 |
Young (n = 40) | Old (n = 45) | Independent Samples t-test | Effect Size | ||||
Mean | SD | Mean | SD | t(83) | p | d | |
Anger | 1.18 | 1.08 | 3.27 | 1.92 | -6.07 | < 0.001 | 1.34 |
Disgust | 2.38 | 1.87 | 3.38 | 1.74 | -2.55 | < 0.05 | 0.55 |
Fear | 1.73 | 1.19 | 3.07 | 1.69 | -4.16 | < 0.001 | 0.91 |
Happy | 0.78 | 0.83 | 1.40 | 1.27 | -2.65 | < 0.05 | 0.58 |
Sad | 3.40 | 1.53 | 3.53 | 1.71 | -0.38 | ns | 0.08 |
Young (n = 40) | Old (n = 45) | Independent Samples t-test | Effect Size | ||||
Mean | SD | Mean | SD | t(83) | p | d | |
Anger | 0.94 | 0.07 | 0.81 | 0.10 | 6.76 | < 0.001 | 1.52 |
Disgust | 0.78 | 0.09 | 0.72 | 0.09 | 2.78 | < 0.01 | 0.60 |
Fear | 0.90 | 0.80 | 0.81 | 0.10 | 4.94 | < 0.001 | 1.10 |
Happy | 0.98 | 0.05 | 0.93 | 0.07 | 2.89 | < 0.01 | 0.60 |
Sad | 0.84 | 0.09 | 0.75 | 0.08 | 4.52 | < 0.001 | 1.00 |
Young (n = 40) | Old (n = 45) | Independent Samples t-test | Effect Size | ||||
Mean | SD | Mean | SD | t(83) | p | d | |
Anger | 3.93 | 1.16 | 2.40 | 1.35 | 3.33 | < 0.01 | 1.21 |
Disgust | 1.60 | 1.12 | 1.07 | 1.39 | 1.15 | ns | 0.41 |
Fear | 4.00 | 1.13 | 2.07 | 1.58 | 3.85 | < 0.01 | 1.40 |
Happy | 5.27 | 0.70 | 5.07 | 0.88 | 0.69 | ns | 0.25 |
Sad | 3.40 | 1.30 | 2.27 | 1.23 | 2.46 | < 0.05 | 0.89 |
Control (n = 15) | AD (n = 15) | |||
Mean | SD | Mean | SD | |
Anger | 2.13 | 1.64 | 2.87 | 1.77 |
Disgust | 2.60 | 1.76 | 3.33 | 1.63 |
Fear | 1.33 | 1.23 | 3.07 | 1.94 |
Happiness | 2.47 | 1.45 | 3.47 | 2.13 |
Sadness | 3.27 | 1.83 | 4.40 | 1.95 |
Control (n = 15) | AD (n = 15) | |||
Mean | SD | Mean | SD | |
Anger | 0.87 | 0.06 | 0.78 | 0.09 |
Disgust | 0.76 | 0.09 | 0.71 | 0.08 |
Fear | 0.86 | 0.08 | 0.76 | 0.10 |
Happy | 0.95 | 0.05 | 0.86 | 0.09 |
Young (n = 40) | Old (n = 45) | Independent Samples t-test | Effect Size | ||||
Mean | SD | Mean | SD | t(83) | p | d | |
Anger | 4.75 | 1.19 | 3.44 | 1.25 | 4.90 | < 0.001 | 1.08 |
Disgust | 1.85 | 1.17 | 1.42 | 1.14 | 1.71 | ns | 0.37 |
Fear | 4.23 | 1.27 | 3.20 | 1.47 | 3.41 | < 0.001 | 0.80 |
Happiness | 5.23 | 0.83 | 4.82 | 1.09 | 1.89 | ns | 0.42 |
Sadness | 4.45 | 1.08 | 2.47 | 1.44 | 7.10 | < 0.001 | 1.56 |
Young (n = 40) | Old (n = 45) | Independent Samples t-test | Effect Size | ||||
Mean | SD | Mean | SD | t(83) | p | d | |
Anger | 1.18 | 1.08 | 3.27 | 1.92 | -6.07 | < 0.001 | 1.34 |
Disgust | 2.38 | 1.87 | 3.38 | 1.74 | -2.55 | < 0.05 | 0.55 |
Fear | 1.73 | 1.19 | 3.07 | 1.69 | -4.16 | < 0.001 | 0.91 |
Happy | 0.78 | 0.83 | 1.40 | 1.27 | -2.65 | < 0.05 | 0.58 |
Sad | 3.40 | 1.53 | 3.53 | 1.71 | -0.38 | ns | 0.08 |
Young (n = 40) | Old (n = 45) | Independent Samples t-test | Effect Size | ||||
Mean | SD | Mean | SD | t(83) | p | d | |
Anger | 0.94 | 0.07 | 0.81 | 0.10 | 6.76 | < 0.001 | 1.52 |
Disgust | 0.78 | 0.09 | 0.72 | 0.09 | 2.78 | < 0.01 | 0.60 |
Fear | 0.90 | 0.80 | 0.81 | 0.10 | 4.94 | < 0.001 | 1.10 |
Happy | 0.98 | 0.05 | 0.93 | 0.07 | 2.89 | < 0.01 | 0.60 |
Sad | 0.84 | 0.09 | 0.75 | 0.08 | 4.52 | < 0.001 | 1.00 |
Young (n = 40) | Old (n = 45) | Independent Samples t-test | Effect Size | ||||
Mean | SD | Mean | SD | t(83) | p | d | |
Anger | 3.93 | 1.16 | 2.40 | 1.35 | 3.33 | < 0.01 | 1.21 |
Disgust | 1.60 | 1.12 | 1.07 | 1.39 | 1.15 | ns | 0.41 |
Fear | 4.00 | 1.13 | 2.07 | 1.58 | 3.85 | < 0.01 | 1.40 |
Happy | 5.27 | 0.70 | 5.07 | 0.88 | 0.69 | ns | 0.25 |
Sad | 3.40 | 1.30 | 2.27 | 1.23 | 2.46 | < 0.05 | 0.89 |
Control (n = 15) | AD (n = 15) | |||
Mean | SD | Mean | SD | |
Anger | 2.13 | 1.64 | 2.87 | 1.77 |
Disgust | 2.60 | 1.76 | 3.33 | 1.63 |
Fear | 1.33 | 1.23 | 3.07 | 1.94 |
Happiness | 2.47 | 1.45 | 3.47 | 2.13 |
Sadness | 3.27 | 1.83 | 4.40 | 1.95 |
Control (n = 15) | AD (n = 15) | |||
Mean | SD | Mean | SD | |
Anger | 0.87 | 0.06 | 0.78 | 0.09 |
Disgust | 0.76 | 0.09 | 0.71 | 0.08 |
Fear | 0.86 | 0.08 | 0.76 | 0.10 |
Happy | 0.95 | 0.05 | 0.86 | 0.09 |