Abstract

Objective

Pain and cognitive impairment are prevalent and often co-occur in older adults. Because pain may negatively affect cognitive test performance, identification of pain in the context of neuropsychological evaluation is important. However, pain detection based on self-report presents challenges, and pain is often under-detected in this population. Alternative methods (e.g., video-based automatic coding of facial biomarkers of pain) may facilitate pain identification and thus enhance interpretation of neuropsychological evaluation results.

Method

The current study examined pain in the context of virtual neuropsychological assessment in 111 community-dwelling older adults, first seeking to validate the use of software developed to automatically code biomarkers of pain. Measures of pain, including self-report of acute and chronic pain and automatic coding of pain, were compared while participants completed neuropsychological testing.

Results

Self-reported pain was negatively associated with poorer performance on a measure of executive function (both acute and chronic pain) and a global cognitive screening measure (acute pain only). However, self-reported acute and chronic pain did not correlate significantly with most neuropsychological tests. Automatic coding of pain did not predict self-report of pain or performance on neuropsychological tests beyond the influence of demographic factors and psychological symptoms.

Conclusions

Though results were largely not significant, correlations warrant further exploration of the influence of pain on neuropsychological test performance in this context to ensure that pain does not influence test performance in individuals with higher levels of pain and in other samples.

Introduction

High rates of pain are present in older adults, afflicting an estimated 50% of individuals living in the community (Lukas et al., 2012). Because neuropsychological assessment is commonly employed to identify cognitive impairment in this population (Eshkoor et al., 2015), the co-occurrence of pain and cognitive deficits represents a challenge for accurate evaluation. Research indicates that chronic pain is associated with poorer performance across multiple cognitive domains, including executive functioning, memory, attention, and processing and psychomotor speed (Baker et al., 2018; Moore et al., 2013; Tapscott & Etherton, 2015). Although the extent to which deficits in this context reflect the influence of pain versus confounding factors (i.e., psychological distress; van der Leeuw et al., 2018) is not fully clear, introduction of acute pain has been shown experimentally to impair performance on tasks of processing speed, sustained attention, and working memory (Tapscott & Etherton, 2015). However, findings on associations between pain and cognitive test performance are mixed, with some studies finding that acute pain was not significantly associated with cognitive impairments (Tapscott & Etherton, 2015). As such, it remains necessary to continue to explore these associations in specific populations and contexts. Although the intention of neuropsychological evaluation is not to induce pain, sitting for long periods is known to exacerbate pain in older adults (Chastin et al., 2014). In addition, testing may induce stress in some individuals, which may in turn exacerbate pain (Abdallah & Geha, 2017). Thus, the context of neuropsychological assessment could introduce greater pain for some individuals, particularly if chronic pain is already present (Choi et al., 2021).

Accurate pain detection in the context of neuropsychological evaluation thus bears important implications for validity of assessment in this population. Given that pain is a subjective experience, and it is difficult to habituate to experienced pain, self-report measures have traditionally been considered the “gold standard” of pain assessment; however, older adults with cognitive deficits become less likely to self-report their pain with increasing cognitive impairment (Stolee et al., 2005). In addition, self-report measures are susceptible to biases in this population because of problems with retrospective recall of chronic pain (Althubaiti, 2016), under-reporting because of viewing pain as a normal aspect of aging (Kang & Demiris, 2018), denial or minimization of pain (Booker & Herr, 2016), and barriers to communication (Craig, 2009). For these reasons, it is important to consider other methods of pain detection during neuropsychological assessment.

Recent recommendations to enhance reliability and validity of pain assessment include efforts to obtain not only self-reported pain ratings but observer ratings as well (e.g., verbalizations and vocalizations, facial expressions, body movements). Unfortunately, a variety of problems are present in observational pain assessment, including individual differences in behavioral markers of pain, biases in pain, and a lack of correlation between self-reported and observer rated pain ratings (Herr, 2010). Moreover, information regarding pain intensity and severity remains difficult to quantify using observational methods (Rojo et al., 2015). Together, prior research suggests other methods of pain detection in the context of neuropsychological assessment are needed. Advances in technology may make it possible to improve upon current standards with greater efficiency and reduced bias. Automatic, video-based pain detection systems could address many of the previously described limitations of commonly used methods of pain assessment. Prior work on video-based pain detection shows promise in identifying and differentiating pain from other states (Kunz et al., 2017).

The current study examined pain in the context of neuropsychological assessment, comparing self-report measures of acute and chronic pain and automatic coding of pain biomarkers from video recordings analyzed using a digital phenotyping software (Open DBMv2.0; AiCure, 2021) while participants completed neuropsychological testing. We hypothesized that self-report and automatic coding of pain would correlate significantly with each other, controlling for demographic variables and psychological symptoms, and that automatic coding of facial biomarkers of pain would have high sensitivity and specificity in predicting acute self-report ratings of pain. We also hypothesized that greater pain would be negatively associated with performance on neuropsychological tests, beyond the influence of demographic variables and psychological symptoms.

Materials and Methods

Participants

Community-dwelling older adults between the ages of 55 and 90 were recruited. A total of 116 took part in the study. Participants were excluded if they did not have sufficient data because of experimenter error or insufficient video quality for analysis (n = 5). The final analyzed sample included 111 older adults.

Measures

Demographic information

Participants self-reported demographic information including age, race, ethnicity, gender, education, and assistance received with instrumental activities of daily life (e.g., shopping, driving, managing finances).

Assessment of pain

Self-reported acute pain

Acute pain experienced during testing was assessed using the Iowa Pain Thermometer—Revised (IPT-R; Ware et al., 2015) at two time points at the beginning and end of testing. The IPT-R is an 11-point scale that combines a verbal descriptor scale with a numeric rating scale (0 = “No Pain” to 10 = “The Most Intense Pain Imaginable”). The IPT-R shows good convergent validity with other measures of acute pain (r = 0.92) and test–retest reliability (r = 0.80) in cognitively intact and cognitively impaired older adults (Ware et al., 2015).

Self-reported pain intensity and interference

The nine-item Brief Pain Inventory—Short Form (BPI-SF; Cleeland, 2009) was used as a self-report measure of chronic pain. Ratings of pain intensity and interference in daily life were summed for a single score. Prior research utilizing the BPI generally suggests pain intensity cut points of 0–4 (mild pain), 5–8 (moderate pain), and 8–10 (severe pain; Kapstad et al., 2008; Woo et al., 2015). The BPI-SF shows good convergent validity with other pain measures and high internal consistency (Cronbach’s α = 0.87) in pain populations with a range of pain etiologies (Ferreira et al., 2023).

Video-based automatic pain detection

OpenDBMv2.0 (AiCure, 2021), a digital phenotyping software package, measured biomarkers of participants’ facial expressions of pain during two video clips of 1-min category fluency tasks at the beginning and end of the session (adjacent to measurements of self-reported acute pain). This software includes a measurement of pain expressivity that is calculated from a combination of Facial Action Coding System units associated with pain in prior literature, including lowered brow, raised cheek, tightened eyelids, wrinkled nose, raised upper lip, pulled lip corner, lips stretched, and dropped jaw (AiCure, 2021). The software combined analyses of facial activity and movement to create an objective measure of putative pain-related behaviors (AiCure, 2021). Automatic coding of pain is measured on a vector scale, ranging from 0 to 1, capturing presence and intensity of pain expression (AiCure, 2021). Vector outputs were averaged across these two time points for each participant.

Assessment of cognition

Verbal fluency

Participants completed tests of phonemic fluency (letters C, F, and L) and category fluency (animals and vegetables) in separate, 1-min trials for each. Phonemic and category fluency tasks show high sensitivity and specificity in identifying cognitive impairments in older adults (Lezak et al., 2004).

Immediate and delayed recall

The Craft Story Immediate Recall and Craft Story Delayed Recall (Craft Story 21; Craft et al., 1996) were used to assess participants’ immediate and delayed episodic memory. In this task, the test administrator read the story aloud once to the participant and the participant was then asked to repeat the story in the same words the test administrator read, or in their own words. Participants received points for both verbatim and paraphrased recall which were summed separately and entered separately in analyses. Craft Story 21 demonstrates sound psychometric properties (Weintraub et al., 2018), including reliability (intraclass correlation coefficient = 0.77 [verbatim] and 0.81 [paraphrase]; Howard et al., 2023) and convergent validity with other measures of episodic memory (Monsell et al., 2016).

Cognitive estimation

The Cognitive Estimation Task (CET; Axelrod & Millis, 1994) was used to assess participants’ cognitive estimation ability, a measure of executive function. The measure consisted of 10 items that asked participants to provide estimated answers to questions that are not normally learned as part of a formal education (e.g., “How much does a quart of milk weigh in pounds,” “How fast do racehorses gallop in miles per hour”). The 10-item version of the CET shows adequate reliability (Cronbach’s α = 0.62) and validity (Axelrod & Millis, 1994; MacPherson et al., 2014).

Judgment

The Judgment subtest of the Neuropsychological Assessment Battery (NAB-JDG; Stern & White, 2003) assessed decision-making in daily living situations involving health and home safety. The NAB-JDG demonstrates ecological validity (Stern & White, 2003), predictive validity in assessing mild cognitive impairment and dementia, and has been shown to correlate significantly with other measures of executive function (Macdougall & Mansbach, 2013).

Global cognitive screening

The Montreal Cognitive Assessment (MoCA; Nasreddine et al., 2005) was used as a measure of global cognitive functioning. The MoCA is a brief cognitive screening tool that shows high sensitivity in detecting mild cognitive impairment (Nasreddine et al., 2005). A 22-item version of the MoCA eliminating visual items was used to accommodate virtual testing.

Auditory attention and working memory

Digit Span Forward was used to assess participants’ auditory attention. In this task, the test administrator read aloud a series of numbers and participants were asked to repeat the numbers aloud in the same order. Participants’ auditory attention and working memory were assessed using Digit Span Backward, in which participants were asked to repeat back a sequence of numbers read aloud by the test administrator in backwards order. Digit Span Sequencing was also used to assess working memory and auditory attention. In this task, the test administrator read aloud a series of numbers and the participant was asked to repeat the numbers back in ascending order. Scores for digit span forward, digit span backward, and digit span sequencing were summed, and sum totals were normed into T-scores based on clinical norms (Wechsler, 2008) and entered separately in analyses. Digit Span is well validated and shows excellent reliability (Lichtenberger & Kaufman, 2013). Reliable Digit Span (Greiffenstein et al., 1994), a well-established measure that is embedded in the Digit Span test, was also calculated to assess performance validity.

Cognitive set shifting

The Oral Trail Making Test B (Ricker & Axelrod, 1994) assessed participants’ ability to shift and maintain cognitive set. Participants were asked to count, switching between number and letter (i.e., 1-A-2-B, etc.) until they reached number 13. The test was discontinued if the participant had not completed the task in 5 min or made five errors. Participant times on Oral Trail Making Test A and Oral Trail Making Test B were normed and entered separately in analyses. Scores were coded such that lower T-scores (slower completion) reflected worse performance.

Assessment of Current Psychological Symptoms

Symptoms of depression

The eight-item Patient Health Questionnaire depression scale (PHQ-8; Kroenke & Spitzer, 2002) was used to control for influence of any present depressive symptoms. The PHQ-8 is a well-established, valid measure of depression in clinical samples and the general population (Kroenke et al., 2009).

Symptoms of anxiety

The seven-item Generalized Anxiety scale (GAD-7; Spitzer et al., 2006) was used to control for influence of any present symptoms of anxiety. The GAD-7 has been validated in general adult samples and samples of older adults (Wild et al., 2014).

Procedure

All procedures were approved by the Kent State University’s Institutional Review Board and followed Strengthening and Reporting of Observational Studies in Epidemiology guidelines (von Elm et al., 2014). Data were collected from July 14, 2021 through June 29, 2022. Participants provided informed consent and completed a 60-min Microsoft Teams session composed of self-report questionnaires, measures of acute pain (self-report and automatic coding) at the beginning and end of the session, a measure of chronic pain at the end of the session, and neuropsychological assessment. The order of neuropsychological assessment consisted of category fluency (vegetables), Craft Story Immediate Recall, Cognitive Estimation Task, NAB Judgment, phonemic fluency (CFL), Craft Story Delayed Recall, MoCA, Digit Span, Oral Trail Making Test A and B, and category fluency (animals). Participants were offered a $25 gift card for taking part in the study. Study sessions were recorded for each participant and videos were broken down into clips for subsequent analyses.

Statistical Power

Prior research exploring associations between automatic, video-based coding of pain and self-reported pain in persons with dementia reported small to medium effect sizes (partial eta square = 0.28–0.55; Hadjistavropoulos et al., 2018). Prior research on the impact of acute pain on neuropsychological test performance reported small effect sizes (partial eta square = 0.27; Tapscott & Etherton, 2015). Prior research on the impact of chronic pain on neuropsychological test performance reported moderate correlations (r = 0.34–0.45; Baker et al., 2018). A priori power analyses using G*Power (Faul et al., 2009) with an alpha of 0.05 and a power of 0.80 suggested a minimum sample size of 80 for correlation and regression-based analyses detecting effects at the low end of effect size ranges. The final analytic sample of 111 was thus deemed more than sufficient to examine study aims.

Statistical Analyses

SPSS version 28 was used in all analyses. T-Scores were calculated from raw scores for each neuropsychological test based on age-stratified clinical norms to account for cognitive declines based on age alone. Self-report ratings of acute pain at Time 1 and Time 2 were compared with determine if a significant change in pain occurred over time. Self-report of acute pain and automatic coding indicated that pain did not significantly increase from beginning to the end of the session; thus, Time 1 and Time 2 pain measurements were averaged to obtain average pain scores for subsequent analyses. Descriptive statistics were conducted for primary variables to determine acceptability for parametric analyses. Measures of psychological symptoms (GAD-7 and PHQ-8) were non-normally distributed (skewness > 2.0 and kurtosis > 6.0). Because of non-normality, analyses were confirmed using bootstrapping procedures (Field, 2013). All other variables met normality assumptions. Participant age, gender, education, and psychological symptoms correlated significantly with both test performance and pain measures and were included as control variables in primary analyses.

Area-under-the-curve (AUC) analyses were used to examine validity of automatic coding of pain in predicting presence/absence of self-reported acute and self-reported chronic pain. Then, Pearson correlations bootstrapped with 5,000 samples were used to examine associations among pain measures and neuropsychological test results. Neuropsychological tests that correlated significantly with pain measures based on these Pearson correlations were then entered as dependent variables in separate linear regressions. Demographic variables and psychological symptom measures that correlated significantly with test performance and pain measures were entered in the first step, and self-reported acute and chronic pain and automatic coding of pain were entered in the second step in each linear regression. Familywise α was set at 0.05 with Holm–Bonferroni correction applied to account for multiple analyses and minimize type I error.

Results

Sample Demographics

Participants were 69.6 years of age (SD = 7.4) on average, and well educated, with 100% reporting at least 12 years of education and 78% reporting at least 14 years of education. Participants were largely white (96%) and female (68%). Regarding activities related to independent living, 104 of 111 participants reported completely independent functioning.

Characterization of Pain and Performance on Neuropsychological Tests

Forty-four percent of the sample reported experiencing chronic pain in the past 24 hours. Participants who experienced chronic pain in the past 24 hours reported mild to moderate pain intensity on average (M = 4.86, SD = 6.13). Mean self-reported acute pain was mild (M = 1.10, SD = 1.56). Mean neuropsychological test performances were average to above average, with average T-scores ranging from 46 to 61. See Table 1 for sample demographics and descriptive statistics for primary variables.

Table 1

Descriptive statistics for primary variables (M/SD, range)

Participant DemographicsN = 111
Age (M/SD, range)69.7/7.4, 55–90
Sex (%)
Male32.4%
Female67.6%
Education ≥ 12 years (%)100%
Race (%)
White/Caucasian95.5%
Black/African American3.6%
Other0.9%
Psychological symptoms
PHQ-83.07/3.51, 0–20
GAD-71.94/2.77, 0–20
Participant DemographicsN = 111
Age (M/SD, range)69.7/7.4, 55–90
Sex (%)
Male32.4%
Female67.6%
Education ≥ 12 years (%)100%
Race (%)
White/Caucasian95.5%
Black/African American3.6%
Other0.9%
Psychological symptoms
PHQ-83.07/3.51, 0–20
GAD-71.94/2.77, 0–20

Note: M = mean; SD = standard deviation; PHQ-8 = Patient Health Questionnaire depression scale-8; GAD-7 = Generalized Anxiety Disorder scale-7.

Table 1

Descriptive statistics for primary variables (M/SD, range)

Participant DemographicsN = 111
Age (M/SD, range)69.7/7.4, 55–90
Sex (%)
Male32.4%
Female67.6%
Education ≥ 12 years (%)100%
Race (%)
White/Caucasian95.5%
Black/African American3.6%
Other0.9%
Psychological symptoms
PHQ-83.07/3.51, 0–20
GAD-71.94/2.77, 0–20
Participant DemographicsN = 111
Age (M/SD, range)69.7/7.4, 55–90
Sex (%)
Male32.4%
Female67.6%
Education ≥ 12 years (%)100%
Race (%)
White/Caucasian95.5%
Black/African American3.6%
Other0.9%
Psychological symptoms
PHQ-83.07/3.51, 0–20
GAD-71.94/2.77, 0–20

Note: M = mean; SD = standard deviation; PHQ-8 = Patient Health Questionnaire depression scale-8; GAD-7 = Generalized Anxiety Disorder scale-7.

Associations Among Pain Measures

Self-reported acute and chronic pain were significantly associated (r = 0.62, p < .001). Other pain measures did not correlate significantly (automatic coding of pain and acute pain [r = 0.14] and chronic pain [r = −.04]). AUC analyses examining sensitivity and specificity of automatic coding of pain in predicting presence/absence of self-reported pain indicated AUCs of 0.55 (self-reported acute pain) and 0.49 (self-reported chronic pain). AUC results did not meet the threshold for high sensitivity and specificity and clinical usefulness, as an AUC curve estimate of 0.50 indicates no discriminative value (Fan et al., 2006).

Correlations Between Cognitive Test Performance and Pain Measures

Self-reported acute pain was significantly associated with Oral Trail Making Test B performance and MoCA performance. Self-reported chronic pain was significantly associated with Oral Trail Making Test B performance. Automatic coding of pain was significantly negatively correlated with Craft Story delayed verbatim recall and Craft Story delayed paraphrase recall. All other associations were non-significant. See Table 2 for correlations.

Table 2

Bootstrapped Pearson correlations with 95% confidence intervals between pain measures and neuropsychological test performance controlling for participant age, gender, education, and psychological symptoms

Acute pain (IPT)Chronic pain (BPI)Automatic pain coding (open DBMv2.0)
M = 1.10 SD = 1.56M = 11.6 SD = 16.9M = 0.21 SD = 0.08
Neuropsychological testSemantic Fluency (Vegetables)M = 48.0 SD = 10.5-.06-.06-.07
Craft Story Immediate Verbatim RecallM = 46.6 SD = 10.2-.06-.11-.15
Craft Story Immediate Paraphrase RecallM = 47.1 SD = 10.5-.04-.18-.14
Cognitive EstimationM = 50.9 SD = 10.0-.01-.18-.09
NAB JudgmentM = 60.8 SD = 9.5-.09-.110.05
Phonemic Fluency (CFL)M = 51.8 SD = 9.90.090.040.05
Craft Story Delayed Verbatim RecallM = 45.6 SD = 9.5-.16-.18-.23*
Craft Story Delayed Paraphrase RecallM = 46.8 SD = 10.6-.15-.14-.23*
MoCAM = 50.3 SD = 10.7-.23*-.18-.14
Digit Span ForwardM = 50.1 SD = 8.3-.06-.08-.08
Digit Span Forward—Longest StringM = 51.1 SD = 9.5-.13-.13-.07
Digit Span BackwardM = 54.0 SD = 8.8-.10-.16-.05
Digit Span Backward—Longest StringM = 54.4 SD = 9.4-.08-.17-.11
Digit Span SequencingM = 54.2 SD = 8.3-.14-.18-.08
Digit Span Sequencing—Longest StringM = 53.5 SD = 8.4-.13-.14-.09
Oral Trail Making Test AM = 42.5 SD = 11.80.020.090.11
Oral Trail Making Test BM = 54.9 SD = 9.2-.25**-.26*-.02
Semantic Fluency (Animals)M = 55.3 SD = 9.90.05-.03-.04
Acute pain (IPT)Chronic pain (BPI)Automatic pain coding (open DBMv2.0)
M = 1.10 SD = 1.56M = 11.6 SD = 16.9M = 0.21 SD = 0.08
Neuropsychological testSemantic Fluency (Vegetables)M = 48.0 SD = 10.5-.06-.06-.07
Craft Story Immediate Verbatim RecallM = 46.6 SD = 10.2-.06-.11-.15
Craft Story Immediate Paraphrase RecallM = 47.1 SD = 10.5-.04-.18-.14
Cognitive EstimationM = 50.9 SD = 10.0-.01-.18-.09
NAB JudgmentM = 60.8 SD = 9.5-.09-.110.05
Phonemic Fluency (CFL)M = 51.8 SD = 9.90.090.040.05
Craft Story Delayed Verbatim RecallM = 45.6 SD = 9.5-.16-.18-.23*
Craft Story Delayed Paraphrase RecallM = 46.8 SD = 10.6-.15-.14-.23*
MoCAM = 50.3 SD = 10.7-.23*-.18-.14
Digit Span ForwardM = 50.1 SD = 8.3-.06-.08-.08
Digit Span Forward—Longest StringM = 51.1 SD = 9.5-.13-.13-.07
Digit Span BackwardM = 54.0 SD = 8.8-.10-.16-.05
Digit Span Backward—Longest StringM = 54.4 SD = 9.4-.08-.17-.11
Digit Span SequencingM = 54.2 SD = 8.3-.14-.18-.08
Digit Span Sequencing—Longest StringM = 53.5 SD = 8.4-.13-.14-.09
Oral Trail Making Test AM = 42.5 SD = 11.80.020.090.11
Oral Trail Making Test BM = 54.9 SD = 9.2-.25**-.26*-.02
Semantic Fluency (Animals)M = 55.3 SD = 9.90.05-.03-.04

Note: M = mean; SD = standard deviation; IPT = Iowa Pain Thermometer; BPI = Brief Pain Inventory; DBM = digital biomarkers; MoCA = Montreal Cognitive Assessment; NAB = Neuropsychological Assessment Battery; * indicates significance at p < .05; ** indicates significance at p < .01.

Table 2

Bootstrapped Pearson correlations with 95% confidence intervals between pain measures and neuropsychological test performance controlling for participant age, gender, education, and psychological symptoms

Acute pain (IPT)Chronic pain (BPI)Automatic pain coding (open DBMv2.0)
M = 1.10 SD = 1.56M = 11.6 SD = 16.9M = 0.21 SD = 0.08
Neuropsychological testSemantic Fluency (Vegetables)M = 48.0 SD = 10.5-.06-.06-.07
Craft Story Immediate Verbatim RecallM = 46.6 SD = 10.2-.06-.11-.15
Craft Story Immediate Paraphrase RecallM = 47.1 SD = 10.5-.04-.18-.14
Cognitive EstimationM = 50.9 SD = 10.0-.01-.18-.09
NAB JudgmentM = 60.8 SD = 9.5-.09-.110.05
Phonemic Fluency (CFL)M = 51.8 SD = 9.90.090.040.05
Craft Story Delayed Verbatim RecallM = 45.6 SD = 9.5-.16-.18-.23*
Craft Story Delayed Paraphrase RecallM = 46.8 SD = 10.6-.15-.14-.23*
MoCAM = 50.3 SD = 10.7-.23*-.18-.14
Digit Span ForwardM = 50.1 SD = 8.3-.06-.08-.08
Digit Span Forward—Longest StringM = 51.1 SD = 9.5-.13-.13-.07
Digit Span BackwardM = 54.0 SD = 8.8-.10-.16-.05
Digit Span Backward—Longest StringM = 54.4 SD = 9.4-.08-.17-.11
Digit Span SequencingM = 54.2 SD = 8.3-.14-.18-.08
Digit Span Sequencing—Longest StringM = 53.5 SD = 8.4-.13-.14-.09
Oral Trail Making Test AM = 42.5 SD = 11.80.020.090.11
Oral Trail Making Test BM = 54.9 SD = 9.2-.25**-.26*-.02
Semantic Fluency (Animals)M = 55.3 SD = 9.90.05-.03-.04
Acute pain (IPT)Chronic pain (BPI)Automatic pain coding (open DBMv2.0)
M = 1.10 SD = 1.56M = 11.6 SD = 16.9M = 0.21 SD = 0.08
Neuropsychological testSemantic Fluency (Vegetables)M = 48.0 SD = 10.5-.06-.06-.07
Craft Story Immediate Verbatim RecallM = 46.6 SD = 10.2-.06-.11-.15
Craft Story Immediate Paraphrase RecallM = 47.1 SD = 10.5-.04-.18-.14
Cognitive EstimationM = 50.9 SD = 10.0-.01-.18-.09
NAB JudgmentM = 60.8 SD = 9.5-.09-.110.05
Phonemic Fluency (CFL)M = 51.8 SD = 9.90.090.040.05
Craft Story Delayed Verbatim RecallM = 45.6 SD = 9.5-.16-.18-.23*
Craft Story Delayed Paraphrase RecallM = 46.8 SD = 10.6-.15-.14-.23*
MoCAM = 50.3 SD = 10.7-.23*-.18-.14
Digit Span ForwardM = 50.1 SD = 8.3-.06-.08-.08
Digit Span Forward—Longest StringM = 51.1 SD = 9.5-.13-.13-.07
Digit Span BackwardM = 54.0 SD = 8.8-.10-.16-.05
Digit Span Backward—Longest StringM = 54.4 SD = 9.4-.08-.17-.11
Digit Span SequencingM = 54.2 SD = 8.3-.14-.18-.08
Digit Span Sequencing—Longest StringM = 53.5 SD = 8.4-.13-.14-.09
Oral Trail Making Test AM = 42.5 SD = 11.80.020.090.11
Oral Trail Making Test BM = 54.9 SD = 9.2-.25**-.26*-.02
Semantic Fluency (Animals)M = 55.3 SD = 9.90.05-.03-.04

Note: M = mean; SD = standard deviation; IPT = Iowa Pain Thermometer; BPI = Brief Pain Inventory; DBM = digital biomarkers; MoCA = Montreal Cognitive Assessment; NAB = Neuropsychological Assessment Battery; * indicates significance at p < .05; ** indicates significance at p < .01.

Predictability of Cognitive Test Performance and Pain Measures

Self-reported acute pain (ΔR2 = 0.06, ΔF(1, 103) = 4.32, p = .01) and self-reported chronic pain (ΔR2 = 0.04, ΔF(1, 103) = 4.32, p < .05) significantly predicted Oral Trail Making Test B performance. Automatic coding of pain significantly predicted Craft Story delayed verbatim recall (R2 = 0.03, ΔF(1, 104) = 4.01, p < .05) and delayed paraphrase recall (R2 = 0.034, ΔF(1, 104) = 4.08, p < .05); however, these findings were not upheld after Holm–Bonferroni correction. All other associations were non-significant. See Supplementary Materials for regression tables.

Discussion

The current study examined pain in the context of neuropsychological assessment of older adults, finding limited support for hypotheses. Self-reported acute and chronic pain correlated significantly with each other and were negatively associated with poorer performance on a measure of executive function (both acute and chronic pain) and a global cognitive screening measure (acute pain only). However, self-reported acute and chronic pain did not correlate significantly with the majority of neuropsychological tests employed in the current study. Automatic coding of pain was not significantly associated with self-reported pain; AUC results did not meet the threshold for high sensitivity and specificity and clinical usefulness, as an AUC curve estimate of 0.50 indicates no discriminative value (Fan et al., 2006). Automatic coding of pain was negatively and weakly associated with poorer test performance, though not robustly enough to withstand correction for multiple comparisons. Overall, links between pain and neuropsychological test performance were limited in the context of the current study.

Although self-reported acute and chronic pain measures were correlated, weaker-than-expected relationships were demonstrated among self-report and automatic pain measures. Several explanations for this are possible. First, given that mean levels of acute pain reported in this sample were mild, pain levels may have been below the threshold needed for accurate detection using automatic coding; in contrast to the present study, prior research examining video-based pain measures was conducted in settings in which pain was explicitly induced or in situations likely to elicit greater pain (Ammaturo et al., 2017; Atee et al., 2018; Hadjistavropoulos et al., 2018; Lautenbacher et al., 2018). These studies included older adults with a broad range of pain levels. In one study, 66% of older adults self-reported no pain and 29% self-reported mild pain at rest and after walking or transfer events (Atee et al., 2018). In other work, 84% of older adults reported at least moderate pain following movement in physiotherapy (Hadjistavropoulos et al., 2018). Participants in the current study could have inaccurately self-reported their pain because of limited awareness, though this is less likely, in light of the largely average cognitive performance observed.

Although automatic coding of pain showed weak relationships with self-reported pain, it is notable that all correlations between automatic coding and neuropsychological test performance were in the expected negative direction. Findings thus lend support to the notion that automatic pain detection may yet be useful in this capacity. It is possible that this method is better suited to detect pain of greater magnitude than was present in the current sample. Indeed, prior research utilizing automatic pain detection methods based on facial expressions has yielded mixed results. For example, studies utilizing video frames from a database of adult individuals with shoulder pain performing range of motion tests (Lucey et al., 2012) yielded 73 to 88% accuracy in identifying pain in individuals with pain levels ranging from no pain to strong pain (Roy et al., 2016). In contrast, a study of automatic pain detection based on facial action units in older adults with chronic pain completing activities of daily living concluded that the automatic coding software they employed, similar to that used in the current study, was able to distinguish between pain and no pain states but was not accurate in classifying pain intensity (Gomutbutra et al., 2022). In their sample, 20% of individuals self-reported mild pain, 55% reported moderate pain, and 24% reported severe pain (Gomutbutra et al., 2022). Some work also shows very poor detectability of pain using automatic coding methods; specifically, a study with healthy older adults undergoing experimentally induced heat pain showed poor to moderate sensitivity and precision in identifying pain (Lautenbacher et al., 2022). Thus, the usefulness of automatic pain detection methods may depend on the type and severity of pain, health of the participants, pain induction method, and activities performed.

The current work has several strengths. Prior research highlights the importance of using objective measures of pain in older adults with comorbid pain and cognitive impairment (Lukas et al., 2012; Stolee et al., 2005). To our knowledge, this study is the first to explore these relationships and compare the predictive power of various pain measures in older adults as they undergo neuropsychological testing. Findings did not support that automatic pain detection was particularly useful in this specific setting, and pain was largely not associated with test performance. However, given that greater acute and chronic pain were negatively associated with performance on a measure of executive function (Oral Trail Making Test B), and greater acute pain was also negatively associated with a measure of global cognitive function (MoCA) beyond the influence of demographic variables and psychological symptoms, results warrant further exploration of the influence of pain on neuropsychological test performance to ensure that pain does not influence test performance in individuals with higher levels of pain and in other samples.

Several limitations must be noted. Foremost, presence of generally mild pain in the current sample may have prevented pain identification through automatic detection methods; these methods may be more appropriate for use in populations with greater pain. It is also possible that participants under-reported their true experienced pain, perhaps because of biases, social desirability, or unwillingness to share their pain experiences. In addition, neuropsychological testing in the current study (60 min) may not have been of long enough duration to elicit greater pain because of lack of movement. A typical neuropsychological assessment can be as short as 1 hour but as long as 8 hour or more, depending on the presenting problems (Schaefer et al., 2022). It is possible that greater pain may have been elicited with a longer evaluation period; evaluators are encouraged to attend to acute pain particularly in the context of longer testing. Future work should explore current findings in the context of greater pain during testing, higher levels of chronic pain, and longer test batteries to explore the validity of these measures and the influence of higher levels of acute pain and automatically coded pain on test performance. Future work should also assess acute pain at additional time points throughout the assessment. Acute pain may vary throughout testing, and the current study was limited to two time points of acute pain measurement. Assessing acute pain throughout testing might facilitate better correlation of self-reported pain and automatic coding of pain expressions in this context. In addition, future work should be conducted in individuals with more significant cognitive impairment, as range restriction in neuropsychological tests may have influenced results in the current study.

Several factors could influence generalizability of findings. For example, the current sample was highly educated. Individuals with a higher educational attainment often demonstrate better neuropsychological performance because of elevated cognitive reserve (Meng & D’Arcy, 2012). Any cognitive difficulties present in this sample may thus have been masked, influencing generalizability to populations with lower educational attainment. In addition, as the current sample consisted of older adults who were able to access a virtual assessment from their homes, 104 of 111 participants reported completely independent functioning, and mean cognitive performances were overall average, participants of this study likely had less decline than typical for an individual presenting for neuropsychological assessment. The current sample was also largely white. Racial and ethnic differences in experience of pain, as well as disparities in assessment, detection, and treatment of pain (Meints et al., 2019), could all contribute to reduced generalizability. Future work should explore questions raised in the current study in samples with more functional impairment and greater diversity.

In addition, several efforts were made to control for confounding factors in the current work. For example, T-scores were first calculated based on age-stratified clinical norms to account for cognitive declines based on age alone, and age was also entered as a control variable in analyses. Future work should similarly aim to incorporate this potential confound in the relationship between pain and neuropsychological test performance, as higher pain and lower neuropsychological test performance are both associated with older age (Herr, 2010; Tripathi et al., 2014). Furthermore, though self-reported depressive and anxiety symptoms were included as control variables in analyses, it is not ruled out that psychological symptoms not reported or accounted for by these brief screening measures could have been present. Research should consider the impact of such symptoms on results, as psychological symptoms have been shown to influence both neuropsychological test performance and reported pain (van der Leeuw et al., 2018). Future work not only controlling for, but also exploring, the influence of age and psychological symptoms on the relationships among pain and neuropsychological test performance would be of interest.

In summary, findings indicated that pain showed a limited association with neuropsychological test performance in the context of the current study. Automatic pain detection was neither significantly related to self-reported pain, nor robustly associated with performance on neuropsychological tests. Though findings were largely non-significant, presence of a small number of significant correlations warrant further exploration. The current work did not rule out the influence of pain in some areas of neuropsychological test performance; further work is needed to ensure that pain does not affect test performance in individuals with higher levels of pain and in other samples. Future work should also explore the utility of video-based automatic coding in samples with greater severity of pain and/or cognitive impairment.

Funding

This work was supported in part by a National Institute on Aging grant awarded to JG (R01AG065432) and by an internal university award granted to the first author.

Conflict of Interest

None declared.

Data Availability

Data is available from the corresponding author upon reasonable request.

Author Contributions

Karlee Patrick (Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Writing—original draft), John Gunstad (Conceptualization, Formal analysis, Funding acquisition, Software, Writing—review and editing), and Mary Beth Spitznagel (Conceptualization, Resources, Supervision, Writing—review and editing

References

Abdallah
,
C. G.
, &
Geha
,
P.
(
2017
).
Chronic pain and chronic stress: Two sides of the same coin?
 
Chronic Stress
,
1
,
247054701770476
. https://doi.org/10.1177/2470547017704763.

AiCure
. (
2021
).
AiCure makes code open source to advance digital biomarker development
.
New York, NY: AiCure
. https://aicure.com/

Althubaiti
,
A.
(
2016
).
Information bias in health research: Definition, pitfalls, and adjustment methods
.
Journal of Multidisciplinary Healthcare
,
9
,
211
217
. https://doi.org/10.2147/JMDH.S104807.

Ammaturo
,
D. A.
,
Hadjistavropoulos
,
T.
, &
Williams
,
J.
(
2017
).
Pain in dementia: Use of observational pain assessment tools by people who are not health professionals
.
Pain Medicine
,
18
(
10
),
1895
1907
. https://doi.org/10.1093/pm/pnw265.

Atee
,
M.
,
Hoti
,
K.
,
Parsons
,
R.
, &
Hughes
,
J.
(
2018
).
A novel pain assessment tool incorporating automated facial analysis: Interrater reliability in advanced dementia
.
Clinical Interventions in Aging
,
13
,
1245
1258
. https://doi.org/10.2147/CIA.S168024.

Axelrod
,
B. N.
, &
Millis
,
S. R.
(
1994
).
Preliminary standardization of the cognitive estimation test
.
Assessment
,
1
(
3
),
269
274
. https://doi.org/10.1177/107319119400100307.

Baker
,
K. S.
,
Gibson
,
S. J.
,
Georgiou-Karistianis
,
N.
, &
Giummarra
,
M. J.
(
2018
).
Relationship between self-reported cognitive difficulties, objective neuropsychological test performance and psychological distress in chronic pain
.
European Journal of Pain
,
22
(
3
),
601
613
. https://doi.org/10.1002/ejp.1151.

Booker
,
S. Q.
, &
Herr
,
K. A.
(
2016
).
Assessment and measurement of pain in adults in later life
.
Clinics in Geriatric Medicine
,
32
(
4
),
677
692
. https://doi.org/10.1016/j.cger.2016.06.012.

Chastin
,
S. F.
,
Fitzpatrick
,
N.
,
Andrews
,
M.
, &
DiCroce
,
N.
(
2014
).
Determinants of sedentary behavior, motivation, barriers and strategies to reduce sitting time in older women: A qualitative investigation
.
International Journal of Environmental Research and Public Health
,
11
(
1
),
773
791
. https://doi.org/10.3390/ijerph110100773.

Choi
,
J. I.
,
Cho
,
Y. H.
,
Kim
,
Y. J.
,
Lee
,
S. Y.
,
Lee
,
J. G.
,
Yi
,
Y. H.
, et al. (
2021
).
The relationship of sitting time and physical activity on the quality of life in elderly people
.
International Journal of Environmental Research and Public Health
,
18
(
4
),
1459
. https://doi.org/10.3390/ijerph18041459.

Cleeland
,
C. S.
(
2009
).
The brief pain inventory user guide
. Houston, TX: The University of Texas MD Anderson Cancer Center. http://www.mdanderson.org/.

Craft
,
S.
,
Newcomer
,
J.
,
Kanne
,
S.
,
Dagogo-Jack
,
S.
,
Cryer
,
P.
,
Sheline
,
Y.
, et al. (
1996
).
Memory improvement following induced hyperinsulinemia in Alzheimer’s disease
.
Neurobiology of Aging
,
17
(
1
),
123
130
. https://doi.org/10.1016/0197-4580(95)02002-0.

Craig
,
K. D.
(
2009
).
The social communication model of pain
.
Canadian Psychology
,
50
(
1
),
22
32
. https://doi.org/10.1037/a0014772.

Eshkoor
,
S. A.
,
Hamid
,
T. A.
,
Mun
,
C. Y.
, &
Ng
,
C. K.
(
2015
).
Mild cognitive impairment and its management in older people
.
Clinical Interventions in Aging
,
10
,
687
693
. https://doi.org/10.2147/CIA.S73922.

Fan
,
J.
,
Upadhye
,
S.
, &
Worster
,
A.
(
2006
).
Understanding receiver operating characteristic (ROC) curves
.
CJEM
,
8
(
01
),
19
20
. https://doi.org/10.1017/s1481803500013336.

Faul
,
F.
,
Erdfelder
,
E.
,
Buchner
,
A.
, &
Lang
,
A. G.
(
2009
).
Statistical power analyses using G*power 3.1: Tests for correlation and regression analyses
.
Behavior Research Methods
,
41
(
4
),
1149
1160
. https://doi.org/10.3758/BRM.41.4.1149.

Ferreira
,
A. C. L.
,
Pereira
,
D. S.
,
da
 
Silva
,
S. L. A.
,
Carvalho
,
G. A.
, &
Pereira
,
L. S. M.
(
2023
).
Validity and reliability of the short form brief pain inventory in older adults with nociceptive, neuropathic and nociplastic pain
.
Geriatric Nursing
,
52
,
16
23
. https://doi.org/10.1016/j.gerinurse.2023.04.011.

Field
,
A.
(
2013
).
Discovering statistics using IBM SPSS statistics
(4th ed.).
Los Angeles, CA: SAGE Publications
.

Gomutbutra
,
P.
,
Kittisares
,
A.
,
Sanguansri
,
A.
,
Choosri
,
N.
,
Sawaddiruk
,
P.
,
Fakfum
,
P.
, et al. (
2022
).
Classification of elderly pain severity from automated video clip facial action unit analysis: A study from a Thai data repository
.
Frontiers in Artificial Intelligence
,
5
, 242–248. https://doi.org/10.3389/frai.2022.942248.

Greiffenstein
,
M. F.
,
Baker
,
W. J.
, &
Gola
,
T.
(
1994
).
Validation of malingered amnesia measures with a large clinical sample
.
Psychological Assessment
,
6
(
3
),
218
224
. https://doi.org/10.1037/1040-3590.6.3.218.

Hadjistavropoulos
,
T.
,
Browne
,
M. E.
,
Prkachin
,
K. M.
,
Taati
,
B.
,
Ashraf
,
A.
, &
Mihailidis
,
A.
(
2018
).
Pain in severe dementia: A comparison of a fine-grained assessment approach to an observational checklist designed for clinical settings
.
European Journal of Pain
,
22
(
5
),
915
925
. https://doi.org/10.1002/ejp.1177.

Herr
,
K.
(
2010
).
Pain in the older adult: An imperative across all health care settings
.
Pain Management Nursing
,
11
(
2
),
S1
S10
. https://doi.org/10.1016/j.pmn.2010.03.005.

Howard
,
R. S.
,
Goldberg
,
T. E.
,
Luo
,
J.
,
Munoz
,
C.
, &
Schneider
,
L. S.
(
2023
).
Reliability of the NACC telephone-administered neuropsychological battery (T-cog) and Montreal cognitive assessment for participants in the USC ADRC
.
Alzheimer's & Dementia
,
15
(
1
),
e12406
. https://doi.org/10.1002/dad2.12406.

Kang
,
Y.
, &
Demiris
,
G.
(
2018
).
Self-report pain assessment tools for cognitively intact older adults: Integrative review
.
International Journal of Older People Nursing
,
13
(
2
),
e12170
. https://doi.org/10.1111/opn.12170.

Kapstad
,
H.
,
Hanestad
,
B. R.
,
Langeland
,
N.
,
Rustøen
,
T.
, &
Stavem
,
K.
(
2008
).
Cutpoints for mild, moderate and severe pain in patients with osteoarthritis of the hip or knee ready for joint replacement surgery
.
BMC Musculoskeletal Disorders
,
9
(
1
),
55
. https://doi.org/10.1186/1471-2474-9-55.

Kroenke
,
K.
, &
Spitzer
,
R. L.
(
2002
).
The PHQ-9: A new depression diagnostic and severity measure
.
Psychiatric Annals
,
32
(
9
),
509
515
. https://doi.org/10.3928/0048-5713-20020901-06.

Kroenke
,
K.
,
Strine
,
T. W.
,
Spitzer
,
R. L.
,
Williams
,
J. B. W.
,
Berry
,
J. T.
, &
Mokdad
,
A. H.
(
2009
).
The PHQ-8 as a measure of current depression in the general population
.
Journal of Affective Disorders
,
114
(
1–3
),
163
173
. https://doi.org/10.1016/j.jad.2008.06.026.

Kunz
,
M.
,
Seuss
,
D.
,
Hassan
,
T.
,
Garbas
,
J. U.
,
Siebers
,
M.
,
Schmid
,
U.
, et al. (
2017
).
Problems of video-based pain detection in patients with dementia: A road map to an interdisciplinary solution
.
BMC Geriatrics
,
17
(
1
),
33
. https://doi.org/10.1186/s12877-017-0427-2.

Lautenbacher
,
S.
,
Hassan
,
T.
,
Seuss
,
D.
,
Loy
,
F. W.
,
Garbas
,
J. U.
,
Schmid
,
U.
, et al. (
2022
).
Automatic coding of facial expressions of pain: Are we there yet?
 
Pain Research & Management
,
2022
,
6635496
. https://doi.org/10.1155/2022/6635496.

Lautenbacher
,
S.
,
Walz
,
A. L.
, &
Kunz
,
M.
(
2018
).
Using observational facial descriptors to infer pain in persons with and without dementia
.
BMC Geriatrics
,
18
(
1
),
88
. https://doi.org/10.1186/s12877-018-0773-8.

van der
 
Leeuw
,
G.
,
Ayers
,
E.
,
Leveille
,
S. G.
,
Blankenstein
,
A. H.
,
van der
 
Horst
,
H. E.
, &
Verghese
,
J.
(
2018
).
The effect of pain on major cognitive impairment in older adults
.
The Journal of Pain
,
19
(
12
),
1435
1444
. https://doi.org/10.1016/j.jpain.2018.06.009.

Lezak
,
M. D.
,
Howieson
,
D. B.
,
Loring
,
D. W.
,
Hannay
,
H. J.
, &
Fischer
,
J. S.
(
2004
).
Neuropsychological assessment
.
New York, NY: Oxford University Press
.

Lichtenberger
,
E.
, &
Kaufman
,
A.
(
2013
).
Essentials of WAIS-IV assessment
(2nd ed.).
New York
:
Wiley
.

Lucey
,
P.
,
Cohn
,
J. F.
,
Prkachin
,
K. M.
,
Solomon
,
P. E.
,
Chew
,
S.
, &
Matthews
,
I.
(
2012
).
Painful monitoring: Automatic pain monitoring using the UNBC-McMaster shoulder pain expression archive database
.
Image and Vision Computing
,
30
(
3
),
197
205
. https://doi.org/10.1016/j.imavis.2011.12.003.

Lukas
,
A.
,
Schuler
,
M.
,
Fischer
,
T. W.
,
Gibson
,
S. J.
,
Savvas
,
S. M.
,
Nikolaus
,
T.
, et al. (
2012
).
Pain and dementia
.
Zeitschrift für Gerontologie und Geriatrie
,
45
(
1
),
45
49
. https://doi.org/10.1007/s00391-011-0272-4.

Macdougall
,
E. E.
, &
Mansbach
,
W. E.
(
2013
).
The judgment test of the neuropsychological assessment battery (NAB): Psychometric considerations in an assisted-living sample
.
Clinical Neuropsychology
,
27
(
5
),
827
839
. https://doi.org/10.1080/13854046.2013.786759.

MacPherson
,
S. E.
,
Wagner
,
G. P.
,
Murphy
,
P.
,
Bozzali
,
M.
,
Cipolotti
,
L.
, &
Shallice
,
T.
(
2014
).
Bringing the cognitive estimation task into the 21st century: Normative data on two new parallel forms
.
PLoS One
,
9
(
3
),
e92554
. https://doi.org/10.1371/journal.pone.0092554.

Meints
,
S. M.
,
Cortes
,
A.
,
Morais
,
C. A.
, &
Edwards
,
R. R.
(
2019
).
Racial and ethnic differences in the experience and treatment of noncancer pain
.
Pain Management
,
9
(
3
),
317
334
. https://doi.org/10.2217/pmt-2018-0030.

Meng
,
X.
, &
D'Arcy
,
C.
(
2012
).
Education and dementia in the context of the cognitive reserve hypothesis: A systematic review with meta-analyses and qualitative analyses
.
PLoS One
,
7
(
6
),
e38268
. https://doi.org/10.1371/journal.pone.0038268.

Moore
,
D. J.
,
Keogh
,
E.
, &
Eccleston
,
C.
(
2013
).
The effect of threat on attentional interruption by pain
.
Pain
,
154
(
1
),
82
88
. https://doi.org/10.1016/j.pain.2012.09.009.

Monsell, S E., Dodge, H.H., Zhou, X.H., Bu, Y., Besser, L.M., Mock, C., et al. (2016). Neuropsychology Work Group Advisory to the Clinical Task Force. Results From the NACC Uniform Data Set Neuropsychological Battery Crosswalk Study.

Alzheimer disease and associated disorders
,
30
(2), 134–139. https://doi.org/10.1097/WAD.0000000000000111.

Nasreddine
,
Z. S.
,
Phillips
,
N. A.
,
Bédirian
,
V.
,
Charbonneau
,
S.
,
Whitehead
,
V.
,
Collin
,
I.
, et al. (
2005
).
The Montreal cognitive assessment, MoCA: A brief screening tool for mild cognitive impairment
.
Journal of the American Geriatrics Society
,
53
(
4
),
695
699
. https://doi.org/10.1111/j.1532-5415.2005.53221.x.

Ricker
,
J. H.
, &
Axelrod
,
B. N.
(
1994
).
Analysis of an oral paradigm for the trail making test
.
Assessment
,
1
(
1
),
47
51
. https://doi.org/10.1177/1073191194001001007.

Rojo
,
R.
,
Prados-Frutos
,
J. C.
, &
López-Valverde
,
A.
(
2015
).
Pain assessment using the facial action coding system. A systematic review
.
Medicina Clínica
,
145
(
8
),
350
355
. https://doi.org/10.1016/j.medcli.2014.08.010.

Roy
,
S. D.
,
Bhowmik
,
M. K.
,
Saha
,
P.
, &
Ghosh
,
A. K.
(
2016
).
An approach for automatic pain detection through facial expression
.
Procedia Computer Science
,
84
,
99
106
. https://doi.org/10.1016/j.procs.2016.04.072.

Schaefer
,
L. A.
,
Thakur
,
T.
, &
Meager
,
M. R.
(
2022
).
Neuropsychological assessment
.
Treasure Island, FL: In StatPearls
,
StatPearls Publishing
. [Online Only].

Spitzer
,
R. L.
,
Kroenke
,
K.
,
Williams
,
J. B.
, &
Löwe
,
B.
(
2006
).
A brief measure for assessing generalized anxiety disorder: The GAD-7
.
Archives of Internal Medicine
,
166
(
10
),
1092
1097
. https://doi.org/10.1001/archinte.166.10.1092.

Stern
,
R. A.
, &
White
,
T.
(
2003
).
Neuropsychological assessment battery: Psychometric and technical manual
.
Lutz, FL
:
Psychological Assessment Resources
.

Stolee
,
P.
,
Hillier
,
L. M.
,
Esbaugh
,
J.
,
Bol
,
N.
,
McKellar
,
L.
, &
Gauthier
,
N.
(
2005
).
Instruments for the assessment of pain in older persons with cognitive impairment
.
Journal of the American Geriatrics Society
,
53
(
2
),
319
326
. https://doi.org/10.1111/j.1532-5415.2005.53121.x.

Tapscott
,
B. E.
, &
Etherton
,
J.
(
2015
).
The effects of cold pressor-induced pain on PASAT performance
.
Applied Neuropsychology
,
22
(
3
),
227
232
. https://doi.org/10.1080/23279095.2014.910213.

Tripathi
,
R.
,
Kumar
,
K.
,
Bharath
,
S.
,
Marimuthu
,
P.
, &
Varghese
,
M.
(
2014
).
Age, education and gender effects on neuropsychological functions in healthy Indian older adults
.
Dementia & Neuropsychologia
,
8
(
2
),
148
154
. https://doi.org/10.1590/S1980-57642014DN82000010.

von Elm, E., Altman, D.G., Egger, M., Pocock, S.J., Gøtzsche, P.C., Vandenbroucke, J.P. (2014) STROBE Initiative. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: Guidelines for Reporting Observational Studies. International journal of surgery, 12(12), 1495–1499. https://doi.org/10.1016/j.ijsu.2014.07.013.

Ware
,
L. J.
,
Herr
,
K. A.
,
Booker
,
S. S.
,
Dotson
,
K.
,
Key
,
J.
,
Poindexter
,
N.
, et al. (
2015
).
Psychometric evaluation of the revised Iowa pain thermometer (IPT-R) in a sample of diverse cognitively intact and impaired older adults: A pilot study
.
Pain Management Nursing
,
16
(
4
),
475
482
. https://doi.org/10.1016/j.pmn.2014.09.004.

Weintraub
,
S.
,
Besser
,
L.
,
Dodge
,
H. H.
,
Teylan
,
M.
,
Ferris
,
S.
,
Goldstein
,
F. C.
, et al. (
2018
).
Version 3 of the Alzheimer disease centers’ neuropsychological test battery in the uniform data set (UDS)
.
Alzheimer Disease & Associated Disorders
,
32
(
1
),
10
17
. https://doi.org/10.1097/WAD.0000000000000223.

Wechsler, D. (2008a). Wechsler Adult Intelligence Scale—Fourth Edition Administration and Scoring Manual. San Antonio, TX: Pearson.

Wild
,
B.
,
Eckl
,
A.
,
Herzog
,
W.
,
Niehoff
,
D.
,
Lechner
,
S.
,
Maatouk
,
I.
, et al. (
2014
).
Assessing generalized anxiety disorder in elderly people using the GAD-7 and GAD-2 scales: Results of a validation study
.
The American Journal of Geriatric Psychiatry
,
22
(
10
),
1029
1038
. https://doi.org/10.1016/j.jagp.2013.01.076.

Woo
,
A.
,
Lechner
,
B.
,
Fu
,
T.
,
Wong
,
C. S.
,
Chiu
,
N.
,
Lam
,
H.
, et al. (
2015
).
Cut points for mild, moderate, and severe pain among cancer and non-cancer patients: A literature review
.
Annals of Palliative Medicine
,
4
(
4
),
176
183
. https://doi.org/10.3978/j.issn.2224-5820.2015.09.04.

This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://dbpia.nl.go.kr/pages/standard-publication-reuse-rights)