Hostname: page-component-89b8bd64d-ktprf Total loading time: 0 Render date: 2026-05-08T22:06:10.172Z Has data issue: false hasContentIssue false

Development and application of novel performance validity metrics for computerized neurocognitive batteries

Published online by Cambridge University Press:  12 December 2022

J. Cobb Scott*
Affiliation:
Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA VISN4 Mental Illness Research, Education, and Clinical Center at the Corporal Michael J. Crescenz VA Medical Center, Philadelphia, PA, USA
Tyler M. Moore
Affiliation:
Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
David R. Roalf
Affiliation:
Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
Theodore D. Satterthwaite
Affiliation:
Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
Daniel H. Wolf
Affiliation:
Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
Allison M. Port
Affiliation:
Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
Ellyn R. Butler
Affiliation:
Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
Kosha Ruparel
Affiliation:
Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
Caroline M. Nievergelt
Affiliation:
Center for Excellent in Stress and Mental Health, VA San Diego Healthcare System, San Diego, CA, USA Department of Psychiatry, University of California (UCSD), San Diego, CA, USA
Victoria B. Risbrough
Affiliation:
Center for Excellent in Stress and Mental Health, VA San Diego Healthcare System, San Diego, CA, USA Department of Psychiatry, University of California (UCSD), San Diego, CA, USA
Dewleen G. Baker
Affiliation:
Center for Excellent in Stress and Mental Health, VA San Diego Healthcare System, San Diego, CA, USA Department of Psychiatry, University of California (UCSD), San Diego, CA, USA
Raquel E. Gur
Affiliation:
Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA Lifespan Brain Institute, Department of Child and Adolescent Psychiatry and Behavioral Sciences, Children’s Hospital of Philadelphia, Philadelphia, PA, USA
Ruben C. Gur
Affiliation:
Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA VISN4 Mental Illness Research, Education, and Clinical Center at the Corporal Michael J. Crescenz VA Medical Center, Philadelphia, PA, USA Lifespan Brain Institute, Department of Child and Adolescent Psychiatry and Behavioral Sciences, Children’s Hospital of Philadelphia, Philadelphia, PA, USA
*
Corresponding author: J. Cobb Scott, email: scott1@pennmedicine.upenn.edu
Rights & Permissions [Opens in a new window]

Abstract

Objectives:

Data from neurocognitive assessments may not be accurate in the context of factors impacting validity, such as disengagement, unmotivated responding, or intentional underperformance. Performance validity tests (PVTs) were developed to address these phenomena and assess underperformance on neurocognitive tests. However, PVTs can be burdensome, rely on cutoff scores that reduce information, do not examine potential variations in task engagement across a battery, and are typically not well-suited to acquisition of large cognitive datasets. Here we describe the development of novel performance validity measures that could address some of these limitations by leveraging psychometric concepts using data embedded within the Penn Computerized Neurocognitive Battery (PennCNB).

Methods:

We first developed these validity measures using simulations of invalid response patterns with parameters drawn from real data. Next, we examined their application in two large, independent samples: 1) children and adolescents from the Philadelphia Neurodevelopmental Cohort (n = 9498); and 2) adult servicemembers from the Marine Resiliency Study-II (n = 1444).

Results:

Our performance validity metrics detected patterns of invalid responding in simulated data, even at subtle levels. Furthermore, a combination of these metrics significantly predicted previously established validity rules for these tests in both developmental and adult datasets. Moreover, most clinical diagnostic groups did not show reduced validity estimates.

Conclusions:

These results provide proof-of-concept evidence for multivariate, data-driven performance validity metrics. These metrics offer a novel method for determining the performance validity for individual neurocognitive tests that is scalable, applicable across different tests, less burdensome, and dimensional. However, more research is needed into their application.

Information

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
Copyright © INS. Published by Cambridge University Press, 2022
Figure 0

Figure 1. Prediction accuracy (AUC) for the multivariate CNB performance validity estimate in predicting true (Simulated) careless responding in 5% of the simulated examinees, by proportions of valid responses. Note. For visual simplification, “Person-Fit Method” is the average of the person-fit indices. CPF = Penn Face Memory Test; CPW = Penn Word Memory Test; VOLT = Visual Object Learning Test; ER40 = Penn Emotion Recognition Test; PMAT = Penn Matrix Analysis Test; PVRT = Penn Verbal Reasoning Test.

Figure 1

Table 1. In-Sample prediction statistics for the six neurocognitive tests predicting human-determined validity, in the Philadelphia Neurodevelopmental cohort (PNC) and marine resiliency Study-II (MRS-II) samples

Supplementary material: File

Scott et al. supplementary material

Scott et al. supplementary material 1

Download Scott et al. supplementary material(File)
File 1.7 MB
Supplementary material: File

Scott et al. supplementary material

Scott et al. supplementary material 2

Download Scott et al. supplementary material(File)
File 1.5 KB
Supplementary material: File

Scott et al. supplementary material

Scott et al. supplementary material 3

Download Scott et al. supplementary material(File)
File 933.9 KB