Hostname: page-component-8448b6f56d-wq2xx Total loading time: 0 Render date: 2024-04-24T04:37:11.869Z Has data issue: false hasContentIssue false

Intraindividual Cognitive Variability: An Examination of ANAM4 TBI-MIL Simple Reaction Time Data from Service Members with and without Mild Traumatic Brain Injury

Published online by Cambridge University Press:  23 November 2017

Wesley R. Cole*
Affiliation:
Defense and Veterans Brain Injury Center, Silver Spring, Maryland and Fort Bragg, North Carolina Womack Army Medical Center, Fort Bragg, North Carolina General Dynamics Health Solutions, Fairfax, Virginia
Emma Gregory
Affiliation:
Defense and Veterans Brain Injury Center, Silver Spring, Maryland and Fort Bragg, North Carolina General Dynamics Health Solutions, Fairfax, Virginia
Jacques P. Arrieux
Affiliation:
Defense and Veterans Brain Injury Center, Silver Spring, Maryland and Fort Bragg, North Carolina Womack Army Medical Center, Fort Bragg, North Carolina General Dynamics Health Solutions, Fairfax, Virginia
F. Jay Haran
Affiliation:
Uniformed Service University of the Health Sciences, Bethesda, Maryland
*
Correspondence and reprint requests to: Wesley R. Cole, Intrepid Spirit, Womack Army Medical Center, Fort Bragg, NC 28310. E-mail: wesley.r.cole.ctr@mail.mil

Abstract

Objectives: The Automated Neuropsychological Assessment Metrics 4 TBI-MIL (ANAM4) is a computerized cognitive test often used in post-concussion assessments with U.S. service members (SMs). However, existing evidence remains mixed regarding ANAM4’s ability to identify cognitive issues following mild traumatic brain injury (mTBI). Studies typically examine ANAM4 using standardized scores and/ or comparisons to a baseline. A more fine-grained approach involves examining inconsistency within an individual’s performance (i.e., intraindividual variability). Methods: Data from 237 healthy control SMs and 105 SMs within seven days of mTBI who took the ANAM4 were included in analyses. Using each individual’s raw scores on a simple reaction time (RT) subtest (SRT1) that is repeated at the end of the battery (SRT2), we calculated mean raw RT and the intraindividual standard deviation (ISD) of trial-by-trial RT. Analyses investigated differences between groups in mean RT, RT variability (i.e., ISD), and change in ISD from SRT1 and SRT2. Results: Using regression residuals to control for demographic variables, analysis of variance, and pairwise comparisons revealed the control group had faster mean RT and smaller ISD compared to the mTBI group. Furthermore, the mTBI group had a significant increase in ISD from SRT1 to SRT2, with effect sizes exceeding the minimum practical effect for comparisons of ISD in SRT2 and change in ISD from SRT1 to SRT2. Conclusions: While inconsistencies in performance are often viewed as test error, the results suggest intraindividual cognitive variability may be more sensitive than traditional metrics in detecting changes in cognitive function after mTBI. Additionally, the findings highlight the utility of the ANAM4’s repeating a RT subtest at two points in the same session for exploring within-subject differences in performance variability. (JINS, 2018, 24, 156–162)

Type
Research Articles
Copyright
Copyright © The International Neuropsychological Society 2017 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

Arrieux, J.P., Cole, W.R., & Ahrens, A.P. (2017). A review of the validity of computerized neurocognitive assessment tools in mild traumatic brain injury assessment. Concussion.CrossRefGoogle Scholar
Batterham, P.J., Bunce, D., Mackinnon, A.J., & Christensen, H. (2014). Intra-individual reaction time variability and all-cause mortality over 17 years: A community-based cohort study. Age and Ageing, 43, 8490.Google Scholar
Bleiberg, J., Garmoe, W.S., Halpern, E.L., Reeves, D.L., & Nadler, J.D. (1997). Consistency of within-day and across-day performance after mild brain injury. Cognitive and Behavioral Neurology, 10(4), 247253.Google Scholar
Cognitive Science Research Center, CSRC. (2014). ANAM Military Battery, Administration Manual. Norman, OK.Google Scholar
Cole, W.R., Arrieux, J.P., Dennison, E.M., & Ivins, B.J. (2017). The impact of administration order in studies of computerized neurocognitive assessment tools (NCATs). Journal of Clinical and Experimental Neuropsychology, 39(1), 3545.Google Scholar
Cole, W.R., Arrieux, J.P., Ivins, B.J., Schwab, K.A., & Qashu, F.M. (2017). A comparison of four computerized neurocognitive assessment tools to a traditional neuropsychological test battery in service members with and without mild traumatic brain injury. Archives of Clinical Neuropsychology, 118. doi: 10.1093/arclin/acx036 Google Scholar
Defense Health Board. (2016). Review of the scientific evidence of using population normative values for post-concussive computerized neurocognitive assessments. Report prepared for the Undersecretary of Personnel and Readiness, Falls Church, VA: Defense Health Board.Google Scholar
Defense and Veterans Brain Injury Center. (2016). DoD worldwide numbers for TBI worldwide totals. Retrieved from http://dvbic.dcoe.mil/files/tbi-numbers/DoD-TBI-Worldwide-Totals_2000-2016_Q1_May-16-2016_v1.0_2016-06-24.pdf.Google Scholar
Department of Defense Instruction (DoDI) 6490.13. (2013). Comprehensive policy on traumatic brain injury-related neurocognitive assessments by the military services. Washington, DC: Department of Defense.Google Scholar
Der, G., & Deary, I.J. (2006). Age and sex differences in reaction time in adulthood: Results from the United Kingdom Health and Lifestyle Survey. Psychology and Aging, 21, 6273.Google Scholar
Dixon, R.A., Garrett, D.D., Lentz, T.L., MacDonald, S.W.S., Strauss, E., & Hultsch, D.F. (2007). Neurocognitive markers of cognitive impairment: Exploring the roles of speed and inconsistency. Neuropsychology, 21, 381399.Google Scholar
Fjell, A.M., Rosquist, H., & Walhovd, K.B. (2009). Instability in the latency of P3a/P3b brain potentials and cognitive function in aging. Neurobiology of Aging, 30(12), 20652079.CrossRefGoogle ScholarPubMed
Friedl, K.E., Grate, S.J., Proctor, S.P., Ness, J.W., Lukey, B.J., & Kane, R.L. (2007). Army research needs for automated neuropsychologial tests: Monitoring soldier health and performance status. Archives of Clinical Neuropsychology, 22, 714.CrossRefGoogle Scholar
Ferguson, C.J. (2009). An effect size primer: A guide for clinicians and researchers. Professional Psychology: Research and Practice, 40(5), 532538.Google Scholar
Garrett, D.D., MacDonald, S.W.S., & Craik, F.I.M. (2012). Intraindividual reaction time variability is malleable: Feedback- and education-related educations in variability with age. Frontiers in Human Neuroscience, 6, 110. doi: 10.3389/fnhum.2012.00101 Google Scholar
Hale, S., Myerson, J., Smith, G.A., & Poon, L.W. (1988). Age, variability and speed: Between-subjects diversity. Psychology and Aging, 3, 407410.Google Scholar
Haran, F.J., Dretsch, M.N., Slaboda, J.C., Johnson, D.E., Adam, O.R., & Tsao, J.W. (2016). Comparison of baseline-referenced versus norm-referenced analytical approaches for in-theatre assessment of mild traumatic brain injury neurocognitive impairment. Brain Injury, 30(3), 280286. doi: 10.3109/02699052.2015.1118766 Google Scholar
Hill, B.D., Rohling, M.L., Boettcher, A.C., & Meyers, J.E. (2013). Cognitive intra-individual variability has a positive association with traumatic brain injury severity and suboptimal effort. Archives of Clinical Neuropsychology, 28(7), 640648.Google Scholar
Hultsch, D.F., MacDonald, S.W.S., & Dixon, R.A. (2002). Variability in reaction time performance of younger and older adults. Journal of Gerontology: Psychological Sciences, 57B(2), 101115.Google Scholar
Hultsch, D.F., Strauss, E., Hunter, M.A., & MacDonald, S.W.S. (2008). Intraindividual variability, cognition and aging. In F.I.M. Craik & T.A. Salthouse (Eds.), The handbook of aging and cognition (3rd ed.), pp 491556). New York: Psychology Press.Google Scholar
Kelly, M.P., Coldren, R.L., Parish, R.V., Dretsch, M.N., & Russell, M.L. (2012). Assessment of acute concussion in the combat environment. Archives of Clinical Neuropsychology, 27(4), 375388.Google Scholar
Lezak, M.D., Howieson, D.B., Bigler, E.D., & Tranel, D. (2012). Neuropsychological assessment (5th ed.), New York, NY: Oxford University Press.Google Scholar
Lovden, M., Li, S.C., Shing, Y.L., & Lindenberger, U. (2007). Within-person trial-to-trial variability precedes and predicts cognitive decline in old and very old age: Longitudinal data from the Berlin Aging Study. Neuropsychologia, 45, 28272838.Google Scholar
Makdissi, M., Collie, A., Maruff, P., Darby, D.G., Bush, A., McCrory, P., & Bennell, K. (2001). Computerised cognitive assessment of concussed Australian Rules footballers. British Journal of Sports Medicine, 35(5), 354360.Google Scholar
McCrea, M., Pliskin, N., Barth, J., Cox, D., Fink, J., French, L., & Powell, M. (2008). Official position of the military TBI task force on the role of neuropsychology and rehabilitation psychology in the evaluation, management, and research of military veterans with traumatic brain injury: APPROVED by: American Academy of Clinical Neuropsychology (AACN) American Psychological Association Division 40 (Neuropsychology) American Psychological Association Division 22 (Rehabilitation Psychology) National Academy of Neuropsychology (NAN). The Clinical Neuropsychologist, 22(1), 1026.Google Scholar
McCrory, P., Meeuwisse, W.H., Aubry, M., Cantu, B., Dvořák, J., Echemendia, R.J. & Sills, A. (2013). Consensus statement on concussion in sport: The 4th International Conference on Concussion in Sport held in Zurich, November 2012. British Journal of Sports Medicine, 47(5), 250258.Google Scholar
Nelson, L.D., LaRoche, A.A., Pfaller, A.Y., Lerner, E.B., Hammeke, T.A., Randolph, C., & McCrea, M.A. (2016). Prospective, head-to-head study of three computerized neurocognitive assessment tools (CNTs): Reliability and validity for the assessment of sport-related concussion. Journal of the International Neuropsychological Society, 22(1), 2437.CrossRefGoogle ScholarPubMed
Norris, J.N., Carr, W., Herzig, T., Labrie, D.W., & Sams, R. (2013). ANAM4 TBI reaction time-based tests have prognostic utility for acute concussion. Military Medicine, 178(7), 767774.Google Scholar
Parks, A.C., Moore, R.D., Wu, C.T., Broglio, S.P., Covassin, T., Hillman, C.H., && Pontifex, M.B. (2015). The association between a history of concussion and variability in behavioral and neuroelectric indices of cognition. International Journal of Psychophysiology, 98(3), 426434.Google Scholar
Rabinowitz, A.R., & Arnett, P.A. (2013). Intraindividual cognitive variability before and after sports-related concussion. Neuropsychology, 27(4), 481490.CrossRefGoogle ScholarPubMed
Resch, J.E., McCrea, M.A., & Cullum, C.M. (2013). Computerized neurocognitive testing in the management of sport-related concussion: An update. Neuropsychology Review, 23(4), 335349.CrossRefGoogle ScholarPubMed
Roebuck-Spencer, T.M., Vincent, A.S., Gilliland, K., Johnson, D.R., & Cooper, D.B. (2013). Initial clinical validation of an embedded performance validity measure within the automated neuropsychological metrics (ANAM). Archives of Clinical Neuropsychology, 28(7), 700710.Google Scholar
Segalowitz, S.J., Dywan, J., & Unsal, A. (1997). Attentional factors in response time variability after traumatic brain injury: An ERP study. Journal of the International Neuropsychological Society, 3(2), 95107.Google Scholar
Sosnoff, J.J., Broglio, S.P., Hillman, C.H., & Ferrara, M.S. (2007). Concussion does not impact intraindividual response time variability. Neuropsychology, 21(6), 796802.Google Scholar
Stuss, D.T., Stethem, L.L., Hugenholtz, H., Picton, T., Pivik, J., & Richard, M.T. (1989). Reaction time after head injury: Fatigue, divided and focused attention, and consistency of performance. Journal of Neurology, Neurosurgery, & Psychiatry, 52(6), 742748.Google Scholar