Hostname: page-component-76fb5796d-skm99 Total loading time: 0 Render date: 2024-04-29T15:21:14.869Z Has data issue: false hasContentIssue false

Latent Factor Structure and Measurement Invariance of the NIH Toolbox Cognition Battery in an Alzheimer’s Disease Research Sample

Published online by Cambridge University Press:  05 October 2020

Yue Ma*
Affiliation:
Wisconsin Alzheimer’s Disease Research Center, University of Wisconsin School of Medicine and Public Health, J5/1 Mezzanine, 600 Highland Avenue, Madison, WI 53792, USA
Cynthia M. Carlsson
Affiliation:
Wisconsin Alzheimer’s Disease Research Center, University of Wisconsin School of Medicine and Public Health, J5/1 Mezzanine, 600 Highland Avenue, Madison, WI 53792, USA Wisconsin Alzheimer’s Institute, University of Wisconsin School of Medicine and Public Health, 610 Walnut Street, 9th Floor, Madison, WI 53726, USA Geriatric Research Education and Clinical Center, William S. Middleton Memorial Veterans Hospital, 2500 Overlook Terrace, Madison, WI 53705, USA
Michelle L. Wahoske
Affiliation:
Wisconsin Alzheimer’s Disease Research Center, University of Wisconsin School of Medicine and Public Health, J5/1 Mezzanine, 600 Highland Avenue, Madison, WI 53792, USA
Hanna M. Blazel
Affiliation:
Wisconsin Alzheimer’s Disease Research Center, University of Wisconsin School of Medicine and Public Health, J5/1 Mezzanine, 600 Highland Avenue, Madison, WI 53792, USA
Richard J. Chappell
Affiliation:
Wisconsin Alzheimer’s Disease Research Center, University of Wisconsin School of Medicine and Public Health, J5/1 Mezzanine, 600 Highland Avenue, Madison, WI 53792, USA Department of Biostatistics and Medical Informatics, University of Wisconsin, WARF Room 201, 610 Walnut Street, Madison, WI 53726, USA Department of Statistics, University of Wisconsin, 1300 University Avenue, Madison, WI 53706, USA
Sterling C. Johnson
Affiliation:
Wisconsin Alzheimer’s Disease Research Center, University of Wisconsin School of Medicine and Public Health, J5/1 Mezzanine, 600 Highland Avenue, Madison, WI 53792, USA Wisconsin Alzheimer’s Institute, University of Wisconsin School of Medicine and Public Health, 610 Walnut Street, 9th Floor, Madison, WI 53726, USA Geriatric Research Education and Clinical Center, William S. Middleton Memorial Veterans Hospital, 2500 Overlook Terrace, Madison, WI 53705, USA
Sanjay Asthana
Affiliation:
Wisconsin Alzheimer’s Disease Research Center, University of Wisconsin School of Medicine and Public Health, J5/1 Mezzanine, 600 Highland Avenue, Madison, WI 53792, USA Geriatric Research Education and Clinical Center, William S. Middleton Memorial Veterans Hospital, 2500 Overlook Terrace, Madison, WI 53705, USA
Carey E. Gleason
Affiliation:
Wisconsin Alzheimer’s Disease Research Center, University of Wisconsin School of Medicine and Public Health, J5/1 Mezzanine, 600 Highland Avenue, Madison, WI 53792, USA Wisconsin Alzheimer’s Institute, University of Wisconsin School of Medicine and Public Health, 610 Walnut Street, 9th Floor, Madison, WI 53726, USA Geriatric Research Education and Clinical Center, William S. Middleton Memorial Veterans Hospital, 2500 Overlook Terrace, Madison, WI 53705, USA
*
*Correspondence and reprint requests to: Yue Ma, Wisconsin Alzheimer’s Disease Research Center, University of Wisconsin School of Medicine and Public Health, J5/1 Mezzanine, 600 Highland Avenue, Madison, WI 53792, USA. E-mail: yma@medicine.wisc.edu

Abstract

Objective:

This study investigated the latent factor structure of the NIH Toolbox Cognition Battery (NIHTB-CB) and its measurement invariance across clinical diagnosis and key demographic variables including sex, race/ethnicity, age, and education for a typical Alzheimer’s disease (AD) research sample.

Method:

The NIHTB-CB iPad English version, consisting of 7 tests, was administered to 411 participants aged 45–94 with clinical diagnosis of cognitively unimpaired, dementia, mild cognitive impairment (MCI), or impaired not MCI. The factor structure of the whole sample was first examined with exploratory factor analysis (EFA) and further refined using confirmatory factor analysis (CFA). Two groups were classified for each variable (diagnosis or demographic factors). The confirmed factor model was next tested for each group with CFA. If the factor structure was the same between the groups, measurement invariance was then tested using a hierarchical series of nested two-group CFA models.

Results:

A two-factor model capturing fluid cognition (executive function, processing speed, and memory) versus crystalized cognition (language) fit well for the whole sample and each group except for those with age < 65. This model generally had measurement invariance across sex, race/ethnicity, and education, and partial invariance across diagnosis. For individuals with age < 65, the language factor remained intact while the fluid cognition was separated into two factors: (1) executive function/processing speed and (2) memory.

Conclusions:

The findings mostly supported the utility of the battery in AD research, yet revealed challenges in measuring memory for AD participants and longitudinal change in fluid cognition.

Type
Regular Research
Copyright
Copyright © INS. Published by Cambridge University Press, 2020

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

Akshoomoff, N., Brown, T.T., Bakeman, R., & Hagler, D.J. Jr. (2018). Developmental differentiation of executive functions on the NIH Toolbox Cognition Battery. Neuropsychology, 32(7), 777783. doi: 10.1037/neu0000476 CrossRefGoogle ScholarPubMed
Albert, M.S., DeKosky, S.T., Dickson, D., Dubois, B., Feldman, H.H., Fox, N.C., … Phelps, C.H. (2011). The diagnosis of mild cognitive impairment due to Alzheimer’s disease: Recommendations from the National Institute on Aging-Alzheimer’s Association workgroups on diagnostic guidelines for Alzheimer’s disease. Alzheimer’s & Dementia, 7(3), 270279. doi: 10.1016/j.jalz.2011.03.008 CrossRefGoogle Scholar
Bentler, P.M. (1990). Comparative fit indexes in structural models. Psychological Bulletin, 107(2), 238246. doi: 10.1037/0033-2909.107.2.238 CrossRefGoogle ScholarPubMed
Bentler, P.M. (1995). EQS Structural Equations Program Manual. Encino, CA: Multivariate Software.Google Scholar
Besser, L., Kukull, W., Knopman, D.S., Chui, H., Galasko, D., Weintraub, S., … Morris, J.C. (2018). Version 3 of the National Alzheimer’s Coordinating Center’s uniform data set: Alzheimer Disease & Associated Disorders, 32(4), 18. doi: 10.1097/WAD.0000000000000279 CrossRefGoogle ScholarPubMed
Bollen, K.A. (1989). Structural Equations with Latent Variables. New York: John Wiley & Sons.CrossRefGoogle Scholar
Bowden, S.C., Cook, M.J., Bardenhagen, F.J., Shores, E.A., & Carstairs, J.R. (2004). Measurement invariance of core cognitive abilities in heterogeneous neurological and community samples. Intelligence, 32(4), 363389. doi: 10.1016/j.intell.2004.05.002 CrossRefGoogle Scholar
Brace, J.C. & Savalei, V. (2017). Type I error rates and power of several versions of scaled chi-square difference tests in investigations of measurement invariance. Psychological Methods, 22(3), 467485. doi: 10.1037/met0000097 CrossRefGoogle ScholarPubMed
Browne, M.W. & Cudeck, R. (1992). Alternative ways of assessing model fit. Sociological Methods & Research, 21(2), 230258. doi: 10.1177/0049124192021002005 CrossRefGoogle Scholar
Byrne, B.M., Shavelson, R.J., & Muthén, B. (1989). Testing for the equivalence of factor covariance and mean structures: The issue of partial measurement invariance. Psychological Bulletin, 105(3), 456466. doi: 10.1037/0033-2909.105.3.456 CrossRefGoogle Scholar
Carlozzi, N.E., Goodnight, S., Casaletto, K.B., Goldsmith, A., Heaton, R.K., Wong, A.W.K., … Tulsky, D.S. (2017). Validation of the NIH Toolbox in individuals with neurologic disorders. Archives of Clinical Neuropsychology, 32(5), 555573. doi: 10.1093/arclin/acx020 CrossRefGoogle ScholarPubMed
Carlozzi, N.E., Tulsky, D.S., Wolf, T.J., Goodnight, S., Heaton, R.K., Casaletto, K.B., … Heinemann, A.W. (2017). Construct validity of the NIH Toolbox Cognition Battery in individuals with stroke. Rehabilitation Psychology, 62(4), 443454. doi: 10.1037/rep0000195 CrossRefGoogle ScholarPubMed
Casaletto, K.B., Umlauf, A., Beaumont, J., Gershon, R., Slotkin, J., Akshoomoff, N., & Heaton, R.K. (2015). Demographically corrected normative standards for the English version of the NIH Toolbox Cognition Battery. Journal of the International Neuropsychological Society : JINS, 21(5), 378391. doi: 10.1017/S1355617715000351 CrossRefGoogle ScholarPubMed
Casaletto, K.B., Umlauf, A., Marquine, M., Beaumont, J.L., Mungas, D., Gershon, R., … Heaton, R.K. (2016). Demographically corrected normative standards for the Spanish language version of the NIH Toolbox Cognition Battery. Journal of the International Neuropsychological Society : JINS, 22(3), 364374. doi: 10.1017/S135561771500137X CrossRefGoogle ScholarPubMed
Chen, F.F. (2007). Sensitivity of goodness of fit indexes to lack of measurement invariance. Structural Equation Modeling: A Multidisciplinary Journal, 14(3), 464504. doi: 10.1080/10705510701301834 CrossRefGoogle Scholar
Cheung, G.W. & Rensvold, R.B. (2002). Evaluating goodness-of-fit indexes for testing measurement invariance. Structural Equation Modeling: A Multidisciplinary Journal, 9(2), 233255.CrossRefGoogle Scholar
Cohen, M.L, Tulsky, D.S., Holdnack, J.A., Carlozzi, N.E., Wong, A., Magasi, S., … Heinemann, A.W. (2017). Cognition among community-dwelling individuals with spinal cord injury. Rehabilitation Psychology, 62(4), 425434. doi: 10.1037/rep0000140 CrossRefGoogle ScholarPubMed
Collie, A., Shafiq-Antonacci, R., Maruff, P., Tyler, P., & Currie, J. (1999). Norms and the effects of demographic variables on a neuropsychological battery for use in healthy ageing Australian populations. Australian & New Zealand Journal of Psychiatry, 33(4), 568575. doi: 10.1080/j.1440-1614.1999.00570.x CrossRefGoogle ScholarPubMed
Curran, P.J., Bollen, K.A., Chen, F., Paxton, P., & Kirby, J.B. (2003). Finite sampling properties of the point estimates and confidence intervals of the RMSEA. Sociological Methods & Research, 32(2), 208252. doi: 10.1177/0049124103256130 CrossRefGoogle Scholar
Dowling, N.M., Hermann, B., La Rue, A., & Sager, M.A. (2010). Latent structure and factorial invariance of a neuropsychological test battery for the study of preclinical Alzheimer’s disease. Neuropsychology, 24(6), 742756. doi: 10.1037/a0020176 CrossRefGoogle Scholar
Enders, C.K. (2010). Applied missing data analysis. New York: Guilford Press.Google Scholar
Fabrigar, L.R. & Wegener, D.T. (2012). Exploratory factor analysis. New York: Oxford University Press.Google Scholar
Flores, I., Casaletto, K.B., Marquine, M.J., Umlauf, A., Moore, D.J., Mungas, D., … Heaton, R.K. (2017). Performance of Hispanics and non-Hispanic whites on the NIH Toolbox Cognition Battery: The roles of ethnicity and language backgrounds. The Clinical Neuropsychologist, 31(4), 783797. doi: 10.1080/13854046.2016.1276216 CrossRefGoogle ScholarPubMed
Gershon, R.C., Wagster, M.V., Hendrie, H.C., Fox, N.A., Cook, K.F., & Nowinski, C.J. (2013). NIH Toolbox for assessment of neurological and behavioral function. Neurology, 80, S2S6.CrossRefGoogle ScholarPubMed
Hackett, K., Krikorian, R., Giovannetti, T., Melendez-Cabrero, J., Rahman, A., Caesar, E.E., … Isaacson, R.S. (2018). Utility of the NIH Toolbox for assessment of prodromal Alzheimer’s disease and dementia. Alzheimer’s & Dementia : Diagnosis, Assessment & Disease Monitoring, 10, 764772. doi: 10.1016/j.dadm.2018.10.002 Google ScholarPubMed
Hayden, K.M., Plassman, B.L., & Warren, L.H. (2011). Factor structure of the National Alzheimer’s Coordinating Centers Uniform Dataset Neuropsychological Battery. Alzheimer Disease and Associated Disorders, 25(2), 128137.CrossRefGoogle ScholarPubMed
Heaton, R.K., Akshoomoff, N., Tulsky, D., Mungas, D., Weintraub, S., Dikmen, S., … Gershon, R. (2014). Reliability and validity of composite scores from the NIH Toolbox Cognition Battery in adults. Journal of the International Neuropsychological Society: JINS, 20(6), 588598. doi: 10.1017/S1355617714000241 CrossRefGoogle ScholarPubMed
Heaton, R.K., Ryan, L., & Grant, I. (2009). Demographic influences and use of demographically corrected norms in neuropsychological assessment. In Grant, I. & Adams, K.M. (Eds.), Neuropsychological assessment of neuropsychiatric and neuromedical disorders (3rd ed., pp. 127155). Oxford University Press.Google Scholar
Hessl, D., Sansone, S.M., Berry-Kravis, E., Riley, K., Widaman, K.F., Abbeduto, L., … Gershon, R.C. (2016). The NIH Toolbox Cognitive Battery for intellectual disabilities: Three preliminary studies and future directions. Journal of Neurodevelopmental Disorders, 8. doi: 10.1186/s11689-016-9167-4 CrossRefGoogle ScholarPubMed
Hu, L. & Bentler, P.M. (1998). Fit indices in covariance structure modeling: Sensitivity to underparameterized model misspecification. Psychological Methods, 3(4), 424453. doi: 10.1037/1082-989X.3.4.424 CrossRefGoogle Scholar
Hu, L. & Bentler, P.M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6(1), 155. doi: 10.1080/10705519909540118 CrossRefGoogle Scholar
Jack, C.R., Bennett, D.A., Blennow, K., Carrillo, M.C., Dunn, B., Haeberlein, S.B., … Contributors. (2018). NIA-AA Research Framework: Toward a biological definition of Alzheimer’s disease. Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association, 14(4), 535562. doi: 10.1016/j.jalz.2018.02.018 CrossRefGoogle Scholar
Jorgensen, T.D., Kite, B.A., Chen, P.-Y., & Short, S.D. (2018). Permutation randomization methods for testing measurement equivalence and detecting differential item functioning in multiple-group confirmatory factor analysis. Psychological Methods, 23(4), 708728. doi: 10.1037/met0000152 CrossRefGoogle ScholarPubMed
MacCallum, R.C., Widaman, K.F., Zhang, S., & Hong, S. (1999). Sample size in factor analysis. Psychological Methods, 4(1), 8499. doi: 10.1037/1082-989X.4.1.84 CrossRefGoogle Scholar
McDonough, I.M., Bischof, G.N., Kennedy, K.M., Rodrigue, K.M., Farrell, M.E., & Park, D.C. (2016). Discrepancies between fluid and crystallized ability in healthy adults: A behavioral marker of preclinical alzheimer’s disease. Neurobiology of Aging, 46, 6875. doi: 10.1016/j.neurobiolaging.2016.06.011 CrossRefGoogle ScholarPubMed
McKhann, G.M., Knopman, D.S., Chertkow, H., Hyman, B.T., Jack, C.R., Kawas, C.H., … Phelps, C.H. (2011). The diagnosis of dementia due to Alzheimer’s disease: Recommendations from the National Institute on Aging-Alzheimer’s Association workgroups on diagnostic guidelines for Alzheimer’s disease. Alzheimer’s & Dementia, 7(3), 263269. doi: 10.1016/j.jalz.2011.03.005 CrossRefGoogle Scholar
Marsh, H.W., Hau, K.-T., & Wen, Z. (2004). In search of golden rules: Comment on hypothesis-testing approaches to setting cutoff values for fit indexes and dangers in overgeneralizing Hu and Bentler’s (1999) findings. Structural Equation Modeling: A Multidisciplinary Journal, 11(3), 320341. doi: 10.1207/s15328007sem1103_2 CrossRefGoogle Scholar
Meade, A.W. & Bauer, D.J. (2007). Power and precision in confirmatory factor analytic tests of measurement invariance. Structural Equation Modeling: A Multidisciplinary Journal, 14(4), 611635. doi: 10.1080/10705510701575461 CrossRefGoogle Scholar
Meade, A.W., Johnson, E.C., & Braddy, P.W. (2008). Power and sensitivity of alternative fit indices in tests of measurement invariance. Journal of Applied Psychology, 93(3), 568592. doi: 10.1037/0021-9010.93.3.568 CrossRefGoogle ScholarPubMed
Meade, A.W. & Lautenschlager, G.J. (2004). A Monte-Carlo study of confirmatory factor analytic tests of measurement equivalence/invariance. Structural Equation Modeling: A Multidisciplinary Journal, 11(1), 6072. doi: 10.1207/S15328007SEM1101_5 CrossRefGoogle Scholar
Meredith, W. (1993). Measurement invariance, factor analysis and factorial invariance. Psychometrika, 58(4), 525543. doi: 10.1007/BF02294825 CrossRefGoogle Scholar
Meredith, W. & Teresi, J.A.E. (2006). An essay on measurement and factorial invariance. Medical Care Measurement in a Multi-Ethnic Society, 44(11). doi: 10.1097/01.mlr.0000245438.73837.89 Google ScholarPubMed
Millsap, R.E. & Kwok, O.-M. (2004). Evaluating the impact of partial factorial invariance on selection in two populations. Psychological Methods, 9(1), 93115. doi: 10.1037/1082-989X.9.1.93 CrossRefGoogle ScholarPubMed
Mungas, D., Heaton, R., Tulsky, D., Zelazo, P., Slotkin, J., Blitz, D., … Gershon, R. (2014). Factor structure, convergent validity, and discriminant validity of the NIH Toolbox Cognitive Health Battery (NIHTB-CHB) in adults. Journal of the International Neuropsychological Society : JINS, 20(6), 579587. doi: 10.1017/S1355617714000307 CrossRefGoogle ScholarPubMed
Mungas, D., Widaman, K.F., Reed, B.R., & Tomaszewski Farias, S. (2011). Measurement invariance of neuropsychological tests in diverse older persons. Neuropsychology, 25(2), 260269. doi: 10.1037/a0021090 CrossRefGoogle ScholarPubMed
Muthén, L.K. & Muthén, B.O. (1998–2017). Mplus User’s Guide (8th ed.) Los Angeles, CA: Muthén & Muthén.Google Scholar
Muthén, L.K. & Muthén, B.O. (2009). Exploratory factor analysis, confirmatory factor analysis, and structural equation modeling for continuous outcomes. Mplus Short Course, Johns Hopkins University, Baltimore, MD. Retrieved from http://www.statmodel.com/download/Topic%201.pdf.Google Scholar
National Institutes of Health and Northwestern University (2006–2016). Scoring and Interpretation Guide for the iPad.Google Scholar
National Institutes of Health Diversity in Extramural Programs (2019). Re: Underrepresented racial and ethnic groups [website information]. Retrieved from https://extramural-diversity.nih.gov/diversity-matters/underrepresented-groups.Google Scholar
Nitsch, K.P., Casaletto, K.B., Carlozzi, N.E., Tulsky, D.S., Heinemann, A.W., & Heaton, R.K. (2017). Uncorrected versus demographically-corrected scores on the NIH Toolbox Cognition Battery in persons with traumatic brain injury and stroke. Rehabilitation Psychology, 62(4), 485495. doi: 10.1037/rep0000122 CrossRefGoogle ScholarPubMed
Norman, M.A., Evans, J.D., Miller, W.S., & Heaton, R.K. (2000). Demographically corrected norms for the California Verbal Learning Test. Journal of Clinical and Experimental Neuropsychology, 22(1), 8094. doi: 10.1076/1380-3395(200002)22:1;1-8;FT080 CrossRefGoogle ScholarPubMed
Norman, M.A., Moore, D.J., Taylor, M., Franklin, D., Cysique, L., Ake, C., … HNRC Group. (2011). Demographically corrected norms for African Americans and Caucasians on the Hopkins verbal learning test-revised, brief visuospatial memory test-revised, Stroop color and word test, and Wisconsin card sorting test 64-card version. Journal of Clinical and Experimental Neuropsychology, 33(7), 793804. doi: 10.1080/13803395.2011.559157 CrossRefGoogle ScholarPubMed
Satorra, A. & Bentler, P.M. (2001). A scaled difference chi-square test statistic for moment structure analysis. Psychometrika, 66(4), 507514. doi: 10.1007/BF02296192 CrossRefGoogle Scholar
Schmidt, M. (1996). Rey Auditory Verbal Learning Test: A Handbook. Los Angeles, CA: Western Psychological Services.Google Scholar
Siedlecki, K.L., Manly, J.J., Brickman, A.M., Schupf, N., Tang, M.-X., & Stern, Y. (2010). Do neuropsychological tests have the same meaning in Spanish speakers as they do in English speakers? Neuropsychology, 24(3), 402411. doi: 10.1037/a0017515 CrossRefGoogle ScholarPubMed
Steiger, J.H. & Lind, J.C. (1980, May). Statistically Based Tests for the Number of Common Factors. Iowa City, IA: Annual Meeting of the Psychometric Society.Google Scholar
Steiger, J.H., Shapiro, A., & Browne, M.W. (1985). On the multivariate asymptotic distribution of sequential Chi-square statistics. Psychometrika, 50(3), 253263. doi: 10.1007/BF02294104 CrossRefGoogle Scholar
Tulsky, D.S., Carlozzi, N.E., Holdnack, J., Heaton, R.K., Wong, A., Goldsmith, A., & Heinemann, A.W. (2017). Using the NIH Toolbox Cognition Battery (NIHTB-CB) in individuals with traumatic brain injury. Rehabilitation Psychology, 62(4), 413424. doi: 10.1037/rep0000174 CrossRefGoogle ScholarPubMed
Tulsky, D.S., Holdnack, J.A., Cohen, M.L., Heaton, R.K., Carlozzi, N.E., Wong, A.W.K., … Heinemann, A.W. (2017). Factor structure of the NIH Toolbox Cognition Battery in individuals with acquired brain injury. Rehabilitation Psychology, 62(4), 435442. doi: 10.1037/rep0000183 CrossRefGoogle ScholarPubMed
Tulsky, D.S., Saklofske, D.H., Chelune, G.J., Heaton, R.K., Ivnik, R.J., Bornstein, R., Prifitera, A., & Ledbetter, M.F. (2003). Clinical Interpretation of the WAIS-III and WMS-III. San Diego, CA: Elsevier Science & Technology.Google Scholar
Tuokko, H.A., Chou, P.H.B., Bowden, S.C., Simard, M., Ska, B., & Crossley, M. (2009). Partial measurement equivalence of French and English versions of the Canadian Study of Health and Aging neuropsychological battery. Journal of the International Neuropsychological Society, 15, 416425.CrossRefGoogle ScholarPubMed
Vandenberg, R.J. (2002). Toward a further understanding of and improvement in measurement invariance methods and procedures. Organizational Research Methods, 5(2), 139158. doi: 10.1177/1094428102005002001 CrossRefGoogle Scholar
Vandenberg, R.J. & Lance, C.E. (2000). A Review and synthesis of the measurement invariance literature: Suggestions, practices, and recommendations for organizational research. Organizational Research Methods, 3(1), 470. doi: 10.1177/109442810031002 CrossRefGoogle Scholar
Victorson, D., Manly, J., Wallner-Allen, K., Fox, N., Purnell, C., Hendrie, H., … Gershon, R. (2013). Using the NIH Toolbox in special populations. Neurology, 80(Suppl 3), S13S19.CrossRefGoogle ScholarPubMed
Wang, J. & Wang, X. (2012). Structural Equation Modeling: Applications using Mplus. Chichester: John Wiley & Sons.CrossRefGoogle Scholar
Weintraub, S., Besser, L., Dodge, H.H., Teylan, M., Ferris, S., Goldstein, F.C., … Morris, J.C. (2018). Version 3 of the Alzheimer Disease Centers’ Neuropsychological Test Battery in the Uniform Data Set (UDS): Alzheimer Disease & Associated Disorders, 32(1), 1017. doi: 10.1097/WAD.0000000000000223 CrossRefGoogle Scholar
Weintraub, S., Dikmen, S.S., Heaton, R.K., Tulsky, D.S., Zelazo, P.D., Bauer, P.J., … Gershon, R.C. (2013). Cognition assessment using the NIH Toolbox. Neurology, 80(11 Suppl 3), S54S64. doi: 10.1212/WNL.0b013e3182872ded CrossRefGoogle ScholarPubMed
Widaman, K.F. & Reise, S.P. (1997). Exploring the measurement invariance of psychological instruments: Applications in the substance use domain. In Bryant, K.J., Windle, M., & West, S.G. (Eds.), The science of prevention: Methodological advances from alcohol and substance abuse research (pp. 281324). Washington, DC: The American Psychological Association.CrossRefGoogle Scholar
Yates, A. (1987). Multivariate Exploratory Data Analysis: A Perspective on Exploratory Factor Analysis. Albany, NY: State University of New York Press.Google Scholar
Yoon, M. & Lai, M.H.C. (2018). Testing factorial invariance with unbalanced samples. Structural Equation Modeling: A Multidisciplinary Journal, 25(2), 201213. doi: 10.1080/10705511.2017.1387859 CrossRefGoogle Scholar
Yuan, K.H. & Bentler, P.M. (2000). Three likelihood-based methods for mean and covariance structure analysis with nonnormal missing data. In Sobel, M.E. & Becker, M.P. (Eds.), Sociological methodology 2000 (pp. 165200). Washington, DC: The American Sociological Association.Google Scholar
Supplementary material: File

Ma et al. supplementary material

Ma et al. supplementary material

Download Ma et al. supplementary material(File)
File 145.2 KB