8 results
87 Virtual Driving Relates to Real-World Risky Driving
- Kathryn N Devlin, Molly Split, Jocelyn Ang, Sophia Lopes, Aleksandar Gonevski, Oluwatoniloba Ogunkoya, Tasmia Hasan, Maria Schultheis
-
- Journal:
- Journal of the International Neuropsychological Society / Volume 29 / Issue s1 / November 2023
- Published online by Cambridge University Press:
- 21 December 2023, pp. 489-490
-
- Article
-
- You have access Access
- Export citation
-
Objective:
Driving is a cognitively demanding activity commonly affected by brain injury and illness. Accurate driving assessment is essential for reducing risk, optimizing independence, and informing driving-related interventions. Virtual reality driving simulation (VRDS) enables safe, sensitive, objective, and standardized measurement of driving abilities. VRDS has been validated in relation to self-reports and driver records. However, self-reports are subjective, and driver records include only major events (collisions, violations). Video telematics platforms can measure naturalistic driving in a more objective and sensitive manner. The present study used video telematics to examine relationships between VRDS performance and directly observed naturalistic driving.
Participants and Methods:20 healthy adult drivers (ages 23-61, mean age=36; 75% women) completed a VRDS assessment that included 1) driving on a straight road, 2) following a truck on a highway, and 3) reacting to a child running into a street to retrieve a ball. Primary VRDS measures were 1) speed and lane management on the straight road; 2) speed and following distance management in the truck-following task; and 3) reaction time, stopping, and distance from the child in the child-ball task. Participants also completed 28 days of naturalistic driving with a video telematics platform in their vehicle. Driving events were detected automatically using accelerometer, GPS, and video data, and driving behaviors were coded by driving risk analysts. The primary naturalistic measure was the number of unsafe driving behaviors per hour driven; specific driving behaviors served as exploratory variables. We examined correlations between VRDS and naturalistic driving variables. Given limited statistical power, we reported correlations that were small-to-medium or greater (r>.2) in primary analyses and medium-to-large or greater (r>.4) in exploratory analyses.
Results:On average, drivers exhibited approximately one unsafe driving behavior per hour (M=0.9, SD=0.9, range=0.1-2.7). Common behaviors were failing to stop, unsafe following distance, speeding, and cell phone use. No collisions occurred. Average lane position in VRDS (specifically, leftward deviation from the center of the lane) was correlated with more real-world unsafe driving behaviors per hour (r=.35, p=.13), as were higher average straight road speed (r=.26, p=.27), greater straight road speed variability (r=.28, p=.24), and failing to stop for the child in the child-ball task (r=.22, p=.36). In exploratory analyses, failing to stop for the child was associated with real-world distracted driving (r=.45, p=.047), greater lane position variability in VRDS was associated with real-world unsafe following distance (r=.57, p=.009), and greater speed variability in VRDS was associated with real-world seat belt non-use/misuse (r=.49, p=.03).
Conclusions:The present findings provide preliminary evidence that VRDS variables are related to directly observed naturalistic driving, supporting the potential utility of VRDS as a sensitive, ecologically valid driving evaluation tool. As the present study used a small sample of healthy drivers, further research will explore this topic in larger samples and in clinical populations, such as acquired brain injury. Future work will also investigate whether incorporating VRDS with conventional driving evaluation tools (e.g., neuropsychological tests, behind-the-wheel assessments) can enhance the ability of clinical driving evaluations to predict real-world risky driving.
65 The Best Tests: Optimizing Detection of Cognitive Decline in People Living with HIV
- Sajda Adam, Will Dampier, Shinika Tillman, Kim Malone, Vanessa Pirrone, Michael Nonnemacher, Amy Althoff, Zsofia Szep, Brian Wigdahl, Maria Schultheis, Kathryn N Devlin
-
- Journal:
- Journal of the International Neuropsychological Society / Volume 29 / Issue s1 / November 2023
- Published online by Cambridge University Press:
- 21 December 2023, pp. 60-61
-
- Article
-
- You have access Access
- Export citation
-
Objective:
Approximately half of people living with HIV (PWH) experience HIV-associated neurocognitive disorders (HAND), yet HAND often goes undiagnosed. There is an ongoing need to find efficient, cost-effective ways to screen for HAND and monitor its progression in order to intervene earlier in its course and more effectively treat it. Prior studies that analyzed brief HAND screening tools have demonstrated that certain cognitive test pairs are sensitive to HAND cross-sectionally and outperform other screening tools such as the HIV Dementia Scale (HDS). However, few studies have examined optimal tests for longitudinal screening. This study aims to identify the best cognitive test pairs for detecting cognitive decline longitudinally.
Participants and Methods:Participants were HIV+ adults (N=132; ages 25-68; 59% men; 92% Black) from the Temple/Drexel Comprehensive NeuroHIV Center cohort. Participants were currently well treated (98% on cART, 92% with undetectable viral load, and mean current CD4 count=686). They completed comprehensive neurocognitive assessments longitudinally (328 total visits, average follow-up time=4.9 years). Eighteen participants (14% of the cohort) demonstrated significant cognitive decline, defined as a decline in global cognitive z-score of 0.5 (SD) or more. In receiver operating characteristic (ROC) analyses, tests with an area under the curve (AUC) of greater than .7 were included in subsequent test pair analyses. Further ROC analyses examined the sensitivity and specificity of each test pair in detecting significant cognitive decline. Results were compared with the predictive ability of the Modified HIV Dementia Scale (MHDS).
Results:The following test pairs demonstrated the best balance between sensitivity and specificity in detecting global cognitive decline: Grooved Pegboard dominant hand (GPD) and category fluency (sensitivity=.89, specificity=.60, AUC=.75, p<.001), GPD and Coding (sensitivity=.76, specificity=.70, AUC=.73, p<.001), letter fluency and Trail Making Test (TMT) B (sensitivity=.82, specificity=.63, AUC=.73, p<.001), and GPD and TMT B (sensitivity=.81, specificity=.64, AUC=.73, p<.001). Change in MHDS predicted significant decline no better than chance (sensitivity=.61, specificity=.47, AUC=.53, p=.65).
Conclusions:Several cognitive test pairs, particularly those that include GPD, are sensitive to HIV-associated cognitive change, and far more sensitive and specific than the MHDS. Cognitive test pairs can serve as valid, rapid, cost-effective screening tools for detecting cognitive change in PWH, thereby better enabling early detection and intervention. Future research should validate the present findings in other cohorts and examine the implementation of test pair screenings in HIV care settings. Most of the optimal tests identified are consistent with the well-established impact of HAND on frontal-subcortical motor and executive networks. The utility of category fluency is somewhat unexpected as it places more demands on temporal semantic networks; future research should explore the factors driving this finding, such as the potential interaction of HIV with aging and neurodegenerative disease.
46 Depression and Reward Responsiveness in Multiple Sclerosis
- Valerie Humphreys, Fareshte Irani, Darshan Patel, Maria Schultheis, John Medaglia, Kathryn N. Devlin
-
- Journal:
- Journal of the International Neuropsychological Society / Volume 29 / Issue s1 / November 2023
- Published online by Cambridge University Press:
- 21 December 2023, pp. 559-560
-
- Article
-
- You have access Access
- Export citation
-
Objective:
Depression is common in persons with MS (PwMS), substantially contributing to morbidity and mortality. Depression can dually impact PwMS as both a psychosocial reaction to living with the disease and a neurological effect of it. Cardinal features of depression include reduced ability to seek and experience pleasure, often attributed to dysregulation of the brain's reward system. People with depression exhibit atypical reward processing, as do fatigued PwMS. However, it is unclear whether MS itself affects reward processing, and whether it interacts with depression. The current study explored the associations of depression, MS, and their interaction on reward responsiveness. We hypothesized that depression and MS would independently be associated with poorer reward responsiveness and that they would interact synergistically to impair reward responsiveness.
Participants and Methods:Forty PwMS and 40 healthy age- and education-matched healthy controls (HC) participated in a computerized switching task with high- and low-reward manipulations. The Chicago Multiscale Depression Inventory (CMDI) Mood subscale measured depressive symptoms. The Behavioral Inhibition/Activation Scales (BIS/BAS) measured self-reported reward responsiveness and behavioral inhibition. Switching task performance was measured as response time (RT) and accuracy. Performance differences between the high- and low-reward conditions represented performance-based reward responsiveness. Linear mixed effects models were used to estimate the associations of MS and depression with reward responsiveness, behavioral inhibition, and task performance.
Results:Depression, but not MS, was associated with higher BIS scores (p=.007). Neither depression nor MS was associated with BAS subscales. On the switching task, participants who reported lower depression responded to reward such that they were slightly faster in the high-reward condition compared to the low-reward condition (p=.07). By contrast, in participants who reported higher depression, there was no effect of reward on response time. Additionally, MS (p=.009) and depression (p=.018) were each associated with slower response times. Regarding accuracy, no effects of reward were observed; however, there was an interaction between MS and depression. Among HC participants, depression was not related to accuracy. In comparison, PwMS who reported higher depression were more accurate than PwMS who reported less depression (p=.043).
Conclusions:Consistent with hypotheses, higher depressive symptoms were associated with increased behavioral inhibition. Depression was not associated with self-reported reward responsiveness, but it was associated with reduced reward responsiveness on a cognitive task. Contrary to hypotheses, MS was not associated with reduced reward responsiveness. Additionally, higher depression and an MS diagnosis were related to slower response time, consistent with prior findings that psychomotor slowing is a hallmark feature of both disorders. Interestingly, we observed a unique behavioral trend in PwMS, such that PwMS with higher depressive symptoms were more accurate than PwMS with lower depressive symptoms, whereas this relationship was not present among HCs. Altogether, depression in both PwMS and cognitively healthy individuals may be associated with blunted reward responsiveness, but MS does not exacerbate this relationship. In fact, PwMS with depression may be more conscientious in their functioning and therefore perform better on cognitive task accuracy. Continued work should examine how reward processing and its underlying mechanisms may differ in depressed PwMS.
69 Influence of Cardiovascular Risk Factors on Neuropsychological Trajectories in Black/African American Adults Living with HIV
- Valerie Humphreys, Will Dampier, Shinika Tilman, Kim Malone, Vanessa Pirrone, Michael Nonnemacher, Amy Althoff, Zsofia Szep, Brian Wigdahl, Maria Schultheis, Kathryn N. Devlin
-
- Journal:
- Journal of the International Neuropsychological Society / Volume 29 / Issue s1 / November 2023
- Published online by Cambridge University Press:
- 21 December 2023, pp. 64-65
-
- Article
-
- You have access Access
- Export citation
-
Objective:
Human immunodeficiency virus (HIV) type 1 (HIV-1), cardiovascular disease, and HIV-associated neurocognitive disorders (HAND) disproportionately affect Black/African American individuals compared to other racial and ethnic groups. Understanding the mechanisms of cognitive health disparities is essential for developing policy and health interventions to combat such disparities. Cardiovascular risk factors/diseases are common comorbidities that likely contribute to cognitive health disparities among Black/African American people living with HIV (PWH), but their impacts on cognition longitudinally in this population are unclear. The current study examines the relationship between cardiovascular risk and cognitive functioning over time in Black/African American adults living with HIV.
Participants and Methods:A sample of 122 Black/African American adults with HIV (ages 25-68, M=51.8, SD=7.7; 98% on antiretroviral therapy; 91% with undetectable viral load) were selected from the Drexel/Temple Comprehensive NeuroHIV Center, Clinical and Translational Research Support Core (CTRSC; based at Drexel University College of Medicine) Cohort. They completed longitudinal visits (300 total visits, average follow-up time=4.9 years) that included clinical interviews, medical record review, biometric measurements, and comprehensive neuropsychological assessments. Cardiovascular risk factors of interest were body mass index (BMI), waist-to-height ratio (WHtR), and a total vascular risk burden score (VBS) representing five risk factors: obesity, central obesity, diabetes, hyperlipidemia, and hypertension. Based on a prior principal component analysis, three cognitive domains were examined: (1) verbal fluency, (2) visual memory/visuoconstruction, and (3) motor speed/executive functions. Mixed models were used to examine domain-specific cognitive trajectories in relation to baseline cardiovascular risk factors and changes in cardiovascular risk factors.
Results:Overall, cognitive test performance improved over time (p<.003). Baseline VBS was marginally associated with longitudinal change in verbal fluency (p=.06). Participants with low baseline VBS (0-1 risk factors) demonstrated improvement in verbal fluency (p=.002), while those with higher VBS (2-5 risk factors) demonstrated stability in verbal fluency. In contrast, greater increases in BMI and in WHtR predicted more favorable trajectories in motor speed/executive function (both p<.001). Patients with increasing BMI over time improved in this domain (p=.02), while patients with stable or decreasing BMI did not. A similar pattern was observed for WHtR change. No vascular risk factors were associated with trajectories of visual memory/visuoconstruction.
Conclusions:Higher total vascular risk burden was associated with less favorable verbal fluency trajectories, reflecting the negative cognitive consequences of disorders such as diabetes, hyperlipidemia, and hypertension. Unexpectedly, greater increases in BMI and WHtR were associated with more favorable trajectories in motor speed and executive functioning. In this population, weight gain may be a proxy for other positive health factors, such as immune reconstitution, which will be examined in future analyses. Taken together, cardiovascular risk factors have heterogeneous associations with cognitive trajectories, emphasizing the importance of examining the mechanisms of these varying relationships. Future research will examine how social determinants of health, such as racial/ethnic discrimination, contribute to disparities in cardiovascular risk factors and cognitive outcomes.
Commissions and Omissions Are Dissociable Aspects of Everyday Action Impairment in Schizophrenia
- Kathryn N. Devlin, Tania Giovannetti, Rachel K. Kessler, Molly J. Fanning
-
- Journal:
- Journal of the International Neuropsychological Society / Volume 20 / Issue 8 / September 2014
- Published online by Cambridge University Press:
- 30 July 2014, pp. 812-821
-
- Article
- Export citation
-
Prior research using performance-based assessment of functional impairment has informed a novel neuropsychological model of everyday action impairment in dementia in which omission errors (i.e., failure to complete task steps) dissociate from commission errors (i.e., inaccurate performance of task steps) and have unique neuropsychological correlates. However, this model has not been tested in other populations. The present study examined whether this model extends to schizophrenia. Fifty-four individuals with schizophrenia or schizoaffective disorder were administered a neuropsychological protocol and the Naturalistic Action Test (NAT), a performance-based measure of everyday action. A principal component analysis (PCA) was performed to examine the construct(s) comprising everyday action impairment, and correlations between the resultant component(s) and neuropsychological tests were examined. Results showed that omissions and a subset of commissions were distinct components of everyday action. However, results did not support unique associations between these components and specific neuropsychological measures. These findings extend the omission-commission model to schizophrenia and may have important implications for efficient assessment and effective rehabilitation of functional impairment, such as the potential efficacy of targeted interventions for the rehabilitation of omission and commission deficits in everyday functioning. Larger studies with prospective designs are needed to replicate the present preliminary findings. (JINS, 2014, 20, 1–10)
Effects of HIV and Early Life Stress on Amygdala Morphometry and Neurocognitive Function
- Uraina S. Clark, Ronald A. Cohen, Lawrence H. Sweet, Assawin Gongvatana, Kathryn N. Devlin, George N. Hana, Michelle L. Westbrook, Richard C. Mulligan, Beth A. Jerskey, Tara L. White, Bradford Navia, Karen T. Tashima
-
- Journal:
- Journal of the International Neuropsychological Society / Volume 18 / Issue 4 / July 2012
- Published online by Cambridge University Press:
- 24 May 2012, pp. 657-668
-
- Article
- Export citation
-
Both HIV infection and high levels of early life stress (ELS) have been related to abnormalities in frontal-subcortical structures, yet the combined effects of HIV and ELS on brain structure and function have not been previously investigated. In this study we assessed 49 non-demented HIV-seropositive (HIV+) and 47 age-matched HIV-seronegative healthy control (HC) adults. Levels of ELS exposure were quantified and used to define four HIV-ELS groups: HC Low-ELS (N = 20); HC High-ELS (N = 27); HIV+ Low-ELS (N = 24); HIV+ High-ELS (N = 25). An automated segmentation tool measured volumes of brain structures known to show HIV-related or ELS-related effects; a brief neurocognitive battery was administered. A significant HIV-ELS interaction was observed for amygdala volumes, which was driven by enlargements in HIV+ High-ELS participants. The HIV+ High-ELS group also demonstrated significant reductions in psychomotor/processing speed compared with HC Low-ELS. Regression analyses in the HIV+ group revealed that amygdala enlargements were associated with higher ELS, lower nadir CD4 counts, and reduced psychomotor/processing speed. Our results suggest that HIV infection and high ELS interact to increase amygdala volume, which is associated with neurocognitive dysfunction in HIV+ patients. These findings highlight the lasting neuropathological influence of ELS and suggest that high ELS may be a significant risk factor for neurocognitive impairment in HIV-infected individuals. (JINS, 2012, 19, 1–12)
Neurocognitive Effects of HIV, Hepatitis C, and Substance Use History
- Kathryn N. Devlin, Assawin Gongvatana, Uraina S. Clark, Jesse D. Chasman, Michelle L. Westbrook, Karen T. Tashima, Bradford Navia, Ronald A. Cohen
-
- Journal:
- Journal of the International Neuropsychological Society / Volume 18 / Issue 1 / January 2012
- Published online by Cambridge University Press:
- 02 December 2011, pp. 68-78
-
- Article
- Export citation
-
HIV-associated neurocognitive dysfunction persists in the highly active antiretroviral therapy (HAART) era and may be exacerbated by comorbidities, including substance use and hepatitis C virus (HCV) infection. However, the neurocognitive impact of HIV, HCV, and substance use in the HAART era is still not well understood. In the current study, 115 HIV-infected and 72 HIV-seronegative individuals with significant rates of lifetime substance dependence and HCV infection received comprehensive neuropsychological assessment. We examined the effects of HIV serostatus, HCV infection, and substance use history on neurocognitive functioning. We also examined relationships between HIV disease measures (current and nadir CD4, HIV RNA, duration of infection) and cognitive functioning. Approximately half of HIV-infected participants exhibited neurocognitive impairment. Detectable HIV RNA but not HIV serostatus was significantly associated with cognitive functioning. HCV was among the factors most consistently associated with poorer neurocognitive performance across domains, while substance use was less strongly associated with cognitive performance. The results suggest that neurocognitive impairment continues to occur in HIV-infected individuals in association with poor virologic control and comorbid conditions, particularly HCV coinfection. (JINS, 2012, 18, 68–78)
Facial emotion recognition impairments in individuals with HIV
- URAINA S. CLARK, RONALD A. COHEN, MICHELLE L. WESTBROOK, KATHRYN N. DEVLIN, KAREN T. TASHIMA
-
- Journal:
- Journal of the International Neuropsychological Society / Volume 16 / Issue 6 / November 2010
- Published online by Cambridge University Press:
- 20 October 2010, pp. 1127-1137
-
- Article
- Export citation
-
Characterized by frontostriatal dysfunction, human immunodeficiency virus (HIV) is associated with cognitive and psychiatric abnormalities. Several studies have noted impaired facial emotion recognition abilities in patient populations that demonstrate frontostriatal dysfunction; however, facial emotion recognition abilities have not been systematically examined in HIV patients. The current study investigated facial emotion recognition in 50 nondemented HIV-seropositive adults and 50 control participants relative to their performance on a nonemotional landscape categorization control task. We examined the relation of HIV-disease factors (nadir and current CD4 levels) to emotion recognition abilities and assessed the psychosocial impact of emotion recognition abnormalities. Compared to control participants, HIV patients performed normally on the control task but demonstrated significant impairments in facial emotion recognition, specifically for fear. HIV patients reported greater psychosocial impairments, which correlated with increased emotion recognition difficulties. Lower current CD4 counts were associated with poorer anger recognition. In summary, our results indicate that chronic HIV infection may contribute to emotion processing problems among HIV patients. We suggest that disruptions of frontostriatal structures and their connections with cortico-limbic networks may contribute to emotion recognition abnormalities in HIV. Our findings also highlight the significant psychosocial impact that emotion recognition abnormalities have on individuals with HIV. (JINS, 2010, 16, 1127–1137.)