3 results
4 TBI and Blast Disrupt Normal Relationships Between Brain Function, Cognitive Performance, and Psychiatric Symptom Severity
- Jared A Rowland, Jennifer R. Stapleton-Kotloski, Dwayne W. Godwin, Sarah L. Martindale
-
- Journal:
- Journal of the International Neuropsychological Society / Volume 29 / Issue s1 / November 2023
- Published online by Cambridge University Press:
- 21 December 2023, pp. 669-670
-
- Article
-
- You have access Access
- Export citation
-
Objective:
Determine how characteristics of deployment mild traumatic brain injury (TBI) and blast exposure influence the relationship between the functional brain connectome with cognitive outcomes and symptom severity.
Participants and Methods:N = 181 Iraq and Afghanistan combat veterans completed structured clinical interviews, cognitive testing, self-report questionnaires, and magnetoencephalography (MEG). MEG data were acquired in the resting-state with eyes open. MEG data were beamformed to identify brain regions active at rest. Functional brain connectomes representing the unique network present for a given individual were created using active brain regions identified for each participant. Network metrics describing these connectomes were calculated at the participant level. Cognitive tests included the WAIS-IV, Trail Making Test Parts A&B, and the Controlled Oral Word Association test. Due to differences in normative data across tests, raw scores were used in analyses. Symptom measures included the PTSD Checklist - 5 (PCL-5), Patient Health Questionnaire (PHQ-9), Neurobehavioral Symptoms Inventory (NSI), Quality of Life After Brain Injury (QOLIBRI), Pittsburgh Sleep Quality Index (PsQi), the Distress Tolerance Scale (DTS), and the PROMIS Pain Interference Scale (PROMIS-PI).
Results:Hierarchical linear regression analyses revealed that several network metrics were significantly related to both cognitive outcomes and symptom severity after adjusting for demographic covariates and clinical characteristics.
The relationship between Global Efficiency (GE) and cognitive outcomes was moderated by deployment TBI on the WAIS-IV Full Scale Index (FSI), Perceptual Reasoning Index (PRI), and General Ability Index (GAI). In all cases, when deployment TBI was absent, greater GE was associated with poorer cognitive scores. The relationship between GE and symptom severity was moderated by the severity of blast exposure. Greater GE was associated with lower symptom severity at lower blast severities for the PHQ-9 and QOLIBRI A (thinking) and E (negative emotions). Moderation effects were also observed for the PSQI. In the absence of deployment TBI, greater GE was associated with better sleep quality; however, in the presence of deployment TBI, greater GE was associated with poorer sleep quality. Other connectome-outcome relationships were not consistently moderated by Deployment TBI or blast history
Conclusions:Results demonstrated relationships between several aspects the functional connectome of the brain with both cognitive outcomes and symptom severity beyond effects of common demographic and clinical variables. Moderation analyses revealed that the relationship between GE of the connectome and outcomes is frequently disrupted by deployment TBI and blast. GE is a measure of the ease of information transfer through the network. These results identified consistent relationships between GE and outcomes in the absence of deployment TBI or blast, but these relationships disappear when deployment TBI or blast are present. Participants in this study were on average 11 years post-TBI or blast exposure, suggesting these are chronic rather than acute effects. GE was significantly correlated with most symptom severity measures as well as the WAIS-IV PRI, GAI, VCI, and FSI. Future efforts to normalize the relationship between GE and outcomes following TBI may improve rehabilitation outcomes and directly affect areas of concern commonly reported by service members following TBI or blast exposure.
Receiver operating characteristic (ROC) analysis of neurons in the cat's lateral geniculate nucleus during tonic and burst response mode
- W. Guido, S.-M. Lu, J.W. Vaughan, Dwayne W. Godwin, S. Murray Sherman
-
- Journal:
- Visual Neuroscience / Volume 12 / Issue 4 / July 1995
- Published online by Cambridge University Press:
- 02 June 2009, pp. 723-741
-
- Article
- Export citation
-
Relay cells of the lateral geniculate nucleus respond to visual stimuli in one of two modes: burst and tonic. The burst mode depends on the activation of a voltage-dependent, Ca2+ conductance underlying the low threshold spike. This conductance is inactivated at depolarized membrane potentials, but when activated from hyperpolarized levels, it leads to a large, triangular, nearly all-or-none depolarization. Typically, riding its crest is a high-frequency barrage of action potentials. Low threshold spikes thus provide a nonlinear amplification allowing hyperpolarized relay neurons to respond to depolarizing inputs, including retinal EPSPs. In contrast, the tonic mode is characterized by a steady stream of unitary action potentials that more linearly reflects the visual stimulus. In this study, we tested possible differences in detection between response modes of 103 geniculate neurons by constructing receiver operating characteristic (ROC) curves for responses to visual stimuli (drifting sine-wave gratings and flashing spots). Detectability was determined from the ROC curves by computing the area under each curve, known as the ROC area. Most cells switched between modes during recording, evidently due to small shifts in membrane potential that affected the activation state of the low threshold spike. We found that the more often a cell responded in burst mode, the larger its ROC area. This was true for responses to optimal and nonoptimal visual stimuli, the latter including nonoptimal spatial frequencies and low stimulus contrasts. The larger ROC areas associated with burst mode were due to a reduced spontaneous activity and roughly equivalent level of visually evoked response when compared to tonic mode. We performed a within-cell analysis on a subset of 22 cells that switched modes during recording. Every cell, whether tested with a low contrast or high contrast visual stimulus exhibited a larger ROC area during its burst response mode than during its tonic mode. We conclude that burst responses better support signal detection than do tonic responses. Thus, burst responses, while less linear and perhaps less useful in providing a detailed analysis of visual stimuli, improve target detection. The tonic mode, with its more linear response, seems better suited for signal analysis rather than signal detection.
Transneuronal retrograde transport of attenuated pseudorabies viruses within central visual pathways
- RODNEY J. MOORE, SHERRY VINSANT, ANITA K. McCAULEY, NUWAN C. KURUKULASURIYA, DWAYNE W. GODWIN
-
- Journal:
- Visual Neuroscience / Volume 18 / Issue 4 / July 2001
- Published online by Cambridge University Press:
- 11 January 2002, pp. 633-640
-
- Article
- Export citation
-
Pseudorabies virus (PRV) has been shown to be an effective transneuronal tracer within both the peripheral and the central nervous system. The only investigations of this virus in the visual system have examined anterograde transport of PRV from injection sites in the retina. In the present study, we injected attenuated forms of PRV into the primary visual cortex of both rats and cats to determine whether transneuronal retrograde infection would occur back to the retina. In rats, we made small injections into visual cortex of a strain of PRV (Bartha Blu) that contained a β-galactosidase promoter insert. In cats, we injected PRV-M201 into area V1 of visual cortex. After a 2- to 4-day incubation period, we examined tissue from these animals for the presence of the β-galactosidase marker (rats) or the virus itself (cats). Cortical PRV injections resulted in transneuronal retrograde infection of the lateral geniculate nucleus (LGN), thalamic reticular nucleus (TRN), and retina. PRV was retinotopically distributed in the pathway. In addition, double-labeling experiments in cats using an antibody against gamma-aminobutyric acid (GABA) were conducted to reveal PRV-labeled interneurons within the LGN and TRN. All TRN neurons were GABA+, as was a subset of LGN neurons. Only the subset of TRN neurons adjacent to the PRV-labeled sector of LGN was labeled with PRV. In addition, a subset of GABA+ interneurons in LGN was also labeled with PRV. We processed some tissue for electron microscopy to examine the morphology of the virus at various replication stages. No mature virions were detected in terminals from efferent pathways, although forms consistent with retrograde infection were encountered. We conclude that the PRV strains we have used produce a local infection that progresses primarily in the retrograde direction in the central visual pathways. The infection is transneuronal and viral replication maintains the intensity of the label throughout the chain of connected neurons, providing a means of examining detailed circuitry within the visual pathway.