We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Enlist E3® soybean is resistant to 2,4-D, glyphosate, and glufosinate, allowing postemergence applications of these herbicides sequentially or as tank mixes. The objectives of this experiment were to evaluate the effect of postemergence herbicide application timing and sequence with or without a preemergence application of micro-encapsulated acetochlor on waterhemp and common lambsquarters control, soybean yield, and economic returns. Field experiments were conducted in Rosemount and Franklin, Minnesota, in 2021 and 2022. Site, herbicide application timing, and sequence influenced weed control, yield, and profitability. In Rosemount, preemergence followed by (fb) two-pass postemergence programs, including 2,4-D + glyphosate applied at mid-postemergence with or without S-metolachlor, resulted in ≥95% waterhemp control at 28 d after late postemergence application. In Franklin, where weed density was lower, two-pass postemergence programs, regardless of preemergence application that included at least one application of 2,4-D + glyphosate (with or without S-metolachlor), provided ≥97% control of waterhemp and common lambsquarters at 28 d after late postemergence. The level of control was comparable to that of a preemergence herbicide fb a mid-postemergence application of 2,4-D + glyphosate + S-metolachlor at that site. In Rosemount, including acetochlor as the preemergence herbicide in the preemergence fb postemergence programs improved soybean yield by 32% and partial returns by US$384.50 ha−1 compared to postemergence herbicides–only programs. In contrast, the preemergence application did not affect yield or profitability in Franklin. The highest soybean yield (2,925.7 kg ha−1) in Rosemount resulted after glufosinate was applied early postemergence fb 2,4-D + glyphosate applied mid-postemergence. This yield was comparable to that of glufosinate applied early postemergence fb 2,4-D + glyphosate + S-metolachlor applied mid-postemergence and the two-pass glufosinate (early postemergence fb mid-postemergence) program, highlighting the importance of early season weed control. In Franklin, 2,4-D + glyphosate + S-metolachlor (applied mid-postemergence) fb glufosinate (applied late postemergence) provided a yield that was similar to the aforementioned programs at that site.
Multicenter clinical trials are essential for evaluating interventions but often face significant challenges in study design, site coordination, participant recruitment, and regulatory compliance. To address these issues, the National Institutes of Health’s National Center for Advancing Translational Sciences established the Trial Innovation Network (TIN). The TIN offers a scientific consultation process, providing access to clinical trial and disease experts who provide input and recommendations throughout the trial’s duration, at no cost to investigators. This approach aims to improve trial design, accelerate implementation, foster interdisciplinary teamwork, and spur innovations that enhance multicenter trial quality and efficiency. The TIN leverages resources of the Clinical and Translational Science Awards (CTSA) program, complementing local capabilities at the investigator’s institution. The Initial Consultation process focuses on the study’s scientific premise, design, site development, recruitment and retention strategies, funding feasibility, and other support areas. As of 6/1/2024, the TIN has provided 431 Initial Consultations to increase efficiency and accelerate trial implementation by delivering customized support and tailored recommendations. Across a range of clinical trials, the TIN has developed standardized, streamlined, and adaptable processes. We describe these processes, provide operational metrics, and include a set of lessons learned for consideration by other trial support and innovation networks.
Palmer amaranth, a competitive weed in cotton and soybeans, poses challenges due to its rapid growth, high fertility, and herbicide resistance. Effective management strategies targeting sex ratios could reduce seed production by female plants. Protoporphyrinogen oxidase (PPO-) inhibiting herbicides play a role in the evolving resistance of Amaranthus spp. in the US Midwest. These herbicides may also affect the male-to-female ratio of Palmer amaranth. A 2-yr field experiment (2015 and 2016) was conducted in a soybean field in Collinsville, IL, evaluating various preemergence and postemergence PPO-inhibiting herbicide treatments. Untreated Palmer amaranth populations exhibited a bias toward females. Preemergence application of sulfentrazone and flumioxazin effectively reduced Palmer amaranth density (1.66 plants m–2) throughout the season, whereas postemergence applications of fomesafen and lactofen provided limited control (27 and 31 plants m–2, respectively). Early-season mortality was high (96%) among Palmer amaranth seedlings, especially with pyroxasulfone + fluthiacet-methyl treatment. Fomesafen increased female biomass (28.8%) while reducing male biomass compared to the nontreated control. In 2015, pyroxasulfone + fluthiacet-methyl and acetochlor altered the male-to-female sex ratio compared to the nontreated control, with pyroxasulfone + fluthiacet-methyl reducing the proportion of females (–0.11 M/F) and acetochlor slightly increasing the proportion of males (0.03 M/F), though not different from a 1:1 ratio. In 2016, pendimethalin and flumioxazin (71 g ai ha–1) resulted in a strong female-biased sex ratio, with an almost exclusively female population. In both years, the nontreated control plots (–0.58 and –0.55 M/F) maintained a naturally female-biased sex ratio, deviating significantly from a 1:1 ratio. These findings suggest that specific herbicide treatments can alter the sex ratio. Understanding sex determination in Palmer amaranth holds promise for developing more effective control strategies in the future.
The global population and status of Snowy Owls Bubo scandiacus are particularly challenging to assess because individuals are irruptive and nomadic, and the breeding range is restricted to the remote circumpolar Arctic tundra. The International Union for Conservation of Nature (IUCN) uplisted the Snowy Owl to “Vulnerable” in 2017 because the suggested population estimates appeared considerably lower than historical estimates, and it recommended actions to clarify the population size, structure, and trends. Here we present a broad review and status assessment, an effort led by the International Snowy Owl Working Group (ISOWG) and researchers from around the world, to estimate population trends and the current global status of the Snowy Owl. We use long-term breeding data, genetic studies, satellite-GPS tracking, and survival estimates to assess current population trends at several monitoring sites in the Arctic and we review the ecology and threats throughout the Snowy Owl range. An assessment of the available data suggests that current estimates of a worldwide population of 14,000–28,000 breeding adults are plausible. Our assessment of population trends at five long-term monitoring sites suggests that breeding populations of Snowy Owls in the Arctic have decreased by more than 30% over the past three generations and the species should continue to be categorised as Vulnerable under the IUCN Red List Criterion A2. We offer research recommendations to improve our understanding of Snowy Owl biology and future population assessments in a changing world.
Daily sodium intake in England is ∼3.3 g/day(1), with government and scientific advice to reduce intake for cardiovascular health purposes having varying success(2). Eccrine sweat is produced during exercise or exposure to warm environments to maintain body temperature through evaporative cooling. Sweat is primarily water, but also contains appreciable amounts of electrolytes, particularly sodium, meaning sweat sodium losses could reduce daily sodium balance without the need for dietary manipulation. However, the effects of sweat sodium losses on 24-h sodium balance are unclear.
Fourteen active participants (10 males, 4 females; 23±2 years, 45±9 mL/kg/min) completed a preliminary trial and two 24-h randomised, counterbalanced experimental trials. Participants arrived fasted for baseline (0-h) measures (blood/urine samples, blood pressure, nude body mass) followed by breakfast and low-intensity intermittent cycling in the heat (∼36⁰C, ∼50% humidity) to turnover ∼2.5% body mass in sweat (EX), or the same duration of room temperature seated rest (REST). Further blood samples were collected post-EX/REST (1.5-3 h post-baseline). During EX, sweat was collected from 5 sites and water consumed to fully replace sweat losses. During REST, participants drank 100 mL/h. Food intake was individually standardised over the 24-h, with bottled water available ad-libitum. Participants collected all urine produced over the 24-h and returned the following morning to repeat baseline measures fasted (24-h). Sodium balance was estimated over the 24-h using sweat/urine losses and dietary intake. Data were analysed using 2-way ANOVA followed by Shapiro-Wilk and paired t-tests/Wilcoxon signed-rank tests. Data are mean (standard deviation).
Dietary sodium intake was 2.3 (0.3) g and participants lost 2.8 (0.3) % body mass in sweat (containing 2.5 (0.9) g sodium). Sodium balance was lower for EX (-2.0 (1.6) g vs -1.0 (1.6) g; P = 0.022), despite lower 24-h urine sodium losses in EX (1.8 (1.2) g vs 3.3 (1.7) g; P = 0.001). PostEX/REST blood sodium concentration was lower in EX (137.6 (2.3) mmol/L vs 139.9 (1.0) mmol/L; P = 0.002) but did not differ at 0-h (P = 0.906) or 24-h (P = 0.118). There was no difference in plasma volume change (P = 0.423), urine specific gravity (P = 0.495), systolic (P = 0.324) or diastolic (P = 0.274) blood pressure between trials over the 24-h. Body mass change over 24-h was not different between trials (REST +0.25 (1.10) %; EX +0.40 (0.68) %; P = 0.663).
Sweat loss through low-intensity exercise resulted in a lower sodium balance compared to rest. Although urine sodium output reduced with EX, it was not sufficient to offset exercise-induced sodium losses. Despite this, body mass, plasma volume and blood sodium concentration were not different between trials, suggesting sodium may have been lost from non-osmotic sodium stores. This suggests sweat sodium losses could be used to reduce sodium balance, although longer studies are required to confirm this thesis.
Information regarding the prevalence and distribution of herbicide-resistant waterhemp [Amaranthus tuberculatus (Moq.) Sauer] in Minnesota is limited. Whole-plant bioassays were conducted in the greenhouse on 90 A. tuberculatus populations collected from 47 counties in Minnesota. Eight postemergence herbicides, 2,4-D, atrazine, dicamba, fomesafen, glufosinate, glyphosate, imazamox, and mesotrione, were applied at 1× and 3× the labeled doses. Based on their responses, populations were classified into highly resistant (≥40 % survival at 3× the labeled dose), moderately resistant (<40% survival at 3× the labeled dose but ≥40% survival at 1× the labeled dose), less sensitive (10% to 39% survival at 1× the labeled dose), and susceptible (<10% survival at 1× the labeled dose) categories. All 90 populations were resistant to imazamox, while 89% were resistant to glyphosate. Atrazine, fomesafen, and mesotrione resistance was observed in 47%, 31%, and 22% of all populations, respectively. Ten percent of the populations were resistant to 2,4-D, and 2 of 90 populations exhibited >40% survival following dicamba application at the labeled dose. No population was confirmed to be resistant to glufosinate. However, 22% of all populations were classified as less sensitive to glufosinate. Eighty-two populations were found to be multiple-herbicide resistant. Among these, 15 populations exhibited resistance to four different herbicide sites of action (SOAs); 7 and 4 populations were resistant to five and six SOAs, respectively. All six-way-resistant populations were from southwest Minnesota. Two populations, one from Lincoln County and the other from Lyon County, were resistant to 2,4-D, atrazine, dicamba, fomesafen, glyphosate, imazamox, and mesotrione, leaving only glufosinate as a postemergence control option for these populations in corn (Zea mays L.) and soybean [Glycine max (L.) Merr.]. Diversified management tactics, including nonchemical control measures along with herbicide applications from effective SOAs, should be implemented to slow down the evolution and spread of herbicide-resistant A. tuberculatus populations.
Although the link between alcohol involvement and behavioral phenotypes (e.g. impulsivity, negative affect, executive function [EF]) is well-established, the directionality of these associations, specificity to stages of alcohol involvement, and extent of shared genetic liability remain unclear. We estimate longitudinal associations between transitions among alcohol milestones, behavioral phenotypes, and indices of genetic risk.
Methods
Data came from the Collaborative Study on the Genetics of Alcoholism (n = 3681; ages 11–36). Alcohol transitions (first: drink, intoxication, alcohol use disorder [AUD] symptom, AUD diagnosis), internalizing, and externalizing phenotypes came from the Semi-Structured Assessment for the Genetics of Alcoholism. EF was measured with the Tower of London and Visual Span Tasks. Polygenic scores (PGS) were computed for alcohol-related and behavioral phenotypes. Cox models estimated associations among PGS, behavior, and alcohol milestones.
Results
Externalizing phenotypes (e.g. conduct disorder symptoms) were associated with future initiation and drinking problems (hazard ratio (HR)⩾1.16). Internalizing (e.g. social anxiety) was associated with hazards for progression from first drink to severe AUD (HR⩾1.55). Initiation and AUD were associated with increased hazards for later depressive symptoms and suicidal ideation (HR⩾1.38), and initiation was associated with increased hazards for future conduct symptoms (HR = 1.60). EF was not associated with alcohol transitions. Drinks per week PGS was linked with increased hazards for alcohol transitions (HR⩾1.06). Problematic alcohol use PGS increased hazards for suicidal ideation (HR = 1.20).
Conclusions
Behavioral markers of addiction vulnerability precede and follow alcohol transitions, highlighting dynamic, bidirectional relationships between behavior and emerging addiction.
Traumatic brain injury (TBI) and concussion are associated with increased dementia risk. Accurate TBI/concussion exposure estimates are relatively unknown for less common neurodegenerative conditions like frontotemporal dementia (FTD). We evaluated lifetime TBI and concussion frequency in patients diagnosed with a range of FTD spectrum conditions and related prior head trauma to cavum septum pellucidum (CSP) characteristics observable on MRI.
Participants and Methods:
We administered the Ohio State University TBI Identification and Boston University Head Impact Exposure Assessment to 108 patients (age 69.5 ± 8.0, 35% female, 93% white or unknown race) diagnosed at the UCSF Memory and Aging Center with one of the following FTD or related conditions: behavioral variant frontotemporal dementia (N=39), semantic variant primary progressive aphasia (N=16), nonfluent variant PPA (N=23), corticobasal syndrome (N=14), or progressive supranuclear palsy (N=16). Data were also obtained from 217 controls (“HC”; age 76.8 ± 8.0, 53% female, 91% white or unknown race). CSP characteristics were defined based on width or “grade” (0-1 vs. 2+) and length of anterior-posterior separation (millimeters). We first describe frequency of any and multiple (2+) prior TBI based on different but commonly used definitions: TBI with loss of consciousness (LOC), TBI with LOC or posttraumatic amnesia (LOC/PTA), TBI with LOC/PTA or other symptoms like dizziness, nausea, “seeing stars,” etc. (“concussion”). TBI/concussion frequency was then compared between FTD and HC using chi-square. Associations between TBI/concussion and CSP characteristics were analyzed with chi-square (CSP grade) and Mann-Whitney U tests (CSP length). We explored sex differences due to typically higher rates of TBI among males.
Results:
History of any TBI with LOC (FTD=20.0%, HC=19.2%), TBI with LOC/PTA (FTD:32.2%, HC=31.5%), and concussion (FTD: 50.0%, HC=44.3%) was common but not different between study groups (p’s>.4). In both FTD and HC, prior TBI/concussion was nominally more frequent in males but not significantly greater than females. Frequency of repeat TBI/concussion (2+) also did not differ significantly between FTD and HC (repeat TBI with LOC: 6.7% vs. 3.3%, TBI with LOC/PTA: 12.2% vs. 10.3%, concussion: 30.2% vs. 28.7%; p’s>.2). Prior TBI/concussion was not significantly related to CSP grade or length in the total sample or within the FTD or HC groups.
Conclusions:
TBI/concussion rates depend heavily on the symptom definition used for classifying prior injury. Lifetime symptomatic TBI/concussion is common but has an unclear impact on risk for FTD-related diagnoses. Larger samples are needed to appropriately evaluate sex differences, to evaluate whether TBI/concussion rates differ between specific FTD phenotypes, and to understand the rates and effects of more extensive repetitive head trauma (symptomatic and asymptomatic) in patients with FTD.
Individuals living with HIV may experience cognitive difficulties or marked declines known as HIV-Associated Neurocognitive Disorder (HAND). Cognitive difficulties have been associated with worse outcomes for people living with HIV, therefore, accurate cognitive screening and identification is critical. One potentially sensitive marker of cognitive impairment which has been underutilized, is intra-individual variability (IIV). Cognitive IIV is the dispersion of scores across tasks in neuropsychological assessment. In individuals living with HIV, greater cognitive IIV has been associated with cortical atrophy, poorer cognitive functioning, with more rapid declines, and greater difficulties in daily functioning. Studies examining the use of IIV in clinical neuropsychological testing are limited, and few have examined IIV in the context of a single neuropsychological battery designed for culturally diverse or at-risk populations. To address these gaps, this study aimed to examine IIV profiles of individuals living with HIV and who inject drugs, utilizing the Neuropsi, a standardized neuropsychological instrument for Spanish speaking populations.
Participants and Methods:
Spanish speaking adults residing in Puerto Rico (n=90) who are HIV positive and who inject drugs (HIV+I), HIV negative and who inject drugs (HIV-I), HIV positive who do not inject drugs (HIV+), or healthy controls (HC) completed the Neuropsi battery as part of a larger research protocol. The Neuropsi produces 3 index scores representing cognitive domains of memory, attention/memory, and attention/executive functioning. Total battery and within index IIV were calculated by dividing the standard deviation of T-scores by mean performance, resulting in a coefficient of variance (CoV). Group differences on overall test battery mean CoV (OTBMCoV) were investigated. To examine unique profiles of index specific IIV, a cluster analysis was performed for each group.
Results:
Results of a one-way ANOVA indicated significant between group differences on OTBMCoV (F[3,86]=6.54, p<.001). Post-hoc analyses revealed that HIV+I (M=.55, SE=.07, p=.003), HIV-I (M=.50, SE=.03, p=.001), and HIV+ (M=.48, SE=.02, p=.002) had greater OTBMCoV than the HC group (M=.30, SE=.02). To better understand sources of IIV within each group, cluster analysis of index specific IIV was conducted. For the HIV+ group, 3 distinct clusters were extracted: 1. High IIV in attention/memory and attention/executive functioning (n=3, 8%); 2. Elevated memory IIV (n=21, 52%); 3. Low IIV across all indices (n=16, 40%). For the HIV-I group, 2 distinct clusters were extracted: 1. High IIV across all 3 indices (n=7, 24%) and 2. Low IIV across all 3 indices (n=22, 76%). For the HC group, 3 distinct clusters were extracted: 1. Very low IIV across all 3 indices (n=5, 36%); 2. Elevated memory IIV (n=6, 43%); 3. Elevated attention/executive functioning IIV with very low attention/memory and memory IIV (n=3, 21%). Sample size of the HIV+I group was insufficient to extract clusters.
Conclusions:
Current findings support IIV in the Neuropsi test battery as clinically sensitive marker for cognitive impairment in Spanish speaking individuals living with HIV or who inject drugs. Furthermore, the distinct IIV cluster types identified between groups can help to better understand specific sources of variability. Implications for clinical assessment in prognosis and etiological considerations are discussed.
Individuals who have experienced traumatic brain injury (TBI) are at an elevated risk for worsened physical and psychological outcomes. Increased rates of anxiety and depression, along with cognitive issues, are common post-TBI. While there is some evidence that anxiety and depression may affect objective cognitive performance, less is known about their effect on other factors that are associated with the individual’s capacity to complete the task, such as perceived workload of the cognitive task. Workload represents an individual’s perception of task difficulty and serves as a proxy for the magnitude of mental demands a given task places on an individual. Preliminary findings in the literature suggest that individuals with TBI commonly report greater workload when completing cognitive tasks compared to neurotypical peers, but the influence of anxiety and depression on survivors’ workload remains unclear. Considering the elevated rates of psychological and cognitive problems in individuals with TBI, the present study examined the moderating role of anxiety and depression on TBI survivor workload perception of a stress-inducing working memory task.
Participants and Methods:
Ten participants with moderate to severe TBI and eight neurologically healthy controls performed the Paced Auditory Serial Addition Task (PASAT). After completing the PASAT, participants reported their subjective workload using the NASA task load index (NASA-TLX). Participants also completed measures of psychological functioning, including the Chicago Multiscale Depression Inventory (CMDI) and the State-Trait Anxiety Inventory (STAI). Relationships between workload and depression and trait anxiety were examined using linear regression.
Results:
Linear regression was employed for both the TBI and the healthy control groups to assess the influence of trait anxiety and depression on perceived workload. There was no significant difference between the TBI and HC NASA perceived workload scores. Within the TBI group, there was a significant anxiety by depression interaction (b = -.015, p < .001). Simple slopes analyses revealed that for TBI participants reporting low depression, perceived workload increased with increased anxiety (b = .093, p < .001). For TBI participants reporting high depression, perceived workload decreased as anxiety increased (b = -.38, p = .03). While there was also significant anxiety by depression interaction in the healthy control group (b = .033, p = .04), simple slopes analyses revealed that there were no significant associations for healthy controls.
Conclusions:
These results demonstrate that in TBI, level of depression moderates the relationship between anxiety and workload perception. The pattern observed in the TBI group was unique from controls. The present findings suggest that post-TBI, higher depression may temper the influence of anxiety on stressful cognitive task performance and workload rating. The tempering effect of high depression in TBI may represent a biased reporting style or impaired assessment of task difficulty, which may ultimately affect the individual’s capacity to accomplish a task well.
Injection drug use is a significant public health crisis with adverse health outcomes, including increased risk of human immunodeficiency virus (HIV) infection. Comorbidity of HIV and injection drug use is highly prevalent in the United States and disproportionately elevated in surrounding territories such as Puerto Rico. While both HIV status and injection drug use are independently known to be associated with cognitive deficits, the interaction of these effects remains largely unknown. The aim of this study was to determine how HIV status and injection drug use are related to cognitive functioning in a group of Puerto Rican participants. Additionally, we investigated the degree to which type and frequency of substance use predict cognitive abilities.
Participants and Methods:
96 Puerto Rican adults completed the Neuropsi Attention and Memory-3rd Edition battery for Spanish-speaking participants. Injection substance use over the previous 12 months was also obtained via clinical interview. Participants were categorized into four groups based on HIV status and injection substance use in the last 30 days (HIV+/injector, HIV+/non-injector, HIV/injector, HIV-/non-injector). One-way analysis of variance (ANOVA) was conducted to determine differences between groups on each index of the Neuropsi battery (Attention and Executive Function; Memory; Attention and Memory). Multiple linear regression was used to determine whether type and frequency of substance use predicted performance on these indices while considering HIV status.
Results:
The one-way ANOVAs revealed significant differences (p’s < 0.01) between the healthy control group and all other groups across all indices. No significant differences were observed between the other groups. Injection drug use, regardless of the substance, was associated with lower combined attention and memory performance compared to those who inject less than monthly (Monthly: p = 0.04; 2-3x daily: p < 0.01; 4-7x daily: p = 0.02; 8+ times daily: p < 0.01). Both minimal and heavy daily use predicted poorer memory performance (p = 0.02 and p = 0.01, respectively). Heavy heroin use predicted poorer attention and executive functioning (p = 0.04). Heroin use also predicted lower performance on tests of memory when used monthly (p = 0.049), and daily or almost daily (2-6x weekly: p = 0.04; 4-7x daily: p = 0.04). Finally, moderate injection of heroin predicted lower scores on attention and memory (Weekly: p = 0.04; 2-6x weekly: p = 0.048). Heavy combined heroin and cocaine use predicted worse memory performance (p = 0.03) and combined attention and memory (p = 0.046). HIV status was not a moderating factor in any circumstance.
Conclusions:
As predicted, residents of Puerto Rico who do not inject substances and are HIVnegative performed better in domains of memory, attention, and executive function than those living with HIV and/or inject substances. There was no significant difference among the affected groups in cognitive ability. As expected, daily injection of substances predicted worse performance on tasks of memory. Heavy heroin use predicted worse performance on executive function and memory tasks, while heroin-only and combined heroin and cocaine use predicted worse memory performance. Overall, the type and frequency of substance is more predictive of cognitive functioning than HIV status.
We present and evaluate the prospects for detecting coherent radio counterparts to gravitational wave (GW) events using Murchison Widefield Array (MWA) triggered observations. The MWA rapid-response system, combined with its buffering mode ($\sim$4 min negative latency), enables us to catch any radio signals produced from seconds prior to hours after a binary neutron star (BNS) merger. The large field of view of the MWA ($\sim$$1\,000\,\textrm{deg}^2$ at 120 MHz) and its location under the high sensitivity sky region of the LIGO-Virgo-KAGRA (LVK) detector network, forecast a high chance of being on-target for a GW event. We consider three observing configurations for the MWA to follow up GW BNS merger events, including a single dipole per tile, the full array, and four sub-arrays. We then perform a population synthesis of BNS systems to predict the radio detectable fraction of GW events using these configurations. We find that the configuration with four sub-arrays is the best compromise between sky coverage and sensitivity as it is capable of placing meaningful constraints on the radio emission from 12.6% of GW BNS detections. Based on the timescales of four BNS merger coherent radio emission models, we propose an observing strategy that involves triggering the buffering mode to target coherent signals emitted prior to, during or shortly following the merger, which is then followed by continued recording for up to three hours to target later time post-merger emission. We expect MWA to trigger on $\sim$$5-22$ BNS merger events during the LVK O4 observing run, which could potentially result in two detections of predicted coherent emission.
Meta-analyses of functional magnetic resonance imaging (fMRI) studies have been used to elucidate the most reliable neural features associated with various psychiatric disorders. However, it has not been well-established whether each of these neural features is linked to a specific disorder or is transdiagnostic across multiple disorders and disorder categories, including mood, anxiety, and anxiety-related disorders.
Objectives
This project aims to advance our understanding of the disorder-specific and transdiagnostic neural features associated with mood, anxiety, and anxiety-related disorders as well as to refine the methodology used to compare multiple disorders.
Methods
We conducted an exhaustive PubMed literature search followed by double-screening, double-extraction, and cross-checking to identify all whole-brain, case-control fMRI activation studies of mood, anxiety, and anxiety-related disorders in order to construct a large-scale meta-analytic database of primary studies of these disorders. We then employed multilevel kernel density analysis (MKDA) with Monte-Carlo simulations to correct for multiple comparisons as well as ensemble thresholding to reduce cluster size bias to analyze primary fMRI studies of mood, anxiety, and anxiety-related disorders followed by application of triple subtraction techniques and a second-order analysis to elucidate the disorder-specificity of the previously identified neural features.
Results
We found that participants diagnosed with mood, anxiety, and anxiety-related disorders exhibited statistically significant (p < .05 – 0.0001; FWE-corrected) differences in neural activation relative to healthy controls throughout the cerebral cortex, limbic system, and basal ganglia. In addition, each of these psychiatric disorders exhibited a particular profile of neural features that ranged from disorder-specific, to category-specific, to transdiagnostic.
Conclusions
These findings indicate that psychiatric disorders exhibit a complex profile of neural features that vary in their disorder-specificity and can be detected with large-scale fMRI meta-analytic techniques. This approach has potential to fundamentally transform neuroimaging investigations of clinical disorders by providing a novel procedure for establishing disorder-specificity of observed results, which can be then used to advance our understanding of individual disorders as well as broader nosological issues related to diagnosis and classification of psychiatric disorders.
Generalized anxiety disorder (GAD) is a highly prevalent mental illness that is associated with clinically significant distress, functional impairment, and poor emotional regulation. Primary functional magnetic resonance imaging (fMRI) studies of GAD report neural abnormalities in comparison to healthy controls. However, many of these findings in the primary literature are inconsistent, and it is unclear whether they are specific to GAD or shared transdiagnostically across related disorders.
Objectives
This meta-analysis seeks to establish the most reliable neural abnormalities observed in individuals with GAD, as reported in the primary fMRI activation literature.
Methods
We conducted an exhaustive literature search in PubMed to identify primary studies that met our pre-specified inclusion criteria and then extracted relevant data from primary, whole-brain fMRI activation studies of GAD that reported coordinates in Talairach or MNI space. We then used multilevel kernel density analysis (MKDA) with ensemble thresholding to examine the differences between adults with GAD and healthy controls in order to identify brain regions that reached statistical significance across primary studies.
Results
Patients with GAD showed statistically significant (α=0.05–0.0001; family-wise-error-rate corrected) neural activation in various regions of the cerebral cortex and basal ganglia across a variety of experimental tasks.
Conclusions
These results inform our understanding of the neural basis of GAD and are interpreted using a frontolimbic model of anxiety as well as specific clinical symptoms of this disorder and its relation to other mood and anxiety disorders. These results also suggest possible novel targets for emerging neurostimulation therapies (e.g., transcranial magnetic stimulation) and may be used to advance our understanding of the effects of current pharmaceutical treatments and ways to improve treatment selection and symptom-targeting for patients diagnosed with GAD.
Major depressive disorder (MDD) is a highly prevalent mental illness that frequently originates in early development and is pervasive during adolescence. Despite its high prevalence and early age of onset, our understanding of the potentially unique neural basis of MDD in this age group is still not well understood, and the existing primary literature on the topic includes many new and divergent results. This limited understanding of MDD in youth presents a critical need to further investigate its neural basis in youth and presents an opportunity to also improve clinical treatments that target its neural abnormalities.
Objectives
The present study aims to advance our understanding of the neural basis of MDD in youth by identifying abnormal functional activation in various brain regions compared with healthy controls.
Methods
We conducted a meta-analysis of functional magnetic resonance imaging (fMRI) studies of MDD by using a well-established method, multilevel kernel density analysis (MKDA) with ensemble thresholding, to quantitatively combine all existing whole-brain fMRI studies of MDD in youth compared with healthy controls. This method involves a voxel-wise, whole-brain approach, that compares neural activation of patients with MDD to age-matched healthy controls across variations of task-based conditions, which we subcategorize into affective processing, executive functioning, positive valence, negative valence, and symptom provocation tasks.
Results
Youth with MDD exhibited statistically significant (p<0.05; FWE-corrected) hyperactivation and hypoactivation in multiple brain regions compared with age-matched healthy controls. These results include significant effects that are stable across various tasks as well as some that appear to depend on task conditions.
Conclusions
This study strengthens our understanding of the neural basis of MDD in youth and may also be used to help identify possible similarities and differences between youth and adults with depression. It may also help inform the development of new treatment interventions and tools for predicting unique treatment responses in youth with depression.
Major depressive disorder (MDD) is a highly prevalent mental illness that often first occurs or persists into adulthood and is considered the leading cause of disability and disease burden worldwide. Unfortunately, individuals diagnosed with MDD who seek treatment often experience limited symptom relief and may not achieve long-term remission, which is due in part to our limited understanding of its underlying pathophysiology. Many studies that use task-based functional magnetic resonance imaging (fMRI) have found abnormal activation in brain regions in adults diagnosed with MDD, but those findings are often inconsistent; in addition, previous meta-analyses that quantitatively integrate this large body literature have found conflicting results.
Objectives
This meta-analysis aims to advance our understanding of the neural basis of MDD in adults, as measured by fMRI activation studies, and address inconsistencies and discrepancies in the empirical literature.
Methods
We employed multilevel kernel density analysis (MKDA) with ensemble thresholding, a well-established method for voxel-wise, whole-brain meta-analyses, to conduct a quantitative comparison of all relevant primary fMRI activation studies of adult patients with MDD compared to age-matched healthy controls.
Results
We found that adults with MDD exhibited a reliable pattern of statistically significant (p<0.05; FWE-corrected) hyperactivation and hypoactivation in several brain regions compared to age-matched healthy controls across a variety of experimental tasks.
Conclusions
This study supports previous findings that there is reliable neural basis of MDD that can be detected across heterogenous fMRI studies. These results can be used to inform development of promising treatments for MDD, including protocols for personalized interventions. They also provide the opportunity for additional studies to examine the specificity of these effects among various populations-of-interest, including youth vs. adults with depression as well as other related mood and anxiety disorders.
Several hypotheses may explain the association between substance use, posttraumatic stress disorder (PTSD), and depression. However, few studies have utilized a large multisite dataset to understand this complex relationship. Our study assessed the relationship between alcohol and cannabis use trajectories and PTSD and depression symptoms across 3 months in recently trauma-exposed civilians.
Methods
In total, 1618 (1037 female) participants provided self-report data on past 30-day alcohol and cannabis use and PTSD and depression symptoms during their emergency department (baseline) visit. We reassessed participant's substance use and clinical symptoms 2, 8, and 12 weeks posttrauma. Latent class mixture modeling determined alcohol and cannabis use trajectories in the sample. Changes in PTSD and depression symptoms were assessed across alcohol and cannabis use trajectories via a mixed-model repeated-measures analysis of variance.
Results
Three trajectory classes (low, high, increasing use) provided the best model fit for alcohol and cannabis use. The low alcohol use class exhibited lower PTSD symptoms at baseline than the high use class; the low cannabis use class exhibited lower PTSD and depression symptoms at baseline than the high and increasing use classes; these symptoms greatly increased at week 8 and declined at week 12. Participants who already use alcohol and cannabis exhibited greater PTSD and depression symptoms at baseline that increased at week 8 with a decrease in symptoms at week 12.
Conclusions
Our findings suggest that alcohol and cannabis use trajectories are associated with the intensity of posttrauma psychopathology. These findings could potentially inform the timing of therapeutic strategies.
Children with fragile X syndrome (FXS) often avoid eye contact, a behavior that is potentially related to hyperarousal. Prior studies, however, have focused on between-person associations rather than coupling of within-person changes in gaze behaviors and arousal. In addition, there is debate about whether prompts to maintain eye contact are beneficial for individuals with FXS. In a study of young females (ages 6–16), we used eye tracking to assess gaze behavior and pupil dilation during social interactions in a group with FXS (n = 32) and a developmentally similar comparison group (n = 23). Participants engaged in semi-structured conversations with a female examiner during blocks with and without verbal prompts to maintain eye contact. We identified a social–behavioral and psychophysiological profile that is specific to females with FXS; this group exhibited lower mean levels of eye contact, significantly increased mean pupil dilation during conversations that included prompts to maintain eye contact, and showed stronger positive coupling between eye contact and pupil dilation. Our findings strengthen support for the perspective that gaze aversion in FXS reflects negative reinforcement of social avoidance behavior. We also found that behavioral skills training may improve eye contact, but maintaining eye contact appears to be physiologically taxing for females with FXS.
OBJECTIVES/GOALS: This study proposes a pragmatic approach for tracking institutional changes in research teamwork and productivity in real time using common institutional electronic databases such as eCV and grant management systems. Dissemination of this approach could provide a standard metric for comparing teamwork productivity across different programs. METHODS/STUDY POPULATION: This study tracks research teamwork and productivity using commonly available institutional electronic databases such as eCV and grant management systems. We tested several definitions of interdisciplinary collaborations based on number of collaborations and their fields of discipline. Publication characteristics were compared by faculty seniority and appointment type using non-parametric Wilcoxon Rank Sum Test (p RESULTS/ANTICIPATED RESULTS: Interdisciplinary grants constitute 24% of all grants but the trend has significantly increased over the last five years. Tenure track faculty collaborated with more organizations (3.5, SD 2.5 vs 2.3, SD 1.1, p DISCUSSION/SIGNIFICANCE: This study provides empirical evidence of the benefits of interdisciplinary collaboration in research and identifies an important role that senior faculty may be playing in creating the culture of interdisciplinary teamwork. More research is needed to improve efficiency of interdisciplinary collaborations.