We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Functional impairment in daily activities, such as work and socializing, is part of the diagnostic criteria for major depressive disorder and most anxiety disorders. Despite evidence that symptom severity and functional impairment are partially distinct, functional impairment is often overlooked. To assess whether functional impairment captures diagnostically relevant genetic liability beyond that of symptoms, we aimed to estimate the heritability of, and genetic correlations between, key measures of current depression symptoms, anxiety symptoms, and functional impairment.
Methods
In 17,130 individuals with lifetime depression or anxiety from the Genetic Links to Anxiety and Depression (GLAD) Study, we analyzed total scores from the Patient Health Questionnaire-9 (depression symptoms), Generalized Anxiety Disorder-7 (anxiety symptoms), and Work and Social Adjustment Scale (functional impairment). Genome-wide association analyses were performed with REGENIE. Heritability was estimated using GCTA-GREML and genetic correlations with bivariate-GREML.
Results
The phenotypic correlations were moderate across the three measures (Pearson’s r = 0.50–0.69). All three scales were found to be under low but significant genetic influence (single-nucleotide polymorphism-based heritability [h2SNP] = 0.11–0.19) with high genetic correlations between them (rg = 0.79–0.87).
Conclusions
Among individuals with lifetime depression or anxiety from the GLAD Study, the genetic variants that underlie symptom severity largely overlap with those influencing functional impairment. This suggests that self-reported functional impairment, while clinically relevant for diagnosis and treatment outcomes, does not reflect substantial additional genetic liability beyond that captured by symptom-based measures of depression or anxiety.
Clinical guidelines for personality disorder emphasise the importance of patients being supported to develop psychological skills to help them manage their symptoms and behaviours. But where these mechanisms fail, and hospital admission occurs, little is known about how episodes of acutely disturbed behaviour are managed.
Aims
To explore the clinical characteristics and management of episodes of acutely disturbed behaviour requiring medication in in-patients with a diagnosis of personality disorder.
Method
Analysis of clinical audit data collected in 2024 by the Prescribing Observatory for Mental Health, as part of a quality improvement programme addressing the pharmacological management of acutely disturbed behaviour. Data were collected from clinical records using a bespoke proforma.
Results
Sixty-two mental health Trusts submitted data on 951 episodes of acutely disturbed behaviour involving patients with a personality disorder, with this being the sole psychiatric diagnosis in 471 (50%). Of the total, 782 (82%) episodes occurred in female patients. Compared with males, episodes in females were three times more likely to involve self-harming behaviour or be considered to pose such a risk (22% and 70% respectively: p < 0.001). Parenteral medication (rapid tranquillisation) was administered twice as often in episodes involving females than in males (64 and 34% respectively: p < 0.001).
Conclusions
Our findings suggest that there are a large number of episodes of acutely disturbed behaviour on psychiatric wards in women with a diagnosis of personality disorder. These episodes are characterised by self-harm and regularly prompt the administration of rapid tranquillisation. This has potential implications for service design, staff training, and research.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
The impact of chronic pain and opioid use on cognitive decline and mild cognitive impairment (MCI) is unclear. We investigated these associations in early older adulthood, considering different definitions of chronic pain.
Methods:
Men in the Vietnam Era Twin Study of Aging (VETSA; n = 1,042) underwent cognitive testing and medical history interviews at average ages 56, 62, and 68. Chronic pain was defined using pain intensity and interference ratings from the SF-36 over 2 or 3 waves (categorized as mild versus moderate-to-severe). Opioid use was determined by self-reported medication use. Amnestic and non-amnestic MCI were assessed using the Jak-Bondi approach. Mixed models and Cox proportional hazards models were used to assess associations of pain and opioid use with cognitive decline and risk for MCI.
Results:
Moderate-to-severe, but not mild, chronic pain intensity (β = −.10) and interference (β = −.23) were associated with greater declines in executive function. Moderate-to-severe chronic pain intensity (HR = 1.75) and interference (HR = 3.31) were associated with a higher risk of non-amnestic MCI. Opioid use was associated with a faster decline in verbal fluency (β = −.18) and a higher risk of amnestic MCI (HR = 1.99). There were no significant interactions between chronic pain and opioid use on cognitive decline or MCI risk (all p-values > .05).
Discussion:
Moderate-to-severe chronic pain intensity and interference related to executive function decline and greater risk of non-amnestic MCI; while opioid use related to verbal fluency decline and greater risk of amnestic MCI. Lowering chronic pain severity while reducing opioid exposure may help clinicians mitigate later cognitive decline and dementia risk.
To understand participant perspectives on an effective, practical, comprehensive telehealth intervention for persistently poorly controlled diabetes mellitus and examine how its components contributed to improved outcomes, with the goal of informing broader telehealth-based diabetes management strategies.
Methods:
We conducted semi-structured interviews of a purposive sample of patients and staff in the comprehensive telehealth arm of the Practical Telehealth to Improve Control and Engagement for Patients with Clinic-Refractory Diabetes Mellitus study. Using the lens of patient engagement, we applied directed content analysis to categorize themes across the five components of the intervention.
Results:
The purposive sample included 19 patients (79% male, 53% Black, varying levels of intervention engagement) and 8 staff. The telemonitoring component was associated with encouragement and motivation among patients; staff found satisfaction in providing metrics of success for participants. For the self-management component, patients saw staff as helpful with problem-solving; staff felt patients were receptive to education. Medication management supported medication adherence and optimization and was acceptable to patients. Diet/activity support motivated behavioral changes among patients. Staff felt that depression support allowed for responsiveness to medical and behavioral factors influencing self-management. Identified areas for improvement included staff time constraints, patient difficulties with taking and transmitting data, and challenges with patient adherence among those with mental health conditions.
Conclusion:
Findings from this study provide insights that may inform the design, implementation, and scalability of comprehensive telehealth models for diabetes management across diverse healthcare settings.
The First Large Absorption Survey in H i (FLASH) is a large-area radio survey for neutral hydrogen in and around galaxies in the intermediate redshift range $0.4\lt z\lt1.0$, using the 21-cm H i absorption line as a probe of cold neutral gas. The survey uses the ASKAP radio telescope and will cover 24,000 deg$^2$ of sky over the next five years. FLASH breaks new ground in two ways – it is the first large H i absorption survey to be carried out without any optical preselection of targets, and we use an automated Bayesian line-finding tool to search through large datasets and assign a statistical significance to potential line detections. Two Pilot Surveys, covering around 3000 deg$^2$ of sky, were carried out in 2019-22 to test and verify the strategy for the full FLASH survey. The processed data products from these Pilot Surveys (spectral-line cubes, continuum images, and catalogues) are public and available online. In this paper, we describe the FLASH spectral-line and continuum data products and discuss the quality of the H i spectra and the completeness of our automated line search. Finally, we present a set of 30 new H i absorption lines that were robustly detected in the Pilot Surveys, almost doubling the number of known H i absorption systems at $0.4\lt z\lt1$. The detected lines span a wide range in H i optical depth, including three lines with a peak optical depth $\tau\gt1$, and appear to be a mixture of intervening and associated systems. Interestingly, around two-thirds of the lines found in this untargeted sample are detected against sources with a peaked-spectrum radio continuum, which are only a minor (5–20%) fraction of the overall radio-source population. The detection rate for H i absorption lines in the Pilot Surveys (0.3 to 0.5 lines per 40 deg$^2$ ASKAP field) is a factor of two below the expected value. One possible reason for this is the presence of a range of spectral-line artefacts in the Pilot Survey data that have now been mitigated and are not expected to recur in the full FLASH survey. A future paper in this series will discuss the host galaxies of the H i absorption systems identified here.
Specimens of Tulaneia amabilia Runnegar and Horodyski n. gen n. sp. (previously Ernietta plateauensis Pflug) discovered by RJH in 1991 at a site in the Montgomery Mountains near Johnnie, Nevada, are described for the first time. All of the material from the original locality was from float, but its stratigraphic position within the lowest siliciclastic to dolostone interval of the lower member of the Wood Canyon Formation (LMWCF) was confirmed by subsequent discoveries. Because the upper part of the LMWCF contains Treptichnus pedum (Seilacher), the Ediacaran–Cambrian boundary has long been drawn at its first appearance. However, in the Esmeralda Member of the Deep Spring Formation in the White-Inyo Mountains, California, and at Mount Dunfee, Nevada, another Cambrian ichnofossil, ‘Plagiogmus’, which is now Psammichnites gigas arcuatus (Roedel), is found just beneath the nadir of the basal Cambrian isotope excursion (BACE). Because the nadir of the BACE excursion is older than ca. 539 Ma in Mexico, the oldest occurrences of Treptichnus pedum in the LMWCF are latest—not earliest—Fortunian in age, and there is no need to reduce the age of the eon boundary from ca. 539 to ca. 533 Ma. Tulaneia resembles Ernietta and other erniettomorphs in being composed of tubular modules with planar common surfaces, but its overall shape was tabular and unidirectional rather than sack or frond shaped. We also illustrate and briefly describe other trace and body fossils from the LMWCF and re-illustrate previously published specimens of Psammichnites gigas arcuatus in order to document its earliest occurrence in the Great Basin.
We examined the association between influenza vaccination policies at acute care hospitals and influenza vaccination coverage among healthcare personnel for the 2021–22 influenza season. Mandatory vaccination and masking for unvaccinated personnel were associated with increased odds of vaccination. Hospital employees had higher vaccination coverage than licensed independent practitioners.
Resilience of the healthcare system has been described as the ability to absorb, adapt, and respond to stress while maintaining the provision of safe patient care. We quantified the impact that stressors associated with the COVID-19 pandemic had on patient safety, as measured by central line-associated bloodstream infections (CLABSIs) reported to the Centers for Disease Control and Prevention’s National Healthcare Safety Network.
Design:
Acute care hospitals were mandated to report markers of resource availability (staffing and hospital occupancy with COVID-19 inpatients) to the federal government between July 2020 and June 2021. These data were used with community levels of COVID-19 to develop a statistical model to assess factors influencing rates of CLABSIs among inpatients during the pandemic.
Results:
After risk adjustment for hospital characteristics, measured stressors were associated with increased CLABSIs. Staff shortages for more than 10% of days per month were associated with a statistically significant increase of 2 CLABSIs per 10,000 central line days versus hospitals reporting staff shortages of less than 10% of days per month. CLABSIs increased with a higher inpatient COVID-19 occupancy rate; when COVID-19 occupancy was 20% or more, there were 5 more CLABSIs per 10,000 central line days versus the referent (less than 5%).
Conclusions:
Reporting of data pertaining to hospital operations during the COVID-19 pandemic afforded an opportunity to evaluate resilience of US hospitals. We demonstrate how the stressors of staffing shortages and high numbers of patients with COVID-19 negatively impacted patient safety, demonstrating poor resilience. Understanding stress in hospitals may allow for the development of policies that support resilience and drive safe care.
Psychopathology assessed across the lifespan often can be summarized with a few broad dimensions: internalizing, externalizing, and psychosis/thought disorder. Extensive overlap between internalizing and externalizing symptoms has garnered interest in bifactor models comprised of a general co-occurring factor and specific internalizing and externalizing factors. We focus on internalizing and externalizing symptoms and compare a bifactor model to a correlated two-factor model of psychopathology at three timepoints in a large adolescent community sample (N = 387; 55 % female; 83% Caucasian; M age = 12.1 at wave 1) using self- and parent-reports. Each model was tested within each time-point with 25–28 validators. The bifactor models demonstrated better fit to the data. Child report had stronger invariance across time. Parent report had stronger reliability over time. Cross-informant correlations between the factors at each wave indicated that the bifactor model had slightly poorer convergent validity but stronger discriminant validity than the two-factor model. With notable exceptions, this pattern of results replicated across informants and waves. The overlap between internalizing and externalizing pathology is systematically and, sometimes, non-linearly related to risk factors and maladaptive outcomes. Strengths and weaknesses to modeling psychopathology as two or three factors and clinical and developmental design implications are discussed.
Recent stressful life events (SLEs) are an established risk factor for a range of psychiatric disorders. Animal studies have shown evidence of gray matter (GM) reductions associated with stress, and previous work has found similar associations in humans. However longitudinal studies investigating the association between stress and changes in brain structure are limited.
Methods
The current study uses longitudinal data from the UK Biobank and comprises 4,543 participants with structural neuroimaging and recent SLE data (mean age = 61.5 years). We analyzed the association between recent SLEs and changes in brain structure, determined using the longitudinal FreeSurfer pipeline, focusing on total GM volume and five a priori brain regions: the hippocampus, amygdala, anterior cingulate cortex, orbitofrontal cortex, and insula. We also examined if depression and childhood adversity moderated the relationship between SLEs and brain structure.
Results
Individuals who had experienced recent SLEs exhibited a slower rate of hippocampal decrease over time compared to individuals who did not report any SLEs. Individuals with depression exhibited smaller GM volumes when exposed to recent SLEs. There was no effect of childhood adversity on the relationship between SLEs and brain structure.
Conclusions
Our findings suggest recent SLEs are not directly associated with an accelerated decline in brain volumes in a population sample of older adults, but instead may alter brain structure via affective disorder psychopathology. Further work is needed to investigate the effects of stress in younger populations who may be more vulnerable to stress-induced changes, and may yet pinpoint brain regions linked to stress-related disorders.
Numerous annual and perennial weeds infest sugarcane. End-season weed infestations are managed before sugarcane is replanted by fallowing (cultivation and sequential glyphosate applications) or by rotating to glyphosate-tolerant soybean in Louisiana. With the occurrence of perennial grasses and glyphosate-resistant weeds, growers need to utilize alternative late POST (LPOST) herbicide programs in soybean to reduce weed infestations in newly planted sugarcane (soybean-sugarcane rotation). Current rotational restrictions limit the use of acifluorfen, clethodim, fomesafen, and quizalofop to control troublesome weeds before soybean harvest and the subsequent planting of sugarcane. However, there is a lack of information on the carryover effects of these soybean herbicides on newly planted sugarcane. Field experiments were conducted at Schriever, LA, and St. Gabriel, LA, in 2017 to 2018 and in 2020 to 2021 to determine sugarcane injury and yield component response to herbicides labeled for LPOST applications in soybean, including acifluorfen, clethodim, fomesafen, lactofen, and quizalofop, applied at the field-use rates (1X) 45 d prior to or immediately after sugarcane planting. Separate field experiments were conducted at those two locations in Louisiana in 2018 to 2019 and in 2020 to 2021 to determine sugarcane injury and yield component response to five rates of fomesafen applied immediately after sugarcane planting. Results of the herbicide screening experiment showed no reductions in sugarcane shoot and stalk population, stalk height, sugarcane yield, sucrose content, or sucrose yield from the selected herbicides at either application timing. Fomesafen applied at 790 (2X) and 1,580 (4X) g ha−1 resulted in 7% and 13% average visible injury to sugarcane at 27 d after treatment (DAT), respectively; injury symptoms persisted until 62 DAT. Transient injury observed at 62 DAT did not correspond to reduced sugarcane stalk population, height, sucrose content, sugarcane yield, or sucrose yield. This research indicates a potentially low risk of carryover and yield loss in newly planted sugarcane from late-season applications of selected soybean herbicides.
This study sought to assess undergraduate students’ knowledge and attitudes surrounding perceived self-efficacy and threats in various common emergencies in communities of higher education.
Methods
Self-reported perceptions of knowledge and skills, as well as attitudes and beliefs regarding education and training, obligation to respond, safety, psychological readiness, efficacy, personal preparedness, and willingness to respond were investigated through 3 representative scenarios via a web-based survey.
Results
Among 970 respondents, approximately 60% reported their university had adequately prepared them for various emergencies while 84% reported the university should provide such training. Respondents with high self-efficacy were significantly more likely than those with low self-efficacy to be willing to respond in whatever capacity needed across all scenarios.
Conclusions
There is a gap between perceived student preparedness for emergencies and training received. Students with high self-efficacy were the most likely to be willing to respond, which may be useful for future training initiatives.
Recent stressful life events (SLE) are a risk factor for psychosis, but limited research has explored how SLEs affect individuals at clinical high risk (CHR) for psychosis. The current study investigated the longitudinal effects of SLEs on functioning and symptom severity in CHR individuals, where we hypothesized CHR would report more SLEs than healthy controls (HC), and SLEs would be associated with poorer outcomes.
Methods
The study used longitudinal data from the EU-GEI High Risk study. Data from 331 CHR participants were analyzed to examine the effects of SLEs on changes in functioning, positive and negative symptoms over a 2-year follow-up. We compared the prevalence of SLEs between CHR and HCs, and between CHR who did (CHR-T) and did not (CHR-NT) transition to psychosis.
Results
CHR reported 1.44 more SLEs than HC (p < 0.001), but there was no difference in SLEs between CHR-T and CHR-NT at baseline. Recent SLEs were associated with poorer functioning and more severe positive and negative symptoms in CHR individuals (all p < 0.01) but did not reveal a significant interaction with time.
Conclusions
CHR individuals who had experienced recent SLEs exhibited poorer functioning and more severe symptoms. However, as the interaction between SLEs and time was not significant, this suggests SLEs did not contribute to a worsening of symptoms and functioning over the study period. SLEs could be a key risk factor to becoming CHR for psychosis, however further work is required to inform when early intervention strategies mitigating against the effects of stress are most effective.
This study aimed to understand the current landscape of USA-based disaster medicine (DM) programs through the lens of alumni and program directors (PDs). The data obtained from this study will provide valuable information to future learners as they ponder careers in disaster medicine and allow PDs to refine curricular offerings.
Methods
Two separate surveys were sent to USA-based DM program directors and alumni. The surveys gathered information regarding current training characteristics, career trajectories, and the outlook of DM training.
Results
The study had a 57% response rate among PDs, and 42% response rate from alumni. Most programs are 1-year and accept 1-2 fellows per class. More than 60% of the programs offer additional advanced degrees. Half of the respondents accept international medical graduates (IMGs). Only 25% accept non-MD/DO/MBBs trained applicants. Most of the alumni hold academic and governmental positions post-training. Furthermore, many alumni report that fellowship training offered an advantage in the job market and allowed them to expand their clinical practice.
Conclusions
The field of disaster medicine is continuously evolving owing to the increased recognition of the important roles DM specialists play in healthcare. The fellowship training programs are experiencing a similar evolution with an increasing trend toward standardization. Furthermore, graduates from these programs see their training as a worthwhile investment in career opportunities.
In practice, nondestructive testing (NDT) procedures tend to consider experiments (and their respective models) as distinct, conducted in isolation, and associated with independent data. In contrast, this work looks to capture the interdependencies between acoustic emission (AE) experiments (as meta-models) and then use the resulting functions to predict the model hyperparameters for previously unobserved systems. We utilize a Bayesian multilevel approach (similar to deep Gaussian Processes) where a higher-level meta-model captures the inter-task relationships. Our key contribution is how knowledge of the experimental campaign can be encoded between tasks as well as within tasks. We present an example of AE time-of-arrival mapping for source localization, to illustrate how multilevel models naturally lend themselves to representing aggregate systems in engineering. We constrain the meta-model based on domain knowledge, then use the inter-task functions for transfer learning, predicting hyperparameters for models of previously unobserved experiments (for a specific design).
Group A streptococcal or Streptococcus pyogenes infections have been increasing post-COVID-19 pandemic. We describe the epidemiology of S. pyogenes pharyngitis and invasive disease in Alberta, Canada 2018–2023. Positive pharyngitis specimens were identified from throat swabs collected from pharyngitis patients. Invasive S. pyogenes was defined as the isolation of S. pyogenes from a normally sterile site or severe skin infection. S. pyogenes isolates were emm typed. Pharyngitis and invasive disease displayed seasonal trends preceding the COVID-19 pandemic followed by a sharp decrease during COVID-19 intervention measures. After the lifting of interventions, rates of pharyngitis and invasive disease rose. There were 182 983 positive pharyngitis specimens between 2018 and 2023 for a positivity rate of 17.6%. The highest rates occurred in the 0–9 age group in 2023 (41.5%). Invasive disease increased in 2022–2023 driven by emm1 and 12 types. M1UK strain was the most frequent M1 type associated with invasive disease (59% of M1 isolates sequenced). Notably, out of 182 983 pharyngitis cases, there were 111 cases of invasive S. pyogenes detected for an invasive disease rate of 0.06%. This descriptive epidemiology of S. pyogenes pharyngitis and invasive S. pyogenes disease highlights the rapid increase in cases of S. pyogenes occurring in western Canada and illustrates the critical need for a vaccine.
This trial assessed the effect of preemergence herbicides on newly transplanted blackberries. A 2-yr field trial was initiated in 2021 and conducted at two locations in Fayetteville and Clarksville, AR. Seven treatments consisted of six preemergence herbicides (flumioxazin, mesotrione, napropamide, oryzalin, pendimethalin, and S-metolachlor) and one nontreated check. Preemergence herbicide treatments were applied to field plots of newly transplanted blackberry plugs (‘Ouachita’), using a CO2 backpack sprayer at 187 L ha−1 covering a 1-m swath, ensuring spray pattern overlap over newly planted blackberries in 2021 and reapplied in the same manner to established blackberries of the same plots in 2022. Data were collected on crop injury and plant height of blackberry plants in each plot. Yield data were collected in the second year, and fruit were analyzed for soluble solids content, pH, and average berry weight. In the first year, mesotrione and flumioxazin treatments caused injury to newly transplanted blackberries, and mesotrione-treated blackberries (58% in Fayetteville, 29% in Clarksville) did not fully recover by 84 d after treatment (DAT). Napropamide, S-metolachlor, oryzalin, and pendimethalin did not cause crop injury greater than 6% throughout the 2021 season. In the second year (2022), no crop injury was caused by any herbicide treatments. Results from these trials verify that flumioxazin, napropamide, oryzalin, and pendimethalin at the tested rates would be appropriate options for weed control in newly planted blackberries. These results corroborate regional recommendations against the use of mesotrione in first-year blackberry plantings. The findings from this trial indicate that S-metolachlor would be safe for registration for use on blackberries because of its limited effect on crop injury and blackberry yield.
The emergence of vascular plants, such as Cooksonia, had a profound impact on Earth’s Early Paleozoic biogeochemical cycles (e.g. atmospheric oxygen, nitrogen and CO2), potentially triggering global environmental and biological changes. However, the timing of Cooksonia’s terrestrial emergence remains elusive as phylogenetic models, microfossils and macrofossils provide different timings for land colonization by vascular plants. Here, hundreds of zircon grains from three siltstones were dated using Laser Ablation-Inductively Couple Plasma-Mass Spectrometry (LA-ICP-MS). The study presents detrital zircon U-Pb dates, which refine the current biostratigraphy ages assigned to Cooksonia macrofossils from the three oldest sites globally. Specifically, siltstones hosting Cooksonia macrofossils from Borrisnoe Mountain (Ireland) and Capel Horeb (Wales) yield Gorstian–Homerian maximum depositional ages (MDAs) of 426 ± 2 Ma and 427 ± 2 Ma, respectively. Additionally, Cwm Graig Ddu (Wales) yields a (Pridoli-Ludlow) maximum age of 423 ± 3 Ma. The findings provide the first detrital zircon U-Pb dates for the oldest Cooksonia macrofossils globally and contribute crucial maximum ages. These maximum ages are instrumental in refining future calibrations of molecular clocks and improving phylogenetic models, thus contributing significantly to a better understanding of Cooksonia’s evolutionary history, including its environmental and ecological impacts.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.