We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study evaluated the impact of four cover crop species and their termination timings on cover crop biomass, weed control, and corn yield. A field experiment was arranged in a split-plot design in which cover crop species (wheat, cereal rye, hairy vetch, and rapeseed) were the main plot factor, and termination timings [4, 2, 1, and 0 wk before planting corn (WBP)] was the subplot factor. In both years (2021 and 2022), hairy vetch produced the most biomass (5,021 kg ha–1) among cover crop species, followed by cereal rye (4,387 kg ha–1), wheat (3,876 kg ha–1), and rapeseed (2,575 kg ha–1). Regression analysis of cover crop biomass with accumulated growing degree days (AGDDs) indicated that for every 100 AGDD increase, the biomass of cereal rye, wheat, hairy vetch, and rapeseed increased by 880, 670, 780, and 620 kg ha–1, respectively. The density of grass and small-seeded broadleaf (SSB) weeds at 4 wk after preemergence herbicide (WAPR) application varied significantly across termination timings. The grass and SSB weed densities were 56% and 36% less at 0 WBP compared with 2 WBP, and 67% and 61% less compared with 4 WBP. The sole use of a roller-crimper did not affect the termination of rapeseed at 0 WBP and resulted in the least corn yield (3,046 kg ha–1), whereas several different combinations of cover crops and termination timings resulted in greater corn yield. In conclusion, allowing cover crops to grow longer in the spring offers more biomass for weed suppression and impacts corn yield.
The Australian SKA Pathfinder (ASKAP) offers powerful new capabilities for studying the polarised and magnetised Universe at radio wavelengths. In this paper, we introduce the Polarisation Sky Survey of the Universe’s Magnetism (POSSUM), a groundbreaking survey with three primary objectives: (1) to create a comprehensive Faraday rotation measure (RM) grid of up to one million compact extragalactic sources across the southern $\sim50$% of the sky (20,630 deg$^2$); (2) to map the intrinsic polarisation and RM properties of a wide range of discrete extragalactic and Galactic objects over the same area; and (3) to contribute interferometric data with excellent surface brightness sensitivity, which can be combined with single-dish data to study the diffuse Galactic interstellar medium. Observations for the full POSSUM survey commenced in May 2023 and are expected to conclude by mid-2028. POSSUM will achieve an RM grid density of around 30–50 RMs per square degree with a median measurement uncertainty of $\sim$1 rad m$^{-2}$. The survey operates primarily over a frequency range of 800–1088 MHz, with an angular resolution of 20” and a typical RMS sensitivity in Stokes Q or U of 18 $\mu$Jy beam$^{-1}$. Additionally, the survey will be supplemented by similar observations covering 1296–1440 MHz over 38% of the sky. POSSUM will enable the discovery and detailed investigation of magnetised phenomena in a wide range of cosmic environments, including the intergalactic medium and cosmic web, galaxy clusters and groups, active galactic nuclei and radio galaxies, the Magellanic System and other nearby galaxies, galaxy halos and the circumgalactic medium, and the magnetic structure of the Milky Way across a very wide range of scales, as well as the interplay between these components. This paper reviews the current science case developed by the POSSUM Collaboration and provides an overview of POSSUM’s observations, data processing, outputs, and its complementarity with other radio and multi-wavelength surveys, including future work with the SKA.
To minimize loss of life, mass casualty response requires swift identification, efficient triage categorization, and rapid hemorrhage control. Current training methods remain suboptimal. Our objective was to train first responders to triage a mass casualty incident using Virtual Reality (VR) simulation and obtain their impressions of the training’s quality and effectiveness.
Methods
We trained subjects in SALT Triage then had them respond to a terrorist bombing of a subway station using a fully immersive VR simulation. We gathered learner reactions to their VR experience and post-encounter debriefing with a custom electronic survey.
Results
Nearly 400 subjects experienced the VR encounter and completed evaluation surveys. Most participants (95%) recommended the experience for other first responders and rated the simulation (95%) and virtual patients (91%) as realistic. Ninety-four percent of participants rated the VR simulator as “excellent” or “good.” We observed no differences between those who owned a personal VR system and those who did not.
Conclusions
Our VR simulator (go.osu.edu/firstresponder) is an automated, customizable, fully immersive virtual reality system for training and assessing personnel in the proper response to a mass casualty incident. Participants perceived the encounter as effective for training, regardless of their prior experience with virtual reality.
Commissioners play a central role in coordinating and planning CAMHS. However, there is little research on their experiences and approaches to understanding the needs of their populations. An improved understanding is likely to benefit the translation of research into practice, by ensuring research outputs meet the needs of key stakeholders and in optimising the sharing and use of data to improve services.
Objectives
To better understand commissioners’ experiences of commissioning child and adolescent mental health services (CAMHS) and the challenges they face.
Methods
Between May to June 2023, we conducted twelve individual, semi-structured interviews with Integrated Care Board commissioners of CAMHS across England. We analysed data using framework analysis; a qualitative analysis method which involves systematically charting and organising data using a framework to generate themes.
Results
We generated five core themes from the data: 1) ‘Reflections on role’ – how commissioners’ roles are informed by their background and ‘positioning’ within the system in which they work, 2) ‘Priorities and Tensions’ – the wider context in which commissioners work and how this may present challenges, 3) ‘Insights and evidence’– how commissioners develop an understanding of child mental health need and the different roles of quantitative and qualitative data, 4) ‘Children’s mental health in the limelight’ – commissioners’ perceptions of changes in child mental health in their populations, 5) ‘Responding to need’ – how commissioners are addressing the needs of their populations and the challenges they perceive.
Conclusions
CAMHS commissioners are negotiating a complex and changing political, social and economic environment with differing priorities and pressures. Commissioners draw heavily on insights from providers and their role is shifting towards managing relationships and bringing the system together. A key challenge is balancing investment in prevention/early intervention versus specialist services needed by children with more severe and complex problems.
The purpose of this study was to explore overall recovery time and post-concussive symptoms (PCSS) of pediatric concussion patients who were referred to a specialty concussion clinic after enduring a protracted recovery (>28 days). This included patients who self-deferred care or received management from another provider until recovery became complicated. It was hypothesized that protracted recovery patients, who initiated care within a specialty concussion clinic, would have similar recovery outcomes as typical acute injury concussion patients (i.e., within 3 weeks).
Participants and Methods:
Retrospective data were gathered from electronic medical records of concussion patients aged 6-19 years. Demographic data were examined based on age, gender, race, concussion history, and comorbid psychiatric diagnosis. Concussion injury data included days from injury to initial clinic visit, total visits, PCSS scores, days from injury to recovery, and days from initiating care with a specialty clinic to recovery. All participants were provided standard return-to-learn and return-to-play protocols, aerobic exercise recommendations, behavioral health recommendations, personalized vestibular/ocular motor rehabilitation exercises, and psychoeducation on the expected recovery trajectory of concussion.
Results:
52 patients were included in this exploratory analysis (Mean age 14.6, SD ±2.7; 57.7% female; 55.7% White, 21.2% Black or African American, 21.2% Hispanic). Two percent of our sample did not disclose their race or ethnicity. Prior concussion history was present in 36.5% of patients and 23.1% had a comorbid psychiatric diagnosis. The patient referral distribution included emergency departments (36%), local pediatricians (26%), neurologists (10%), other concussion clinics (4%), and self-referrals (24%).
Given the nature of our specialty concussion clinic sample, the data was not normally distributed and more likely to be skewed by outliers. As such, the median value and interquartile range were used to describe the results. Regarding recovery variables, the median days to clinic from initial injury was 50.0 (IQR=33.5-75.5) days, the median PCSS score at initial visit was 26.0 (IQR=10.0-53.0), and the median overall recovery time was 81.0 (IQR=57.0-143.3) days.
After initiating care within our specialty concussion clinic, the median recovery time was 21.0 (IQR=14.0-58.0) additional days, the median total visits were 2.0 (IQR=2.0-3.0), and the median PCSS score at follow-up visit was 7.0 (IQR=1-17.3).
Conclusions:
Research has shown that early referral to specialty concussion clinics may reduce recovery time and the risk of protracted recovery. Our results extend these findings to suggest that patients with protracted recovery returned to baseline similarly to those with an acute concussion injury after initiating specialty clinic care. This may be due to the vast number of resources within specialty concussion clinics including tailored return-to-learn and return-to-play protocols, rehabilitation recommendations consistent with research, and home exercises that supplement recovery. Future studies should compare outcomes of protracted recovery patients receiving care from a specialty concussion clinic against those who sought other forms of treatment. Further, evaluating the influence of comorbid factors (e.g., psychiatric and/or concussion history) on pediatric concussion recovery trajectories may be useful for future research.
During the late 1930s, the Home Owners’ Loan Corporation (HOLC) developed a series of area descriptions with color-coded maps of cities that summarized mortgage lending risk. We analyze the maps to explain the oft-noted fact that black neighborhoods overwhelmingly received the lowest rating. Our results suggest that racial bias in the construction of the HOLC maps can explain at most 4 to 20 percent of the observed concentration of black households in the lowest-rated zones. We also provide evidence that the Federal Housing Administration had its own mapping strategies when evaluating mortgages and relied relatively little on the HOLC maps.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
People with severe mental illness (SMI) have more physical health conditions than the general population, resulting in higher rates of hospitalisations and mortality. In this study, we aimed to determine the rate of emergency and planned physical health hospitalisations in those with SMI, compared to matched comparators, and to investigate how these rates differ by SMI diagnosis.
Methods
We used Clinical Practice Research DataLink Gold and Aurum databases to identify 20,668 patients in England diagnosed with SMI between January 2000 and March 2016, with linked hospital records in Hospital Episode Statistics. Patients were matched with up to four patients without SMI. Primary outcomes were emergency and planned physical health admissions. Avoidable (ambulatory care sensitive) admissions and emergency admissions for accidents, injuries and substance misuse were secondary outcomes. We performed negative binomial regression, adjusted for clinical and demographic variables, stratified by SMI diagnosis.
Results
Emergency physical health (aIRR:2.33; 95% CI 2.22–2.46) and avoidable (aIRR:2.88; 95% CI 2.60–3.19) admissions were higher in patients with SMI than comparators. Emergency admission rates did not differ by SMI diagnosis. Planned physical health admissions were lower in schizophrenia (aIRR:0.80; 95% CI 0.72–0.90) and higher in bipolar disorder (aIRR:1.33; 95% CI 1.24–1.43). Accident, injury and substance misuse emergency admissions were particularly high in the year after SMI diagnosis (aIRR: 6.18; 95% CI 5.46–6.98).
Conclusion
We found twice the incidence of emergency physical health admissions in patients with SMI compared to those without SMI. Avoidable admissions were particularly elevated, suggesting interventions in community settings could reduce hospitalisations. Importantly, we found underutilisation of planned inpatient care in patients with schizophrenia. Interventions are required to ensure appropriate healthcare use, and optimal diagnosis and treatment of physical health conditions in people with SMI, to reduce the mortality gap due to physical illness.
Iron deficiency is associated with worse outcomes in children and adults with systolic heart failure. While oral iron replacement has been shown to be ineffective in adults with heart failure, its efficacy in children with heart failure is unknown. We hypothesised that oral iron would be ineffective in replenishing iron stores in ≥50% of children with heart failure.
Methods:
We performed a single-centre retrospective cohort study of patients aged ≤21 years with systolic heart failure and iron deficiency who received oral iron between 01/2013 and 04/2019. Iron deficiency was defined as ≥2 of the following: serum iron <50 mcg/dL, serum ferritin <20 ng/mL, transferrin >300 ng/mL, transferrin saturation <15%. Iron studies and haematologic indices pre- and post-iron therapy were compared using paired-samples Wilcoxon test.
Results:
Fifty-one children with systolic heart failure and iron deficiency (median age 11 years, 49% female) met inclusion criteria. Heart failure aetiologies included cardiomyopathy (51%), congenital heart disease (37%), and history of heart transplantation with graft dysfunction (12%). Median dose of oral iron therapy was 2.9 mg/kg/day of elemental iron, prescribed for a median duration of 96 days. Follow-up iron testing was available for 20 patients, of whom 55% (11/20) remained iron deficient despite oral iron therapy.
Conclusions:
This is the first report on the efficacy of oral iron therapy in children with heart failure. Over half of the children with heart failure did not respond to oral iron and remained iron deficient.
We present the first Southern-Hemisphere all-sky imager and radio-transient monitoring system implemented on two prototype stations of the low-frequency component of the Square Kilometre Array (SKA-Low). Since its deployment, the system has been used for real-time monitoring of the recorded commissioning data. Additionally, a transient searching algorithm has been executed on the resulting all-sky images. It uses a difference imaging technique to enable identification of a wide variety of transient classes, ranging from human-made radio-frequency interference to genuine astrophysical events. Observations at the frequency 159.375 MHz and higher in a single coarse channel ($\approx$0.926 MHz) were made with 2 s time resolution, and multiple nights were analysed generating thousands of images. Despite having modest sensitivity ($\sim$ few Jy beam–1), using a single coarse channel and 2-s imaging, the system was able to detect multiple bright transients from PSR B0950+08, proving that it can be used to detect bright transients of an astrophysical origin. The unusual, extreme activity of the pulsar PSR B0950+08 (maximum flux density $\sim$155 Jy beam–1) was initially detected in a ‘blind’ search in the 2020 April 10/11 data and later assigned to this specific pulsar. The limitations of our data, however, prevent us from making firm conclusions of the effect being due to a combination of refractive and diffractive scintillation or intrinsic emission mechanisms. The system can routinely collect data over many days without interruptions; the large amount of recorded data at 159.375 and 229.6875 MHz allowed us to determine a preliminary transient surface density upper limit of $1.32 \times 10^{-9} \text{deg}^{-2}$ for a timescale and limiting flux density of 2 s and 42 Jy, respectively. In the future, we plan to extend the observing bandwidth to tens of MHz and improve time resolution to tens of milliseconds in order to increase the sensitivity and enable detections of fast radio bursts below 300 MHz.
Introduction: Prehospital field trauma triage (FTT) standards were reviewed and revised in 2014 based on the recommendations of the Centers for Disease Control and Prevention. The FTT standard allows a hospital bypass and direct transport, within 30 min, to a lead trauma hospital (LTH). Our objectives were to assess the impact of the newly introduced prehospital FTT standard and to describe the emergency department (ED) management and outcomes of patients that had bypassed closer hospitals. Methods: We conducted a 12-month multi-centred health record review of paramedic and ED records following the implementation of the 4 step FTT standard (step 1: vital signs and level of consciousness (physiologic), step 2: anatomical injury, step 3: mechanism and step 4: special considerations) in nine paramedic services across Eastern Ontario. We included adult trauma patients transported as urgent that met FTT standard, regardless of transport time. We developed and piloted a data collection tool and obtained consensus on all definitions. The primary outcome was the rate of appropriate triage to a LTH which was defined as: ISS ≥12, admitted to intensive care unit (ICU), non-orthopedic surgery, or death. We have reported descriptive statistics. Results: 570 patients were included: mean age 48.8, male 68.9%, falls 29.6%, motor vehicle collisions 20.2%, stab wounds 10.5%, transported to a LTH 76.5% (n = 436). 72.2% (n = 315) of patients transported to a LTH had bypassed a closer hospital and 126/306 (41.2%) of those were determined to be an appropriate triage to LTH (9 patients had missing outcomes). ED management included: CT head/cervical spine 69.9%, ultrasound 53.6%, xray 51.6%, intubation 15.0%, sedation 11.1%, tranexamic acid 9.8%, blood transfusion 8.2%, fracture reduction 6.9%, tube thoracostomy 5.9%. Outcomes included: ISS ≥ 12 32.7%, admitted to ICU 15.0%, non-orthopedic surgery 11.1%, death 8.8%. Others included: admission to hospital 57.5%, mean LOS 12.8 days, orthopedic surgery 16.3% and discharged from ED 37.3%. Conclusion: Despite a high number of admissions, the majority of trauma patients bypassed to a LTH were considered over-triaged, with a low number of ED procedures and non-orthopedic surgeries. Continued work is needed to appropriately identify patients requiring transport to a LTH.
Single nucleotide polymorphisms (SNPs) contribute small increases in risk for late-onset Alzheimer's disease (LOAD). LOAD SNPs cluster around genes with similar biological functions (pathways). Polygenic risk scores (PRS) aggregate the effect of SNPs genome-wide. However, this approach has not been widely used for SNPs within specific pathways.
Objectives
We investigated whether pathway-specific PRS were significant predictors of LOAD case/control status.
Methods
We mapped SNPs to genes within 8 pathways implicated in LOAD. For our polygenic analysis, the discovery sample comprised 13,831 LOAD cases and 29,877 controls. LOAD risk alleles for SNPs in our 8 pathways were identified at a P-value threshold of 0.5. Pathway-specific PRS were calculated in a target sample of 3332 cases and 9832 controls. The genetic data were pruned with R2 > 0.2 while retaining the SNPs most significantly associated with AD. We tested whether pathway-specific PRS were associated with LOAD using logistic regression, adjusting for age, sex, country, and principal components. We report the proportion of variance in liability explained by each pathway.
Results
The most strongly associated pathways were the immune response (NSNPs = 9304, = 5.63 × 10−19, R2 = 0.04) and hemostasis (NSNPs = 7832, P = 5.47 × 10−7, R2 = 0.015). Regulation of endocytosis, hematopoietic cell lineage, cholesterol transport, clathrin and protein folding were also significantly associated but accounted for less than 1% of the variance. With APOE excluded, all pathways remained significant except proteasome-ubiquitin activity and protein folding.
Conclusions
Genetic risk for LOAD can be split into contributions from different biological pathways. These offer a means to explore disease mechanisms and to stratify patients.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
The USA is the largest consumer of legally, internationally-traded wildlife. A proportion of this trade consists of species listed in the Appendices of CITES, and recorded in the CITES Trade Database. Using this resource, we quantified wildlife entering the USA for 82 of the most frequently recorded wildlife products and a range of taxonomic groups during 1979–2014. We examined trends in legal trade and seizures of illegally traded items over time, and relationships between trade and four national measures of biodiversity. We found that: (1) there is an overall positive relationship between legal imports and seizures; (2) Asia was the main region exporting CITES-listed wildlife products to the USA; (3) bears, crocodilians and other mammals (i.e. other than Ursidae, Felidae, Cetacea, Proboscidea, Primates or Rhinocerotidae) increased in both reported legal trade and seizures over time; (4) legal trade in live specimens was reported to be primarily from captive-produced, artificially-propagated or ranched sources, whereas traded meat was primarily wild sourced; (5) both seizures and legally traded items of felids and elephants decreased over time; and (6) volumes of both legally traded and seized species were correlated with four attributes of exporting countries: species endemism, species richness, number of IUCN threatened species, and country size. The goal of our analysis was to inform CITES decision-making and species conservation efforts.
Introduction: Trauma and injury play a significant role in the population's burden of disease. Limited research exists evaluating the role of trauma bypass protocols. The objective of this study was to assess the impact and effectiveness of a newly introduced prehospital field trauma triage (FTT) standard, allowing paramedics to bypass a closer hospital and directly transport to a trauma centre (TC) provided transport times were within 30 minutes. Methods: We conducted a 12-month multi-centred health record review of paramedic call reports and emergency department health records following the implementation of the 4 step FTT standard (step 1: vital signs and level of consciousness, step 2: anatomical injury, step 3: mechanism and step 4: special considerations) in nine paramedic services across Eastern Ontario. We included adult trauma patients transported as an urgent transport to hospital, that met one of the 4 steps of the FTT standard and would allow for a bypass consideration. We developed and piloted a standardized data collection tool and obtained consensus on all data definitions. The primary outcome was the rate of appropriate triage to a TC, defined as any of the following: injury severity score ≥12, admitted to an intensive care unit, underwent non-orthopedic operation, or death. We report descriptive and univariate analysis where appropriate. Results: 570 adult patients were included with the following characteristics: mean age 48.8, male 68.9%, attended by Advanced Care Paramedic 71.8%, mechanisms of injury: MVC 20.2%, falls 29.6%, stab wounds 10.5%, median initial GCS 14, mean initial BP 132, prehospital fluid administered 26.8%, prehospital intubation 3.5%, transported to a TC 74.6%. Of those transported to a TC, 308 (72.5%) had bypassed a closer hospital prior to TC arrival. Of those that bypassed a closer hospital, 136 (44.2%) were determined to be “appropriate triage to TC”. Bypassed patients more often met the step 1 or step 2 of the standard (186, 66.9%) compared to the step 3 or step 4 (122, 39.6%). An appropriate triage to TC occurred in 104 (55.9%) patients who had met step 1 or 2 and 32 (26.2%) patients meeting step 3 or 4 of the FTT standard. Conclusion: The FTT standard can identify patients who should be bypassed and transported to a TC. However, this is at a cost of potentially burdening the system with poor sensitivity. More work is needed to develop a FTT standard that will assist paramedics in appropriately identifying patients who require a trauma centre.
Recommending nitrofurantoin to treat uncomplicated cystitis was associated with increased nitrofurantoin use from 3.53 to 4.01 prescriptions per 1,000 outpatient visits, but nitrofurantoin resistance in E. coli isolates remained stable at 2%. Concomitant levofloxacin resistance was a significant risk for nitrofurantoin resistance in E. coli isolates (odds ratio [OR], 2.72; 95% confidence interval [CI], 1.04–7.17).
Breakthrough Listen is a 10-yr initiative to search for signatures of technologies created by extraterrestrial civilisations at radio and optical wavelengths. Here, we detail the digital data recording system deployed for Breakthrough Listen observations at the 64-m aperture CSIRO Parkes Telescope in New South Wales, Australia. The recording system currently implements two modes: a dual-polarisation, 1.125-GHz bandwidth mode for single-beam observations, and a 26-input, 308-MHz bandwidth mode for the 21-cm multibeam receiver. The system is also designed to support a 3-GHz single-beam mode for the forthcoming Parkes ultra-wideband feed. In this paper, we present details of the system architecture, provide an overview of hardware and software, and present initial performance results.