We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Foliar-applied postemergence applications of glufosinate are often applied to glufosinate-resistant crops to provide nonselective weed control without significant crop injury. Rainfall, air temperature, solar radiation, and relative humidity near the time of application have been reported to affect glufosinate efficacy. However, previous research may have not captured the full range of weather variability to which glufosinate may be exposed before or following application. Additionally, climate models suggest more extreme weather will become the norm, further expanding the weather range to which glufosinate can be exposed. The objective of this research was to quantify the probability of successful weed control (efficacy ≥85%) with glufosinate applied to some key weed species across a broad range of weather conditions. A database of >10,000 North American herbicide evaluation trials was used in this study. The database was filtered to include treatments with a single postemergence application of glufosinate applied to waterhemp [Amaranthus tuberculatus (Moq.) Sauer], morningglory species (Ipomoea spp.), and/or giant foxtail (Setaria faberi Herrm.) <15 cm in height. These species were chosen because they are well represented in the database and listed as common and troublesome weed species in both corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] (Van Wychen 2020, 2022). Individual random forest models were created. Low rainfall (≤20 mm) over the 5 d before glufosinate application was detrimental to the probability of successful control of A. tuberculatus and S. faberi. Lower relative humidity (≤70%) and solar radiation (≤23 MJ m−1 d−1) on the day of application reduced the probability of successful weed control in most cases. Additionally, the probability of successful control decreased for all species when average air temperature over the first 5 d after application was ≤25 C. As climate continues to change and become more variable, the risk of unacceptable control of several common species with glufosinate is likely to increase.
This paper discusses the application of the principles of factorial design to an experiment in psychology. For the purpose of illustrating the principles, a simple experiment was designed dealing with the determination of the differential limen values of subjects for weights increasing at constant rates. The factorial design was of the type: 4 rates × 7 weights × 2 sexes × 2 sights × 2 dates. The appropriate statistical analysis for this type of design is the analysis of variance. The mathematical formulation of the problem was specified and the appropriate solution for the specific problem was obtained. Greater precision results from this type of design, in comparison with the traditional psychological experiment dealing with a single factor, in that measures are obtained of the effect of each of a number of factors together with their interactions.
This is the report of the application of the principles of factorial design to an investigation of individual educational development. The specific type of factorial design formulated was a 2 Χ 3 Χ 3 Χ 3 arrangement, that is, the effect of sex, grade location, scholastic standing, and individual order, singly and in all possible combinations was studied in relation to educational development as measured by the Iowa Tests of Educational Development. An application of the covariance method was introduced which resulted in increased precision of this type of experimental design by significantly reducing experimental error. The two concomitant measures used to increase the sensitiveness of the experiment were initial status of individual development and mental age. Without these statistical controls all main effects and two first-order interactions would have been accepted as significant. With their use only sex (doubtful), scholastic standing, and individual order demonstrated significant effects. The chief beauty of the analysis of variance and covariance as an integral part of a self-contained experiment is demonstrated in the complete single analysis of the data. The statistical utilization of the experimental results has also been developed for purposes of estimation and prediction. The mathematical statistician is being continuously required to develop and analyze experimental designs of increasing complexity since the introduction of the analysis of variance and covariance. The mathematical formulation and solution of the problem of this investigation is carried out. The methods illustrated and explained in this study, and modifications and extensions of them are capable of very wide application. The general principles can be used to various degrees and in a number of ways.
The theoretical basis for the Johnson-Neyman Technique is here presented for the first time in an American journal. In addition, a simplified working procedure is outlined, step-by-step, for an actual problem. The determination of significance is arrived at early in the analysis; and where no significant difference is found, the problem is complete at this point. The plotting of the region of significance where a significant difference does exist has also been simplified by using the procedure of rotation and translation of axes.
The global population and status of Snowy Owls Bubo scandiacus are particularly challenging to assess because individuals are irruptive and nomadic, and the breeding range is restricted to the remote circumpolar Arctic tundra. The International Union for Conservation of Nature (IUCN) uplisted the Snowy Owl to “Vulnerable” in 2017 because the suggested population estimates appeared considerably lower than historical estimates, and it recommended actions to clarify the population size, structure, and trends. Here we present a broad review and status assessment, an effort led by the International Snowy Owl Working Group (ISOWG) and researchers from around the world, to estimate population trends and the current global status of the Snowy Owl. We use long-term breeding data, genetic studies, satellite-GPS tracking, and survival estimates to assess current population trends at several monitoring sites in the Arctic and we review the ecology and threats throughout the Snowy Owl range. An assessment of the available data suggests that current estimates of a worldwide population of 14,000–28,000 breeding adults are plausible. Our assessment of population trends at five long-term monitoring sites suggests that breeding populations of Snowy Owls in the Arctic have decreased by more than 30% over the past three generations and the species should continue to be categorised as Vulnerable under the IUCN Red List Criterion A2. We offer research recommendations to improve our understanding of Snowy Owl biology and future population assessments in a changing world.
To understand healthcare workers’ (HCWs) beliefs and practices toward blood culture (BCx) use.
Design:
Cross-sectional electronic survey and semi-structured interviews.
Setting:
Academic hospitals in the United States.
Participants:
HCWs involved in BCx ordering and collection in adult intensive care units (ICU) and wards.
Methods:
We administered an anonymous electronic survey to HCWs and conducted semi-structured interviews with unit staff and quality improvement (QI) leaders in these institutions to understand their perspectives regarding BCx stewardship between February and November 2023.
Results:
Of 314 HCWs who responded to the survey, most (67.4%) were physicians and were involved in BCx ordering (82.3%). Most survey respondents reported that clinicians had a low threshold to culture patients for fever (84.4%) and agreed they could safely reduce the number of BCx obtained in their units (65%). However, only half of them believed BCx was overused. Although most made BCx decisions as a team (74.1%), a minority reported these team discussions occurred daily (42.4%). A third of respondents reported not usually collecting the correct volume per BCx bottle, half were unaware of the improved sensitivity of 2 BCx sets, and most were unsure of the nationally recommended BCx contamination threshold (87.5%). Knowledge regarding the utility of BCx for common infections was limited.
Conclusions:
HCWs’ understanding of best collection practices and yield of BCx was limited.
The spin-down law of pulsars is generally perturbed by two types of timing irregularities: glitches and timing noise. Glitches are sudden changes in the rotational frequency of pulsars, while timing noise is a discernible stochastic wandering in the phase, period, or spin-down rate of a pulsar. We present the timing results of a sample of glitching pulsars observed using the Ooty Radio Telescope (ORT) and the upgraded Giant Metrewave Radio Telescope (uGMRT). Our findings include timing noise analysis for 17 pulsars, with seven being reported for the first time. We detected five glitches in four pulsars and a glitch-like event in PSR J1825–0935. The frequency evolution of glitches in pulsars, J0742–2822 and J1740–3015, is presented for the first time. Additionally, we report timing noise results for three glitching pulsars. The timing noise was analysed separately in the pre-glitch and post-glitch regions. We observed an increase in the red noise parameters in the post-glitch regions, where exponential recovery was considered in the noise analysis. Timing noise can introduce ambiguities in the correct evaluation of glitch observations. Hence, it is important to consider timing noise in glitch analysis. We propose an innovative glitch verification approach designed to discern between a glitch and strong timing noise. The novel glitch analysis technique is also demonstrated using the observed data.
The association between cannabis and psychosis is established, but the role of underlying genetics is unclear. We used data from the EU-GEI case-control study and UK Biobank to examine the independent and combined effect of heavy cannabis use and schizophrenia polygenic risk score (PRS) on risk for psychosis.
Methods
Genome-wide association study summary statistics from the Psychiatric Genomics Consortium and the Genomic Psychiatry Cohort were used to calculate schizophrenia and cannabis use disorder (CUD) PRS for 1098 participants from the EU-GEI study and 143600 from the UK Biobank. Both datasets had information on cannabis use.
Results
In both samples, schizophrenia PRS and cannabis use independently increased risk of psychosis. Schizophrenia PRS was not associated with patterns of cannabis use in the EU-GEI cases or controls or UK Biobank cases. It was associated with lifetime and daily cannabis use among UK Biobank participants without psychosis, but the effect was substantially reduced when CUD PRS was included in the model. In the EU-GEI sample, regular users of high-potency cannabis had the highest odds of being a case independently of schizophrenia PRS (OR daily use high-potency cannabis adjusted for PRS = 5.09, 95% CI 3.08–8.43, p = 3.21 × 10−10). We found no evidence of interaction between schizophrenia PRS and patterns of cannabis use.
Conclusions
Regular use of high-potency cannabis remains a strong predictor of psychotic disorder independently of schizophrenia PRS, which does not seem to be associated with heavy cannabis use. These are important findings at a time of increasing use and potency of cannabis worldwide.
Foliar-applied postemergence herbicides are a critical component of corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] weed management programs in North America. Rainfall and air temperature around the time of application may affect the efficacy of herbicides applied postemergence in corn or soybean production fields. However, previous research utilized a limited number of site-years and may not capture the range of rainfall and air temperatures that these herbicides are exposed to throughout North America. The objective of this research was to model the probability of achieving successful weed control (≥85%) with commonly applied postemergence herbicides across a broad range of environments. A large database of more than 10,000 individual herbicide evaluation field trials conducted throughout North America was used in this study. The database was filtered to include only trials with a single postemergence application of fomesafen, glyphosate, mesotrione, or fomesafen + glyphosate. Waterhemp [Amaranthus tuberculatus (Moq.) Sauer], morningglory species (Ipomoea spp.), and giant foxtail (Setaria faberi Herrm.) were the weeds of focus. Separate random forest models were created for each weed species by herbicide combination. The probability of successful weed control deteriorated when the average air temperature within the first 10 d after application was <19 or >25 C for most of the herbicide by weed species models. Additionally, drier conditions before postemergence herbicide application reduced the probability of successful control for several of the herbicide by weed species models. As air temperatures increase and rainfall becomes more variable, weed control with many of the commonly used postemergence herbicides is likely to become less reliable.
To determine the reach, adoption, implementation and effectiveness of an intervention to increase children’s vegetable intake in long day care (LDC).
Design:
A 12-week pragmatic cluster randomised controlled trial, informed by the multiphase optimisation strategy (MOST), targeting the mealtime environment and curriculum. Children’s vegetable intake and variety was measured at follow-up using a modified Short Food Survey for early childhood education and care and analysed using a two-part mixed model for non-vegetable and vegetable consumers. Outcome measures were based on the RE-AIM framework.
Setting:
Australian LDC centres.
Participants:
Thirty-nine centres, 120 educators and 719 children at follow-up.
Results:
There was no difference between intervention and waitlist control groups in the likelihood of consuming any vegetables when compared with non-vegetable consumers for intake (OR = 0·70, (95 % CI 0·34–1·43), P = 0·32) or variety (OR = 0·73 (95 % CI 0·40–1·32), P = 0·29). Among vegetable consumers (n 652), there was no difference between groups in vegetable variety (exp(b): 1·07 (95 % CI:0·88–1·32, P = 0·49) or vegetable intake (exp(b): 1·06 (95 % CI: 0·78, 1·43)), P = 0·71) with an average of 1·51 (95 % CI 1·20–1·82) and 1·40 (95 % CI 1·08–1·72) serves of vegetables per day in the intervention and control group, respectively. Intervention educators reported higher skills for promoting vegetables at mealtimes, and knowledge and skills for teaching the curriculum, than control (all P < 0·001). Intervention fidelity was moderate (n 16/20 and n 15/16 centres used the Mealtime environment and Curriculum, respectively) with good acceptability among educators. The intervention reached 307/8556 centres nationally and was adopted by 22 % eligible centres.
Conclusions:
The pragmatic self-delivered online intervention positively impacted educator’s knowledge and skills and was considered acceptable and feasible. Intervention adaptations, using the MOST cyclic approach, could improve intervention impact on children’ vegetable intake.
Transient acquisition of methicillin-resistant Staphylococcus aureus (MRSA) on healthcare personnel (HCP) gloves and gowns following patient care has been examined. However, the potential for transmission to the subsequent patient has not been studied. We explored the frequency of MRSA transmission from patient to HCP, and then in separate encounters from contaminated HCP gloves and gowns to a subsequent simulated patient as well as the factors associated with these 2 transmission pathways.
Methods:
We conducted a prospective cohort study with 2 parts. In objective 1, we studied MRSA transmission from random MRSA-positive patients to HCP gloves and gowns after specific routine patient care activities. In objective 2, we simulated subsequent transmission from random HCP gloves and gowns without hand hygiene to the next patient using a manikin proxy.
Results:
For the first objective, among 98 MRSA-positive patients with 333 randomly selected individual patient–HCP interactions, HCP gloves or gowns were contaminated in 54 interactions (16.2%). In a multivariable analysis, performing endotracheal tube care had the greatest odds of glove or gown contamination (OR, 4.06; 95% CI, 1.3–12.6 relative to physical examination). For the second objective, after 147 simulated HCP–patient interactions, the subsequent transmission of MRSA to the manikin proxy occurred 15 times (10.2%).
Conclusion:
After caring for a patient with MRSA, contamination of HCP gloves and gown and transmission to subsequent patients following HCP-patient interactions occurs frequently if contact precautions are not used. Proper infection control practices, including the use of gloves and gown, can prevent this potential subsequent transmission.
While cannabis use is a well-established risk factor for psychosis, little is known about any association between reasons for first using cannabis (RFUC) and later patterns of use and risk of psychosis.
Methods
We used data from 11 sites of the multicentre European Gene-Environment Interaction (EU-GEI) case–control study. 558 first-episode psychosis patients (FEPp) and 567 population controls who had used cannabis and reported their RFUC.
We ran logistic regressions to examine whether RFUC were associated with first-episode psychosis (FEP) case–control status. Path analysis then examined the relationship between RFUC, subsequent patterns of cannabis use, and case–control status.
Results
Controls (86.1%) and FEPp (75.63%) were most likely to report ‘because of friends’ as their most common RFUC. However, 20.1% of FEPp compared to 5.8% of controls reported: ‘to feel better’ as their RFUC (χ2 = 50.97; p < 0.001). RFUC ‘to feel better’ was associated with being a FEPp (OR 1.74; 95% CI 1.03–2.95) while RFUC ‘with friends’ was associated with being a control (OR 0.56; 95% CI 0.37–0.83). The path model indicated an association between RFUC ‘to feel better’ with heavy cannabis use and with FEPp-control status.
Conclusions
Both FEPp and controls usually started using cannabis with their friends, but more patients than controls had begun to use ‘to feel better’. People who reported their reason for first using cannabis to ‘feel better’ were more likely to progress to heavy use and develop a psychotic disorder than those reporting ‘because of friends’.
Methicillin-resistant Staphylococcus aureus (MRSA) is a significant nosocomial pathogen in the ICU. MRSA contamination of healthcare personnel (HCP) gloves and gowns after providing care to patients with MRSA occurs at a rate of 14%–16% in the ICU setting. Little is known about whether the MRSA isolates identified on HCP gown and gloves following patient care activities are the same as MRSA isolates identified as colonizing or infecting the patient.
Methods:
From a multisite cohort of 388 independent patient MRSA isolates and their corresponding HCP gown and glove isolates, we selected 91 isolates pairs using a probability to proportion size (PPS) sampling method. To determine whether the patient and HCP gown or gloves isolates were genetically similar, we used 5 comparative genomic typing methods: phylogenetic analysis, spa typing, multilocus sequence typing (MLST), large-scale BLAST score ratio (LSBSR), and single-nucleotide variant (SNV) analysis.
Results:
We identified that 56 (61.5%) of isolate pairs were genetically similar at least by 4 of the methods. Comparably, the spa typing and the LSBSR analyses revealed that >75% of the examined isolate pairs were concordant, with the thresholds established for each analysis.
Conclusions:
Many of the patient MRSA isolates were genetically similar to those on the HCP gown or gloves following a patient care activity. This finding indicates that the patient is often the primary source of the MRSA isolates transmitted to the HCP, which can potentially be spread to other patients or hospital settings through HCP vectors. These results have important implications because they provide additional evidence for hospitals considering ending the use of contact precautions (gloves and gowns) for MRSA patients.
Background: Phase 3 COMET trial (NCT02782741) compares avalglucosidase alfa (n=51) with alglucosidase alfa (n=49) in treatment-naïve LOPD. Methods: Primary objective: determine avalglucosidase alfa effect on respiratory muscle function. Secondary/other objectives include: avalglucosidase alfa effect on functional endurance, inspiratory/expiratory muscle strength, lower/upper extremity muscle strength, motor function, health-related quality of life, safety. Results: At Week 49, change (LSmean±SE) from baseline in upright forced vital capacity %predicted was greater with avalglucosidase alfa (2.89%±0.88%) versus alglucosidase alfa (0.46%±0.93%)(absolute difference+2.43%). The primary objective, achieving statistical non-inferiority (p=0.0074), was met. Superiority testing was borderline significant (p=0.0626). Week 49 change from baseline in 6-minute walk test was 30.01-meters greater for avalglucosidase alfa (32.21±9.93m) versus alglucosidase alfa (2.19±10.40m). Positive results for avalglucosidase alfa were seen for all secondary/other efficacy endpoints. Treatment-emergent adverse events (AEs) occurred in 86.3% of avalglucosidase alfa-treated and 91.8% of alglucosidase alfa-treated participants. Five participants withdrew, 4 for AEs, all on alglucosidase alfa. Serious AEs occurred in 8 avalglucosidase alfa-treated and 12 alglucosidase alfa-treated participants. IgG antidrug antibody responses were similar in both. High titers and neutralizing antibodies were more common for alglucosidase alfa. Conclusions: Results demonstrate improvements in clinically meaningful outcome measures and a more favorable safety profile with avalglucosidase alfa versus alglucosidase alfa. Funding: Sanofi Genzyme
Trainees and investigators from underrepresented minority (URM) backgrounds face unique challenges to establishing successful careers in clinical and translational research. Structured training for mentors is an important mechanism to increase the diversity of the research workforce. This article presents data from an evaluation of the University of California, San Francisco (UCSF) Center for AIDS Research (CFAR) Mentoring the Mentors program aimed at improving mentors’ competency in working with diverse mentees in HIV research.
Methods:
Mentors from around the USA who had in one of seven separate 2-day training workshops conducted from 2013 to 2020 were invited to participate in an online evaluation survey of their experiences with the training and their subsequent mentoring activities.
Results:
There was a high response rate (80%) among the 226 mentors invited to complete the survey. The 180 respondents were diverse in demographics, professional disciplines, and geographic distribution. Quantitative and qualitative data indicate a lasting positive impact of the training, with sustained improvements documented on a validated measure of self-appraised mentoring competency. Respondents also endorsed high interest in future, follow-up training with continued focus on topics related to mentoring in the context of diversity.
Conclusion:
The evaluation of the UCSF CFAR Mentoring the Mentors program showed lasting impact in improving mentoring practices, coupled with high interest in continued in-depth training in areas focused on diversity, equity, and inclusion.
To investigate the perceived effects of the coronavirus disease (COVID-19) pandemic lockdown measures on food availability, accessibility, dietary practices and strategies used by participants to cope with these measures.
Design:
We conducted a cross-sectional multi-country online survey between May and July 2020. We used a study-specific questionnaire mainly based on the adaptation of questions to assess food security and coping strategies from the World Food Programme’s ‘Emergency Food Security Assessment’ and ‘The Coping Strategy Index’.
Setting:
The questionnaire was hosted online using Google Forms and shared using social media platforms.
Participants:
A total of 1075 adult participants from eighty-two countries completed the questionnaire.
Results:
As a prelude to COVID-19 lockdowns, 62·7 % of the participants reported to have stockpiled food, mainly cereals (59·5 % of the respondents) and legumes (48·8 %). An increase in the prices of staples, such as cereals and legumes, was widely reported. Price increases have been identified as an obstacle to food acquisition by 32·7 % of participants. Participants reported having lesser variety (50·4 %), quality (30·2 %) and quantity (39·2 %) of foods, with disparities across regions. Vulnerable groups were reported to be facing some struggle to acquire adequate food, especially people with chronic diseases (20·2 %), the elderly (17·3 %) and children (14·5 %). To cope with the situation, participants mostly relied on less preferred foods (49 %), reduced portion sizes (30 %) and/or reduced the number of meals (25·7 %).
Conclusions:
The COVID-19 pandemic negatively impacted food accessibility and availability, altered dietary practices and worsened the food insecurity situation, particularly in the most fragile regions.
To test the feasibility of targeted gown and glove use by healthcare personnel caring for high-risk nursing-home residents to prevent Staphylococcus aureus acquisition in short-stay residents.
Design:
Uncontrolled clinical trial.
Setting:
This study was conducted in 2 community-based nursing homes in Maryland.
Participants:
The study included 322 residents on mixed short- and long-stay units.
Methods:
During a 2-month baseline period, all residents had nose and inguinal fold swabs taken to estimate S. aureus acquisition. The intervention was iteratively developed using a participatory human factors engineering approach. During a 2-month intervention period, healthcare personnel wore gowns and gloves for high-risk care activities while caring for residents with wounds or medical devices, and S. aureus acquisition was measured again. Whole-genome sequencing was used to assess whether the acquisition represented resident-to-resident transmission.
Results:
Among short-stay residents, the methicillin-resistant S. aureus acquisition rate decreased from 11.9% during the baseline period to 3.6% during the intervention period (odds ratio [OR], 0.28; 95% CI, 0.08–0.92; P = .026). The methicillin-susceptible S. aureus acquisition rate went from 9.1% during the baseline period to 4.0% during the intervention period (OR, 0.41; 95% CI, 0.12–1.42; P = .15). The S. aureus resident-to-resident transmission rate decreased from 5.9% during the baseline period to 0.8% during the intervention period.
Conclusions:
Targeted gown and glove use by healthcare personnel for high-risk care activities while caring for residents with wounds or medical devices, regardless of their S. aureus colonization status, is feasible and potentially decreases S. aureus acquisition and transmission in short-stay community-based nursing-home residents.
To examine the efficacy and tolerability of quetiapine SR in patients with schizophrenia switched from quetiapine IR.
Methods:
Randomised, double-blind study (D1444C00146) using dual-matched placebo. Patients clinically stable on fixed doses of quetiapine IR received twice-daily quetiapine IR 400, 600 or 800 mg/day for 4 weeks. Stable patients were then randomised (1:2) to continue taking quetiapine IR or switch to the same total dose of quetiapine SR (active dose once-daily in the evening) for 6 weeks. Primary analysis: % of patients (modified ITT population) discontinuing due to lack of efficacy or with PANSS total increase ≥20% at any visit, using a 6% non-inferiority margin for the upper 95% CI of the treatment difference. Per-protocol (PP) analysis was also performed.
Results:
497 patients were randomised (quetiapine SR 331, IR 166); completion rates were 91.5% and 94.0%, respectively. Few patients discontinued due to lack of efficacy or had a PANSS increase ≥20% in both the MITT (n=496) and PP populations (n=393): 9.1% and 5.3% for quetiapine SR and 7.2% and 6.2% for quetiapine IR, respectively. Quetiapine SR was non-inferior to quetiapine IR in the PP population (treatment difference: -0.83% [95% CI -6.75, 3.71]; p=0017) but not in the MITT population (treatment difference: 1.86% [95% CI -3.78, 6.57]; p=0.0431). The incidence (quetiapine SR 38.7%; IR 35.5%) and profile of AEs were similar in both groups.
Conclusion:
Clinically-stable patients receiving quetiapine IR can be switched, without titration, to an equivalent once-daily dose of quetiapine SR without any clinical deterioration or compromise in tolerability.