We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We studied severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection and vaccination status among six ethnic groups in Amsterdam, the Netherlands. We analysed participants of the Healthy Life in an Urban Setting cohort who were tested for SARS-CoV-2 spike protein antibodies between 17 May and 21 November 2022. We categorized participants with antibodies as only infected, only vaccinated (≥1 dose), or both infected and vaccinated, based on self-reported prior infection and vaccination status and previous seroprevalence data. We compared infection and vaccination status between ethnic groups using multivariable, multinomial logistic regression. Of the 1,482 included participants, 98.5% had SARS-CoV-2 antibodies (P between ethnic groups = 0.899). Being previously infected and vaccinated ranged from 36.2% (95% confidence interval (CI) = 28.3–44.1%) in the African Surinamese to 64.5% (95% CI = 52.9–76.1%) in the Ghanaian group. Compared to participants of Dutch origin, participants of South-Asian Surinamese (adjusted odds ratio (aOR) = 6.74, 95% CI = 2.61–17.45)), African Surinamese (aOR = 23.32, 95% CI = 10.55–51.54), Turkish (aOR = 8.50, 95% CI = 3.05–23.68), or Moroccan (aOR = 22.33, 95% CI = 9.48–52.60) origin were more likely to be only infected than infected and vaccinated, after adjusting for age, sex, household size, trust in the government’s response to the pandemic, and month of study visit. SARS-CoV-2 infection and vaccination status varied across ethnic groups, particularly regarding non-vaccination. As hybrid immunity is most protective against coronavirus disease 2019, future vaccination campaigns should encourage vaccination uptake in specific demographic groups with only infection.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
To conduct feasibility and cost analysis of portable MRI implementation in a remote setting where MRI access is otherwise unavailable.
Methods:
Portable MRI (ultra-low field, 0.064T) was installed in Weeneebayko General Hospital, Moose Factory, Ontario. Adult patients, presenting with any indication for neuroimaging, were eligible for study inclusion. Scanning period was from November 14, 2021, to September 6, 2022. Images were sent via a secure PACS network for Neuroradiologist interpretation, available 24/7. Clinical indications, image quality, and report turnaround time were recorded. A cost analysis was conducted from a healthcare system’s perspective in 2022 Canadian dollars, comparing cost of portable MRI implementation to transporting patients to a center with fixed MRI.
Results:
Portable MRI was successfully implemented in a remote Canadian location. Twenty-five patients received a portable MRI scan. All studies were of diagnostic quality. No clinically significant pathologies were identified on any of the studies. However, based on clinical presentation and limitations of portable MRI resolution, it is estimated that 11 (44%) of patients would require transfer to a center with fixed MRI for further imaging workup. Cost savings were $854,841 based on 50 patients receiving portable MRI over 1 year. Five-year budget impact analysis showed nearly $8 million dollars saved.
Conclusions:
Portable MRI implementation in a remote setting is feasible, with significant cost savings compared to fixed MRI. This study may serve as a model to democratize MRI access, offer timely care and improved triaging in remote areas where conventional MRI is unavailable.
To understand which anthropometric diagnostic criteria best discriminate higher from lower risk of death in children and explore programme implications.
Design:
A multiple cohort individual data meta-analysis of mortality risk (within 6 months of measurement) by anthropometric case definitions. Sensitivity, specificity, informedness and inclusivity in predicting mortality, face validity and compatibility with current standards and practice were assessed and operational consequences were modelled.
Setting:
Community-based cohort studies in twelve low-income countries between 1977 and 2013 in settings where treatment of wasting was not widespread.
Participants:
Children aged 6 to 59 months.
Results:
Of the twelve anthropometric case definitions examined, four (weight-for-age Z-score (WAZ) <−2), (mid-upper arm circumference (MUAC) <125 mm), (MUAC < 115 mm or WAZ < −3) and (WAZ < −3) had the highest informedness in predicting mortality. A combined case definition (MUAC < 115 mm or WAZ < −3) was better at predicting deaths associated with weight-for-height Z-score <−3 and concurrent wasting and stunting (WaSt) than the single WAZ < −3 case definition. After the assessment of all criteria, the combined case definition performed best. The simulated workload for programmes admitting based on MUAC < 115 mm or WAZ < −3, when adjusted with a proxy for required intensity and/or duration of treatment, was 1·87 times larger than programmes admitting on MUAC < 115 mm alone.
Conclusions:
A combined case definition detects nearly all deaths associated with severe anthropometric deficits suggesting that therapeutic feeding programmes may achieve higher impact (prevent mortality and improve coverage) by using it. There remain operational questions to examine further before wide-scale adoption can be recommended.
To compare the prognostic value of mid-upper arm circumference (MUAC), weight-for-height Z-score (WHZ) and weight-for-age Z-score (WAZ) for predicting death over periods of 1, 3 and 6 months follow-up in children.
Design:
Pooled analysis of twelve prospective studies examining survival after anthropometric assessment. Sensitivity and false-positive ratios to predict death within 1, 3 and 6 months were compared for three individual anthropometric indices and their combinations.
Setting:
Community-based, prospective studies from twelve countries in Africa and Asia.
Participants:
Children aged 6–59 months living in the study areas.
Results:
For all anthropometric indices, the receiver operating characteristic curves were higher for shorter than for longer durations of follow-up. Sensitivity was higher for death with 1-month follow-up compared with 6 months by 49 % (95 % CI (30, 69)) for MUAC < 115 mm (P < 0·001), 48 % (95 % CI (9·4, 87)) for WHZ < -3 (P < 0·01) and 28 % (95 % CI (7·6, 42)) for WAZ < -3 (P < 0·005). This was accompanied by an increase in false positives of only 3 % or less. For all durations of follow-up, WAZ < -3 identified more children who died and were not identified by WHZ < -3 or by MUAC < 115 mm, 120 mm or 125 mm, but the use of WAZ < -3 led to an increased false-positive ratio up to 16·4 % (95 % CI (12·0, 20·9)) compared with 3·5 % (95 % CI (0·4, 6·5)) for MUAC < 115 mm alone.
Conclusions:
Frequent anthropometric measurements significantly improve the identification of malnourished children with a high risk of death without markedly increasing false positives. Combining two indices increases sensitivity but also increases false positives among children meeting case definitions.
We summarize what we assess as the past year's most important findings within climate change research: limits to adaptation, vulnerability hotspots, new threats coming from the climate–health nexus, climate (im)mobility and security, sustainable practices for land use and finance, losses and damages, inclusive societal climate decisions and ways to overcome structural barriers to accelerate mitigation and limit global warming to below 2°C.
Technical summary
We synthesize 10 topics within climate research where there have been significant advances or emerging scientific consensus since January 2021. The selection of these insights was based on input from an international open call with broad disciplinary scope. Findings concern: (1) new aspects of soft and hard limits to adaptation; (2) the emergence of regional vulnerability hotspots from climate impacts and human vulnerability; (3) new threats on the climate–health horizon – some involving plants and animals; (4) climate (im)mobility and the need for anticipatory action; (5) security and climate; (6) sustainable land management as a prerequisite to land-based solutions; (7) sustainable finance practices in the private sector and the need for political guidance; (8) the urgent planetary imperative for addressing losses and damages; (9) inclusive societal choices for climate-resilient development and (10) how to overcome barriers to accelerate mitigation and limit global warming to below 2°C.
Social media summary
Science has evidence on barriers to mitigation and how to overcome them to avoid limits to adaptation across multiple fields.
Background: Infantile spasms (IS) is an epileptic encephalopathy, characterized by spasms, hypsarrhythmia, and developmental regression. This is a retrospective case series of children with IS who underwent epilepsy surgery at The Hospital for Sick Children (HSC) in Toronto, Canada. Methods: The records of 223 patients seen in the IS clinic were reviewed. Results: Nineteen patients met inclusion criteria. The etiology of IS was encephalomalacia in six patients (32%), malformations of cortical development in 11 patients (58%), atypical hypoglycaemic injury in one patient (5%), and partial hemimegalencephaly in one patient (5%). Nine patients (47%) underwent hemispherectomy and 10 patients (53%) underwent lobectomy/lesionectomy. Three patients (16%) underwent a second epilepsy surgery. Fifteen patients (79%) were considered ILAE Seizure Outcome Class 1 (completely seizure free; no auras). The percentage of patients who were ILAE Class 1 at most recent follow-up decreased with increasing duration of epilepsy prior to surgery. Developmental outcome was improved in 14/19 (74%) and stable in 5/19 (26%) patients. Conclusions: Our study found excellent seizure freedom rates and improved developmental outcomes following epilepsy surgery in patients with a history of IS with a structural lesion detected on MRI brain.
Healthcare facilities are a well-known high-risk environment for transmission of M. tuberculosis, the etiologic agent of tuberculosis (TB) disease. However, the link between M. tuberculosis transmission in healthcare facilities and its role in the general TB epidemic is unknown. We estimated the proportion of overall TB transmission in the general population attributable to healthcare facilities.
Methods:
We combined data from a prospective, population-based molecular epidemiologic study with a universal electronic medical record (EMR) covering all healthcare facilities in Botswana to identify biologically plausible transmission events occurring at the healthcare facility. Patients with M. tuberculosis isolates of the same genotype visiting the same facility concurrently were considered an overlapping event. We then used TB diagnosis and treatment data to categorize overlapping events into biologically plausible definitions. We calculated the proportion of overall TB cases in the cohort that could be attributable to healthcare facilities.
Results:
In total, 1,881 participants had TB genotypic and EMR data suitable for analysis, resulting in 46,853 clinical encounters at 338 healthcare facilities. We identified 326 unique overlapping events involving 370 individual patients; 91 (5%) had biologic plausibility for transmission occurring at a healthcare facility. A sensitivity analysis estimated that 3%–8% of transmission may be attributable to healthcare facilities.
Conclusions:
Although effective interventions are critical in reducing individual risk for healthcare workers and patients at healthcare facilities, our findings suggest that development of targeted interventions aimed at community transmission may have a larger impact in reducing TB.
ABSTRACT IMPACT: Up to 33% of patients of patients who undergo reconstruction have hostile defects with coexisting soft tissue and osseous defects due to prior radiation, prior failed cranioplasty or concurrent infections we seek to identify optimal strategies for these patients based on the experience of a southeastern tertiary referral center. OBJECTIVES/GOALS: Scalp and calvarial defects in patients may result from a number of etiologies including trauma, burns, tumor resections, infections, osteoradionecrosis, or congenital lesions. Our objective was to retrospectively evaluate the use of alloplastic reconstruction alongside autologous reconstruction for high risk cranial defects. METHODS/STUDY POPULATION: An IRB approved retrospective review of patients who underwent cranioplasty of a hostile site at a Southeastern tertiary referal center between January 2008 and December 2018 was performed. The patients were stratified into three groups based on the type of implant used: autogenous (bone), alloplastic (PEEK, Titanium, PMMA), or mixed (combination of both types of graft). The primary outcome metric was a complication in the year following cranioplasty, identified by flap or bone graft failure, necrosis, or infection. Statistical analysis included t-tests and chi-square tests where appropriate using SPSS. RESULTS/ANTICIPATED RESULTS: There were 43 total cases in this time period; 15 autogenous, 23 alloplastic, and 5 mixed. The purely autogenous group had the highest complication rate (85%) and the alloplastic group had the lowest complication rate (38%). When stratified by specific material used for reconstruction (15 bone, 14 PEEK, 10 titanium, and 5 PMMA), overall complication rate was statistically significant (p=0.009; chi square test) with PEEK implants having the lowest complication rate (21%). The analysis documented an overall complication rate that was statistically different between the three groups (p=0.012). DISCUSSION/SIGNIFICANCE OF FINDINGS: This analysis interestingly found that in the setting of hostile cranial defects, cranioplasties would benefit from the use of prosthetic implants instead of autologous bone grafts, not only for avoidance of donor site morbidity but also for decrease in overall complications.
Colleges and universities around the world engaged diverse strategies during the COVID-19 pandemic. Baylor University, a community of ˜22,700 individuals, was 1 of the institutions which resumed and sustained operations. The key strategy was establishment of multidisciplinary teams to develop mitigation strategies and priority areas for action. This population-based team approach along with implementation of a “Swiss Cheese” risk mitigation model allowed small clusters to be rapidly addressed through testing, surveillance, tracing, isolation, and quarantine. These efforts were supported by health protocols including face coverings, social distancing, and compliance monitoring. As a result, activities were sustained from August 1 to December 8, 2020. There were 62,970 COVID-19 tests conducted with 1435 people testing positive for a positivity rate of 2.28%. A total of 1670 COVID-19 cases were identified with 235 self-reports. The mean number of tests per week was 3500 with approximately 80 of these positive (11/d). More than 60 student tracers were trained with over 120 personnel available to contact trace, at a ratio of 1 per 400 university members. The successes and lessons learned provide a framework and pathway for similar institutions to mitigate the ongoing impacts of COVID-19 and sustain operations during a global pandemic.
Goosegrass control options in bermudagrass are limited. Topramezone is one option that offers excellent control of mature goosegrass, but application to bermudagrass results in unacceptable symptoms of bleaching and necrosis typical of hydroxyphenylpyruvate dioxygenase inhibitors. Previous research has shown that adding chelated iron reduced the phytotoxicity of topramezone without reducing the efficacy of the herbicide, resulting in safening when applied to bermudagrass. Our objective was to examine additional iron sources to determine whether similar safening effects occur with other sources. Field trials were conducted in the summers of 2016 to 2018 (Auburn University). Mixtures of topramezone and methylated seed oil were combined with six different commercial iron sources, including sodium ferric ethylenediamine di-o-hydroxyphenyl-acetate (FeEDDHA), ferrous diethylenetriamine pentaacetic acid (FeDTPA), iron citrate, FeSO4, and a combination of iron oxide/sucrate/sulfate, some of which contained nitrogen. Bermudagrass necrosis and bleaching symptoms were visually rated on a 0% to 100% scale. Reflectance (normalized difference vegetation index) and clipping yield measurements were also collected. Application of FeDTPA and FeSO4 reduced symptoms of bleaching and necrosis when applied with topramezone. Other treatments that contained nitrogen did not reduce injury but did reduce bermudagrass recovery time following the appearance of necrosis. Inclusion of small amounts of nitrogen often negated the safening effects of FeSO4. The iron oxide/sucrate/sulfate product had no effect on bleaching or necrosis. Data suggest that the iron source had a differential effect on bleaching and necrosis reduction when applied in combination with topramezone to bermudagrass. Overall, FeSO4 and FeDTPA safened topramezone the most on bermudagrass.
POST goosegrass and other grassy weed control in bermudagrass is problematic. Fewer herbicides that can control goosegrass are available due to regulatory pressure and herbicide resistance. Alternative herbicide options that offer effective control are needed. Previous research demonstrates that topramezone controls goosegrass, crabgrass, and other weed species; however, injury to bermudagrass may be unacceptable. The objective of this research was to evaluate the safening potential of topramezone combinations with different additives on bermudagrass. Field trials were conducted at Auburn University during summer and fall from 2015 to 2018 and 2017 to 2018, respectively. Treatments included topramezone mixtures and methylated seed oil applied in combination with five different additives: triclopyr, green turf pigment, green turf paint, ammonium sulfate, and chelated iron. Bermudagrass bleaching and necrosis symptoms were visually rated. Normalized-difference vegetative index measurements and clipping yield data were also collected. Topramezone plus chelated iron, as well as topramezone plus triclopyr, reduced bleaching potential the best; however, the combination of topramezone plus triclopyr resulted in necrosis that outweighed reductions in bleaching. Masking agents such as green turf paint and green turf pigment were ineffective in reducing injury when applied with topramezone. The combination of topramezone plus ammonium sulfate should be avoided because of the high level of necrosis. Topramezone-associated bleaching symptoms were transient and lasted 7 to 14 d on average. Findings from this research suggest that chelated iron added to topramezone and methylated seed oil mixtures acted as a safener on bermudagrass.
Engagement of frontline staff, along with senior leadership, in competition-style healthcare-associated infection reduction efforts, combined with electronic clinical decision support tools, appeared to reduce antibiotic regimen initiations for urinary tract infections (P = .01). Mean monthly standardized infection and device utilization ratios also decreased (P < .003 and P < .0001, respectively).
Chills and vomiting have traditionally been associated with severe bacterial infections and bacteremia. However, few modern studies have in a prospective way evaluated the association of these signs with bacteremia, which is the aim of this prospective, multicenter study. Patients presenting to the emergency department with at least one affected vital sign (increased respiratory rate, increased heart rate, altered mental status, decreased blood pressure or decreased oxygen saturation) were included. A total of 479 patients were prospectively enrolled. Blood cultures were obtained from 197 patients. Of the 32 patients with a positive blood culture 11 patients (34%) had experienced shaking chills compared with 23 (14%) of the 165 patients with a negative blood culture, P = 0.009. A logistic regression was fitted to show the estimated odds ratio (OR) for a positive blood culture according to shaking chills. In a univariate model shaking chills had an OR of 3.23 (95% CI 1.35–7.52) and in a multivariate model the OR was 5.9 (95% CI 2.05–17.17) for those without prior antibiotics adjusted for age, sex, and prior antibiotics. The presence of vomiting was also addressed, but neither a univariate nor a multivariate logistic regression showed any association between vomiting and bacteremia. In conclusion, among patients at the emergency department with at least one affected vital sign, shaking chills but not vomiting were associated with bacteremia.
Self-care disability is difficulty with or dependence on others to perform activities of daily living, such as eating and dressing. Disablement is worsening self-care disability measured over time. The disablement process model (DPM) is often used to conceptualize gerontology research on self-care disability and disablement; however, no summary of variables that align with person-level DPM constructs exists. This review summarizes the results of 88 studies to identify the nature and role of variables associated with disability and disablement in older adults according to the person-level constructs (e.g., demographic characteristics, chronic pathologies) in the DPM. It also examines the evidence for cross-sectional applications of the DPM and identifies common limitations in extant literature to address in future research. Researchers can apply these results to guide theory-driven disability and disablement research using routinely collected health data from older adults.
The co-production and co-facilitation of recovery-focused education programmes is one way in which service users may be meaningfully involved as partners.
Objectives:
To evaluate the impact of a clinician and peer co-facilitated information programme on service users’ knowledge, confidence, recovery attitudes, advocacy and hope, and to explore their experience of the programme.
Methods:
A sequential design was used involving a pre–post survey to assess changes in knowledge, confidence, advocacy, recovery attitudes and hope following programme participation. In addition, semi-structured interviews with programme participants were completed. Fifty-three participants completed both pre- and post-surveys and twelve individuals consented to interviews.
Results:
The results demonstrated statistically significant changes in service users’ knowledge about mental health issues, confidence and advocacy. These improvements were reflected in the themes which emerged from the interviews with participants (n = 12), who reported enhanced knowledge and awareness of distress and wellness, and a greater sense of hope. In addition, the peer influence helped to normalise experiences for participants, while the dual facilitation engendered equality of participation and increased the opportunity for meaningful collaboration between service users and practitioners.
Conclusions:
The evaluation highlights the potential strengths of a service user and clinician co-facilitated education programme that acknowledges and respects the difference between the knowledge gained through self-experience and the knowledge gained through formal learning.
Fomesafen is a protoporphyrinogen oxidase–inhibitor herbicide with an alternative mode of action that provides PRE weed control in strawberry [Fragaria×ananassa (Weston) Duchesne ex Rozier (pro sp.) [chiloensis×virginiana]] produced in a plasticulture setting in Florida. Plasticulture mulch could decrease fomesafen dissipation and increase crop injury in rotational crops. Field experiments were conducted in Balm, FL, to investigate fomesafen persistence and movement in soil in Florida strawberry systems for the 2014/2015 and 2015/2016 production cycles. Treatments included fomesafen preplant at 0, 0.42, and 0.84 kg ai ha−1. Soil samples were taken under the plastic from plots treated with fomesafen at 0.42 kg ha−1 throughout the production cycle. Fomesafen did not injure strawberry or decrease yield. Fomesafen concentration data for the 0.0- to 0.1-m soil depth were described using a three-parameter logistic function. The fomesafen 50% dissipation times were 37 and 47 d for the 2014/2015 and 2015/2016 production cycles, respectively. At the end of the study, fomesafen was last detected in the 0.0- to 0.1-m depth soil at 167 and 194 d after treatment in the 2014/2015 and 2015/2016 production cycles, respectively. Fomesafen concentration was less than 25 ppb on any sampling date for 0.1- to 0.2-m and 0.2- to 0.3-m depths. Fomesafen concentration decreased significantly after strawberry was transplanted and likely leached during overhead and drip irrigation used during the crop establishment.
When assessing hepatitis B virus (HBV) status in clinical settings, it is unclear whether self-reports on vaccination history and previous HBV-test results have any diagnostic capacity. Of 3997 participants in a multi-centre HBV-screening study in Paris, France, 1090 were asked questions on their last HBV-test result and vaccination history. Discordance between self-reported history compared with infection status (determined by serology) was calculated for participants claiming ‘negative’, ‘effective vaccine’, ‘past infection’, or ‘chronic infection’ HBV-status. Serological testing revealed that 320 (29.4%) were non-immunised, 576 (52.8%) were vaccinated, 173 (15.9%) had resolved the infection and 21 (1.9%) were hepatitis B surface antigen positive. In total 208/426 (48.8%) participants with a self-reported history of ‘negative’ infection had a discordant serological result, in whom 128 (61.5%) were vaccinated and 74 (35.6%) had resolved infections. A total of 153/599 (25.5%) participants self-reporting ‘effective vaccine’ had a discordant serological result, in whom 100 (65.4%) were non-immunised and 50 (32.7%) were resolved infections. Discordance for declaring ‘past’ or ‘chronic infection’ occurred in 9/55 (16.4%) and 3/10 (30.0%) individuals, respectively. In conclusion, self-reported HBV-status based on participant history is partially inadequate for determining serological HBV-status, especially between negative/vaccinated individuals. More adapted patient education about HBV-status might be helpful for certain key populations.