We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To estimate the cost-effectiveness of methicillin-resistant Staphylococcus aureus (MRSA) nares poymerase chain reaction (PCR) use in pediatric pneumonia and tracheitis.
Methods:
We built a cost-effectiveness model based on MRSA prevalence and probability of empiric treatment for MRSA pneumonia or tracheitis, with all parameters varied in sensitivity analyses. The hypothetical patient cohort was <18 years of age and hospitalized in the pediatric intensive care unit for community-acquired pneumonia (CAP) or tracheitis. Two strategies were compared: MRSA nares PCR-guided antibiotic therapy versus usual care. The primary measure was cost per incorrect treatment course avoided. Length of stay and hospital costs unrelated to antibiotic costs were assumed to be the same regardless of PCR use. Both literature data and expert estimates informed sensitivity analysis ranges.
Results:
When estimating the health care system willingness-to-pay threshold for PCR testing as $140 (varied in sensitivity analyses) per incorrect treatment course avoided, reflecting estimated additional costs of MRSA targeted antibiotics, and MRSA nares PCR true cost as $64, PCR testing was generally favored if empiric MRSA treatment likelihood was >52%. PCR was not favored in some scenarios when simultaneously varying MRSA infection prevalence and likelihood of MRSA empiric treatment. Screening becomes less favorable as MRSA PCR cost increased to the highest range value of the parameter ($88). Individual variation of MRSA colonization rates over wide ranges (0% – 30%) had lesser effects on results.
Conclusions:
MRSA nares PCR use in hospitalized pediatric patients with CAP or tracheitis was generally favored when empiric MRSA empiric treatment rates are moderate or high.
This study evaluated Medicaid claims (MC) data as a valid source for outpatient antimicrobial stewardship programs (ASPs) by comparing it to electronic medical record (EMR) data from a single academic center.
Methods:
This retrospective study compared pediatric patients’ MC data with EMR data from the Marshall Health Network (MHN). Claims were matched to EMR records based on patient Medicaid ID, service date, and provider NPI number. Demographics, antibiotic choice, diagnosis appropriateness, and guideline concordance were assessed across both data sources.
Setting:
The study was conducted within the MHN, involving multiple pediatric and family medicine outpatient practices in West Virginia, USA.
Patients:
Pediatric patients receiving care within MHN with Medicaid coverage.
Results:
MC and EMR data showed >90% agreement in antibiotic choice, gender, and date of service. Discrepancies were observed in diagnoses, especially for visits with multiple infectious diagnoses. MC data demonstrated similar accuracy to EMR data in identifying inappropriate prescriptions and assessing guideline concordance. Additionally, MC data provided timely information, enhancing the feasibility of impactful outpatient ASP interventions.
Conclusion:
MC data is a valid and timely resource for outpatient ASP interventions. Insurance providers should be leveraged as key partners to support large-scale outpatient stewardship efforts.
Quality improvement programmes (QIPs) are designed to enhance patient outcomes by systematically introducing evidence-based clinical practices. The CONQUEST QIP focuses on improving the identification and management of patients with COPD in primary care. The process of developing CONQUEST, recruiting, preparing systems for participation, and implementing the QIP across three integrated healthcare systems (IHSs) is examined to identify and share lessons learned.
Approach and development:
This review is organized into three stages: 1) development, 2) preparing IHSs for implementation, and 3) implementation. In each stage, key steps are described with the lessons learned and how they can inform others interested in developing QIPs designed to improve the care of patients with chronic conditions in primary care.
Stage 1 was establishing and working with steering committees to develop the QIP Quality Standards, define the target patient population, assess current management practices, and create a global operational protocol. Additionally, potential IHSs were assessed for feasibility of QIP integration into primary care practices. Factors assessed included a review of technological infrastructure, QI experience, and capacity for effective implementation.
Stage 2 was preparation for implementation. Key was enlisting clinical champions to advocate for the QIP, secure participation in primary care, and establish effective communication channels. Preparation for implementation required obtaining IHS approvals, ensuring Health Insurance Portability and Accountability Act compliance, and devising operational strategies for patient outreach and clinical decision support delivery.
Stage 3 was developing three IHS implementation models. With insight into the local context from local clinicians, implementation models were adapted to work with the resources and capacity of the IHSs while ensuring the delivery of essential elements of the programme.
Conclusion:
Developing and launching a QIP programme across primary care practices requires extensive groundwork, preparation, and committed local champions to assist in building an adaptable environment that encourages open communication and is receptive to feedback.
We present the Evolutionary Map of the Universe (EMU) survey conducted with the Australian Square Kilometre Array Pathfinder (ASKAP). EMU aims to deliver the touchstone radio atlas of the southern hemisphere. We introduce EMU and review its science drivers and key science goals, updated and tailored to the current ASKAP five-year survey plan. The development of the survey strategy and planned sky coverage is presented, along with the operational aspects of the survey and associated data analysis, together with a selection of diagnostics demonstrating the imaging quality and data characteristics. We give a general description of the value-added data pipeline and data products before concluding with a discussion of links to other surveys and projects and an outline of EMU’s legacy value.
Highly portable and accessible MRI technology will allow researchers to conduct field-based MRI research in community settings. Previous guidance for researchers working with fixed MRI does not address the novel ethical, legal, and societal issues (ELSI) of portable MRI (pMRI). Our interdisciplinary Working Group (WG) previously identified 15 core ELSI challenges associated with pMRI research and recommended solutions. In this article, we distill those detailed recommendations into a Portable MRI Research ELSI Checklist that offers practical operational guidance for researchers contemplating using this technology.
Clinical outcomes of repetitive transcranial magnetic stimulation (rTMS) for treatment of treatment-resistant depression (TRD) vary widely and there is no mood rating scale that is standard for assessing rTMS outcome. It remains unclear whether TMS is as efficacious in older adults with late-life depression (LLD) compared to younger adults with major depressive disorder (MDD). This study examined the effect of age on outcomes of rTMS treatment of adults with TRD. Self-report and observer mood ratings were measured weekly in 687 subjects ages 16–100 years undergoing rTMS treatment using the Inventory of Depressive Symptomatology 30-item Self-Report (IDS-SR), Patient Health Questionnaire 9-item (PHQ), Profile of Mood States 30-item, and Hamilton Depression Rating Scale 17-item (HDRS). All rating scales detected significant improvement with treatment; response and remission rates varied by scale but not by age (response/remission ≥ 60: 38%–57%/25%–33%; <60: 32%–49%/18%–25%). Proportional hazards models showed early improvement predicted later improvement across ages, though early improvements in PHQ and HDRS were more predictive of remission in those < 60 years (relative to those ≥ 60) and greater baseline IDS burden was more predictive of non-remission in those ≥ 60 years (relative to those < 60). These results indicate there is no significant effect of age on treatment outcomes in rTMS for TRD, though rating instruments may differ in assessment of symptom burden between younger and older adults during treatment.
This is the fourth comprehensive assessment of the population status of all wild bird species in Europe. It identifies Species of European Conservation Concern (SPECs) so that action can be taken to improve their status. Species are categorised according to their global extinction risk, the size and trend of their European population and range, and Europe’s global responsibility for them. Of the 546 species assessed, 207 (38%) are SPECs: 74 (14%) of global concern (SPEC 1); 32 (6%) of European concern and concentrated in Europe (SPEC 2); and 101 (18%) of European concern but not concentrated in Europe (SPEC 3). The proportion of SPECs has remained similar (38–43%) across all four assessments since 1994, but the number of SPEC 1 species of global concern has trebled. The 44 species assessed as Non-SPECs in the third assessment (2017) but as SPECs here include multiple waders, raptors and passerines that breed in arctic, boreal or alpine regions, highlighting the growing importance of northern Europe and mountain ecosystems for bird conservation. Conversely, the 62 species assessed as SPECs in 2017 but as Non-SPECs here include various large waterbirds and raptors that are recovering due to conservation action. Since 1994, the number of specially protected species (listed on Annex I of the EU Birds Directive) qualifying as SPECs has fallen by 33%, while the number of huntable (Annex II) species qualifying as SPECs has risen by 56%. The broad patterns identified previously remain evident: 100 species have been classified as SPECs in all four assessments, including numerous farmland and steppe birds, ducks, waders, raptors, seabirds and long-distance migrants. Many of their populations are heavily depleted or continue to decline and/or contract in range. Europe still holds 3.4–5.4 billion breeding birds, but more action to halt and reverse losses is needed.
People with neuropsychiatric symptoms often experience delay in accurate diagnosis. Although cerebrospinal fluid neurofilament light (CSF NfL) shows promise in distinguishing neurodegenerative disorders (ND) from psychiatric disorders (PSY), its accuracy in a diagnostically challenging cohort longitudinally is unknown.
Methods:
We collected longitudinal diagnostic information (mean = 36 months) from patients assessed at a neuropsychiatry service, categorising diagnoses as ND/mild cognitive impairment/other neurological disorders (ND/MCI/other) and PSY. We pre-specified NfL > 582 pg/mL as indicative of ND/MCI/other.
Results:
Diagnostic category changed from initial to final diagnosis for 23% (49/212) of patients. NfL predicted the final diagnostic category for 92% (22/24) of these and predicted final diagnostic category overall (ND/MCI/other vs. PSY) in 88% (187/212), compared to 77% (163/212) with clinical assessment alone.
Conclusions:
CSF NfL improved diagnostic accuracy, with potential to have led to earlier, accurate diagnosis in a real-world setting using a pre-specified cut-off, adding weight to translation of NfL into clinical practice.
Optimum nutrition plays a major role in the achievement and maintenance of good health. The Nutrition Society of the UK and Ireland and the Sabri Ülker Foundation, a charity based in Türkiye and focused on improving public health, combined forces to highlight this important subject. A hybrid conference was held in Istanbul, with over 4000 delegates from sixty-two countries joining the proceedings live online in addition to those attending in person. The primary purpose was to inspire healthcare professionals and nutrition policy makers to better consider the role of nutrition in their interactions with patients and the public at large to reduce the prevalence of non-communicable diseases such as obesity and type 2 diabetes. The event provided an opportunity to share and learn from different approaches in the UK, Türkiye and Finland, highlighting initiatives to strengthen research in the nutritional sciences and translation of that research into nutrition policy. The presenters provided evidence of the links between nutrition and disease risk and emphasised the importance of minimising risk and implementing early treatment of diet-related disease. Suggestions were made including improving health literacy and strengthening policies to improve the quality of food production and dietary behaviour. A multidisciplinary approach is needed whereby Governments, the food industry, non-governmental groups and consumer groups collaborate to develop evidence-based recommendations and appropriate joined-up policies that do not widen inequalities. This summary of the proceedings will serve as a gateway for those seeking to access additional information on nutrition and health across the globe.
To describe national trends in testing and detection of carbapenemasesproduced by carbapenem-resistant Enterobacterales (CRE) and associatetesting with culture and facility characteristics.
Design:
Retrospective cohort study.
Setting:
Department of Veterans’ Affairs medical centers (VAMCs).
Participants:
Patients seen at VAMCs between 2013 and 2018 with cultures positive for CRE,defined by national VA guidelines.
Interventions:
Microbiology and clinical data were extracted from national VA data sets.Carbapenemase testing was summarized using descriptive statistics.Characteristics associated with carbapenemase testing were assessed withbivariate analyses.
Results:
Of 5,778 standard cultures that grew CRE, 1,905 (33.0%) had evidence ofmolecular or phenotypic carbapenemase testing and 1,603 (84.1%) of these hadcarbapenemases detected. Among these cultures confirmed ascarbapenemase-producing CRE, 1,053 (65.7%) had molecular testing for≥1 gene. Almost all testing included KPC (n = 1,047, 99.4%), with KPCdetected in 914 of 1,047 (87.3%) cultures. Testing and detection of otherenzymes was less frequent. Carbapenemase testing increased over the studyperiod from 23.5% of CRE cultures in 2013 to 58.9% in 2018. The South USCensus region (38.6%) and the Northeast (37.2%) region had the highestproportion of CRE cultures with carbapenemase testing. High complexity (vslow) and urban (vs rural) facilities were significantly associated withcarbapenemase testing (P < .0001).
Conclusions:
Between 2013 and 2018, carbapenemase testing and detection increased in theVA, largely reflecting increased testing and detection of KPC. Surveillanceof other carbapenemases is important due to global spread and increasingantibiotic resistance. Efforts supporting the expansion of carbapenemasetesting to low-complexity, rural healthcare facilities and standardizationof reporting of carbapenemase testing are needed.
To evaluate opportunities for assessing penicillin allergies among patients presenting to dental clinics.
Design:
Retrospective cross-sectional study.
Setting:
VA dental clinics.
Patients:
Adult patients with a documented penicillin allergy who received an antibiotic from a dentist between January 1, 2015, and December 31, 2018, were included.
Methods:
Chart reviews were completed on random samples of 100 patients who received a noncephalosporin antibiotic and 200 patients who received a cephalosporin. Each allergy was categorized by severity. These categories were used to determine patient eligibility for 3 testing groups based on peer-reviewed algorithms: (1) no testing, (2) skin testing, and (3) oral test-dose challenge. Descriptive and bivariate statistics were used to compare facility and patient demographics first between true penicillin allergy, pseudo penicillin allergy, and missing allergy documentation, and between those who received a cephalosporin and those who did not at the dental visit.
Results:
Overall, 19% lacked documentation of the nature of allergic reaction, 53% were eligible for skin testing, 27% were eligible for an oral test-dose challenge, and 1% were contraindicated from testing. Male patients and African American patients were less likely to receive a cephalosporin.
Conclusions:
Most penicillin-allergic patients in the VA receiving an antibiotic from a dentist are eligible for penicillin skin testing or an oral penicillin challenge. Further research is needed to understand the role of dentists and dental clinics in assessing penicillin allergies.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Healthcare personnel (HCP) with unprotected exposures to aerosol-generating procedures (AGPs) on patients with coronavirus disease 2019 (COVID-19) are at risk of infection with severe acute respiratory coronavirus virus 2 (SARS-CoV-2). A retrospective review at an academic medical center demonstrated an infection rate of <1% among HCP involved in AGPs without a respirator and/or eye protection.
We present the data and initial results from the first pilot survey of the Evolutionary Map of the Universe (EMU), observed at 944 MHz with the Australian Square Kilometre Array Pathfinder (ASKAP) telescope. The survey covers $270 \,\mathrm{deg}^2$ of an area covered by the Dark Energy Survey, reaching a depth of 25–30 $\mu\mathrm{Jy\ beam}^{-1}$ rms at a spatial resolution of $\sim$11–18 arcsec, resulting in a catalogue of $\sim$220 000 sources, of which $\sim$180 000 are single-component sources. Here we present the catalogue of single-component sources, together with (where available) optical and infrared cross-identifications, classifications, and redshifts. This survey explores a new region of parameter space compared to previous surveys. Specifically, the EMU Pilot Survey has a high density of sources, and also a high sensitivity to low surface brightness emission. These properties result in the detection of types of sources that were rarely seen in or absent from previous surveys. We present some of these new results here.
We describe the glacial geomorphology and initial geochronology of two ice-free valley systems within the Neptune Range of the Pensacola Mountains, Antarctica. These valleys are characterized by landforms associated with formerly more expanded ice sheet(s) that were at least 200 m thicker than at present. The most conspicuous features are areas of supraglacial debris, discrete debris accumulations separated from modern-day ice and curvilinear ridges and mounds. The landsystem bears similarities to debris-rich cold-based glacial landsystems described elsewhere in Antarctica and the Arctic where buried ice is prevalent. Geochronological data demonstrate multiple phases of ice expansion. The oldest, occurring > 3 Ma, overtopped much of the landscape. Subsequent, less expansive advances into the valleys occurred > 2 Ma and > ~1 Ma. An expansion of some local glaciers occurred < 250 ka. This sequence of glacial stages is similar to that described from the northernmost massif of the Pensacola Mountains (Dufek Massif), suggesting that it represents a regional signal of ice-sheet evolution over the Plio-Pleistocene. The geomorphological record and its evolution over millions of years makes the Neptune Range valleys an area worthy of future research and we highlight potential avenues for this.
United States dentists prescribe 10% of all outpatient antibiotics. Assessing appropriateness of antibiotic prescribing has been challenging due to a lack of guidelines for oral infections. In 2019, the American Dental Association (ADA) published clinical practice guidelines (CPG) on the management of acute oral infections. Our objective was to describe baseline national antibiotic prescribing for acute oral infections prior to the release of the ADA CPG and to identify patient-level variables associated with an antibiotic prescription.
Design:
Cross-sectional analysis.
Methods:
We performed an analysis of national VA data from January 1, 2017, to December 31, 2017. We identified cases of acute oral infections using International Classification of Disease, Tenth Revision, Clinical Modification (ICD-10-CM) codes. Antibiotics prescribed by a dentist within ±7 days of a visit were included. Multivariable logistic regression identified patient-level variables associated with an antibiotic prescription.
Results:
Of the 470,039 VA dental visits with oral infections coded, 12% of patient visits with irreversible pulpitis, 17% with apical periodontitis, and 28% with acute apical abscess received antibiotics. Although the median days’ supply was 7, prolonged use of antibiotics was frequent (≥8 days, 42%–49%). Patients with high-risk cardiac conditions, prosthetic joints, and endodontic, implant, and oral and maxillofacial surgery dental procedures were more likely to receive antibiotics.
Conclusions:
Most treatments of irreversible pulpitis and apical periodontitis cases were concordant with new ADA guidelines. However, in cases where antibiotics were prescribed, prolonged antibiotic courses >7 days were frequent. These findings demonstrate opportunities for the new ADA guidelines to standardize and improve dental prescribing practices.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.