We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Racial and ethnic variations in antibiotic utilization are well-reported in outpatient settings but little is known about inpatient settings. Our objective was to describe national inpatient antibiotic utilization among children by race and ethnicity.
Methods:
This study included hospital visit data from the Pediatric Health Information System between 01/01/2022 and 12/31/2022 for patients <20 years. Primary outcomes were the percentage of hospitalization encounters that received an antibiotic and antibiotic days of therapy (DOT) per 1000 patient days. Mixed-effect regression models were used to determine the association of race-ethnicity with outcomes, adjusting for covariates.
Results:
There were 846,530 hospitalizations. 45.2% of children were Non-Hispanic (NH) White, 27.1% were Hispanic, 19.2% were NH Black, 4.5% were NH Other, 3.5% were NH Asian, 0.3% were NH Native Hawaiian/Other Pacific Islander (NHPI) and 0.2% were NH American Indian. Adjusting for covariates, NH Black children had lower odds of receiving antibiotics compared to NH White children (aOR 0.96, 95%CI 0.94–0.97), while NH NHPI had higher odds of receiving antibiotics (aOR 1.16, 95%CI 1.05–1.29). Children who were Hispanic, NH Asian, NH American Indian, and children who were NH Other received antibiotic DOT compared to NH White children, while NH NHPI children received more antibiotic DOT.
Conclusions:
Antibiotic utilization in children’s hospitals differs by race and ethnicity. Hospitals should assess policies and practices that may contribute to disparities in treatment; antibiotic stewardship programs may play an important role in promoting inpatient pharmacoequity. Additional research is needed to examine individual diagnoses, clinical outcomes, and drivers of variation.
The terminal Ediacaran Period is signaled worldwide by the first appearance of skeletonizing tubular metazoan fossils, e.g., Cloudina Germs, 1972 and Sinotubulites Chen, Chen, and Qian, 1981. Although recent efforts have focused on evaluating the taxic composition and preservation of such assemblages from the southwestern United States, comparable forms reported in the 1980s from Mexico remain to be re-examined. Here, we reassess the latest Ediacaran skeletal materials from the La Ciénega Formation of the Caborca region in Sonora, Mexico, using a combination of analytical methods: optical microscopy of extracted fossils, thin-section petrography, scanning electron microscopy and energy dispersive X-ray spectroscopy, and X-ray tomographic microscopy. From our examination, we conclude that the La Ciénega hosts a polytaxic assemblage of latest Ediacaran tubular organisms that have been preserved through two taphonomic pathways: coarse silicification and calcareous recrystallization preserving finer details. Further, these fossils show signs that their shells might not have been inflexible or completely mineralized in vivo, and that they might also record tentatively interpreted predation traces in the form of drill holes or puncture marks. This work, along with ongoing efforts around the world, helps to provide a framework for biostratigraphic correlation and possible subdivision of the Ediacaran Period, and further shapes our view of metazoan evolution and ecology in the interval directly preceding the Cambrian explosion.
The status of the genera Euparagonimus Chen, 1963 and Pagumogonimus Chen, 1963 relative to Paragonimus Braun, 1899 was investigated using DNA sequences from the mitochondrial cytochrome c oxidase subunit I (CO1) gene (partial) and the nuclear ribosomal DNA second internal transcribed spacer (ITS2). In the phylogenetic trees constructed, the genus Pagumogonimus is clearly not monophyletic and therefore not a natural taxon. Indeed, the type species of Pagumogonimus,P. skrjabini from China, is very closely related to Paragonimusmiyazakii from Japan. The status of Euparagonimus is less obvious. Euparagonimus cenocopiosus lies distant from other lungflukes included in the analysis. It can be placed as sister to Paragonimus in some analyses and falls within the genus in others. A recently published morphological study placed E. cenocopiosus within the genus Paragonimus and probably this is where it should remain.
As the scale of cosmological surveys increases, so does the complexity in the analyses. This complexity can often make it difficult to derive the underlying principles, necessitating statistically rigorous testing to ensure the results of an analysis are consistent and reasonable. This is particularly important in multi-probe cosmological analyses like those used in the Dark Energy Survey (DES) and the upcoming Legacy Survey of Space and Time, where accurate uncertainties are vital. In this paper, we present a statistically rigorous method to test the consistency of contours produced in these analyses and apply this method to the Pippin cosmological pipeline used for type Ia supernova cosmology with the DES. We make use of the Neyman construction, a frequentist methodology that leverages extensive simulations to calculate confidence intervals, to perform this consistency check. A true Neyman construction is too computationally expensive for supernova cosmology, so we develop a method for approximating a Neyman construction with far fewer simulations. We find that for a simulated dataset, the 68% contour reported by the Pippin pipeline and the 68% confidence region produced by our approximate Neyman construction differ by less than a percent near the input cosmology; however, they show more significant differences far from the input cosmology, with a maximal difference of 0.05 in $\Omega_{M}$ and 0.07 in w. This divergence is most impactful for analyses of cosmological tensions, but its impact is mitigated when combining supernovae with other cross-cutting cosmological probes, such as the cosmic microwave background.
The quenching of cluster satellite galaxies is inextricably linked to the suppression of their cold interstellar medium (ISM) by environmental mechanisms. While the removal of neutral atomic hydrogen (H i) at large radii is well studied, how the environment impacts the remaining gas in the centres of galaxies, which are dominated by molecular gas, is less clear. Using new observations from the Virgo Environment traced in CO survey (VERTICO) and archival H i data, we study the H i and molecular gas within the optical discs of Virgo cluster galaxies on 1.2-kpc scales with spatially resolved scaling relations between stellar ($\Sigma_{\star}$), H i ($\Sigma_{\text{H}\,{\small\text{I}}}$), and molecular gas ($\Sigma_{\text{mol}}$) surface densities. Adopting H i deficiency as a measure of environmental impact, we find evidence that, in addition to removing the H i at large radii, the cluster processes also lower the average $\Sigma_{\text{H}\,{\small\text{I}}}$ of the remaining gas even in the central $1.2\,$kpc. The impact on molecular gas is comparatively weaker than on the H i, and we show that the lower $\Sigma_{\text{mol}}$ gas is removed first. In the most H i-deficient galaxies, however, we find evidence that environmental processes reduce the typical $\Sigma_{\text{mol}}$ of the remaining gas by nearly a factor of 3. We find no evidence for environment-driven elevation of $\Sigma_{\text{H}\,{\small\text{I}}}$ or $\Sigma_{\text{mol}}$ in H i-deficient galaxies. Using the ratio of $\Sigma_{\text{mol}}$-to-$\Sigma_{\text{H}\,{\small\text{I}}}$ in individual regions, we show that changes in the ISM physical conditions, estimated using the total gas surface density and midplane hydrostatic pressure, cannot explain the observed reduction in molecular gas content. Instead, we suggest that direct stripping of the molecular gas is required to explain our results.
Rabies virus (RABV) is a deadly zoonosis that circulates in wild carnivore populations in North America. Intensive management within the USA and Canada has been conducted to control the spread of the raccoon (Procyon lotor) variant of RABV and work towards elimination. We examined RABV occurrence across the northeastern USA and southeastern Québec, Canada during 2008–2018 using a multi-method, dynamic occupancy model. Using a 10 km × 10 km grid overlaid on the landscape, we examined the probability that a grid cell was occupied with RABV and relationships with management activities (oral rabies vaccination (ORV) and trap-vaccinate-release efforts), habitat, neighbour effects and temporal trends. We compared raccoon RABV detection probabilities between different surveillance samples (e.g. animals that are strange acting, road-kill, public health samples). The management of RABV through ORV was found to be the greatest driver in reducing the occurrence of rabies on the landscape. Additionally, RABV occupancy declined further with increasing duration of ORV baiting programmes. Grid cells north of ORV management were at or near elimination ($\hat{\psi }_{{\rm north}}$ = 0.00, s.e. = 0.15), managed areas had low RABV occupancy ($\hat{\psi }_{{\rm managed}}$ = 0.20, s.e. = 0.29) and enzootic areas had the highest level of RABV occupancy ($\hat{\psi }_{{\rm south}}$ = 0.83, s.e. = 0.06). These results provide evidence that past management actions have been being successful at the goals of reducing and controlling the raccoon variant of RABV. At a finer scale we also found that vaccine bait type and bait density impacted RABV occupancy. Detection probabilities varied; samples from strange acting animals and public health had the highest detection rates. Our results support the movement of the ORV zone south within the USA due to high elimination probabilities along the US border with Québec. Additional enhanced rabies surveillance is still needed to ensure elimination is maintained.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Method
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Results
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
Conclusions
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.
Describe nutrition and physical activity practices, nutrition self-efficacy and barriers and food programme knowledge within Family Child Care Homes (FCCH) and differences by staffing.
Design:
Baseline, cross-sectional analyses of the Happy Healthy Homes randomised trial (NCT03560050).
Setting:
FCCH in Oklahoma, USA.
Participants:
FCCH providers (n 49, 100 % women, 30·6 % Non-Hispanic Black, 2·0 % Hispanic, 4·1 % American Indian/Alaska Native, 51·0 % Non-Hispanic white, 44·2 ± 14·2 years of age. 53·1 % had additional staff) self-reported nutrition and physical activity practices and policies, nutrition self-efficacy and barriers and food programme knowledge. Differences between providers with and without additional staff were adjusted for multiple comparisons (P < 0·01).
Results:
The prevalence of meeting all nutrition and physical activity best practices ranged from 0·0–43·8 % to 4·1–16·7 %, respectively. Average nutrition and physical activity scores were 3·2 ± 0·3 and 3·0 ± 0·5 (max 4·0), respectively. Sum nutrition and physical activity scores were 137·5 ± 12·6 (max 172·0) and 48·4 ± 7·5 (max 64·0), respectively. Providers reported high nutrition self-efficacy and few barriers. The majority of providers (73·9–84·7 %) felt that they could meet food programme best practices; however, knowledge of food programme best practices was lower than anticipated (median 63–67 % accuracy). More providers with additional staff had higher self-efficacy in family-style meal service than did those who did not (P = 0·006).
Conclusions:
Providers had high self-efficacy in meeting nutrition best practices and reported few barriers. While providers were successfully meeting some individual best practices, few met all. Few differences were observed between FCCH providers with and without additional staff. FCCH providers need additional nutrition training on implementation of best practices.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
We present an overview of the Middle Ages Galaxy Properties with Integral Field Spectroscopy (MAGPI) survey, a Large Program on the European Southern Observatory Very Large Telescope. MAGPI is designed to study the physical drivers of galaxy transformation at a lookback time of 3–4 Gyr, during which the dynamical, morphological, and chemical properties of galaxies are predicted to evolve significantly. The survey uses new medium-deep adaptive optics aided Multi-Unit Spectroscopic Explorer (MUSE) observations of fields selected from the Galaxy and Mass Assembly (GAMA) survey, providing a wealth of publicly available ancillary multi-wavelength data. With these data, MAGPI will map the kinematic and chemical properties of stars and ionised gas for a sample of 60 massive (${>}7 \times 10^{10} {\mathrm{M}}_\odot$) central galaxies at $0.25 < z <0.35$ in a representative range of environments (isolated, groups and clusters). The spatial resolution delivered by MUSE with Ground Layer Adaptive Optics ($0.6-0.8$ arcsec FWHM) will facilitate a direct comparison with Integral Field Spectroscopy surveys of the nearby Universe, such as SAMI and MaNGA, and at higher redshifts using adaptive optics, for example, SINS. In addition to the primary (central) galaxy sample, MAGPI will deliver resolved and unresolved spectra for as many as 150 satellite galaxies at $0.25 < z <0.35$, as well as hundreds of emission-line sources at $z < 6$. This paper outlines the science goals, survey design, and observing strategy of MAGPI. We also present a first look at the MAGPI data, and the theoretical framework to which MAGPI data will be compared using the current generation of cosmological hydrodynamical simulations including EAGLE, Magneticum, HORIZON-AGN, and Illustris-TNG. Our results show that cosmological hydrodynamical simulations make discrepant predictions in the spatially resolved properties of galaxies at $z\approx 0.3$. MAGPI observations will place new constraints and allow for tangible improvements in galaxy formation theory.
Maternal nutrition is critical in mammalian development, influencing the epigenetic reprogramming of gametes, embryos, and fetal programming. We evaluated the effects of different levels of sulfur (S) and cobalt (Co) in the maternal diet throughout the pre- and periconceptional periods on the biochemical and reproductive parameters of the donors and the DNA methylome of the progeny in Bos indicus cattle. The low-S/Co group differed from the control with respect to homocysteine, folic acid, B12, insulin growth factor 1, and glucose. The oocyte yield was lower in heifers from the low S/Co group than that in the control heifers. Embryos from the low-S/Co group exhibited 2320 differentially methylated regions (DMRs) across the genome compared with the control embryos. We also characterized candidate DMRs linked to the DNMT1 and DNMT3B genes in the blood and sperm cells of the adult progeny. A DMR located in DNMT1 that was identified in embryos remained differentially methylated in the sperm of the progeny from the low-S/Co group. Therefore, we associated changes in specific compounds in the maternal diet with DNA methylation modifications in the progeny. Our results help to elucidate the impact of maternal nutrition on epigenetic reprogramming in livestock, opening new avenues of research to study the effect of disturbed epigenetic patterns in early life on health and fertility in adulthood. Considering that cattle are physiologically similar to humans with respect to gestational length, our study may serve as a model for studies related to the developmental origin of health and disease in humans.
Susceptibility to infection such as SARS-CoV-2 may be influenced by host genotype. TwinsUK volunteers (n = 3261) completing the C-19 COVID-19 symptom tracker app allowed classical twin studies of COVID-19 symptoms, including predicted COVID-19, a symptom-based algorithm to predict true infection, derived from app users tested for SARS-CoV-2. We found heritability of 49% (32−64%) for delirium; 34% (20−47%) for diarrhea; 31% (8−52%) for fatigue; 19% (0−38%) for anosmia; 46% (31−60%) for skipped meals and 31% (11−48%) for predicted COVID-19. Heritability estimates were not affected by cohabiting or by social deprivation. The results suggest the importance of host genetics in the risk of clinical manifestations of COVID-19 and provide grounds for planning genome-wide association studies to establish specific genes involved in viral infectivity and the host immune response.
To examine children’s sugar-sweetened beverage (SSB) and water intakes in relation to implemented intervention activities across the social ecological model (SEM) during a multilevel community trial.
Design:
Children’s Healthy Living was a multilevel, multicomponent community trial that reduced young child obesity (2013–2015). Baseline and 24-month cross-sectional data were analysed from nine intervention arm communities. Implemented intervention activities targeting reduced SSB and increased water consumption were coded by SEM level (child, caregiver, organisation, community and policy). Child SSB and water intakes were assessed by caregiver-completed 2-day dietary records. Multilevel linear regression models examined associations of changes in beverage intakes with activity frequencies at each SEM level.
Setting:
US-Affiliated Pacific region.
Participants:
Children aged 2–8 years (baseline: n 1343; 24 months: n 1158).
Results:
On average (± sd), communities implemented 74 ± 39 SSB and 72 ± 40 water activities. More than 90 % of activities targeted both beverages together. Community-level activities (e.g. social marketing campaign) were most common (61 % of total activities), and child-level activities (e.g. sugar counting game) were least common (4 %). SSB activities across SEM levels were not associated with SSB intake changes. Additional community-level water activities were associated with increased water intake (0·62 ml/d/activity; 95 % CI: 0·09, 1·15) and water-for-SSB substitution (operationalised as SSB minus water: –0·88 ml/d/activity; 95 % CI: –1·72, –0·03). Activities implemented at the organization level (e.g. strengthening preschool wellness guidelines) and policy level (e.g. SSB tax advocacy) also suggested greater water-for-SSB substitution (P < 0·10).
Conclusions:
Community-level intervention activities were associated with increased water intake, alone and relative to SSB intake, among young children in the Pacific region.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Background: With the emergence of antibiotic resistant threats and the need for appropriate antibiotic use, laboratory microbiology information is important to guide clinical decision making in nursing homes, where access to such data can be limited. Susceptibility data are necessary to inform antibiotic selection and to monitor changes in resistance patterns over time. To contribute to existing data that describe antibiotic resistance among nursing home residents, we summarized antibiotic susceptibility data from organisms commonly isolated from urine cultures collected as part of the CDC multistate, Emerging Infections Program (EIP) nursing home prevalence survey. Methods: In 2017, urine culture and antibiotic susceptibility data for selected organisms were retrospectively collected from nursing home residents’ medical records by trained EIP staff. Urine culture results reported as negative (no growth) or contaminated were excluded. Susceptibility results were recorded as susceptible, non-susceptible (resistant or intermediate), or not tested. The pooled mean percentage tested and percentage non-susceptible were calculated for selected antibiotic agents and classes using available data. Susceptibility data were analyzed for organisms with ≥20 isolates. The definition for multidrug-resistance (MDR) was based on the CDC and European Centre for Disease Prevention and Control’s interim standard definitions. Data were analyzed using SAS v 9.4 software. Results: Among 161 participating nursing homes and 15,276 residents, 300 residents (2.0%) had documentation of a urine culture at the time of the survey, and 229 (76.3%) were positive. Escherichia coli, Proteus mirabilis, Klebsiella spp, and Enterococcus spp represented 73.0% of all urine isolates (N = 278). There were 215 (77.3%) isolates with reported susceptibility data (Fig. 1). Of these, data were analyzed for 187 (87.0%) (Fig. 2). All isolates tested for carbapenems were susceptible. Fluoroquinolone non-susceptibility was most prevalent among E. coli (42.9%) and P. mirabilis (55.9%). Among Klebsiella spp, the highest percentages of non-susceptibility were observed for extended-spectrum cephalosporins and folate pathway inhibitors (25.0% each). Glycopeptide non-susceptibility was 10.0% for Enterococcus spp. The percentage of isolates classified as MDR ranged from 10.1% for E. coli to 14.7% for P. mirabilis. Conclusions: Substantial levels of non-susceptibility were observed for nursing home residents’ urine isolates, with 10% to 56% reported as non-susceptible to the antibiotics assessed. Non-susceptibility was highest for fluoroquinolones, an antibiotic class commonly used in nursing homes, and ≥ 10% of selected isolates were MDR. Our findings reinforce the importance of nursing homes using susceptibility data from laboratory service providers to guide antibiotic prescribing and to monitor levels of resistance.
Background: Antibiotics are among the most commonly prescribed drugs in nursing homes; urinary tract infections (UTIs) are a frequent indication. Although there is no gold standard for the diagnosis of UTIs, various criteria have been developed to inform and standardize nursing home prescribing decisions, with the goal of reducing unnecessary antibiotic prescribing. Using different published criteria designed to guide decisions on initiating treatment of UTIs (ie, symptomatic, catheter-associated, and uncomplicated cystitis), our objective was to assess the appropriateness of antibiotic prescribing among NH residents. Methods: In 2017, the CDC Emerging Infections Program (EIP) performed a prevalence survey of healthcare-associated infections and antibiotic use in 161 nursing homes from 10 states: California, Colorado, Connecticut, Georgia, Maryland, Minnesota, New Mexico, New York, Oregon, and Tennessee. EIP staff reviewed resident medical records to collect demographic and clinical information, infection signs, symptoms, and diagnostic testing documented on the day an antibiotic was initiated and 6 days prior. We applied 4 criteria to determine whether initiation of treatment for UTI was supported: (1) the Loeb minimum clinical criteria (Loeb); (2) the Suspected UTI Situation, Background, Assessment, and Recommendation tool (UTI SBAR tool); (3) adaptation of Infectious Diseases Society of America UTI treatment guidelines for nursing home residents (Crnich & Drinka); and (4) diagnostic criteria for uncomplicated cystitis (cystitis consensus) (Fig. 1). We calculated the percentage of residents for whom initiating UTI treatment was appropriate by these criteria. Results: Of 248 residents for whom UTI treatment was initiated in the nursing home, the median age was 79 years [IQR, 19], 63% were female, and 35% were admitted for postacute care. There was substantial variability in the percentage of residents with antibiotic initiation classified as appropriate by each of the criteria, ranging from 8% for the cystitis consensus, to 27% for Loeb, to 33% for the UTI SBAR tool, to 51% for Crnich and Drinka (Fig. 2). Conclusions: Appropriate initiation of UTI treatment among nursing home residents remained low regardless of criteria used. At best only half of antibiotic treatment met published prescribing criteria. Although insufficient documentation of infection signs, symptoms and testing may have contributed to the low percentages observed, adequate documentation in the medical record to support prescribing should be standard practice, as outlined in the CDC Core Elements of Antibiotic Stewardship for nursing homes. Standardized UTI prescribing criteria should be incorporated into nursing home stewardship activities to improve the assessment and documentation of symptomatic UTI and to reduce inappropriate antibiotic use.
To disrupt cycles of health inequity, traceable to dietary inequities in the earliest stages of life, public health interventions should target improving nutritional wellbeing in preconception/pregnancy environments. This requires a deep engagement with pregnant/postpartum people (PPP) and their communities (including their health and social care providers, HSCP). We sought to understand the factors that influence diet during pregnancy from the perspectives of PPP and HSCP, and to outline intervention priorities.
Design:
We carried out thematic network analyses of transcripts from ten focus group discussions (FGD) and one stakeholder engagement meeting with PPP and HSCP in a Canadian city. Identified themes were developed into conceptual maps, highlighting local priorities for pregnancy nutrition and intervention development.
Setting:
FGD and the stakeholder meeting were run in predominantly lower socioeconomic position (SEP) neighbourhoods in the sociodemographically diverse city of Hamilton, Canada.
Participants:
All local, comprising twenty-two lower SEP PPP and forty-three HSCP.
Results:
Salient themes were resilience, resources, relationships and the embodied experience of pregnancy. Both PPP and HSCP underscored that socioeconomic-political forces operating at multiple levels largely determined the availability of individual and relational resources constraining diet during pregnancy. Intervention proposals focused on cultivating individual and community resilience to improve early-life nutritional environments. Participants called for better-integrated services, greater income supports and strengthened support programmes.
Conclusions:
Hamilton stakeholders foregrounded social determinants of inequity as main factors influencing pregnancy diet. They further indicated a need to develop interventions that build resilience and redistribute resources at multiple levels, from the household to the state.
Introduction: CAEP recently developed the acute atrial fibrillation (AF) and flutter (AFL) [AAFF] Best Practices Checklist to promote optimal care and guidance on cardioversion and rapid discharge of patients with AAFF. We sought to assess the impact of implementing the Checklist into large Canadian EDs. Methods: We conducted a pragmatic stepped-wedge cluster randomized trial in 11 large Canadian ED sites in five provinces, over 14 months. All hospitals started in the control period (usual care), and then crossed over to the intervention period in random sequence, one hospital per month. We enrolled consecutive, stable patients presenting with AAFF, where symptoms required ED management. Our intervention was informed by qualitative stakeholder interviews to identify perceived barriers and enablers for rapid discharge of AAFF patients. The many interventions included local champions, presentation of the Checklist to physicians in group sessions, an online training module, a smartphone app, and targeted audit and feedback. The primary outcome was length of stay in ED in minutes from time of arrival to time of disposition, and this was analyzed at the individual patient-level using linear mixed effects regression accounting for the stepped-wedge design. We estimated a sample size of 800 patients. Results: We enrolled 844 patients with none lost to follow-up. Those in the control (N = 316) and intervention periods (N = 528) were similar for all characteristics including mean age (61.2 vs 64.2 yrs), duration of AAFF (8.1 vs 7.7 hrs), AF (88.6% vs 82.9%), AFL (11.4% vs 17.1%), and mean initial heart rate (119.6 vs 119.9 bpm). Median lengths of stay for the control and intervention periods respectively were 413.0 vs. 354.0 minutes (P < 0.001). Comparing control to intervention, there was an increase in: use of antiarrhythmic drugs (37.4% vs 47.4%; P < 0.01), electrical cardioversion (45.1% vs 56.8%; P < 0.01), and discharge in sinus rhythm (75.3% vs. 86.7%; P < 0.001). There was a decrease in ED consultations to cardiology and medicine (49.7% vs 41.1%; P < 0.01), but a small but insignificant increase in anticoagulant prescriptions (39.6% vs 46.5%; P = 0.21). Conclusion: This multicenter implementation of the CAEP Best Practices Checklist led to a significant decrease in ED length of stay along with more ED cardioversions, fewer ED consultations, and more discharges in sinus rhythm. Widespread and rigorous adoption of the CAEP Checklist should lead to improved care of AAFF patients in all Canadian EDs.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.