We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Prospective audit and feedback (PAF) is an established practice in critical care settings but not in surgical populations. We pilot-tested a structured face-to-face PAF program for our acute-care surgery (ACS) service.
Methods:
This was a mixed-methods study. For the quantitative analysis, the structured PAF period was from August 1, 2017, to April 30, 2019. The ad hoc PAF period was from May 1, 2019, to January 31, 2021. Interrupted time-series segmented negative binomial regression analysis was used to evaluate change in antimicrobial usage measured in days of therapy per 1,000 patient days for all systemic and targeted antimicrobials. Secondary outcomes included C. difficile infections, length of stay and readmission within 30 days. Each secondary outcome was analyzed using a logistic regression or negative binomial regression model. For the qualitative analyses, all ACS surgeons and trainees from November 23, 2015, to April 30, 2019, were invited to participate in an email-based anonymous survey developed using implementation science principles. Responses were measured using counts.
Results:
In total, 776 ACS patients were included in the structured PAF period and 783 patients were included in the in ad hoc PAF period. No significant changes in level or trend for antimicrobial usage were detected for all and targeted antimicrobials. Similarly, no significant differences were detected for secondary outcomes. The survey response rate was 25% (n = 10). Moreover, 50% agreed that PAF provided them with skills to use antimicrobials more judiciously, and 80% agreed that PAF improved the quality of antimicrobial treatment for their patients.
Conclusion:
Structured PAF showed clinical outcomes similar to ad hoc PAF. Structured PAF was well received and was perceived as beneficial by surgical staff.
To describe the evolution of respiratory antibiotic prescribing during the coronavirus disease 2019 (COVID-19) pandemic across 3 large hospitals that maintained antimicrobial stewardship services throughout the pandemic.
Design:
Retrospective interrupted time-series analysis.
Setting:
A multicenter study was conducted including medical and intensive care units (ICUs) from 3 hospitals within a Canadian epicenter for COVID-19.
Methods:
Interrupted time-series analysis was used to analyze rates of respiratory antibiotic utilization measured in days of therapy per 1,000 patient days (DOT/1,000 PD) in medical units and ICUs. Each of the first 3 waves of the pandemic were compared to the baseline.
Results:
Within the medical units, use of respiratory antibiotics increased during the first wave of the pandemic (rate ratio [RR], 1.76; 95% CI, 1.38–2.25) but returned to the baseline in waves 2 and 3 despite more COVID-19 admissions. In ICU, the use of respiratory antibiotics increased in wave 1 (RR, 1.30; 95% CI, 1.16–1.46) and wave 2 of the pandemic (RR, 1.21; 95% CI, 1.11–1.33) and returned to the baseline in the third wave, which had the most COVID-19 admissions.
Conclusions:
After an initial surge in respiratory antibiotic prescribing, we observed the normalization of prescribing trends at 3 large hospitals throughout the COVID-19 pandemic. This trend may have been due to the timely generation of new research and guidelines developed with frontline clinicians, allowing for the active application of new research to clinical practice.
We evaluated the impact of introducing a mandatory indication field into electronic order entry for targeted antibiotics in adult inpatients.
Design:
Retrospective, before-and-after trial.
Setting:
A 400-bed community hospital.
Interventions:
All adult electronic intravenous (IV) and enteral orders for targeted antibiotics (moxifloxacin, ciprofloxacin, clindamycin, vancomycin, and metronidazole) had a mandatory indication field added. Control antibiotics (amoxicillin-clavulanate, ceftriaxone and piperacillin-tazobactam) were chosen to track shifts in antibiotic prescribing due to the introduction of mandatory indication field.
Methods:
Descriptive statistics were used to summarize the primary outcome, measured in Defined Daily Doses (DDD) per 1000 patient days (PD). Interrupted time-series (ITS) analysis was performed to compare levels and trends in antibiotic usage of targeted and control antibiotics during 24 months before and after the intervention. Additionally, a descriptive analysis of mandatory indication fields for targeted antibiotics in the postintervention period was conducted.
Results:
In total, 4,572 study antibiotic orders were evaluated after the intervention. Preset mandatory indications were selected for 30%–55% of orders. There was decreased usage of targeted antibiotics (mean, 92.02 vs 72.07 DDD/1000-PD) with increased usage of control antibiotics (mean, 102.73 vs 119.91 DDD/1000-PD). ITS analysis showed no statistically significant difference in overall antibiotic usage before and after the intervention for all targeted antibiotics.
Conclusion:
This study showed moderate use of preset mandatory indications, suggesting that the preset list of indications can be optimized. There was no impact on overall antibiotic usage with the use of mandatory indications. More prospective research is needed to study the utility of this intervention in different contexts.
An accurate estimate of the average number of hand hygiene opportunities per patient hour (HHO rate) is required to implement group electronic hand hygiene monitoring systems (GEHHMSs). We sought to identify predictors of HHOs to validate and implement a GEHHMS across a network of critical care units.
Design:
Multicenter, observational study (10 hospitals) followed by quality improvement intervention involving 24 critical care units across 12 hospitals in Ontario, Canada.
Methods:
Critical care patient beds were randomized to receive 1 hour of continuous direct observation to determine the HHO rate. A Poisson regression model determined unit-level predictors of HHOs. Estimates of average HHO rates across different types of critical care units were derived and used to implement and evaluate use of GEHHMS.
Results:
During 2,812 hours of observation, we identified 25,417 HHOs. There was significant variability in HHO rate across critical care units. Time of day, day of the week, unit acuity, patient acuity, patient population and use of transmission-based precautions were significantly associated with HHO rate. Using unit-specific estimates of average HHO rate, aggregate HH adherence was 30.0% (1,084,329 of 3,614,908) at baseline with GEHHMS and improved to 38.5% (740,660 of 1,921,656) within 2 months of continuous feedback to units (P < .0001).
Conclusions:
Unit-specific estimates based on known predictors of HHO rate enabled broad implementation of GEHHMS. Further longitudinal quality improvement efforts using this system are required to assess the impact of GEHHMS on both HH adherence and clinical outcomes within critically ill patient populations.
Reward Deficiency Syndrome (RDS) is an umbrella term for all drug and nondrug addictive behaviors, due to a dopamine deficiency, “hypodopaminergia.” There is an opioid-overdose epidemic in the USA, which may result in or worsen RDS. A paradigm shift is needed to combat a system that is not working. This shift involves the recognition of dopamine homeostasis as the ultimate treatment of RDS via precision, genetically guided KB220 variants, called Precision Behavioral Management (PBM). Recognition of RDS as an endophenotype and an umbrella term in the future DSM 6, following the Research Domain Criteria (RDoC), would assist in shifting this paradigm.
Nudging in microbiology is an antimicrobial stewardship strategy to influence decision making through the strategic reporting of microbiology results while preserving prescriber autonomy. The purpose of this scoping review was to identify the evidence that demonstrates the effectiveness of nudging strategies in susceptibility result reporting to improve antimicrobial use.
Methods:
A search for studies in Ovid MEDLINE, Embase, PsycINFO, and All EBM Reviews was conducted. All simulated and vignette studies were excluded. Two independent reviewers were used throughout screening and data extraction.
Results:
Of a total of 1,346 citations screened, 15 relevant studies were identified. Study types included pre- and postintervention (n = 10), retrospective cohort (n = 4), and a randomized controlled trial (n = 1). Most studies were performed in acute-care settings (n = 13), and the remainder were in primary care (n = 2). Most studies used a strategy to alter the default antibiotic choices on the antibiotic report. All studies reported at least 1 outcome of antimicrobial use: utilization (n = 9), appropriateness (n = 7), de-escalation (n = 2), and cost (n = 1). Moreover, 12 studies reported an overall benefit in antimicrobial use outcomes associated with nudging, and 4 studies evaluated the association of nudging strategy with subsequent antimicrobial resistance, with 2 studies noting overall improvement.
Conclusions:
The number of heterogeneous studies evaluating the impact of applying nudging strategies to susceptibility result reports is small; however, most strategies do show promise in altering prescriber’s antibiotic selection. Selective and cascade reporting of targeted agents in a hospital setting represent the majority of current research. Gaps and opportunities for future research identified from our scoping review include performing prospective randomized controlled trials and evaluating other approaches aside from selective reporting.
Antimicrobial stewardship program (ASP) interventions, such as prospective audit and feedback (PAF), have been shown to reduce antimicrobial use and improve patient outcomes. However, the optimal approach to PAF is unknown.
Objective:
We examined the impact of a high–intensity interdisciplinary rounds–based PAF compared to low–intensity PAF on antimicrobial use on internal medicine wards in a 400–bed community hospital.
Methods:
Prior to the intervention, ASP pharmacists performed low–intensity PAF with a focus on targeted antibiotics. Recommendations were made directly to the internist for each patient. High–intensity, rounds–based PAF was then introduced sequentially to 5 internal medicine wards. This PAF format included twice–weekly interdisciplinary rounds, with a review of all internal medicine patients receiving any antimicrobial agent. Antibiotic use and clinical outcomes were measured before and after the transition to high–intensity PAF. An interrupted time–series analysis was performed adjusting for seasonal and secular trends.
Results:
With the transition from low–intensity to high–intensity PAF, a reduction in overall usage was seen from 483 defined daily doses (DDD)/1,000 patient days (PD) during the low–intensity phase to 442 DDD/1,000 PD in the high–intensity phase (difference, −42; 95% confidence interval [CI], −74 to −9). The reduction in usage was more pronounced in the adjusted analysis, in the latter half of the high intensity period, and for targeted agents. There were no differences seen in clinical outcomes in the adjusted analysis.
Conclusions:
High–intensity PAF was associated with a reduction in antibiotic use compared to a low–intensity approach without any adverse impact on patient outcomes. A decision to implement high–intensity PAF approach should be weighed against the increased workload required.
Despite the critical role families play in the care and recovery journeys of people who experience enduring mental distress, they are often excluded by the mental health services in the care and decision-making process. International trends in mental health services emphasise promoting a partnership approach between service users, families and practitioners within an ethos of recovery.
Objective:
This paper evaluated the acceptability of and initial outcomes from a clinician and peer co-led family information programme.
Methods:
A sequential design was used involving a pre-post survey to assess changes in knowledge, confidence, advocacy, recovery and hope following programme participation and interviews with programme participants. Participants were recruited from mental health services running the information programme. In all, 86 participants completed both pre- and post-surveys, and 15 individuals consented to interviews.
Results:
Survey findings indicated a statistically significant change in family members’ knowledge about mental health issues, recovery attitudes, sense of hope and confidence. In addition, the interviews suggested that the programme had a number of other positive outcomes for family members, including increased communication with members of the mental health team and increased awareness of communication patterns within the family unit. Family members valued the opportunity to share their experiences in a ‘safe’ place, learn from each other and provide mutual support.
Conclusion:
The evaluation highlights the importance of developing information programmes in collaboration with family members as well as the strength of a programme that is jointly facilitated by a family member and clinician.
We consider an M/M/1 queue with a removable server that dynamically chooses its service rate from a set of finitely many rates. If the server is off, the system must warm up for a random, exponentially distributed amount of time, before it can begin processing jobs. We show under the average cost criterion, that work conserving policies are optimal. We then demonstrate the optimal policy can be characterized by a threshold for turning on the server and the optimal service rate increases monotonically with the number in system. Finally, we present some numerical experiments to provide insights into the practicality of having both a removable server and service rate control.
Natural samples of the substituted basic Cu(II) chloride series, Cu4–xMx2+(OH)6Cl2(M = Zn, Ni, or Mg) were investigated by single-crystal X-ray diffraction in order to elucidate compositional boundaries associated with paratacamite and its congeners. The compositional ranges examined are Cu3.65Zn0.35(OH)6Cl2 – Cu3.36Zn0.64(OH)6Cl2 and Cu3.61Ni0.39(OH)6Cl2 – Cu3.13Ni0.87(OH)6Cl2, along with a single Mg-bearing phase. The majority of samples studied have trigonal symmetry (R3̄m) analogous to that of herbertsmithite (Zn) and gillardite (Ni), with a ≈ 6.8, c ≈ 14.0 Å. Crystallographic variations for these samples caused by composition are compared with both published and new data for the R3̄m sub-cell of paratacamite, paratacamite-(Mg) and paratacamite-(Ni). The observed trends suggest that the composition of end-members associated with the paratacamite congeners depend upon the nature of the substituting cation.
Increasing numbers of freshwater ecosystems have had sportfish consumption advisories posted in recent years. Advisories are sometimes issued in lieu of environmental remediation if they are considered more cost-effective than “cleaning up” the resource, but this approach assumes that anglers adjust behavior in response to the warning. Previous studies, however, suggest that compliance with advisories can be quite low. In contrast, this study measures a statistically significant response by reservoir anglers to consumption advisories. In particular, anglers are less likely to choose to visit a reservoir with an advisory than a similar reservoir without an advisory. Furthermore, the economic losses due to advisories are quantified for anglers in two regions of Tennessee.
In this article we introduce a new method of mitigating the problem of long wait times for low-priority customers in a two-class queuing system. To this end, we allow class 1 customers to be upgraded to class 2 after they have been in queue for some time. We assume that there are ci servers at station i, i=1, 2. The servers at station 1 are flexible in the sense that they can work at either station, whereas the servers at station 2 are dedicated. Holding costs at rate hi are accrued per customer per unit time at station i, i=1, 2. This study yields several surprising results. First, we show that stability analysis requires a condition on the order of the service rates. This is unexpected since no such condition is required when the system does not have upgrades. This condition continues to play a role when control is considered. We provide structural results that include a c-μ rule when an inequality holds and a threshold policy when the inequality is reversed. A numerical study verifies that the optimal control policy significantly reduces holding costs over the policy that assigns the flexible server to station 1. At the same time, in most cases the optimal control policy reduces waiting times of both customer classes.
By
Gary Marks, Professor of Political Science and founding Director of the Center for European Studies, University of North Carolina, Chapel Hill,
Ian Down, Postdoctoral Fellow, University of California, Davis
Regional regimes – authoritative arrangements facilitating economic integration in a particular region – have sprouted across the globe in recent decades. More than fifty such arrangements currently exist, among them the European Union (EU) and the North American Free Trade Agreement (NAFTA). They are an important – perhaps the most important – form in which competencies, formerly the preserve of sovereign states, have been rebundled. Yet there is no sign of convergence in the policy scope and institutional depth of such regional regimes. In this chapter, we argue that such variation has a decisive effect on their political support. So, to extend Sidney Tarrow's line of argument in Chapter 3, rebundling of authority may shape group support and opposition as well as the strategies that groups adopt to achieve their objectives.
Our focus is on mainstream left political parties in the member states of the EU and NAFTA. Political parties are vital in building regional regimes. When Jean Monnet built his Action Committee for the United States in Europe from the mid-1950s, he sought above all to gain the support of socialists and unions, because he regarded these groups as critical to the future of European integration. He realized that the European Community was a political, not merely a functional, project, and that its future would depend on the support it could muster from political parties and mass organizations (Duchêne 1994: 285ff).
The influence of boron ion implants on the optical and physical properties of photochemically deposited SiO2 films on HgO 7 CdO 3Te and silicon has been investigated. The distributions of the boron atoms between the SiO2 film and substrate have been determined by a nondestructive neutron depth profiling method. The implants produce an apparent densification of the SiO2 films, which is accompanied by an increase in refractive index and changes in the infrared vibrational spectra for these films.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.