An analysis of a bloodstream infection (BSI) prevention effort in 17 outpatient hemodialysis centers found reductions in overall BSI and access-related BSI (ARBSI) incidence rates following the implementation of a bundle of BSI prevention interventions. Most of the decrease was identified soon after implementation of the interventions, with reductions sustained through a 15-month period.
We aimed to reevaluate the effect of the interventions using additional data and to investigate the sustainability of the initial observed reductions during an extended period.
The overall evaluation period spanned January 2009 through December 2013. The baseline period consisted of the first year of data, and January 2010 marked the start of the intervention time frame.
In the initial analysis, data were available through March 2011 (intervention month 15). For this report, we analyzed data collected through December 2013, allowing for evaluation of a 48-month intervention period. Herein, intervention months 1–15 are referred to as the “early intervention” and months 16–48 are referred to as the “later intervention” periods.
CDC Dialysis BSI Prevention Collaborative
The baseline and early intervention periods of the Centers for Disease Control and Prevention (CDC) Dialysis BSI Prevention Collaborative (“Collaborative”) of 17 outpatient hemodialysis centers have been described previously.
Catheter hub disinfection upon connection or disconnection of central venous catheters (CVCs)
was added as an intervention in May 2011.
The last in-person meeting of representatives from the Collaborative occurred in November 2011, and monthly educational conference calls were reduced to quarterly starting March 2012.
Overall, BSI and ARBSI incidence were defined according to the CDC National Healthcare Safety Network (NHSN) Dialysis Event Protocol,
which defines BSI as a positive blood culture from a hemodialysis patient as an outpatient or within 1 day after hospital admission. An ARBSI is a BSI for which the suspected positive blood culture source is the vascular access or is otherwise uncertain.
Outcomes were stratified into 2 vascular access groupings: (1) arteriovenous fistulas and grafts and (2) tunneled and nontunneled central venous catheters (CVCs). Incidence rates were reported per 100 patient months. Data were restricted to those reported under a monthly reporting plan, which indicates a facility’s intent to follow the protocol.
Since our prior report,
we learned that 60 unanalyzed facility months of baseline and 3 months of intervention data had been reported to NHSN without corresponding reporting plans.
The data were verified to have been collected per protocol and prepared for submission under a reporting plan. The involved facilities entered the reporting plans, and the previously excluded data were incorporated into the present analysis.
Analytic and Statistical Methods
Effect of interventions
We first reevaluated the effect of the interventions with the additional baseline and intervention data. Consistent with the original model, we used segmented regression
to estimate the baseline rate trend (β1), the level change after the intervention start (β2), and the difference between baseline and intervention trends (β3*); the intervention rate trend was quantified by combining the β1 and β3* estimates.
Sustainability of effect
To assess the sustainability of the initial rate reduction, we compared the early and later intervention periods. Two time-dependent variables were added to the original model: one was an indicator of the later intervention period and the other indicated the number of months since the start of that period. The first allowed for estimation of the rate level change immediately after the start of the later intervention period (β4); the second allowed for estimation of the difference in monthly rate trends between the early and later intervention periods (β5).
We used segmented regression models with the number of BSIs or ARBSIs as the outcome offset by patient months. Because the analysis involved longitudinal data for multiple facilities, clustering was considered in 2 ways: (1) within-facility correlation of errors over time (ie, assessed specification of a non-independent residual) and (2) variation between facilities in baseline rates (ie, assessed specification of a random intercept). Because a small number of clusters were assessed (ie, 17 facilities) and to safeguard against choosing the wrong correlation structure, standard errors were calculated using the Morel, Bokossa, and Neerchal (MBN) sandwich estimator.
Data analysis was conducted using SAS Version 9.3 (SAS Institute, Cary, NC).
All 17 facilities that were included initially
continued reporting their data to the NHSN during the later intervention period. Facilities reported a median of 12 baseline months (range, 0–12) and 48 intervention months (range, 43–48). One facility had a 5-month gap in reporting due to temporary closure. Baseline data were available for 15 facilities; complete data were available for 12.
Main Results and Other Analyses
Unadjusted pooled mean BSI and ARBSI rates (both overall and stratified) decreased from the baseline to the intervention period (Table 1).
TABLE 1 Baseline Versus Intervention Period Percent Changes in Overall and Access-Related Bloodstream Infection Incidence Rates Among Facilities Participating in the CDC Dialysis Bloodstream Infection Prevention Collaborative: Effect of Interventions
Effect of interventions
ARBSI rates dropped by 44% (P = .005) overall and 49% (P = .002) in the CVC stratum immediately following the start of the intervention period (Table 1). No changes in modeled BSI rates were detected.
Sustainability of effect
No immediate changes were seen in BSI or ARBSI rates, overall or in either stratum (Table 2). In addition, the later intervention period rate trend did not statistically differ from the early intervention period rate trend, overall or in either stratum.
TABLE 2 Early Versus Later Intervention Period Percent Changes in Incidence Rates for Overall and Access-Related Bloodstream Infections Among Facilities Participating in the CDC Dialysis Bloodstream Infection Prevention Collaborative: Sustainability of Effect
Our updated analysis showed reductions in ARBSIs among dialysis facilities participating in the Collaborative immediately following intervention implementation overall and in the CVC stratum. Perhaps most importantly, this analysis further demonstrates that the early reductions in ARBSI were sustained for 4 years after the initiation of the intervention. These reductions persisted at facilities even after the formal collaboration ended. Based on the overall baseline rate (1.03 ARBSIs per 100 patient months) and intervention period denominator (46,351 patient months), an estimated 286 of 477 expected ARBSIs (60%) were prevented during the 48-month intervention period. This finding further supports the effectiveness of the interventions and the sustainability of the reductions even after intensive technical assistance from the CDC ended.
Reduction of BSIs in patients with CVCs has been demonstrated in other quality improvement projects, albeit over shorter follow-up periods.
A recent cluster-randomized trial conducted in outpatient hemodialysis facilities reported a 21% relative reduction in BSI rates in the quarter following implementation of CDC-recommended catheter care practices compared with control facilities; the improvement was sustained over the subsequent 9 months.
Interventions implemented in both the trial and the Collaborative included chlorhexidine for catheter exit site care as well as adherence to catheter “scrub-the-hub” procedure. These studies lend support to the concept that reductions in CVC-related BSIs among patients undergoing hemodialysis are both achievable and sustainable.
While overall BSIs decreased significantly in the previous analysis, the reduction was no longer significant in the current analysis. Several changes since the previous analysis may have contributed to this reduction in precision: use of the MBN instead of the classical sandwich standard error estimator, additional data included for certain facilities, and additional intervention period months for all facilities.
Improvements associated with the intervention implementation were not observed in the fistula-graft stratum. This lack of improvement is not unexpected, however, because the Collaborative interventions primarily focused on CVCs.
Although we did not model stratified rates in the initial analysis, crude rates seemed to show a similar pattern of effect.
Our analysis had several limitations.
First, 5 of the 17 facilities lacked some or all data from the baseline period. Second, we lack information about the level of adherence to specific interventions within individual facilities. Third, an interrupted time series study design without a control group is still limited in the capacity to infer a causal relationship between the implementation of a bundle of BSI prevention interventions and changes in BSIs and ARBSIs.
The consistency in the findings of this and the initial study suggest that the reported findings are accurate.
In conclusion, bloodstream infections in hemodialysis patients are preventable through implementation of and adherence to recommended prevention practices focused on catheter care. Importantly for dialysis providers and patient safety advocates, these improvements can be maintained for multiple years after adoption.
We acknowledge and thank the participating facilities for their support and assistance. The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention.
Financial support: This work was supported through personal time and/or salary funds from respective institutional affiliation(s) for each author.
Potential conflicts of interest: The authors have no other relevant conflicts or financial support to report.
Patel, PR, Yi, SH, Booth, S, et al. Bloodstream infection rates in outpatient hemodialysis facilities participating in a collaborative prevention effort: a quality improvement report. Am J Kidney Dis
Wagner, AK, Soumerai, SB, Zhang, F, Ross-Degnan, D. Segmented regression analysis of interrupted time series studies in medication use research. J Clin Pharm Ther
Morel, JG, Bokossa, MC, Neerchal, NK. Small sample correction for the variance of GEE estimators. Biom J
Pronovost, PJ, Goeschel, CA, Colantuoni, E, et al. Sustaining reductions in catheter related bloodstream infections in Michigan intensive care units: observational study. BMJ
Rosenblum, A, Wang, W, Ball, LK, Latham, C, Maddux, FW, Lacson, E Jr. Hemodialysis catheter care strategies: a cluster-randomized quality improvement initiative. Am J Kidney Dis
Penfold, RB, Zhang, F. Use of interrupted time series analysis in evaluating health care quality improvements. Acad Pediatr