We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Concerns about penicillin-cephalosporin cross-reactivity have historically led to conservative prescribing and avoidance of cephalosporins in patients with penicillin allergy labels, potentially causing suboptimal outcomes. Recent evidence suggests a lower risk of cross-reactivity, prompting a reassessment of alert systems.
Objective:
To assess the impact of limited penicillin cross-reactivity alerts on outpatient cephalosporin use and the incidence of adverse reactions in a healthcare setting.
Methods:
This retrospective cohort study compared cephalosporin prescribing and adverse reactions in patients labeled as penicillin-allergic before and after limiting penicillin cross-reactivity alerts in the electronic medical record at a large academic medical center.
Results:
Among 17,174 patients (8,131 pre- and 9,043 post-implementation), there was a statistically significant increase in outpatient cephalosporin prescribing by 8% (P < .001). The use of alternative antibiotic classes decreased. There was no statistically significant increase in adverse events pre- and post-implementation (0.036%–0.058%, P = .547), and no severe events were attributable to cross-reactivity. The alert modification reduced alerts by 92% (P < .001).
Conclusion:
The reduction of penicillin-cephalosporin cross-reactivity alerts was associated with increased cephalosporin use, without a significant increase in adverse reactions. This demonstrates that the practice is safe and decreases alert burden.
To evaluate the motor proficiency, identify risk factors for abnormal motor scores, and examine the relationship between motor proficiency and health-related quality of life in school-aged patients with CHD.
Study design:
Patients ≥ 4 years old referred to the cardiac neurodevelopmental program between June 2017 and April 2020 were included. Motor skills were evaluated by therapist-administered Bruininks-Oseretsky Test of Motor Proficiency Second-Edition Short Form and parent-reported Adaptive Behavior Assessment System and Patient-Reported Outcomes Measurement Inventory System Physical Functioning questionnaires. Neuropsychological status and health-related quality of life were assessed using a battery of validated questionnaires. Demographic, clinical, and educational variables were collected from electronic medical records. General linear modelling was used for multivariable analysis.
Results:
The median motor proficiency score was the 10th percentile, and the cohort (n = 272; mean age: 9.1 years) scored well below normative values on all administered neuropsychological questionnaires. In the final multivariable model, worse motor proficiency score was associated with family income, presence of a genetic syndrome, developmental delay recognised in infancy, abnormal neuroimaging, history of heart transplant, and executive dysfunction, and presence of an individualised education plan (p < 0.03 for all predictors). Worse motor proficiency correlated with reduced health-related quality of life. Parent-reported adaptive behaviour (p < 0.001) and physical functioning (p < 0.001) had a strong association with motor proficiency scores.
Conclusion:
This study highlights the need for continued motor screening for school-aged patients with CHD. Clinical factors, neuropsychological screening results, and health-related quality of life were associated with worse motor proficiency.
Numerous studies have shown longer pre-hospital and in-hospital workflow times and poorer outcomes in women after acute ischemic stroke (AIS) in general and after endovascular treatment (EVT) in particular. We investigated sex differences in acute stroke care of EVT patients over 5 years in a comprehensive Canadian provincial registry.
Methods:
Clinical data of all AIS patients who underwent EVT between January 2017 and December 2022 in the province of Saskatchewan were captured in the Canadian OPTIMISE registry and supplemented with patient data from administrative data sources. Patient baseline characteristics, transport time metrics, and technical EVT outcomes between female and male EVT patients were compared.
Results:
Three-hundred-three patients underwent EVT between 2017 and 2022: 144 (47.5%) women and 159 (52.5%) men. Women were significantly older (median age 77.5 [interquartile range: 66–85] vs.71 [59–78], p < 0.001), while men had more intracranial internal carotid artery occlusions (48/159 [30.2%] vs. 26/142 [18.3%], p = 0.03). Last-known-well to comprehensive stroke center (CSC)-arrival time (median 232 min [interquartile range 90–432] in women vs. 230 min [90–352] in men), CSC-arrival-to-reperfusion time (median 108 min [88–149] in women vs. 102 min [77–141] in men), reperfusion status (successful reperfusion 106/142 [74.7%] in women vs. 117/158 [74.1%] in men) as well as modified Rankin score at 90 days did not differ significantly. This held true after adjusting for baseline variables in multivariable analyses.
Conclusion:
While women undergoing EVT in the province of Saskatchewan were on average older than men, they were treated just as fast and achieved similar technical and clinical outcomes compared to men.
To describe epidemiologic and genomic characteristics of a severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) outbreak in a large skilled-nursing facility (SNF), and the strategies that controlled transmission.
Design, setting, and participants:
This cohort study was conducted during March 22–May 4, 2020, among all staff and residents at a 780-bed SNF in San Francisco, California.
Methods:
Contact tracing and symptom screening guided targeted testing of staff and residents; respiratory specimens were also collected through serial point prevalence surveys (PPSs) in units with confirmed cases. Cases were confirmed by real-time reverse transcription–polymerase chain reaction testing for SARS-CoV-2, and whole-genome sequencing (WGS) was used to characterize viral isolate lineages and relatedness. Infection prevention and control (IPC) interventions included restricting from work any staff who had close contact with a confirmed case; restricting movement between units; implementing surgical face masking facility-wide; and the use of recommended PPE (ie, isolation gown, gloves, N95 respirator and eye protection) for clinical interactions in units with confirmed cases.
Results:
Of 725 staff and residents tested through targeted testing and serial PPSs, 21 (3%) were SARS-CoV-2 positive: 16 (76%) staff and 5 (24%) residents. Fifteen cases (71%) were linked to a single unit. Targeted testing identified 17 cases (81%), and PPSs identified 4 cases (19%). Most cases (71%) were identified before IPC interventions could be implemented. WGS was performed on SARS-CoV-2 isolates from 4 staff and 4 residents: 5 were of Santa Clara County lineage and the 3 others were distinct lineages.
Conclusions:
Early implementation of targeted testing, serial PPSs, and multimodal IPC interventions limited SARS-CoV-2 transmission within the SNF.
Background: Central-line–associated blood stream infections (CLABSIs) are linked with significant morbidity and mortality. A NHSN laboratory-confirmed bloodstream infection (LCBSI) has specific criteria to ascribe an infection to the central line or not. The criteria used to associate the pathogen to another site are restrictive. This objective to better classify CLABSIs using enhanced criteria to gain a comprehensive understanding of the error so that appropriate reduction efforts are utilized. Methods: We conducted a retrospective review of medical records with NHSN-identified CLABSI from July 2017 to December 2018 at 2 geographically proximate hospitals. Trained infectious diseases personnel from tertiary-care academic medical centers, the University of Virginia Health System, a 600-bed medical center in Charlottesville, Virginia, and Virginia Commonwealth University Health System with 865 beds in Richmond, Virginia, reviewed charts. We defined “overcaptured” or O-CLABSI into different categories: O-CLABSI-1 is bacteremia attributable to a primary infectious source; O-CLABSI-2 is bacteremia attributable to neutropenia with gastrointestinal translocation not meeting mucosal barrier injury criteria; O-CLABSI-3 is a positive blood culture attributable to a contaminant; and O-CLABSI-4 is a patient injecting line, though not officially documented. Descriptive analyses were performed using the χ2 and the Fisher exact tests. Results: We found a large number of O-CLABSIs on chart review (79 of 192, 41%). Overall, 56 of 192 (29%) LCBSIs were attributable to a primary infectious source not meeting NHSN definition. O-CLABSI proportions between the 2 hospitals were statistically different; hospital A identified 34 of 59 (58%) of their NHSN-identified CLABSIs as O-CLABSIs, and hospital B identified a 45 of 133 (34%) as O-CLABSIs (P = .0020) (Table 1). When comparing O-CLABSI types, hospital B had a higher percentage of O-CLABSI-1 compared to hospital B: 76% versus 64%. Hospital A had a higher proportion of O-CLABSI-2: 21 versus 7%. Hospitals A and B had similar proportion of O-CLABSI-3: 15% versus 18%. These values were all statistically significant (P < .0001). Discussions: The results of these 2 geographically proximate systems indicate that O-CLABSIs are common. Attribution can vary significantly between institutions, likely depending on differences in incidence of true CLABSI, patient populations, protocols, and protocol compliance. These findings have implications for interfacility comparisons of publicly reported data. Most importantly, erroneous attribution can result in missed opportunity to direct patient safety efforts to the root cause of the bacteremia and could lead to inappropriate treatment.
Funding: None
Disclosures: Michelle Doll, Research Grant from Molnlycke Healthcare
The learning hospital is distinguished by ceaseless evolution of erudition, enhancement, and implementation of clinical best practices. We describe a model for the learning hospital within the framework of a hospital infection prevention program and argue that a critical assessment of safety practices is possible without significant grant funding. We reviewed 121 peer-reviewed manuscripts published by the VCU Hospital Infection Prevention Program over 16 years. Publications included quasi-experimental studies, observational studies, surveys, interrupted time series analyses, and editorials. We summarized the articles based on their infection prevention focus, and we provide a brief summary of the findings. We also summarized the involvement of nonfaculty learners in these manuscripts as well as the contributions of grant funding. Despite the absence of significant grant funding, infection prevention programs can critically assess safety strategies under the learning hospital framework by leveraging a diverse collaboration of motivated nonfaculty learners. This model is a valuable adjunct to traditional grant-funded efforts in infection prevention science and is part of a successful horizontal infection control program.
Head and neck cancer patients receiving radiotherapy can experience a number of toxicities, including weight loss and malnutrition, which can impact upon the quality of treatment. The purpose of this retrospective cohort study is to evaluate weight loss and identify predictive factors for this patient group.
Materials and methods
A total of 40 patients treated with radiotherapy since 2012 at the study centre were selected for analysis. Data were collected from patient records. The association between potential risk factors and weight loss was investigated.
Results
Mean weight loss was 5 kg (6%). In all, 24 patients lost >5% starting body weight. Age, T-stage, N-stage, chemotherapy and starting body weight were individually associated with significant differences in weight loss. On multiple linear regression analysis age and nodal status were predictive.
Conclusion
Younger patients and those with nodal disease were most at risk of weight loss. Other studies have identified the same risk factors along with several other variables. The relative significance of each along with a number of other potential factors is yet to be fully understood. Further research is required to help identify patients most at risk of weight loss; and assess interventions aimed at preventing weight loss and malnutrition.
Hog producers in Indiana and Nebraska were surveyed about sources of risk, effectiveness of risk management strategies, and prior participation in and desire for additional risk management education. Ownership of hogs by the producer, size of the operation, and age did have significant effects on ratings of both sources of risk and effectiveness of risk management strategies. Probit analysis found age, prior attendance, knowledge and prior use of the tool, level of integration, and concern about price and performance risk have significant effects on interest in further education about production contracts, futures and options, packer marketing contracts, and financial management.
Introduction: Functionally univentricular hearts palliated with superior or total cavopulmonary connection result in circulations in series. The absence of a pre-pulmonary pump means that cardiac output is more difficult to adjust and control. Continuous monitoring of cardiac output is crucial during cardiac catheter interventions and can provide new insights into the complex physiology of these lesions. Materials and methods: The Icon® cardiac output monitor was used to study the changes in cardiac output during catheter interventions in 15 patients (median age: 6.1 years, range: 4.8–15.3 years; median weight: 18.5 kg, range: 15–63 kg) with cavopulmonary circulations. A total of 19 interventions were undertaken in these patients and the observed changes in cardiac output were recorded and analysed. Results: Cardiac output was increased with creation of stent fenestrations after total cavopulmonary connection (median increase of 22.2, range: 6.7%–28.6%) and also with drainage of significant pleural effusions (16.7% increase). Cardiac output was decreased with complete or partial occlusion of fenestrations (median decrease of 10.6, range: 7.1%–13.4%). There was a consistent increase in cardiac output with stenting of obstructive left pulmonary artery lesions (median increase of 7.7, range: 5%–14.3%, p = 0.007). Conclusions: Icon® provides a novel technique for the continuous, non-invasive monitoring of cardiac output. It provides a further adjunct for monitoring of physiologically complex patients during catheter interventions. These results are consistent with previously reported series involving manipulation of fenestrations. This is the first report identifying an increase in cardiac output with stenting of obstructive pulmonary arterial lesions.
To our knowledge, no comprehensive, interdisciplinary initiatives have been taken to examine the role of genetic variants on patient-reported quality-of-life outcomes. The overall objective of this paper is to describe the establishment of an international and interdisciplinary consortium, the GENEQOL Consortium, which intends to investigate the genetic disposition of patient-reported quality-of-life outcomes. We have identified five primary patient-reported quality-of-life outcomes as initial targets: negative psychological affect, positive psychological affect, self-rated physical health, pain, and fatigue. The first tangible objective of the GENEQOL Consortium is to develop a list of potential biological pathways, genes and genetic variants involved in these quality-of-life outcomes, by reviewing current genetic knowledge. The second objective is to design a research agenda to investigate and validate those genes and genetic variants of patient-reported quality-of-life outcomes, by creating large datasets. During its first meeting, the Consortium has discussed draft summary documents addressing these questions for each patient-reported quality-of-life outcome. A summary of the primary pathways and robust findings of the genetic variants involved is presented here. The research agenda outlines possible research objectives and approaches to examine these and new quality-of-life domains. Intriguing questions arising from this endeavor are discussed. Insight into the genetic versus environmental components of patient-reported quality-of-life outcomes will ultimately allow us to explore new pathways for improving patient care. If we can identify patients who are susceptible to poor quality of life, we will be able to better target specific clinical interventions to enhance their quality of life and treatment outcomes.
While health warnings are present on cigarette packs around the world, the nature of the warnings varies considerably between countries. In the United States, a small text warning citing the dangers of cigarette smoking is found on the side of all packs. This pilot study sought to determine whether graphic cigarette warning images, like those found in the United Kingdom and Canada, were better at decreasing cravings to smoke than existing text warnings found on cigarette packs in the United States. Twenty-five smokers seeking treatment to quit at a specialty tobacco treatment program were administered the Brief Questionnaire of Smoking Urges (QSU — BRIEF), a validated measure of craving, prior to and following exposure to cigarette pack warning images. The graphic cigarette warning images reduced cravings to smoke (6.20 point decrease) more than neutral images (3.36 point decrease) and current text warnings used in the United States (5.75 point decrease), although this difference was not statistically significant. Based on these pilot data, a larger study could further examine the effectiveness of graphic warning images and whether such warnings hold an advantage over the currently used text warnings.