We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We sought to compare patient outcomes between carbapenem-resistant Enterobacterales (CRE) and carbapenem-susceptible Enterobacterales (CSE) infections at our academic medical center.
Design:
We conducted a retrospective cohort study of adult patients with a positive culture of E. coli, E. cloacae, K. aerogenes, K. oxytoca, and/or K. pneumoniae admitted at UK HealthCare (January 1, 2010–December 31, 2019). Based on the type of pathogen on the date of the first culture (index date), patients were included in the CRE (i.e., exposed) group, or the CSE (comparator) group. Exclusion criteria were age < 18 years old, pregnancy, endocarditis, osteomyelitis, necrotizing fasciitis, or cystic fibrosis. We evaluated the impact of CRE vs CSE on a composite outcome of 30-day of all-cause mortality or discharge to hospice using Kaplan–Meier survival curves and Cox proportional hazard regression with inverse probability of treatment weights (IPTW).
Results:
Of 17,839 hospitalized patients, 128 and 6,953 patients were included in the CRE and CSE groups, respectively. Baseline differences existed in sex-assigned-at-birth, admission source, time-to-index culture, and infection type/severity. Most CRE index cultures observed (76%) only exhibited resistance to ertapenem. IPTW-adjusted HR [95% CI] of composite outcome was 0.99 [0.65, 1.51] after 30 days. Follow-up analysis in patients with carbapenem-non-susceptible Enterobacteralesbloodstream infections on index yielded an HR of 1.38 [0.85, 2.24].
Conclusions:
Risk of composite outcome was not estimated to differ between patients with CRE and CSE in the overall analysis. Although follow-up analysis identified an increased risk, we cannot statistically distinguish this from a null effect.
We evaluated diagnostic test and antibiotic utilization among 252 patients from 11 US hospitals who were evaluated for coronavirus disease 2019 (COVID-19) pneumonia during the severe acute respiratory coronavirus virus 2 (SARS-CoV-2) omicron variant pandemic wave. In our cohort, antibiotic use remained high (62%) among SARS-CoV-2–positive patients and even higher among those who underwent procalcitonin testing (68%).
Vancomycin therapy is associated with an increased risk of acute kidney injury (AKI). Previous studies suggest that area under the curve (AUC) monitoring reduces the risk of AKI, but literature is lacking to support this in patients receiving longer durations of vancomycin therapy.
Design:
Retrospective cohort study.
Method:
Patients ≥18 years old, admitted between August 2015 and July 2017 or October 2017 and September 2019, and received at least 14 days of intravenous (IV) vancomycin therapy were included in the study. Our primary outcome was the incidence of AKI between trough monitoring and AUC monitoring groups using Kidney Disease Improving Global Outcomes criteria. Secondary outcomes included inpatient mortality, median inpatient length of stay, and median intensive care unit length of stay.
Results:
Overall, 582 patients were included in the study, with 318 patients included in the trough monitoring group and 264 included in the AUC monitoring group. The median duration of vancomycin therapy was 23 days (interquartile range, 16–39). Patients within the trough monitoring group had a higher incidence of AKI compared to the AUC monitoring group (45.6% vs 28.4%, p < 0.001). Furthermore, logistic regression analysis showed that AUC monitoring was associated with a 54% lower incidence of AKI (OR 0.46, 95% CI [0.31–0.69]). All-cause inpatient mortality was numerically higher in the trough monitoring group (12.9% vs 8.3%, p = 0.078).
Conclusions:
In patients who received at least 14 days of IV vancomycin therapy, AUC monitoring was associated with a lower incidence of AKI.
The objective of this study was to determine antibiotic appropriateness based on Loeb minimum criteria (LMC) in patients with and without altered mental status (AMS).
Design:
Retrospective, quasi-experimental study assessing pooled data from 3 periods pertaining to the implementation of a UTI management guideline.
Setting:
Academic medical center in Lexington, Kentucky.
Patients:
Adult patients aged ≥18 years with a collected urinalysis receiving antimicrobial therapy for a UTI indication.
Methods:
Appropriateness of UTI management was assessed in patients prior to an institutional UTI guideline, after guideline introduction and education, and after implementation of a prospective audit-and-feedback stewardship intervention from September to November 2017–2019. Patient data were pooled and compared between patients noted to have AMS versus those with classic UTI symptoms. Loeb minimum criteria were used to determine whether UTI diagnosis and treatment was warranted.
Results:
In total, 600 patients were included in the study. AMS was one of the most common indications for testing across the 3 periods (19%–30.5%). Among those with AMS, 25 patients (16.7%) met LMC, significantly less than the 151 points (33.6%) without AMS (P < .001).
Conclusions:
Patients with AMS are prescribed antibiotic therapy without symptoms indicative of UTI at a higher rate than those without AMS, according to LMC. Further antimicrobial stewardship efforts should focus on prescriber education and development of clearly defined criteria for patients with and without AMS.
We assessed breakpoint changes of 13,101 Enterobacterales and Pseudomonas aeruginosa isolates from the past decade. All β-lactams and fluoroquinolones demonstrated decreased susceptibilities following breakpoint changes. Enterobacter cloacae experienced the largest average decrease in susceptibility amongst the Enterobacterales at 5.3% and P. aeruginosa experienced an average decrease in susceptibility of 9.3%.
The purpose of this scoping review is two-fold: to assess the literature that quantitatively measures outcomes of mentorship programs designed to support research-focused junior faculty and to identify mentoring strategies that promote diversity within academic medicine mentoring programs.
Methods:
Studies were identified by searching Medline using MESH terms for mentoring and academic medicine. Eligibility criteria included studies focused on junior faculty in research-focused positions, receiving mentorship, in an academic medical center in the USA, with outcomes collected to measure career success (career trajectory, career satisfaction, quality of life, research productivity, leadership positions). Data were abstracted using a standardized data collection form, and best practices were summarized.
Results:
Search terms resulted in 1,842 articles for title and abstract review, with 27 manuscripts meeting inclusion criteria. Two studies focused specifically on women, and four studies focused on junior faculty from racial/ethnic backgrounds underrepresented in medicine. From the initial search, few studies were designed to specifically increase diversity or capture outcomes relevant to promotion within academic medicine. Of those which did, most studies captured the impact on research productivity and career satisfaction. Traditional one-on-one mentorship, structured peer mentorship facilitated by a senior mentor, and peer mentorship in combination with one-on-one mentorship were found to be effective strategies to facilitate research productivity.
Conclusion:
Efforts are needed at the mentee, mentor, and institutional level to provide mentorship to diverse junior faculty on research competencies and career trajectory, create a sense of belonging, and connect junior faculty with institutional resources to support career success.
In objected-oriented design, "smells" are symptoms of code violating design principles. When a deadline is looming, decisions can affect the long-term quality of a code or CAD. Given this and the similarities between object-oriented code and CAD models, this paper introduces a set of CAD smells. These smells are derived from a top-down review of potential CAD smells mapped against the reported code smells that violate abstraction, modularity, encapsulation, and hierarchy principles. This list was further reviewed considering CAD systems and specific examples (some illustrated in the paper).
The coronavirus disease 2019 (COVID-19) pandemic has required healthcare systems and hospitals to rapidly modify standard practice, including antimicrobial stewardship services. Our study examines the impact of COVID-19 on the antimicrobial stewardship pharmacist.
Design:
A survey was distributed nationally to all healthcare improvement company members.
Participants:
Pharmacist participants were mostly leaders of antimicrobial stewardship programs distributed evenly across the United States and representing urban, suburban, and rural health-system practice sites.
Results:
Participants reported relative increases in time spent completing tasks related to medication access and preauthorization (300%; P = .018) and administrative meeting time (34%; P = .067) during the COVID-19 pandemic compared to before the pandemic. Time spent rounding, making interventions, performing pharmacokinetic services, and medication reconciliation decreased.
Conclusion:
A shift away from clinical activities may negatively affect the utilization of antimicrobials.
Higher milk intake has been associated with a lower stroke risk, but not with risk of CHD. Residual confounding or reverse causation cannot be excluded. Therefore, we estimated the causal association of milk consumption with stroke and CHD risk through instrumental variable (IV) and gene-outcome analyses. IV analysis included 29 328 participants (4611 stroke; 9828 CHD) of the European Prospective Investigation into Cancer and Nutrition (EPIC)-CVD (eight European countries) and European Prospective Investigation into Cancer and Nutrition-Netherlands (EPIC-NL) case-cohort studies. rs4988235, a lactase persistence (LP) SNP which enables digestion of lactose in adulthood was used as genetic instrument. Intake of milk was first regressed on rs4988235 in a linear regression model. Next, associations of genetically predicted milk consumption with stroke and CHD were estimated using Prentice-weighted Cox regression. Gene-outcome analysis included 777 024 participants (50 804 cases) from MEGASTROKE (including EPIC-CVD), UK Biobank and EPIC-NL for stroke, and 483 966 participants (61 612 cases) from CARDIoGRAM, UK Biobank, EPIC-CVD and EPIC-NL for CHD. In IV analyses, each additional LP allele was associated with a higher intake of milk in EPIC-CVD (β = 13·7 g/d; 95 % CI 8·4, 19·1) and EPIC-NL (36·8 g/d; 95 % CI 20·0, 53·5). Genetically predicted milk intake was not associated with stroke (HR per 25 g/d 1·05; 95 % CI 0·94, 1·16) or CHD (1·02; 95 % CI 0·96, 1·08). In gene-outcome analyses, there was no association of rs4988235 with risk of stroke (OR 1·02; 95 % CI 0·99, 1·05) or CHD (OR 0·99; 95 % CI 0·95, 1·03). Current Mendelian randomisation analysis does not provide evidence for a causal inverse relationship between milk consumption and stroke or CHD risk.
Coronavirus disease 2019 (COVID-19) has migrated to regions that were initially spared, and it is likely that different populations are currently at risk for illness. Herein, we present our observations of the change in characteristics and resource use of COVID-19 patients over time in a national system of community hospitals to help inform those managing surge planning, operational management, and future policy decisions.
To determine risk factors for mortality among COVID-19 patients admitted to a system of community hospitals in the United States.
Design:
Retrospective analysis of patient data collected from the routine care of COVID-19 patients.
Setting:
System of >180 acute-care facilities in the United States.
Participants:
All admitted patients with positive identification of COVID-19 and a documented discharge as of May 12, 2020.
Methods:
Determination of demographic characteristics, vital signs at admission, patient comorbidities and recorded discharge disposition in this population to construct a logistic regression estimating the odds of mortality, particular for those patients characterized as not being critically ill at admission.
Results:
In total, 6,180 COVID-19+ patients were identified as of May 12, 2020. Most COVID-19+ patients (4,808, 77.8%) were admitted directly to a medical-surgical unit with no documented critical care or mechanical ventilation within 8 hours of admission. After adjusting for demographic characteristics, comorbidities, and vital signs at admission in this subgroup, the largest driver of the odds of mortality was patient age (OR, 1.07; 95% CI, 1.06–1.08; P < .001). Decreased oxygen saturation at admission was associated with increased odds of mortality (OR, 1.09; 95% CI, 1.06–1.12; P < .001) as was diabetes (OR, 1.57; 95% CI, 1.21–2.03; P < .001).
Conclusions:
The identification of factors observable at admission that are associated with mortality in COVID-19 patients who are initially admitted to non-critical care units may help care providers, hospital epidemiologists, and hospital safety experts better plan for the care of these patients.
The aim was to investigate the cognitive abnormalities in healthy individuals (No Axis I or II disorders) at risk for bipolar disorder (BD) and schizophrenia (SZ)
Materials and Methods:
Participants were 17 BD-R, 15 SZ-R and 23 controls. All participants underwent assessment of IQ, working, verbal memory and learning, visuospatial memory, verbal and visual recall and recognition. Lack of lifetime Axis I and II disorders was screened using Structured Clinical Interview for DSM-IV and symptomatology was assessed with the Brief Psychiatric Rating Scale (BPRS).
Results:
No difference was found in IQ. The SZ-R underperformed compared to BD-R and controls in working memory. The SZ-R had increased number of intrusions but did not differ from the BD-R in short delay. The SZ-R showed impairment in long term recall. No effect of learning was found. SZ-R and BD-R underperformed compared to controls in visuospatial memory. SZ-R showed long term memory deficits with higher overall forgetting scores in both visual and verbal tests compared to BD-R and controls. The BD relatives were able to retain more verbal items but comparable visual items to SZ-R. Effect of BPRS total score was found only for BD-R across all measures.
Conclusions:
BD-R do not show deficits compared to controls in the dorsal prefrontal cortex (DPFC) like the SZ-R. The SZ-R show impairments in fronto temporal networks that are preserved in BD-R supporting deficits in semantic categories in both encoding and retrieval whereas impairment shown in BD-R may be mainly attributed to the effect of symptoms.
The aim of this project was to investigate the cognitive abnormalities in healthy individuals (No Axis I or II disorders) at risk for bipolar disorder (BD) and schizophrenia (SZ)
Materials and Methods:
Participants were 17 BD-R and 15 SZ-R and 23 controls. All participants underwent assessment of IQ, inhibition, verbal fluency, planning and cognitive set shifting. Lack of lifetime Axis I and II disorders was screened using Structured Clinical Interview for DSM-IV and symptomatology was assessed with the Brief Psychiatric Rating Scale (BPRS).
Results:
No difference was found in IQ. Loss of inhibition was found in both SZ-R and BD-R compared to controls whereas SZ-R had slower initiation times. SZ-R also failed to inhibit relatively fast erroneous responses, leading to an effect on error rates but not in reaction times. SZ-R and BD-R produced fewer words compared to controls whereas the former group made more errors. BD-R achieved both comparable number of categories to controls and made equal number of errors whereas SZ-R underperformed compared to former groups in both measures. Effect of BPRS total score was found only for BD-R across all measures apart from inhibition.
Conclusions:
Genetic predisposition to SZ may be mediated by deficits in both the Ventral and Dorsal Prefrontal Cortex (VPFC) and (DPFC). In BD-R impairment was limited in the VPFC whereas the DPFC function was preserved. The two disorders share inhibition deficits associated with the VPFC.
This prospective, epidemiological British Ophthalmological Surveillance Unit study into ophthalmic complications of functional endoscopic sinus surgery aimed to determine the minimum incidence, presenting features and management throughout the UK.
Methods
Cases of ophthalmic complications of functional endoscopic sinus surgery, between February 2016 and February 2018, were identified through the British Ophthalmological Surveillance Unit reporting card system. Reporting ophthalmic consultants were sent an initial questionnaire, followed by a second questionnaire at six months.
Results
Twenty-six cases of ophthalmic complications of functional endoscopic sinus surgery were reported. The majority (16 cases (62 per cent)) had limitations of ocular motility at presentation. The most common final diagnosis was rectus muscle (33 per cent) and nasolacrimal duct trauma (27 per cent). Using national data, this study reports a minimum incidence of ophthalmic complications of functional endoscopic sinus surgery in the UK of 0.2 per cent over two years.
Conclusion
In terms of ophthalmic complications, functional endoscopic sinus surgery is shown to be safe. Ophthalmic complications are rare, but when they do occur, they commonly result in rectus muscle trauma, often requiring surgical intervention.
The RemoveDEBRIS mission has been the first mission to successfully demonstrate, in-orbit, a series of technologies that can be used for the active removal of space debris. The mission started late in 2014 and was sponsored by a grant from the EC that saw a consortium led by the Surrey Space Centre to develop the mission, from concept to in-orbit demonstrations, that terminated in March 2019. Technologies for the capture of large space debris, like a net and a harpoon, have been successfully tested together with hardware and software to retrieve data on non-cooperative target debris kinematics from observations carried out with on board cameras. The final demonstration consisted of the deployment of a drag-sail to increase the drag of the satellite to accelerate its demise.
Proximal environments could facilitate smoking cessation among low-income smokers by making cessation appealing to strive for and tenable.
Aims
We sought to examine how home smoking rules and proximal environmental factors such as other household members' and peers' smoking behaviors and attitudes related to low-income smokers' past quit attempts, readiness, and self-efficacy to quit.
Methods
This analysis used data from Offering Proactive Treatment Intervention (OPT-IN) (randomized control trial of proactive tobacco cessation outreach) baseline survey, which was completed by 2,406 participants in 2011/12. We tested the associations between predictors (home smoking rules and proximal environmental factors) and outcomes (past-year quit attempts, readiness to quit, and quitting self-efficacy).
Results
Smokers who lived in homes with more restrictive household smoking rules, and/or reported having ‘important others’ who would be supportive of their quitting, were more likely to report having made a quit attempt in the past year, had greater readiness to quit, and greater self-efficacy related to quitting.
Conclusions
Adjustments to proximal environments, including strengthening household smoking rules, might encourage cessation even if other household members are smokers.