We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Many preoperative urine cultures are of low value and may even lead to patient harms. This study sought to understand practices around ordering preoperative urine cultures and prescribing antibiotic treatment.
We interviewed participants using a qualitative semi-structured interview guide. Collected data was coded inductively and with the Dual Process Model (DPM) using MAXQDA software. Data in the “Testing Decision-Making” code was further reviewed using the concept of perceived risk as a sensitizing concept.
Results:
We identified themes relating to surgeons’ concerns about de-implementing preoperative urine cultures to detect asymptomatic bacteriuria (ASB) in patients undergoing non-urological procedures: (1) anxiety and uncertainty surrounding missing infection signs spanned surgical specialties, (2) there were perceived risks of negative consequences associated with omitting urine cultures and treatment prior to specific procedure sites and types, and additionally, (3) participants suggested potential routes for adjusting these perceived risks to facilitate de-implementation acceptance. Notably, participants suggested that leadership support and peer engagement could help improve surgeon buy-in.
Conclusions:
Concerns about perceived risks sometimes outweigh the evidence against routine preoperative urine cultures to detect ASB. Evidence from trusted peers may improve openness to de-implementing preoperative urine cultures.
The intent of this document is to highlight practical recommendations in a concise format designed to assist acute-care hospitals in implementing and prioritizing their surgical-site infection (SSI) prevention efforts. This document updates the Strategies to Prevent Surgical Site Infections in Acute Care Hospitals published in 2014.1 This expert guidance document is sponsored by the Society for Healthcare Epidemiology of America (SHEA). It is the product of a collaborative effort led by SHEA, the Infectious Diseases Society of America (IDSA), the Association for Professionals in Infection Control and Epidemiology (APIC), the American Hospital Association (AHA), and The Joint Commission, with major contributions from representatives of a number of organizations and societies with content expertise.
The Intensity Interferometry technique consists of measuring the spatial coherence (visibility) of an object via its intensity fluctuations over a sufficient range of telescope separations (baselines). This allows us to study the size, shape and morphology of stars with an unprecedented resolution. Cherenkov telescopes have a set of characteristics that coincidentally allow for Intensity Interferometry observations: very large reflective surfaces, sensitivity to individual photons, temporal resolution of nanoseconds and the fact that they come in groups of several telescopes. In the recent years, the MAGIC Collaboration has developed a deadtime-free Intensity Interferometry setup for its two 17 m diameter Cherenkov telescopes that includes a 4-channel GPU-based real-time correlator, 410–430 nm filters and new ways of splitting its primary mirrors into submirrors using Active Mirror Control (AMC). With this setup, MAGIC can operate as a long-baseline optical interferometer in the baseline range 40–90 m, which translates into angular resolutions of 0.5-1 mas. Additionally, thanks to its AMC, it can simultaneously measure the zero-baseline correlation or, by splitting into submirrors, access shorter baselines under 17 m in multiple u-v plane orientations. The best candidates to observe with this technique are relatively small and bright stars, in other words, massive stars (O, B and A types). We will present the science cases that are currently being proposed for this setup, as well as the prospects for the future of the system and technique, like the possibility of large-scale implementation with CTA.
To evaluate the frequency of antibiotic prescribing for common infections via telemedicine compared to face-to-face visits.
Design:
Systematic literature review and meta-analysis.
Methods:
We searched PubMed, CINAHL, Embase (Elsevier platform) and Cochrane CENTRAL to identify studies comparing frequency of antibiotic prescribing via telemedicine and face-to-face visits without restrictions by publish dates or language used. We conducted meta-analyses of 5 infections: sinusitis, pharyngitis, otitis media, upper respiratory infection (URI) and urinary tract infection (UTI). Random-effect models were used to obtain pooled odds ratios (ORs). Heterogeneity was evaluated with I2 estimation and the Cochran Q statistic test.
Results:
Among 3,106 studies screened, 23 studies (1 randomized control study, 22 observational studies) were included in the systematic literature review. Most of the studies (21 of 23) were conducted in the United States. Studies were substantially heterogenous, but stratified analyses revealed that providers prescribed antibiotics more frequently via telemedicine for otitis media (pooled odds ratio [OR], 1.26; 95% confidence interval [CI], 1.04–1.52; I2 = 31%) and pharyngitis (pooled OR, 1.16; 95% CI, 1.01–1.33; I2 = 0%). We detected no significant difference in the frequencies of antibiotic prescribing for sinusitis (pooled OR, 0.86; 95% CI, 0.70–1.06; I2 = 91%), URI (pooled OR, 1.18; 95% CI, 0.59–2.39; I2 = 100%), or UTI (pooled OR, 2.57; 95% CI, 0.88–7.46; I2 = 91%).
Conclusions:
Telemedicine visits for otitis media and pharyngitis were associated with higher rates of antibiotic prescribing. The interpretation of these findings requires caution due to substantial heterogeneity among available studies. Large-scale, well-designed studies with comprehensive assessment of antibiotic prescribing for common outpatient infections comparing telemedicine and face-to-face visits are needed to validate our findings.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
To compare the prevalence of select cardiovascular risk factors (CVRFs) in patients with mild cognitive impairment (MCI) versus lifetime history of major depression disorder (MDD) and a normal comparison group using baseline data from the Prevention of Alzheimer’s Dementia with Cognitive Remediation plus Transcranial Direct Current Stimulation (PACt-MD) study.
Design:
Baseline data from a multi-centered intervention study of older adults with MCI, history of MDD, or combined MCI and history of MDD (PACt-MD) were analyzed.
Setting:
Community-based multi-centered study based in Toronto across 5 academic sites.
Participants:
Older adults with MCI, history of MDD, or combined MCI and history of MDD and healthy controls.
Measurements:
We examined the baseline distribution of smoking, hypertension and diabetes in three groups of participants aged 60+ years in the PACt-MD cohort study: MCI (n = 278), MDD (n = 95), and healthy older controls (n = 81). Generalized linear models were fitted to study the effect of CVRFs on MCI and MDD as well as neuropsychological composite scores.
Results:
A higher odds of hypertension among the MCI cohort compared to healthy controls (p < .05) was noted in unadjusted analysis. Statistical significance level was lost on adjusting for age, sex and education (p > .05). A history of hypertension was associated with lower performance in composite executive function (p < .05) and overall composite neuropsychological test score (p < .05) among a pooled cohort with MCI or MDD.
Conclusions:
This study reinforces the importance of treating modifiable CVRFs, specifically hypertension, as a means of mitigating cognitive decline in patients with at-risk cognitive conditions.
To evaluate the effectiveness of chlorhexidine (CHG) dressings to prevent catheter-related bloodstream infections (CRBSIs).
Design:
Systematic review and meta-analysis.
Methods:
We searched PubMed, CINAHL, EMBASE, and ClinicalTrials.gov for studies (randomized controlled and quasi-experimental trials) with the following criteria: patients with short- or long-term catheters; CHG dressings were used in the intervention group and nonantimicrobial dressings in the control group; CRBSI was an outcome. Random-effects models were used to obtain pooled risk ratios (pRRs). Heterogeneity was evaluated using the I2 test and the Cochran Q statistic.
Results:
In total, 20 studies (18 randomized controlled trials; 15,590 catheters) without evidence of publication bias and mainly performed in intensive care units (ICUs) were included. CHG dressings significantly reduced CRBSIs (pRR, 0.71; 95% CI, 0.58–0.87), independent of the CHG dressing type used. Benefits were limited to adults with short-term central venous catheters (CVCs), including onco-hematological patients. For long-term CVCs, CHG dressings decreased exit-site/tunnel infections (pRR, 0.37; 95% CI, 0.22–0.64). Contact dermatitis was associated with CHG dressing use (pRR, 5.16; 95% CI, 2.09–12.70); especially in neonates and pediatric populations in whom severe reactions occurred. Also, 2 studies evaluated and did not find CHG-acquired resistance.
Conclusions:
CHG dressings prevent CRBSIs in adults with short-term CVCs, including patients with an onco-hematological disease. CHG dressings might reduce exit-site and tunnel infections in long-term CVCs. In neonates and pediatric populations, proof of CHG dressing effectiveness is lacking and there is an increased risk of serious adverse events. Future studies should investigate CHG effectiveness in non-ICU settings and monitor for CHG resistance.
Pregabalin is indicated for the treatment of GAD in adults in Europe. The efficacy and safety of pregabalin for the treatment of adults and elderly patients with GAD has been demonstrated in 6 of 7 short-term clinical trials of 4 to 8 weeks.
Aims/objectives
To characterise the long-term efficacy and safety of pregabalin in subjects with GAD.
Methods
Subjects were randomised to double-blind treatment with either high-dose pregabalin (450-600 mg/d), low-dose pregabalin (150-300 mg/d), or lorazepam (3-4 mg/d) for 3 months. Treatment was extended with drug or blinded placebo for a further 3 months.
Results
At 3 months, mean change from baseline Hamilton Anxiety Rating Scale (HAM-A) for pregabalin high- and low-dose, and for lorazepam ranged from -16.0 to -17.4. Mean change from baseline Clinical Global Impression-Severity (CGI-S) scores ranged from -2.1 to -2.3 and mean CGI-Improvement (CGI-I) scores were 1.9 for each active treatment group. At 6 months, improvement was retained for all 3 active drug groups, even when switched to placebo. HAM-A and CGI-S change from baseline scores ranged from -14.9 to -19.0 and -2.0 to -2.5, respectively. Mean CGI-I scores ranged from 1.5 to 2.3. The most frequently reported adverse events were insomnia, fatigue, dizziness, headache, and somnolence.
Conclusions
Efficacy was observed at 3 months, with maintained improvement in anxiety symptoms over 6 months of treatment. These results are consistent with previously reported efficacy and safety trials of shorter duration with pregabalin and lorazepam in subjects with GAD.
Pregabalin is indicated for the treatment of generalised anxiety disorder (GAD) in adults in Europe. When pregabalin is discontinued, a 1-week (minimum) taper is recommended to prevent potential discontinuation symptoms.
Aims/objectives
To evaluate whether a 1-week pregabalin taper, after 3 or 6 months of treatment, is associated with the development of discontinuation symptoms (including rebound anxiety) in subjects with GAD.
Methods
Subjects were randomised to double-blind treatment with low- (150-300 mg/d) or high-dose pregabalin (450-600 mg/d) or lorazepam (3-4 mg/d) for 3 months. After 3 months ~25% of subjects in each group (per the original randomisation) underwent a double-blind, 1-week taper, with substitution of placebo. The remaining subjects continued on active treatment for another 3 months and underwent the 1-week taper at 6 months.
Results
Discontinuation after 3 months was associated with low mean changes in Physician Withdrawal Checklist (PWC) scores (range: +1.4 to +2.3) and Hamilton Anxiety Rating Scale (HAM A) scores (range: +0.9 to +2.3) for each pregabalin dose and lorazepam. Discontinuation after 6 months was associated with low mean changes in PWC scores (range: -1.0 to +3.0) and HAM A scores (range: -0.8 to +3.0) for all active drugs and placebo. Incidence of rebound anxiety during pregabalin taper was low and did not appear related to treatment dose or duration.
Conclusions
A 1-week taper following 3 or 6 months of pregabalin treatment was not associated with clinically meaningful discontinuation symptoms as evaluated by changes in the PWC and HAM A rating scales.
We report a novel strategy to render stainless steel (SS) a more versatile material that is suitable to be used as the substrate for preparing electrodes for efficient hydrogen evolution by interface engineering. Our strategy involves the growth of carbon nanotubes (CNTs) by atmospheric pressure chemical vapor deposition (APCVD) as the interface material on the surface of SS. We optimized the procedure to prepare CNTs/SS and demonstrate a higher activity of the CNTs/SS prepared at 700 °C for the hydrogen evolution reaction (HER) when compared to samples prepared at other temperatures. This can be attributed to the higher number of defects and the higher content of pyrrolic N obtained at this temperature. Our strategy offers a new approach to employ SS as a substrate for the preparation of highly efficient electrodes and has the potential to be widely used in electrochemistry.
We examined Clostridioides difficile infection (CDI) prevention practices and their relationship with hospital-onset healthcare facility-associated CDI rates (CDI rates) in Veterans Affairs (VA) acute-care facilities.
Design:
Cross-sectional study.
Methods:
From January 2017 to February 2017, we conducted an electronic survey of CDI prevention practices and hospital characteristics in the VA. We linked survey data with CDI rate data for the period January 2015 to December 2016. We stratified facilities according to whether their overall CDI rate per 10,000 bed days of care was above or below the national VA mean CDI rate. We examined whether specific CDI prevention practices were associated with an increased risk of a CDI rate above the national VA mean CDI rate.
Results:
All 126 facilities responded (100% response rate). Since implementing CDI prevention practices in July 2012, 60 of 123 facilities (49%) reported a decrease in CDI rates; 22 of 123 facilities (18%) reported an increase, and 41 of 123 (33%) reported no change. Facilities reporting an increase in the CDI rate (vs those reporting a decrease) after implementing prevention practices were 2.54 times more likely to have CDI rates that were above the national mean CDI rate. Whether a facility’s CDI rates were above or below the national mean CDI rate was not associated with self-reported cleaning practices, duration of contact precautions, availability of private rooms, or certification of infection preventionists in infection prevention.
Conclusions:
We found considerable variation in CDI rates. We were unable to identify which particular CDI prevention practices (i.e., bundle components) were associated with lower CDI rates.
Healthcare-associated infections (HAIs) are a significant burden on healthcare facilities. Universal gloving is a horizontal intervention to prevent transmission of pathogens that cause HAI. In this meta-analysis, we aimed to identify whether implementation of universal gloving is associated with decreased incidence of HAI in clinical settings.
Methods:
A systematic literature search was conducted to find all relevant publications using search terms for universal gloving and HAIs. Pooled incidence rate ratios (IRRs) and 95% confidence intervals (CIs) were calculated using random effects models. Heterogeneity was evaluated using the Woolf test and the I2 test.
Results:
In total, 8 studies were included. These studies were moderately to substantially heterogeneous (I2 = 59%) and had varied results. Stratified analyses showed a nonsignificant association between universal gloving and incidence of methicillin-resistant Staphylococcus aureus (MRSA; pooled IRR, 0.94; 95% CI, 0.79–1.11) and vancomycin-resistant enterococci (VRE; pooled IRR, 0.94; 95% CI, 0.69–1.28). Studies that implemented universal gloving alone showed a significant association with decreased incidence of HAI (IRR, 0.77; 95% CI, 0.67–0.89), but studies implementing universal gloving as part of intervention bundles showed no significant association with incidence of HAI (IRR, 0.95; 95% CI, 0.86–1.05).
Conclusions:
Universal gloving may be associated with a small protective effect against HAI. Despite limited data, universal gloving may be considered in high-risk settings, such as pediatric intensive care units. Further research should be performed to determine the effects of universal gloving on a broader range of pathogens, including gram-negative pathogens.
Despite a reported worldwide increase, the incidence of extended-spectrum β-lactamase (ESBL) Escherichia coli and Klebsiella infections in the United States is unknown. Understanding the incidence and trends of ESBL infections will aid in directing research and prevention efforts.
OBJECTIVE
To perform a literature review to identify the incidence of ESBL-producing E. coli and Klebsiella infections in the United States.
DESIGN
Systematic literature review.
METHODS
MEDLINE via Ovid, CINAHL, Cochrane library, NHS Economic Evaluation Database, Web of Science, and Scopus were searched for multicenter (≥2 sites), US studies published between 2000 and 2015 that evaluated the incidence of ESBL-E. coli or ESBL-Klebsiella infections. We excluded studies that examined resistance rates alone or did not have a denominator that included uninfected patients such as patient days, device days, number of admissions, or number of discharges. Additionally, articles that were not written in English, contained duplicated data, or pertained to ESBL organisms from food, animals, or the environment were excluded.
RESULTS
Among 51,419 studies examined, 9 were included for review. Incidence rates differed by patient population, time, and ESBL definition and ranged from 0 infections per 100,000 patient days to 16.64 infections per 10,000 discharges and incidence rates increased over time from 1997 to 2011. Rates were slightly higher for ESBL-Klebsiella infections than for ESBL-E. coli infections.
CONCLUSION
The incidence of ESBL-E. coli and ESBL-Klebsiella infections in the United States has increased, with slightly higher rates of ESBL-Klebsiella infections. Appropriate estimates of ESBL infections when coupled with other mechanisms of resistance will allow for the appropriate targeting of resources toward research, drug discovery, antimicrobial stewardship, and infection prevention.
Intraspecific interference of populations of sunflower (Helianthus annuus L.) and velvetleaf (Abutilon theophrasti Medic), as well as interspecific populations of these two broadleaf weeds, in sugarbeets (Beta vulgaris L. ‘Mono Hy D2′) was determined in 2-yr field experiments. Sunflower was more competitive in sugarbeets than was velvetleaf. At densities of 6, 12, 18, and 24 sunflower plants/30 m of row, root yields were reduced 40, 52, 67, and 73%, respectively. At the same densities of velvetleaf, root yields were reduced only 14, 17, 25, and 30%, respectively. Interspecific interference of these two broadleaf weeds at the same densities reduced root yields 19, 36, 43, and 56%, respectively. The minimum number of weeds required/30 m of row to reduce sugarbeet root yields was predicted to be 1 for sunflower, 9 to 12 for velvetleaf, and 2 to 7 for an equal population of sunflower and velvetleaf.
Twelve sequential herbicide treatments were compared to cycloate (S-ethyl N-ethylthiocyclohexanecarbamate), a standard treatment, for control of annual weeds in sugarbeets (Beta vulgaris L.) in three field experiments conducted from 1971 through 1977. At harvest, seven sequential treatments had less than 10 annual broadleaf weeds per 30 m of row, whereas there were 40 broadleaf weeds per 30 m of row for the cycloate treatment. Four of these sequential treatments had significantly higher root yields and net returns than the cycloate treatment. Dependent on the sequential treatment and year, tonnage was increased 7.3 to 20.3 t/ha, and net returns $150 to $515/ha above those with cycloate. The most effective sequential treatment for control of weeds was a preplanting mixture of 2.2 kg/ha of ethofumesate [(±)-2-ethoxy-2,3-dihydro-3,3-dimethyl-5-benzofuranyl methanesulfonate] plus 1.7 kg/ha of diclofop {2-[4-(2,4-dichlorophenoxy)phenoxy] propanoic acid} followed by a postemergence mixture of 0,6 kg/ha each of desmedipham [ethyl m-hydroxycarbanilate carbanilate (ester)] plus phenmedipham (methyl m-hydroxycarbanilate m-methylcarbanilate). This sequential herbicide treatment increased root yields by an average of 20.3 t/ha and net returns by $515/ha above those with cycloate.
The herbicide 2,4-D [(2,4-dichlorophenoxy)acetic acid] was applied as sublethal rates to sugarbeets (Beta vulgaris L. ‘Mono Hy D2’) in the field at different growth stages to determine its effect on growth and yield. The greatest reduction in top growth occurred when the highest rate of 2,4-D, 0.07 kg/ha, was applied to the oldest plants (12-leaf stage). All rates of 2,4-D reduced the components of sucrose yield (percentage sucrose, percentage purity, and root weight) to the extent that, together, the three components contributed to a significant reduction in recoverable sucrose. The yields of recoverable sucrose were reduced 6.8, 7.8, and 13.2% by the 0.017, 0.035, and 0.07 kg/ha rates, respectively.
Dormant and nondormant kochia [Kochia scoparia (L.) Schrad. ♯3 KCHSC] seed populations were buried at six soil depths in Colorado. Portions of both populations remained viable for 36 months. Persistence increased with burial depth in both populations. Seed loss from the initially dormant population was limited to germination in situ, but seed loss from the initially nondormant population included significant viability loss at burial depths of 10 cm or less. Persistence of both populations was regulated by dormancy retention. Shallow tillage practices are predicted to decrease seed persistence in soil and increase successful seedling emergence. Deep tillage practices are predicted to reduce seedling emergence but increase soil seed populations.