We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Clostridioides difficile infection (CDI) may be misdiagnosed if testing is performed in the absence of signs or symptoms of disease. This study sought to support appropriate testing by estimating the impact of signs, symptoms, and healthcare exposures on pre-test likelihood of CDI.
Methods:
A panel of fifteen experts in infectious diseases participated in a modified UCLA/RAND Delphi study to estimate likelihood of CDI. Consensus, defined as agreement by >70% of panelists, was assessed via a REDCap survey. Items without consensus were discussed in a virtual meeting followed by a second survey.
Results:
All fifteen panelists completed both surveys (100% response rate). In the initial survey, consensus was present on 6 of 15 (40%) items related to risk of CDI. After panel discussion and clarification of questions, consensus (>70% agreement) was reached on all remaining items in the second survey. Antibiotics were identified as the primary risk factor for CDI and grouped into three categories: high-risk (likelihood ratio [LR] 7, 93% agreement among panelists in first survey), low-risk (LR 3, 87% agreement in first survey), and minimal-risk (LR 1, 71% agreement in first survey). Other major factors included new or unexplained severe diarrhea (e.g., ≥ 10 liquid bowel movements per day; LR 5, 100% agreement in second survey) and severe immunosuppression (LR 5, 87% agreement in second survey).
Conclusion:
Infectious disease experts concurred on the importance of signs, symptoms, and healthcare exposures for diagnosing CDI. The resulting risk estimates can be used by clinicians to optimize CDI testing and treatment.
We examined whether cannabis use contributes to the increased risk of psychotic disorder for non-western minorities in Europe.
Methods
We used data from the EU-GEI study (collected at sites in Spain, Italy, France, the United Kingdom, and the Netherlands) on 825 first-episode patients and 1026 controls. We estimated the odds ratio (OR) of psychotic disorder for several groups of migrants compared with the local reference population, without and with adjustment for measures of cannabis use.
Results
The OR of psychotic disorder for non-western minorities, adjusted for age, sex, and recruitment area, was 1.80 (95% CI 1.39–2.33). Further adjustment of this OR for frequency of cannabis use had a minimal effect: OR = 1.81 (95% CI 1.38–2.37). The same applied to adjustment for frequency of use of high-potency cannabis. Likewise, adjustments of ORs for most sub-groups of non-western countries had a minimal effect. There were two exceptions. For the Black Caribbean group in London, after adjustment for frequency of use of high-potency cannabis the OR decreased from 2.45 (95% CI 1.25–4.79) to 1.61 (95% CI 0.74–3.51). Similarly, the OR for Surinamese and Dutch Antillean individuals in Amsterdam decreased after adjustment for daily use: from 2.57 (95% CI 1.07–6.15) to 1.67 (95% CI 0.62–4.53).
Conclusions
The contribution of cannabis use to the excess risk of psychotic disorder for non-western minorities was small. However, some evidence of an effect was found for people of Black Caribbean heritage in London and for those of Surinamese and Dutch Antillean heritage in Amsterdam.
Antimicrobial stewardship programs (ASPs) exist to optimize antibiotic use, reduce selection for antimicrobial-resistant microorganisms, and improve patient outcomes. Rapid and accurate diagnosis is essential to optimal antibiotic use. Because diagnostic testing plays a significant role in diagnosing patients, it has one of the strongest influences on clinician antibiotic prescribing behaviors. Diagnostic stewardship, consequently, has emerged to improve clinician diagnostic testing and test result interpretation. Antimicrobial stewardship and diagnostic stewardship share common goals and are synergistic when used together. Although ASP requires a relationship with clinicians and focuses on person-to-person communication, diagnostic stewardship centers on a relationship with the laboratory and hardwiring testing changes into laboratory processes and the electronic health record. Here, we discuss how diagnostic stewardship can optimize the “Four Moments of Antibiotic Decision Making” created by the Agency for Healthcare Research and Quality and work synergistically with ASPs.
While unobscured and radio-quiet active galactic nuclei are regularly being found at redshifts
$z > 6$
, their obscured and radio-loud counterparts remain elusive. We build upon our successful pilot study, presenting a new sample of low-frequency-selected candidate high-redshift radio galaxies (HzRGs) over a sky area 20 times larger. We have refined our selection technique, in which we select sources with curved radio spectra between 72–231 MHz from the GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) survey. In combination with the requirements that our GLEAM-selected HzRG candidates have compact radio morphologies and be undetected in near-infrared
$K_{\rm s}$
-band imaging from the Visible and Infrared Survey Telescope for Astronomy Kilo-degree Infrared Galaxy (VIKING) survey, we find 51 new candidate HzRGs over a sky area of approximately
$1200\ \mathrm{deg}^2$
. Our sample also includes two sources from the pilot study: the second-most distant radio galaxy currently known, at
$z=5.55$
, with another source potentially at
$z \sim 8$
. We present our refined selection technique and analyse the properties of the sample. We model the broadband radio spectra between 74 MHz and 9 GHz by supplementing the GLEAM data with both publicly available data and new observations from the Australia Telescope Compact Array at 5.5 and 9 GHz. In addition, deep
$K_{\rm s}$
-band imaging from the High-Acuity Widefield K-band Imager (HAWK-I) on the Very Large Telescope and from the Southern Herschel Astrophysical Terahertz Large Area Survey Regions
$K_{\rm s}$
-band Survey (SHARKS) is presented for five sources. We discuss the prospects of finding very distant radio galaxies in our sample, potentially within the epoch of reionisation at
$z \gtrsim 6.5$
.
Gene x environment (G×E) interactions, i.e. genetic modulation of the sensitivity to environmental factors and/or environmental control of the gene expression, have not been reliably established regarding aetiology of psychotic disorders. Moreover, recent studies have shown associations between the polygenic risk scores for schizophrenia (PRS-SZ) and some risk factors of psychotic disorders, challenging the traditional gene v. environment dichotomy. In the present article, we studied the role of GxE interaction between psychosocial stressors (childhood trauma, stressful life-events, self-reported discrimination experiences and low social capital) and the PRS-SZ on subclinical psychosis in a population-based sample.
Methods
Data were drawn from the EUropean network of national schizophrenia networks studying Gene-Environment Interactions (EU-GEI) study, in which subjects without psychotic disorders were included in six countries. The sample was restricted to European descendant subjects (n = 706). Subclinical dimensions of psychosis (positive, negative, and depressive) were measured by the Community Assessment of Psychic Experiences (CAPE) scale. Associations between the PRS-SZ and the psychosocial stressors were tested. For each dimension, the interactions between genes and environment were assessed using linear models and comparing explained variances of ‘Genetic’ models (solely fitted with PRS-SZ), ‘Environmental’ models (solely fitted with each environmental stressor), ‘Independent’ models (with PRS-SZ and each environmental factor), and ‘Interaction’ models (Independent models plus an interaction term between the PRS-SZ and each environmental factor). Likelihood ration tests (LRT) compared the fit of the different models.
Results
There were no genes-environment associations. PRS-SZ was associated with positive dimensions (β = 0.092, R2 = 7.50%), and most psychosocial stressors were associated with all three subclinical psychotic dimensions (except social capital and positive dimension). Concerning the positive dimension, Independent models fitted better than Environmental and Genetic models. No significant GxE interaction was observed for any dimension.
Conclusions
This study in subjects without psychotic disorders suggests that (i) the aetiological continuum hypothesis could concern particularly the positive dimension of subclinical psychosis, (ii) genetic and environmental factors have independent effects on the level of this positive dimension, (iii) and that interactions between genetic and individual environmental factors could not be identified in this sample.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to coronavirus disease 2019 (COVID-19) with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplementary materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
Acute cannabis administration can produce transient psychotic-like effects in healthy individuals. However, the mechanisms through which this occurs and which factors predict vulnerability remain unclear. We investigate whether cannabis inhalation leads to psychotic-like symptoms and speech illusion; and whether cannabidiol (CBD) blunts such effects (study 1) and adolescence heightens such effects (study 2).
Methods
Two double-blind placebo-controlled studies, assessing speech illusion in a white noise task, and psychotic-like symptoms on the Psychotomimetic States Inventory (PSI). Study 1 compared effects of Cann-CBD (cannabis containing Δ-9-tetrahydrocannabinol (THC) and negligible levels of CBD) with Cann+CBD (cannabis containing THC and CBD) in 17 adults. Study 2 compared effects of Cann-CBD in 20 adolescents and 20 adults. All participants were healthy individuals who currently used cannabis.
Results
In study 1, relative to placebo, both Cann-CBD and Cann+CBD increased PSI scores but not speech illusion. No differences between Cann-CBD and Cann+CBD emerged. In study 2, relative to placebo, Cann-CBD increased PSI scores and incidence of speech illusion, with the odds of experiencing speech illusion 3.1 (95% CIs 1.3–7.2) times higher after Cann-CBD. No age group differences were found for speech illusion, but adults showed heightened effects on the PSI.
Conclusions
Inhalation of cannabis reliably increases psychotic-like symptoms in healthy cannabis users and may increase the incidence of speech illusion. CBD did not influence psychotic-like effects of cannabis. Adolescents may be less vulnerable to acute psychotic-like effects of cannabis than adults.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
$60+$
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
We used a survey to characterize contemporary infection prevention and antibiotic stewardship program practices across 64 healthcare facilities, and we compared these findings to those of a similar 2013 survey. Notable findings include decreased frequency of active surveillance for methicillin-resistant Staphylococcus aureus, frequent active surveillance for carbapenem-resistant Enterobacteriaceae, and increased support for antibiotic stewardship programs.
To ascertain opinions regarding etiology and preventability of hospital-onset bacteremia and fungemia (HOB) and perspectives on HOB as a potential outcome measure reflecting quality of infection prevention and hospital care.
Design:
Cross-sectional survey.
Participants:
Hospital epidemiologists and infection preventionist members of the Society for Healthcare Epidemiology of America (SHEA) Research Network.
Methods:
A web-based, multiple-choice survey was administered via the SHEA Research Network to 133 hospitals.
Results:
A total of 89 surveys were completed (67% response rate). Overall, 60% of respondents defined HOB as a positive blood culture on or after hospital day 3. Central line-associated bloodstream infections and intra-abdominal infections were perceived as the most frequent etiologies. Moreover, 61% thought that most HOB events are preventable, and 54% viewed HOB as a measure reflecting a hospital’s quality of care. Also, 29% of respondents’ hospitals already collect HOB data for internal purposes. Given a choice to publicly report central-line–associated bloodstream infections (CLABSIs) and/or HOB, 57% favored reporting either HOB alone (22%) or in addition to CLABSI (35%) and 34% favored CLABSI alone.
Conclusions:
Among the majority of SHEA Research Network respondents, HOB is perceived as preventable, reflective of quality of care, and potentially acceptable as a publicly reported quality metric. Further studies on HOB are needed, including validation as a quality measure, assessment of risk adjustment, and formation of evidence-based bundles and toolkits to facilitate measurement and improvement of HOB rates.
The thermal decomposition of caledonite has been examined by simultaneous differential thermal analysis, thermogravimetry and mass spectrometry. Structural H2O and CO2 are liberated endothermically between 300 and 400°C leaving a residue of lead sulphate, oxysulphate, and Cu(I) and Cu(II) oxides. A series of sharp endothermic peaks between 850 and 950°C correspond to phase transition and melting reactions of the PbO-PbSO4 mixture. The sulphate anion breaks down above 880 °C. Mass spectra of the gaseous decomposition products show SO2, SO, and O2, although SO is an artefact arising from ion fragmentation of the SO2 within the mass spectrometer. The residue at 1060 °C is composed predominantly of 2PbO · PbSO4 and Cu(I) and Cu(II) oxides.
Firn temperatures at the Dome Summit South drill site, East Antarctica, are simulated by driving a thermal model of the ice sheet with observed instrumental records over the period 1960-96. The model incorporates firn density and thermal properties to reproduce measured borehole temperatures as shallow as 5 m below the surface, where the seasonal temperature wave is readily apparent. The study shows that ice-sheet temperatures are approximately 0.8°C cooler than mean 4 m air temperatures. It also finds that non-conductive processes such as ventilation and radiation can be simulated at this site by assuming perfect thermal contact between the top ∼1 m of firn and the atmosphere on monthly time-scales.
We present techniques developed to calibrate and correct Murchison Widefield Array low-frequency (72–300 MHz) radio observations for polarimetry. The extremely wide field-of-view, excellent instantaneous (u, v)-coverage and sensitivity to degree-scale structure that the Murchison Widefield Array provides enable instrumental calibration, removal of instrumental artefacts, and correction for ionospheric Faraday rotation through imaging techniques. With the demonstrated polarimetric capabilities of the Murchison Widefield Array, we discuss future directions for polarimetric science at low frequencies to answer outstanding questions relating to polarised source counts, source depolarisation, pulsar science, low-mass stars, exoplanets, the nature of the interstellar and intergalactic media, and the solar environment.
The current generation of experiments aiming to detect the neutral hydrogen signal from the Epoch of Reionisation (EoR) is likely to be limited by systematic effects associated with removing foreground sources from target fields. In this paper, we develop a model for the compact foreground sources in one of the target fields of the MWA’s EoR key science experiment: the ‘EoR1’ field. The model is based on both the MWA’s GLEAM survey and GMRT 150 MHz data from the TGSS survey, the latter providing higher angular resolution and better astrometric accuracy for compact sources than is available from the MWA alone. The model contains 5 049 sources, some of which have complicated morphology in MWA data, Fornax A being the most complex. The higher resolution data show that 13% of sources that appear point-like to the MWA have complicated morphology such as double and quad structure, with a typical separation of 33 arcsec. We derive an analytic expression for the error introduced into the EoR two-dimensional power spectrum due to peeling close double sources as single point sources and show that for the measured source properties, the error in the power spectrum is confined to high k⊥ modes that do not affect the overall result for the large-scale cosmological signal of interest. The brightest 10 mis-modelled sources in the field contribute 90% of the power bias in the data, suggesting that it is most critical to improve the models of the brightest sources. With this hybrid model, we reprocess data from the EoR1 field and show a maximum of 8% improved calibration accuracy and a factor of two reduction in residual power in k-space from peeling these sources. Implications for future EoR experiments including the SKA are discussed in relation to the improvements obtained.
We examined longitudinally the course and predictors of treatment resistance in a large cohort of first-episode psychosis (FEP) patients from initiation of antipsychotic treatment. We hypothesized that antipsychotic treatment resistance is: (a) present at illness onset; and (b) differentially associated with clinical and demographic factors.
Method
The study sample comprised 323 FEP patients who were studied at first contact and at 10-year follow-up. We collated clinical information on severity of symptoms, antipsychotic medication and treatment adherence during the follow-up period to determine the presence, course and predictors of treatment resistance.
Results
From the 23% of the patients, who were treatment resistant, 84% were treatment resistant from illness onset. Multivariable regression analysis revealed that diagnosis of schizophrenia, negative symptoms, younger age at onset, and longer duration of untreated psychosis predicted treatment resistance from illness onset.
Conclusions
The striking majority of treatment-resistant patients do not respond to first-line antipsychotic treatment even at time of FEP. Clinicians must be alert to this subgroup of patients and consider clozapine treatment as early as possible during the first presentation of psychosis.
Around 70% of total seed phosphorus is represented by phytate which must be hydrolysed to be bioavailable in non-ruminant diets. The limited endogenous phytase activity in non-ruminant animals make it common practice to add an exogenous phytase source to most poultry and pig feeds. The mature grain phytase activity (MGPA) of cereal seeds provides a route for the seeds themselves to contribute to phytate digestion, but MGPA varies considerably between species and most varieties in current use make negligible contributions. Currently, all phytases used for feed supplementation and transgenic improvement of MGPA are derived from microbial enzymes belonging to the group of histidine acid phosphatases (HAP). Cereals contain HAP phytases, but the bulk of MGPA can be attributed to phytases belonging to a completely different group of phosphatases, the purple acid phosphatases (PAPhy). In recent years, increased MGPAs were achieved in cisgenic barley holding extra copies of barley PAPhy and in the wheat HIGHPHY mutant, where MGPA was increased to ~6200 FTU/kg. In the present study, the effect of replacing 33%, 66% and 100% of a standard wheat with HIGHPHY wheat was compared with a control diet with and without 500 FTU of supplemental phytase. Diets were compared by evaluating broiler performance, ileal Ca and P digestibility and tibia development, using nine replicate pens of four birds per diet over 3 weeks from hatch. There were no differences between treatments in any tibia or bird performance parameters, indicating the control diet did not contain sufficiently low levels of phosphorus to distinguish effect of phytase addition. However, in a comparison of the two wheats, the ileal Ca and P digestibility coefficients for the 100% HIGHPHY wheat diets are 22.9% and 35.6% higher, respectively, than for the control diet, indicating the wheat PAPhy is functional in the broiler digestive tract. Furthermore, 33% HIGHPHY replacement of conventional wheat, significantly improved Ca and P digestibility over the diet-supplemented exogenous phytase, probably due to the higher phytase activity in the HIGHPHY diet (1804 v. 1150 FTU). Full replacement by HIGHPHY gave 14.6% and 22.8% higher ileal digestibility coefficients for Ca and P, respectively, than for feed supplemented with exogenous HAP phytase at 500 FTU. This indicates that in planta wheat PAPhys has promising potential for improving P and mineral digestibility in animal feed.