We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Enterotoxigenic Escherichia coli (ETEC) is a well-established cause of traveller's diarrhoea and occasional domestic foodborne illness outbreaks in the USA. Although ETEC are not detected by conventional stool culture methods used in clinical laboratories, syndromic culture-independent diagnostic tests (CIDTs) capable of detecting ETEC have become increasingly prevalent in the last decade. This study describes the epidemiology of ETEC infections reported to the Minnesota Department of Health (MDH) during 2016–2017. ETEC-positive stool specimens were submitted to MDH to confirm the presence of ETEC DNA by polymerase chain reaction (PCR). Cases were interviewed to ascertain illness and exposures. Contemporaneous Salmonella cases were used as a comparison group in a case-case comparison analysis of risk factors. Of 222 ETEC-positive specimens received by MDH, 108 (49%) were concordant by PCR. ETEC was the sixth most frequently reported bacterial enteric pathogen among a subset of CIDT-positive specimens. Sixty-nine (64%) laboratory-confirmed cases had an additional pathogen codetected with ETEC, including enteroaggregative E. coli (n = 40) and enteropathogenic E. coli (n = 39). Although travel is a risk factor for ETEC infection, only 43% of cases travelled internationally, providing evidence for ETEC as an underestimated source of domestically acquired enteric illness in the USA.
In cognitive models of adult psychosis, schematic beliefs about the self and others are important vulnerability and maintaining factors, and are therefore targets for psychological interventions. Schematic beliefs have not previously been investigated in children with distressing unusual, or psychotic-like, experiences (UEDs). The aim of this study was firstly to investigate whether a measure of schematic beliefs, originally designed for adults with psychosis, was suitable for children; and secondly, to examine the association of childhood schematic beliefs with internalising and externalising problems and with UEDs.
Method
Sixty-seven children aged 8–14 years, with emotional and behavioural difficulties, completed measures of UEDs, internalising (depression and anxiety), and externalising (conduct and hyperactivity-inattention) problems, together with the Brief Core Schema Scales (BCSS).
Results
The BCSS was readily completed by participants, and scale psychometric properties were good. Children tended to view themselves and others positively. Internalising and externalising problems and UEDs were all associated with negative schematic beliefs; effect sizes were small to medium.
Conclusions
Schematic beliefs in young people can be measured using the BCSS, and negative schematic beliefs are associated with childhood psychopathology and with UEDs. Schematic beliefs may therefore form a useful target in psychological interventions for young people with UEDs.
Belonging to a social group is one of the most important factors contributing to well-being. The Belonging Regulation model proposes that humans possess a social monitoring system (SMS) that evaluates social inclusion and monitors belonging needs. Here, we used a prospective longitudinal design to examine links between peer victimization experienced across 7 years and social monitoring at the behavioral and neural level in adolescent girls (n = 38, Mage = 15.43 years, SD = .33). Participants completed a social evaluation task during a functional magnetic resonance imaging (fMRI) scan. More severe peer victimization was associated with increased activation to in-group versus out-group peers in the amygdala, ventral striatum, fusiform gyrus, and temporoparietal junction. Moreover, participants who displayed increased activation in these regions reported lower social self esteem and higher levels of internalizing and externalizing symptoms. These results suggest that exposure to peer victimization across the school years is associated with heightened social monitoring at the neural level during adolescence, which has potential adverse implications for girls’ adjustment and well-being.
We present the first general theory of glacier surging that includes both temperate and polythermal glacier surges, based on coupled mass and enthalpy budgets. Enthalpy (in the form of thermal energy and water) is gained at the glacier bed from geothermal heating plus frictional heating (expenditure of potential energy) as a consequence of ice flow. Enthalpy losses occur by conduction and loss of meltwater from the system. Because enthalpy directly impacts flow speeds, mass and enthalpy budgets must simultaneously balance if a glacier is to maintain a steady flow. If not, glaciers undergo out-of-phase mass and enthalpy cycles, manifest as quiescent and surge phases. We illustrate the theory using a lumped element model, which parameterizes key thermodynamic and hydrological processes, including surface-to-bed drainage and distributed and channelized drainage systems. Model output exhibits many of the observed characteristics of polythermal and temperate glacier surges, including the association of surging behaviour with particular combinations of climate (precipitation, temperature), geometry (length, slope) and bed properties (hydraulic conductivity). Enthalpy balance theory explains a broad spectrum of observed surging behaviour in a single framework, and offers an answer to the wider question of why the majority of glaciers do not surge.
Calling in staff and preparing the operating room for an urgent surgical procedure is a significant draw on hospital resources and disrupts care of other patients. It has been common practice to treat open fractures on an urgent basis. HTA methods can be applied to examine this prioritization of care, just like they can be applied to the acquisition of drugs and devices.
Methods:
Our center completed a rapid systematic review of guidelines, systematic reviews, and primary clinical evidence, on urgent surgical debridement and stabilization of open fractures of long bones (“urgent” being defined as within six hours of the injury) compared to surgical debridement and reduction performed at a later time point. Meta-analyses were performed for infection and non-union outcomes and the GRADE system was used to assess the strength of evidence for each conclusion.
Results:
We found no published clinical guidelines for the urgency of treating open fractures. A good systematic review on the topic was published in 2012. We found six cohort studies published since completion of the earlier review. The summary odds ratio for any infection in patients with later treatment was 0.97 (95% confidence interval (CI) 0.78–1.22, sixteen studies, 3,615 patients) and for deep or “major” infections was 1.00 (95% CI 0.74–1.34, nine studies, 2,013 patients). The summary odds ratio of non-union with later treatment was 0.95 (95% CI 0.65–1.41, six studies, 1,308 patients). There was no significant heterogeneity in any of the results (I-squared = 0 percent) and no apparent trends in the results as a function of study size or publication date. We graded the strength of each of the conclusions as very low because they were based on cohort studies where the treating physician could elect immediate treatment for patients with severe soft-tissue injuries or patients at risk of complications. This raises the risk of spectrum bias.
Conclusions:
Default urgent scheduling of patients with open fractures for surgical debridement and stabilization does not appear to reduce the risk of infection or fracture non-union. Based on this information, our surgery department managers no longer schedule patients with open fractures for immediate surgery unless there are specific circumstances necessitating it.
Alteplase is an effective treatment for ischaemic stroke patients, and it is widely available at all primary stroke centres. The effectiveness of alteplase is highly time-dependent. Large tertiary centres have reported significant improvements in their door-to-needle (DTN) times. However, these same improvements have not been reported at community hospitals.
Methods
Red Deer Regional Hospital Centre (RDRHC) is a community hospital of 370 beds that serves approximately 150,000 people in their acute stroke catchment area. The RDRHC participated in a provincial DTN improvement initiative, and implemented a streamlined algorithm for the treatment of stroke patients. During this intervention period, they implemented the following changes: early alert of an incoming acute stroke patient to the neurologist and care team, meeting the patient immediately upon arrival, parallel work processes, keeping the patient on the Emergency Medical Service stretcher to the CT scanner, and administering alteplase in the imaging area. Door-to-needle data were collected from July 2007 to December 2017.
Results
A total of 289 patients were treated from July 2007 to December 2017. In the pre-intervention period, 165 patients received alteplase and the median DTN time was 77 minutes [interquartile range (IQR): 60–103 minutes]; in the post-intervention period, 104 patients received alteplase and the median DTN time was 30 minutes (IQR: 22–42 minutes) (p < 0.001). The annual number of patients that received alteplase increased from 9 to 29 in the pre-intervention period to annual numbers of 41 to 63 patients in the post-intervention period.
Conclusion
Community hospitals staffed with community neurologists can achieve median DTN times of 30 minutes or less.
An orbicular diorite from Fisher Lake, California, USA, contains multi-shelled, magmatic orbicules with branching and budding orthopyroxene crystals as well as feather and acicular plagioclase crystals that are oriented perpendicular to the growth horizon. Plagioclase and orthopyroxene show gradual, reverse compositional zoning along the long axes and normal zoning along the short axes. The reverse zoning varies from An87 to An93 and Mg68 to Mg74 over distances of 4 mm and 8 mm respectively. The close proximity of these two minerals makes it likely that only one mechanism is responsible for the reverse zoning. This zoning can be explained by using relevant temperature-composition diagrams and Gibbs free energy-composition plots. Under sudden and moderate undercoolings, which produce high growth but low nucleation rates, the difference in Gibbs free energy (ΔG) between the crystals and liquid is not initially maximized, i.e. initial compositions are not near-to-equilibrium. This results in crystal compositions that are closer to that of the bulk liquid than expected for crystallization under near-to-equilibrium conditions (i.e. very small ΔT). Over time, and under isothermal crystallization conditions, ΔG gradually increases to a maximum producing crystal compositions that also gradually attain near-to-equilibrium compositions. Subsequent to attaining these conditions, normal zoning occurs perpendicular to the crystal growth axes.
Silicate minerals grown from glasses, and rapidly cooled melts, often have non-compact branching or ‘spherulitic’ morphology. The branching patterns are observed in volcanic rocks, glasses, meteorites, slags and sometimes in shallow level intrusive rocks. Experiments, observations, theory and simulations all support the concept that the crystal morphology is the result of growth under diffusion limited conditions. We show that in a silicate melt under appropriate conditions the equations for heat transfer and chemical-diffusion reduce to the Laplace equation. This means that the temperature or chemical gradient is a steady state field. Interaction between this field and a random variable (Brownian motion of growth species) is modelled and yields complex branching objects. The growing cluster affects the field such that an in-filled structure cannot be formed. The branching structures of the model crystal are remarkably similar to those formed in nature, and to those produced in laboratory experiments, implying that the model captures the essence of the branching-growth process.
Estimates of the incubation period for Q fever vary substantially between different reviews and expert advice documents. We systematically reviewed and quality appraised the literature to provide an evidence-based estimate of the incubation period of the Q fever by the aerosolised infection route. Medline (OVIDSP) and EMBASE were searched with the search limited to human studies and English language. Eligible studies included persons with symptomatic, acute Q fever, and defined exposure to Coxiella burnetti. After review of 7115 titles and abstracts, 320 records were screened at full-text level. Of these, 23 studies contained potentially useful data and were quality assessed, with eight studies (with 403 individual cases where the derivation of incubation period was possible) being of sufficient quality and providing individual-level data to produce a pooled summary. We found a median incubation period of 18 days, with 95% of cases expected to occur between 7 and 32 days after exposure.
To determine whether antimicrobial-impregnated textiles decrease the acquisition of pathogens by healthcare provider (HCP) clothing.
DESIGN
We completed a 3-arm randomized controlled trial to test the efficacy of 2 types of antimicrobial-impregnated clothing compared to standard HCP clothing. Cultures were obtained from each nurse participant, the healthcare environment, and patients during each shift. The primary outcome was the change in total contamination on nurse scrubs, measured as the sum of colony-forming units (CFU) of bacteria.
PARTICIPANTS AND SETTING
Nurses working in medical and surgical ICUs in a 936-bed tertiary-care hospital.
INTERVENTION
Nurse subjects wore standard cotton-polyester surgical scrubs (control), scrubs that contained a complex element compound with a silver-alloy embedded in its fibers (Scrub 1), or scrubs impregnated with an organosilane-based quaternary ammonium and a hydrophobic fluoroacrylate copolymer emulsion (Scrub 2). Nurse participants were blinded to scrub type and randomly participated in all 3 arms during 3 consecutive 12-hour shifts in the intensive care unit.
RESULTS
In total, 40 nurses were enrolled and completed 3 shifts. Analyses of 2,919 cultures from the environment and 2,185 from HCP clothing showed that scrub type was not associated with a change in HCP clothing contamination (P=.70). Mean difference estimates were 0.118 for the Scrub 1 arm (95% confidence interval [CI], −0.206 to 0.441; P=.48) and 0.009 for the Scrub 2 rm (95% CI, −0.323 to 0.342; P=.96) compared to the control. HCP became newly contaminated with important pathogens during 19 of the 120 shifts (16%).
CONCLUSIONS
Antimicrobial-impregnated scrubs were not effective at reducing HCP contamination. However, the environment is an important source of HCP clothing contamination.
In this paper we undertake a quantitative analysis of the dynamic process by which ice underneath a dry porous debris layer melts. We show that the incorporation of debris-layer airflow into a theoretical model of glacial melting can capture the empirically observed features of the so-called Østrem curve (a plot of the melt rate as a function of debris depth). Specifically, we show that the turning point in the Østrem curve can be caused by two distinct mechanisms: the increase in the proportion of ice that is debris-covered and/or a reduction in the evaporative heat flux as the debris layer thickens. This second effect causes an increased melt rate because the reduction in (latent) energy used for evaporation increases the amount of energy available for melting. Our model provides an explicit prediction for the melt rate and the temperature distribution within the debris layer, and provides insight into the relative importance of the two effects responsible for the maximum in the Østrem curve. We use the data of Nicholson and Benn (2006) to show that our model is consistent with existing empirical measurements.
Research was conducted from 2011 to 2014 to determine weed population
dynamics and frequency of glyphosate-resistant (GR) Palmer amaranth with
herbicide programs consisting of glyphosate, dicamba, and residual
herbicides in dicamba-tolerant cotton. Five treatments were maintained in
the same plots over the duration of the experiment: three sequential POST
applications of glyphosate with or without pendimethalin plus diuron PRE;
three sequential POST applications of glyphosate plus dicamba with and
without the PRE herbicides; and a POST application of glyphosate plus
dicamba plus acetochlor followed by one or two POST applications of
glyphosate plus dicamba without PRE herbicides. Additional treatments
included alternating years with three sequential POST applications of
glyphosate only and glyphosate plus dicamba POST with and without PRE
herbicides. The greatest population of Palmer amaranth was observed when
glyphosate was the only POST herbicide throughout the experiment. Although
diuron plus pendimethalin PRE in a program with only glyphosate POST
improved control during the first 2 yr, these herbicides were ineffective by
the final 2 yr on the basis of weed counts from soil cores. The lowest
population of Palmer amaranth was observed when glyphosate plus dicamba were
applied regardless of PRE herbicides or inclusion of acetochlor POST.
Frequency of GR Palmer amaranth was 8% or less when the experiment was
initiated. Frequency of GR Palmer amaranth varied by herbicide program
during 2012 but was similar among all herbicide programs in 2013 and 2014.
Similar frequency of GR Palmer amaranth across all treatments at the end of
the experiment most likely resulted from pollen movement from Palmer
amaranth treated with glyphosate only to any surviving female plants
regardless of PRE or POST treatment. These data suggest that GR Palmer
amaranth can be controlled by dicamba and that dicamba is an effective
alternative mode of action to glyphosate in fields where GR Palmer amaranth
exists.
To what extent do political campaigns mobilize voters? Despite the central role of campaigns in American politics and despite many experiments on campaigning, we know little about the aggregate effects of an entire campaign on voter participation. Drawing upon inside information from presidential campaigns and utilizing a geographic research design that exploits media markets spanning state boundaries, we estimate the aggregate effects of a large-scale campaign. We estimate that the 2012 presidential campaigns increased turnout in highly targeted states by 7–8 percentage points, on average, indicating that modern campaigns can significantly alter the size and composition of the voting population. Further evidence suggests that the predominant mechanism behind this effect is traditional ground campaigning, which has dramatically increased in scale in the last few presidential elections. Additionally, we find no evidence of diminishing marginal returns to ground campaigning, meaning that voter contacts, each likely exhibiting small individual effects, may aggregate to large effects over the course of a campaign.
In November 2013, national public health agencies in England and Scotland identified an increase in laboratory-confirmed Salmonella Mikawasima. The role of proton pump inhibitors (PPIs) as a risk factor for salmonellosis is unclear; we therefore captured information on PPI usage as part of our outbreak investigation. We conducted a case-control study, comparing each case with two controls. Adjusted odds ratios (aORs) and 95% confidence intervals (CIs) were estimated using multivariable logistic regression. Thirty-nine of 61 eligible cases were included in the study. The median age of cases was 45 years; 56% were female. Of these, 33% were admitted to hospital and 31% reported taking PPIs. We identified an association between PPIs and non-typhoidal salmonellosis (aOR 8·8, 95% CI 2·0–38·3). There is increasing evidence supporting the existence of an association between salmonellosis and PPIs; however, biological studies are needed to understand the effect of PPIs in the pathogenesis of Salmonella. We recommend future outbreak studies investigate PPI usage to strengthen evidence on the relevance of PPIs in Salmonella infection. These findings should be used to support the development of guidelines for patients and prescribers on the risk of gastrointestinal infection and PPI usage.
Paranoia is one of the commonest symptoms of psychosis but has rarely been studied in a population at risk of developing psychosis. Based on existing theoretical models, including the proposed distinction between ‘poor me’ and ‘bad me’ paranoia, we aimed to test specific predictions about associations between negative cognition, metacognitive beliefs and negative emotions and paranoid ideation and the belief that persecution is deserved (deservedness).
Method
We used data from 117 participants from the Early Detection and Intervention Evaluation for people at risk of psychosis (EDIE-2) trial of cognitive–behaviour therapy, comparing them with samples of psychiatric in-patients and healthy students from a previous study. Multi-level modelling was utilized to examine predictors of both paranoia and deservedness, with post-hoc planned comparisons conducted to test whether person-level predictor variables were associated differentially with paranoia or with deservedness.
Results
Our sample of at-risk mental state participants was not as paranoid, but reported higher levels of ‘bad-me’ deservedness, compared with psychiatric in-patients. We found several predictors of paranoia and deservedness. Negative beliefs about self were related to deservedness but not paranoia, whereas negative beliefs about others were positively related to paranoia but negatively with deservedness. Both depression and negative metacognitive beliefs about paranoid thinking were specifically related to paranoia but not deservedness.
Conclusions
This study provides evidence for the role of negative cognition, metacognition and negative affect in the development of paranoid beliefs, which has implications for psychological interventions and our understanding of psychosis.
Large numbers of evacuees arrived in Dallas, Texas, from Hurricanes Katrina and Rita just 3 weeks apart in 2005 and from Hurricanes Gustav and Ike just 3 weeks apart again in 2008. The Dallas community needed to locate, organize, and manage the response to provide shelter and health care with locally available resources. With each successive hurricane, disaster response leaders applied many lessons learned from prior operations to become more efficient and effective in the provision of services. Mental health services proved to be an essential component. From these experiences, a set of operating guidelines for large evacuee shelter mental health services in Dallas was developed, with involvement of key stakeholders. A generic description of the processes and procedures used in Dallas that highlights the important concepts, key considerations, and organizational steps was then created for potential adaptation by other communities. (Disaster Med Public Health Preparedness. 2015;9:423–429)
(See the commentary by Pfeiffer and Beldavs, on pages 984–986.)
Objective
Describe the epidemiology of carbapenem-resistant Enterobacteriaceae (CRE) and examine the effect of lower carbapenem breakpoints on CRE detection.
Design
Retrospective cohort.
Setting
Inpatient care at community hospitals.
Patients
All patients with CRE-positive cultures were included.
Methods
CRE isolated from 25 community hospitals were prospectively entered into a centralized database from January 2008 through December 2012. Microbiology laboratory practices were assessed using questionnaires.
Results
A total of 305 CRE isolates were detected at 16 hospitals (64%). Patients with CRE had symptomatic infection in 180 cases (59%) and asymptomatic colonization in the remainder (125 cases; 41%). Klebsiella pneumoniae (277 isolates; 91%) was the most prevalent species. The majority of cases were healthcare associated (288 cases; 94%). The rate of CRE detection increased more than fivefold from 2008 (0.26 cases per 100,000 patient-days) to 2012 (1.4 cases per 100,000 patient-days; incidence rate ratio (IRR), 5.3 [95% confidence interval (CI), 1.22–22.7]; P = .01). Only 5 hospitals (20%) had adopted the 2010 Clinical and Laboratory Standards Institute (CLSI) carbapenem breakpoints. The 5 hospitals that adopted the lower carbapenem breakpoints were more likely to detect CRE after implementation of breakpoints than before (4.1 vs 0.5 cases per 100,000 patient-days; P < .001; IRR, 8.1 [95% CI, 2.7–24.6]). Hospitals that implemented the lower carbapenem breakpoints were more likely to detect CRE than were hospitals that did not (3.3 vs 1.1 cases per 100,000 patient-days; P = .01).
Conclusions
The rate of CRE detection increased fivefold in community hospitals in the southeastern United States from 2008 to 2012. Despite this, our estimates are likely underestimates of the true rate of CRE detection, given the low adoption of the carbapenem breakpoints recommended in the 2010 CLSI guidelines.