We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Develop and implement a system in the Veterans Health Administration (VA) to alert local medical center personnel in real time when an acute- or long-term care patient/resident is admitted to their facility with a history of colonization or infection with a multidrug-resistant organism (MDRO) previously identified at any VA facility across the nation.
Methods:
An algorithm was developed to extract clinical microbiology and local facility census data from the VA Corporate Data Warehouse initially targeting carbapenem-resistant Enterobacterales (CRE) and methicillin-resistant Staphylococcus aureus (MRSA). The algorithm was validated with chart review of CRE cases from 2010-2018, trialed and refined in 24 VA healthcare systems over two years, expanded to other MDROs and implemented nationwide on 4/2022 as “VA Bug Alert” (VABA). Use through 8/2023 was assessed.
Results:
VABA performed well for CRE with recall of 96.3%, precision of 99.8%, and F1 score of 98.0%. At the 24 trial sites, feedback was recorded for 1,011 admissions with a history of CRE (130), MRSA (814), or both (67). Among Infection Preventionists and MDRO Prevention Coordinators, 338 (33%) reported being previously unaware of the information, and of these, 271 (80%) reported they would not have otherwise known this information. By fourteen months after nationwide implementation, 113/130 (87%) VA healthcare systems had at least one VABA subscriber.
Conclusions:
A national system for alerting facilities in real-time of patients admitted with an MDRO history was successfully developed and implemented in VA. Next steps include understanding facilitators and barriers to use and coordination with non-VA facilities nationwide.
There is emerging evidence of heterogeneity within treatment-resistance schizophrenia (TRS), with some people not responding to antipsychotic treatment from illness onset and a smaller group becoming treatment-resistant after an initial response period. It has been suggested that these groups have different aetiologies. Few studies have investigated socio-demographic and clinical differences between early and late onset of TRS.
Objectives
This study aims to investigate socio-demographic and clinical correlates of late-onset of TRS.
Methods
Using data from the electronic health records of the South London and Maudsley, we identified a cohort of people with TRS. Regression analyses were conducted to identify correlates of the length of treatment to TRS. Analysed predictors include gender, age, ethnicity, positive symptoms severity, problems with activities of daily living, psychiatric comorbidities, involuntary hospitalisation and treatment with long-acting injectable antipsychotics.
Results
We observed a continuum of the length of treatment until TRS presentation. Having severe hallucinations and delusions at treatment start was associated shorter duration of treatment until the presentation of TRS.
Conclusions
Our findings do not support a clear cut categorisation between early and late TRS, based on length of treatment until treatment resistance onset. More severe positive symptoms predict earlier onset of treatment resistance.
Disclosure
DFdF, GKS, EF and IR have received research funding from Janssen and H. Lundbeck A/S. RDH and HS have received research funding from Roche, Pfizer, Janssen and Lundbeck. SES is employed on a grant held by Cardiff University from Takeda Pharmaceutical Comp
Adults who had non-edematous severe acute malnutrition (SAM) during infancy (i.e., marasmus) have worse glucose tolerance and beta-cell function than survivors of edematous SAM (i.e., kwashiorkor). We hypothesized that wasting and/or stunting in SAM is associated with lower glucose disposal rate (M) and insulin clearance (MCR) in adulthood.
We recruited 40 nondiabetic adult SAM survivors (20 marasmus survivors (MS) and 20 kwashiorkor survivors (KS)) and 13 matched community controls. We performed 150-minute hyperinsulinaemic, euglycaemic clamps to estimate M and MCR. We also measured serum adiponectin, anthropometry, and body composition. Data on wasting (weight-for-height) and stunting (height-for-age) were abstracted from the hospital records.
Children with marasmus had lower weight-for-height z-scores (WHZ) (−3.8 ± 0.9 vs. −2.2 ± 1.4; P < 0.001) and lower height-for-age z-scores (HAZ) (−4.6 ± 1.1 vs. −3.4 ± 1.5; P = 0.0092) than those with kwashiorkor. As adults, mean age (SD) of participants was 27.2 (8.1) years; BMI was 23.6 (5.0) kg/m2. SAM survivors and controls had similar body composition. MS and KS and controls had similar M (9.1 ± 3.2; 8.7 ± 4.6; 6.9 ± 2.5 mg.kg−1.min−1 respectively; P = 0.3) and MCR. WHZ and HAZ were not associated with M, MCR or adiponectin even after adjusting for body composition.
Wasting and stunting during infancy are not associated with insulin sensitivity and insulin clearance in lean, young, adult survivors of SAM. These data are consistent with the finding that glucose intolerance in malnutrition survivors is mostly due to beta-cell dysfunction.
OBJECTIVES/SPECIFIC AIMS: Background: Delirium is a well described form of acute brain organ dysfunction characterized by decreased or increased movement, changes in attention and concentration as well as perceptual disturbances (i.e., hallucinations) and delusions. Catatonia, a neuropsychiatric syndrome traditionally described in patients with severe psychiatric illness, can present as phenotypically similar to delirium and is characterized by increased, decreased and/or abnormal movements, staring, rigidity, and mutism. Delirium and catatonia can co-occur in the setting of medical illness, but no studies have explored this relationship by age. Our objective was to assess whether advancing age and the presence of catatonia are associated with delirium. METHODS/STUDY POPULATION: Methods: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Measures of association (OR) were assessed with a simple logistic regression model with catatonia as the independent variable and delirium as the dependent variable. Effect measure modification by age was assessed using a Likelihood ratio test. RESULTS/ANTICIPATED RESULTS: Results: We enrolled 136 medical and surgical critically ill patients with 452 matched (concomitant) delirium and catatonia assessments. Median age was 59 years (IQR: 52–68). In our cohort of 136 patients, 58 patients (43%) had delirium only, 4 (3%) had catatonia only, 42 (31%) had both delirium and catatonia, and 32 (24%) had neither. Age was significantly associated with prevalent delirium (i.e., increasing age associated with decreased risk for delirium) (p=0.04) after adjusting for catatonia severity. Catatonia was significantly associated with prevalent delirium (p<0.0001) after adjusting for age. Peak delirium risk was for patients aged 55 years with 3 or more catatonic signs, who had 53.4 times the odds of delirium (95% CI: 16.06, 176.75) than those with no catatonic signs. Patients 70 years and older with 3 or more catatonia features had half this risk. DISCUSSION/SIGNIFICANCE OF IMPACT: Conclusions: Catatonia is significantly associated with prevalent delirium even after controlling for age. These data support an inverted U-shape risk of delirium after adjusting for catatonia. This relationship and its clinical ramifications need to be examined in a larger sample, including patients with dementia. Additionally, we need to assess which acute brain syndrome (delirium or catatonia) develops first.
The commissioning and operation of apparatus for neutron diffraction at simultaneous high temperatures and pressures is reported. The basic design is based on the Paris-Edinburgh cell using opposed anvils, with internal heating. Temperature is measured using neutron radiography. The apparatus has been shown in both on-line and off-line tests to operate to a pressure of 7 GPa and temperature of 1700°C. The apparatus has been used in a neutron diffraction study of the crystal structure of deuterated brucite, and results for 520°C and 5.15 GPa are presented. The diffraction data that can be obtained from the apparatus are of comparable quality to previous high-pressure studies at ambient temperatures, and are clearly good enough for Rietveld refinement analysis to give structural data of reasonable quality.
High definition video from a towed camera system was used to describe the deep-sea benthic habitats within an elongate depression located at the western margin of Rockall Bank in the Hatton–Rockall Basin. At depths greater than 1190 m, an extensive area (10 km long by 1.5 km wide) of what appeared to be reduced sediments, bacterial mats and flocculent matter indicated possible cold-seep habitat. Plumes of sediment-rich fluid were observed alongside raised elongate features that gave topographic relief to the otherwise flat seafloor. In the deepest section of the depression (1215 m) dense flocculent matter was observed suspended in the water column, in places obscuring the seabed. Away from the bacterial mats, the habitat changed rapidly to sediments dominated by tube-dwelling polychaete worms and then to deep-sea sedimentary habitats more typical for the water depth (sponges and burrowing megafauna in areas of gentle slopes, and coral gardens on steeper slopes).
Field experiments were conducted at five locations in Colorado, Kansas, and Wyoming in 1994–1995 and 1995–1996 to compare the effects of MON 37500 rate and application timings on downy brome control and winter wheat tolerance. MON 37500 at 18 to 35 g ha−1 applied preemergence or fall postemergence reduced downy brome density 71 to 92% in 1995. Spring-applied MON 37500 suppressed downy brome growth but did not reduce plant density. No application reduced downy brome density in 1996. At each location, downy brome was controlled best by MON 37500 applied preemergence or fall postemergence at 35 g ha−1. MON 37500 did not affect wheat height at Archer or Torrington, WY, and Burlington or Stratton, CO, but wheat treated preemergence or fall postemergence was taller than untreated wheat at Hays, KS, in 1995. Spring-postemergence-treated wheat at Hays in 1995 was shorter than untreated, preemergence-, or fall-postemergence-treated wheat. Wheat head density did not differ among treated and untreated plots at Torrington, but herbicide treatment increased wheat yields. Wheat head density increased with all MON 37500 treatments at Hays in 1995, as did yield. However, preemergence and fall-postemergence applications resulted in the highest wheat yields. No herbicide treatment affected head density or yield at Hays in 1996.
Crop yield loss–weed density relationships critically influence calculation of economic thresholds and the resulting management recommendations made by a bioeconomic model. To examine site-to-site and year-to-year variation in winter Triticum aestivum L. (winter wheat)–Aegilops cylindrica Host. (jointed goatgrass) interference relationships, the rectangular hyperbolic yield loss function was fit to data sets from multiyear field experiments conducted at Colorado, Idaho, Kansas, Montana, Nebraska, Utah, Washington, and Wyoming. The model was fit to three measures of A. cylindrica density: fall seedling, spring seedling, and reproductive tiller densities. Two parameters: i, the slope of the yield loss curve as A. cylindrica density approaches zero, and a, the maximum percentage yield loss as A. cylindrica density becomes very large, were estimated for each data set using nonlinear regression. Fit of the model to the data was better using spring seedling densities than fall seedling densities, but it was similar for spring seedling and reproductive tiller densities based on the residual mean square (RMS) values. Yield loss functions were less variable among years within a site than among sites for all measures of weed density. For the one site where year-to-year variation was observed (Archer, WY), parameter a varied significantly among years, but parameter i did not. Yield loss functions differed significantly among sites for 7 of 10 comparisons. Site-to-site statistical differences were generally due to variation in estimates of parameter i. Site-to-site and year-to-year variation in winter T. aestivum–A. cylindrica yield loss parameter estimates indicated that management recommendations made by a bioeconomic model cannot be based on a single yield loss function with the same parameter values for the winter T. aestivum-producing region. The predictive ability of a bioeconomic model is likely to be improved when yield loss functions incorporating time of emergence and crop density are built into the model's structure.
Secale cereale is a serious weed problem in winter Triticum aestivum–producing regions. The interference relationships and economic thresholds of S. cereale in winter T. aestivum in Colorado, Kansas, Nebraska, and Wyoming were determined over 4 yr. Winter T. aestivum density was held constant at recommended planting densities for each site. Target S. cereale densities were 0, 5, 10, 25, 50, or 100 plants m−2. Secale cereale–winter T. aestivum interference relationships across locations and years were determined using a negative hyperbolic yield loss function. Two parameters—I, which represents the percent yield loss as S. cereale density approaches zero, and A, the maximum percent yield loss as S. cereale density increases—were estimated for each data set using nonlinear regression. Parameter I was more stable among years within locations than among locations within years, whereas maximum percentage yield loss was more stable across locations and years. Environmental conditions appeared to have a role in the stability of these relationships. Parameter estimates for I and A were incorporated into a second model to determine economic thresholds. On average, threshold values were between 4 and 5 S. cereale plants m−2; however, the large variation in these threshold values signifies considerable risk in making economic weed management decisions based upon these values.
A high resolution analysis of benthic foraminifera as well as of aeolian terrigenous proxies extracted from a 37 m-long marine core located off the Mauritanian margin spanning the last ~ 1.2 Ma, documents the possible link between major continental environmental changes with a shift in the isotopic signature of deep waters around 1.0–0.9 Ma, within the so-called Mid-Pleistocene Transition (MPT) time period. The increase in the oxygen isotopic composition of deep waters, as seen through the benthic foraminifera δ18O values, is consistent with the growth of larger ice sheets known to have occurred during this transition. Deep-water mass δ13C changes, also estimated from benthic foraminifera, show a strong depletion for the same time interval. This drastic change in δ13C values is concomitant with a worldwide 0.3‰ decrease observed in the major deep oceanic waters for the MPT time period. The phase relationship between aeolian terrigeneous signal increase and this δ13C decrease in our record, as well as in other paleorecords, supports the hypothesis of a global aridification amongst others processes to explain the deep-water masses isotopic signature changes during the MPT. In any case, the isotopic shifts imply major changes in the end-member δ18O and δ13C values of deep waters.
The paucity of modern pollen-rain data from Amazonia constitutes a significant barrier to understanding the Late Quaternary vegetation history of this globally important tropical forest region. Here, we present the first modern pollen-rain data for tall terra firme moist evergreen Amazon forest, collected between 1999 and 2001 from artificial pollen traps within a 500 × 20 m permanent study plot (14°34′50″S, 60°49′48″W) in Noel Kempff Mercado National Park (NE Bolivia). Spearman's rank correlations were performed to assess the extent of spatial and inter-annual variability in the pollen rain, whilst statistically distinctive taxa were identified using Principal Components Analysis (PCA). Comparisons with the floristic and basal area data of the plot (stems ≥10 cm d.b.h.) enabled the degree to which taxa are over/under-represented in the pollen rain to be assessed (using R-rel values). Moraceae/Urticaceae dominates the pollen rain (64% median abundance) and is also an important constituent of the vegetation, accounting for 16% of stems ≥10 cm d.b.h. and ca. 11% of the total basal area. Other important pollen taxa are Arecaceae (cf. Euterpe), Melastomataceae/Combretaceae, Cecropia, Didymopanax, Celtis, and Alchornea. However, 75% of stems and 67% of the total basal area of the plot ≥10 cm d.b.h. belong to species which are unidentified in the pollen rain, the most important of which are Phenakospermum guianensis (a banana-like herb) and the key canopy-emergent trees, Erisma uncinatum and Qualea paraensis.
Three models that empirically predict crop yield from crop and weed density were evaluated for their fit to 30 data sets from multistate, multiyear winter wheat–jointed goatgrass interference experiments. The purpose of the evaluation was to identify which model would generally perform best for the prediction of yield (damage function) in a bioeconomic model and which model would best fulfill criteria for hypothesis testing with limited amounts of data. Seven criteria were used to assess the fit of the models to the data. Overall, Model 2 provided the best statistical description of the data. Model 2 regressions were most often statistically significant, as indicated by approximate F tests, explained the largest proportion of total variation about the mean, gave the smallest residual sum of squares, and returned residuals with random distribution more often than Models 1 and 3. Model 2 performed less well based on the remaining criteria. Model 3 outperformed Models 1 and 2 in the number of parameters estimated that were statistically significant. Model 1 outperformed Models 2 and 3 in the proportion of regressions that converged on a solution and more readily exhibited an asymptotic relationship between winter wheat yield and both winter wheat and jointed goatgrass density under the constraint of limited data. In contrast, Model 2 exhibited a relatively linear relationship between yield and crop density and little effect of increasing jointed goatgrass density on yield, thus overpredicting yield at high weed densities when data were scarce. Model 2 had statistical properties that made it superior for hypothesis testing; however, Model 1's properties were determined superior for the damage function in the winter wheat–jointed goatgrass bioeconomic model because it was less likely to cause bias in yield predictions based on data sets of minimum size.
Research during the past several decades on jointed goatgrass management has focused on individual cultural practices rather than on multi- or interdisciplinary components. Field studies were conducted at Hays, KS, from 1997 to 2003 to evaluate the interaction of crop rotation, fallow weed management, and winter wheat variety on jointed goatgrass density. Extending a wheat–fallow (W–F) rotation to include grain sorghum or grain sorghum and sunflower reduced jointed goatgrass populations more than other cultural practices tested. Fallow treatments were equal in most years, but mechanical fallow resulted in increased jointed goatgrass emergence compared with chemical fallow under drought conditions. Winter wheat cultivars had little effect on jointed goatgrass populations. However, taller, more competitive varieties are favorable for jointed goatgrass control in an integrated management program. No specific combination of crop rotation, fallow weed management, and wheat variety consistently reduced jointed goatgrass density more than other combinations during multiple years.
Over 300 cases of acute toxoplasmosis are confirmed by reference testing in England and Wales annually. We conducted a case-control study to identify risk factors for Toxoplasma gondii infection to inform prevention strategies. Twenty-eight cases and 27 seronegative controls participated. We compared their food history and environmental exposures using logistic regression to calculate odds ratios (OR) and 95% confidence intervals in a model controlling for age and sex. Univariable analysis showed that the odds of eating beef (OR 10·7, P < 0·001), poultry (OR 6·4, P = 0·01) or lamb/mutton (OR 4·9, P = 0·01) was higher for cases than controls. After adjustment for potential confounders a strong association between beef and infection remained (OR 5·6, P = 0·01). The small sample size was a significant limitation and larger studies are needed to fully investigate potential risk factors. The study findings emphasize the need to ensure food is thoroughly cooked and handled hygienically, especially for those in vulnerable groups.
Diarrhoeal diseases are major causes of morbidity and mortality in developing countries. This longitudinal study aimed to identify controllable environmental drivers of intestinal infections amidst a highly contaminated drinking water supply in urban slums and villages of Vellore, Tamil Nadu in southern India. Three hundred households with children (<5 years) residing in two semi-urban slums and three villages were visited weekly for 12–18 months to monitor gastrointestinal morbidity. Households were surveyed at baseline to obtain information on environmental and behavioural factors relevant to diarrhoea. There were 258 diarrhoeal episodes during the follow-up period, resulting in an overall incidence rate of 0·12 episodes/person-year. Incidence and longitudinal prevalence rates of diarrhoea were twofold higher in the slums compared to rural communities (P < 0·0002). Regardless of study site, diarrhoeal incidence was highest in infants (<1 year) at 1·07 episodes/person-year, and decreased gradually with increasing age. Increasing diarrhoeal rates were associated with presence of children (<5 years), domesticated animals and low socioeconomic status. In rural communities, open-field defecation was associated with diarrhoea in young children. This study demonstrates the contribution of site-specific environmental and behavioural factors in influencing endemic rates of urban and rural diarrhoea in a region with highly contaminated drinking water.
Nearly 10% of the world's total forest area is formally owned by communities and indigenous groups, yet knowledge of the effects of decentralized forest management approaches on conservation (and livelihood) impacts remains elusive. In this paper, the conservation impact of decentralized forest management on two forests in Tanzania was evaluated using a mixed method approach. Current forest condition, forest increment and forest use patterns were assessed through forest inventories, and changes in forest disturbance levels before and after the implementation of decentralized forest management were assessed on the basis of analyses of Landsat images. This biophysical evidence was then linked to changes in actual management practices, assessed through records, interviews and participatory observations, to provide a measure of the conservation impact of the policy change. Both forests in the study were found to be in good condition, and extraction was lower than overall forest increment. Divergent changes in forest disturbance levels were in evidence following the implementation of decentralized forest management. The evidence from records, interviews and participatory observations indicated that decentralized management had led to increased control of forest use and the observed divergence in forest disturbance levels appeared to be linked to differences in the way that village-level forest managers prioritized conservation objectives and forest-based livelihood strategies. The study illustrates that a mixed methods approach comprises a valid and promising way to evaluate impacts of conservation policies, even in the absence of control sites. By carefully linking policy outcomes to policy outputs, such an approach not only identifies whether such policies work as intended, but also potential mechanisms.
A report on Toxoplasma gondii by the UK Advisory Committee on the Microbiological Safety of Food recommended that more accurate figures on the burden of disease in the UK are needed. We present the first 5 years of data from an enhanced surveillance scheme for toxoplasmosis in England and Wales. Between 2008 and 2012, 1824 cases were reported, with an average of 365 each year. There were 1109 immunocompetent cases, the majority presenting with lymphadenopathy, and 364 immunosuppressed cases, with central nervous system and systemic symptoms most frequently reported. There were also 190 pregnant and 33 congenital cases. Of the pregnant cases, 148 were asymptomatic (probably detected during screening), while 28 suffered a fetal loss or stillbirth. The enhanced surveillance system has led to an improvement in the detection of toxoplasmosis in England and Wales. However, numbers are still likely to be an underestimate, biasing towards the more severe infections.