We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Diagnostic criteria for major depressive disorder allow for heterogeneous symptom profiles but genetic analysis of major depressive symptoms has the potential to identify clinical and etiological subtypes. There are several challenges to integrating symptom data from genetically informative cohorts, such as sample size differences between clinical and community cohorts and various patterns of missing data.
Methods
We conducted genome-wide association studies of major depressive symptoms in three cohorts that were enriched for participants with a diagnosis of depression (Psychiatric Genomics Consortium, Australian Genetics of Depression Study, Generation Scotland) and three community cohorts who were not recruited on the basis of diagnosis (Avon Longitudinal Study of Parents and Children, Estonian Biobank, and UK Biobank). We fit a series of confirmatory factor models with factors that accounted for how symptom data was sampled and then compared alternative models with different symptom factors.
Results
The best fitting model had a distinct factor for Appetite/Weight symptoms and an additional measurement factor that accounted for the skip-structure in community cohorts (use of Depression and Anhedonia as gating symptoms).
Conclusion
The results show the importance of assessing the directionality of symptoms (such as hypersomnia versus insomnia) and of accounting for study and measurement design when meta-analyzing genetic association data.
Blood pressure variability (BPV), independent of traditionally targeted average blood pressure levels, is an emerging vascular risk factor for stroke, cerebrovascular disease, and dementia, possibly through links with vascular-endothelial injury. Recent evidence suggests visit-to-visit (e.g., over months, years) BPV is associated with cerebrovascular disease severity, but less is known about relationships with short-term (e.g., < 24 hours) fluctuations in blood pressure. Additionally, it is unclear how BPV may be related to angiogenic growth factors that play a role in cerebral arterial health.
Participants and Methods:
We investigated relationships between short-term BPV, white matter hyperintensities on MRI, and levels of plasma vascular endothelial growth factor (VEGF) in a sample of community-dwelling older adults (n = 57, ages 55-88) without history of dementia or stroke. Blood pressure was collected continuously during a 5-minute resting period. BPV was calculated as variability independent of mean, a commonly used index of BPV uncorrelated with average blood pressure levels. Participants underwent T2-FLAIR MRI to determine severity of white matter lesion burden. Severity of lesions was classified as Fazekas scores (0-3). Participants also underwent venipuncture to determine levels of plasma VEGF. Ordinal logistic regression examined the association between BPV and Fazekas scores. Multiple linear regression explored relationships between BPV and VEGF. Models controlled for age, sex, and average blood pressure.
Results:
Elevated BPV was related to greater white matter lesion burden (i.e., Fazekas score) (systolic: OR = 1.17 [95% CI 1.01, 1.37]; p = .04; diastolic: OR = 2.47 [95% CI 1.09, 5.90]; p = .03) and increased levels of plasma VEGF (systolic: ß = .39 [95% CI .11, .67]; adjusted R2 = .16; p = .007; diastolic: ß = .48 [95% CI .18, .78]; adjusted R2 = .18; p = .003).
Conclusions:
Findings suggest short-term BPV may be related to cerebrovascular disease burden and angiogenic growth factors relevant to cerebral arterial health, independent of average blood pressure. Understanding the role of BPV in cerebrovascular disease and vascular-endothelial health may help elucidate the increased risk for stroke and dementia associated with elevated BPV.
To examine associations between maternal characteristics and feeding styles in Caribbean mothers.
Design:
Participants were mother–child pairs enrolled in a cluster randomised trial of a parenting intervention in three Caribbean islands. Maternal characteristics were obtained by questionnaires when infants were 6–8 weeks old. Items adapted from the Toddler Feeding Behaviour Questionnaire were used to assess infant feeding styles at the age of 1 year. Feeding styles were identified using factor analysis and associations with maternal characteristics assessed using multilevel linear regression.
Setting:
Health clinics in St. Lucia (n 9), Antigua (n 10) and Jamaica (n 20).
Participants:
A total of 405 mother–child pairs from the larger trial.
Results:
Maternal depressive symptoms were associated with uninvolved (β = 0·38, 95 % CI (0·14, 0·62)), restrictive (β = 0·44, 95 % CI (0·19, 0·69)) and forceful (β = 0·31, 95 % CI (0·06, 0·57)) feeding and inversely associated with responsive feeding (β = −0·30, 95 % CI (−0·56, −0·05)). Maternal vocabulary was inversely associated with uninvolved (β = −0·31, 95 % CI (−0·57, −0·06)), restrictive (β = −0·30, 95 % CI (−0·56, −0·04)), indulgent (β = −0·47, 95 % CI (−0·73, −0·21)) and forceful (β = −0·54, 95 % CI (−0·81, −0·28)) feeding. Indulgent feeding was negatively associated with socio-economic status (β = −0·27, 95 % CI (−0·53, −0·00)) and was lower among mothers ≥35 years (β = −0·32, 95 % CI (−0·62, −0·02)). Breast-feeding at 1 year was associated with forceful feeding (β = 0·41, 95 % CI (0·21, 0·61)). No significant associations were found between maternal education, BMI, occupation and feeding styles.
Conclusion:
Services to identify and assist mothers with depressive symptoms may benefit infant feeding style. Interventions to promote responsive feeding may be important for less educated, younger and socio-economically disadvantaged mothers.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
Methods:
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Results:
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Conclusions:
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
Heat stress is a global issue constraining pig productivity, and it is likely to intensify under future climate change. Technological advances in earth observation have made tools available that enable identification and mapping livestock species that are at risk of exposure to heat stress due to climate change. Here, we present a methodology to map the current and likely future heat stress risk in pigs using R software by combining the effects of temperature and relative humidity. We applied the method to growing-finishing pigs in Uganda. We mapped monthly heat stress risk and quantified the number of pigs exposed to heat stress using 18 global circulation models and projected impacts in the 2050s. Results show that more than 800 000 pigs in Uganda will be affected by heat stress in the future. The results can feed into evidence-based policy, planning and targeted resource allocation in the livestock sector.
Neurocognitive and functional neuroimaging studies point to frontal lobe abnormalities in schizophrenia. Molecular and behavioural genetic studies suggest that the frontal lobe is under significant genetic influence. We carried out structural magnetic resonance imaging (MRI) of the frontal lobe in monozygotic (MZ) twins concordant or discordant for schizophrenia and healthy MZ control twins.
Methods:
The sample comprised 21 concordant pairs, 17 discordant affected and 18 discordant unaffected twins from 19 discordant pairs, and 27 control pairs. Groups were matched on sociodemographic variables. Patient groups (concordant, discordant affected) did not differ on clinical variables. Volumes of superior, middle, inferior and orbital frontal gyri were calculated using the Cavalieri principle on the basis of manual tracing of anatomic boundaries. Group differences were investigated covarying for whole-brain volume, gender and age.
Results:
Results for superior frontal gyrus showed that twins with schizophrenia (i.e. concordant twins and discordant affected twins) had reduced volume compared to twins without schizophrenia (i.e. discordant unaffected and control twins), indicating an effect of illness. For middle and orbital frontal gyrus, concordant (but not discordant affected) twins differed from non-schizophrenic twins. There were no group differences in inferior frontal gyrus volume.
Conclusions:
These findings suggest that volume reductions in the superior frontal gyrus are associated with a diagnosis of schizophrenia (in the presence or absence of a co-twin with schizophrenia). On the other hand, volume reductions in middle and orbital frontal gyri are seen only in concordant pairs, perhaps reflecting the increased genetic vulnerability in this group.
Major depression is a significant problem for people with a traumatic brain injury (TBI) and its treatment remains difficult. A promising approach to treat depression is Mindfulness-based cognitive therapy (MBCT), a relatively new therapeutic approach rooted in mindfulness based stress-reduction (MBSR) and cognitive behavioral therapy (CBT). We conducted this study to examine the effectiveness of MBCT in reducing depression symptoms among people who have a TBI.
Methods:
Twenty individuals diagnosed with major depression were recruited from a rehabilitation clinic and completed the 8-week MBCT intervention. Instruments used to measure depression symptoms included: BDI-II, PHQ-9, HADS, SF-36 (Mental Health subscale), and SCL-90 (Depression subscale). They were completed at baseline and post-intervention.
Results:
All instruments indicated a statistically significant reduction in depression symptoms post-intervention (p < .05). For example, the total mean score on the BDI-II decreased from 25.2 (9.8) at baseline to 18.2 (11.7) post-intervention (p=.001). Using a PHQ threshold of 10, the proportion of participants with a diagnosis of major depression was reduced by 59% at follow-up (p=.012).
Conclusions:
Most participants reported reductions in depression symptoms after the intervention such that many would not meet the criteria for a diagnosis of major depression. This intervention may provide an opportunity to address a debilitating aspect of TBI and could be implemented concurrently with more traditional forms of treatment, possibly enhancing their success. The next step will involve the execution of multi-site, randomized controlled trials to fully demonstrate the value of the intervention.
Healthcare personnel (HCP) were recruited to provide serum samples, which were tested for antibodies against Ebola or Lassa virus to evaluate for asymptomatic seroconversion.
Setting:
From 2014 to 2016, 4 patients with Ebola virus disease (EVD) and 1 patient with Lassa fever (LF) were treated in the Serious Communicable Diseases Unit (SCDU) at Emory University Hospital. Strict infection control and clinical biosafety practices were implemented to prevent nosocomial transmission of EVD or LF to HCP.
Participants:
All personnel who entered the SCDU who were required to measure their temperatures and complete a symptom questionnaire twice daily were eligible.
Results:
No employee developed symptomatic EVD or LF. EVD and LF antibody studies were performed on sera samples from 42 HCP. The 6 participants who had received investigational vaccination with a chimpanzee adenovirus type 3 vectored Ebola glycoprotein vaccine had high antibody titers to Ebola glycoprotein, but none had a response to Ebola nucleoprotein or VP40, or a response to LF antigens.
Conclusions:
Patients infected with filoviruses and arenaviruses can be managed successfully without causing occupation-related symptomatic or asymptomatic infections. Meticulous attention to infection control and clinical biosafety practices by highly motivated, trained staff is critical to the safe care of patients with an infection from a special pathogen.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
To evaluate the association between novel pre- and post-operative biomarker levels and 30-day unplanned readmission or mortality after paediatric congenital heart surgery.
Methods:
Children aged 18 years or younger undergoing congenital heart surgery (n = 162) at Johns Hopkins Hospital from 2010 to 2014 were enrolled in the prospective cohort. Collected novel pre- and post-operative biomarkers include soluble suppression of tumorgenicity 2, galectin-3, N-terminal prohormone of brain natriuretic peptide, and glial fibrillary acidic protein. A model based on clinical variables from the Society of Thoracic Surgery database was developed and evaluated against two augmented models.
Results:
Unplanned readmission or mortality within 30 days of cardiac surgery occurred among 21 (13%) children. The clinical model augmented with pre-operative biomarkers demonstrated a statistically significant improvement over the clinical model alone with a receiver-operating characteristics curve of 0.754 (95% confidence interval: 0.65–0.86) compared to 0.617 (95% confidence interval: 0.47–0.76; p-value: 0.012). The clinical model augmented with pre- and post-operative biomarkers demonstrated a significant improvement over the clinical model alone, with a receiver-operating characteristics curve of 0.802 (95% confidence interval: 0.72–0.89; p-value: 0.003).
Conclusions:
Novel biomarkers add significant predictive value when assessing the likelihood of unplanned readmission or mortality after paediatric congenital heart surgery. Further exploration of the utility of these novel biomarkers during the pre- or post-operative period to identify early risk of mortality or readmission will aid in determining the clinical utility and application of these biomarkers into routine risk assessment.
Using existing data from clinical registries to support clinical trials and other prospective studies has the potential to improve research efficiency. However, little has been reported about staff experiences and lessons learned from implementation of this method in pediatric cardiology.
Objectives:
We describe the process of using existing registry data in the Pediatric Heart Network Residual Lesion Score Study, report stakeholders’ perspectives, and provide recommendations to guide future studies using this methodology.
Methods:
The Residual Lesion Score Study, a 17-site prospective, observational study, piloted the use of existing local surgical registry data (collected for submission to the Society of Thoracic Surgeons-Congenital Heart Surgery Database) to supplement manual data collection. A survey regarding processes and perceptions was administered to study site and data coordinating center staff.
Results:
Survey response rate was 98% (54/55). Overall, 57% perceived that using registry data saved research staff time in the current study, and 74% perceived that it would save time in future studies; 55% noted significant upfront time in developing a methodology for extracting registry data. Survey recommendations included simplifying data extraction processes and tailoring to the needs of the study, understanding registry characteristics to maximise data quality and security, and involving all stakeholders in design and implementation processes.
Conclusions:
Use of existing registry data was perceived to save time and promote efficiency. Consideration must be given to the upfront investment of time and resources needed. Ongoing efforts focussed on automating and centralising data management may aid in further optimising this methodology for future studies.
We sought to address the prior limitations of symptom checker accuracy by analysing the diagnostic and triage feasibility of online symptom checkers using a consecutive series of real-life emergency department (ED) patient encounters, and addressing a complex patient population – those with hepatitis C or HIV. We aimed to study the diagnostic and triage accuracy of these symptom checkers in relation to an emergency room physician-determined diagnosis. An ED retrospective analysis was performed on 8363 consecutive adult patients. Eligible patients included: 90 HIV, 67 hepatitis C, 11 both HIV and hepatitis C. Five online symptom checkers were utilised for diagnosis (Mayo Clinic, WebMD, Symptomate, Symcat, Isabel), three with triage capabilities. Symptom checker output was compared with ED physician-determined diagnosis data in regards to diagnostic accuracy and differential diagnosis listing, along with triage advice. All symptom checkers, whether for combined HIV and hepatitis C, HIV alone or hepatitis C alone had poor diagnostic accuracy in regards to Top1 (<20%), Top3 (<35%), Top10 (<40%), Listed at All (<45%). Significant variations existed for each individual symptom checker, as some appeared more accurate for listing the diagnosis in the top of the differential, vs. others more apt to list the diagnosis at all. In regards to ED triage data, a significantly higher percentage of hepatitis C patients (59.7%; 40/67) were found to have an initial diagnosis with emergent criteria than HIV patients (35.6%; 32/90). Symptom checker diagnostic capabilities are quite inferior to physician diagnostic capabilities. Complex patients such as those with HIV or hepatitis C may carry a more specific differential diagnosis, warranting symptom checkers to have diagnostic algorithms accounting for such complexity. Symptom checkers carry the potential for real-time epidemiologic monitoring of patient symptoms, as symptom entries and subsequent symptom checker diagnosis could allow health officials a means to track illnesses in specific patient populations and geographic regions. In order to do this, accurate and reliable symptom checkers are warranted.
Polyphenol oxidase (PPO) in red clover (RC) has been shown to reduce both lipolysis and proteolysis in silo and implicated (in vitro) in the rumen. However, all in vivo comparisons have compared RC with other forages, typically with lower levels of PPO, which brings in other confounding factors as to the cause for the greater protection of dietary nitrogen (N) and C18 polyunsaturated fatty acids (PUFA) on RC silage. This study compared two RC silages which when ensiled had contrasting PPO activities (RC+ and RC−) against a control of perennial ryegrass silage (PRG) to ascertain the effect of PPO activity on dietary N digestibility and PUFA biohydrogenation. Two studies were performed the first to investigate rumen and duodenal flow with six Hereford×Friesian steers, prepared with rumen and duodenal cannulae, and the second investigating whole tract N balance using six Holstein-Friesian non-lactating dairy cows. All diets were offered at a restricted level based on animal live weight with each experiment consisting of two 3×3 Latin squares using big bale silages ensiled in 2010 and 2011, respectively. For the first experiment digesta flow at the duodenum was estimated using a dual-phase marker system with ytterbium acetate and chromium ethylenediaminetetraacetic acid as particulate and liquid phase markers, respectively. Total N intake was higher on the RC silages in both experiments and higher on RC− than RC+. Rumen ammonia-N reflected intake with ammonia-N per unit of N intake lower on RC+ than RC−. Microbial N duodenal flow was comparable across all silage diets with non-microbial N higher on RC than the PRG with no difference between RC+ and RC−, even when reported on a N intake basis. C18 PUFA biohydrogenation was lower on RC silage diets than PRG but with no difference between RC+ and RC−. The N balance trial showed a greater retention of N on RC+ over RC−; however, this response is likely related to the difference in N intake over any PPO driven protection. The lack of difference between RC silages, despite contrasting levels of PPO, may reflect a similar level of protein-bound-phenol complexing determined in each RC silage. Previously this complexing has been associated with PPOs protection mechanism; however, this study has shown that protection is not related to total PPO activity.
The north-west European population of Bewick’s Swan Cygnus columbianus bewickii declined by 38% between 1995 and 2010 and is listed as ‘Endangered’ on the European Red List of birds. Here, we combined information on food resources within the landscape with long-term data on swan numbers, habitat use, behaviour and two complementary measures of body condition, to examine whether changes in food type and availability have influenced the Bewick’s Swan’s use of their main wintering site in the UK, the Ouse Washes and surrounding fens. Maximum number of Bewick’s Swans rose from 620 in winter 1958/59 to a high of 7,491 in winter 2004/05, before falling to 1,073 birds in winter 2013/14. Between winters 1958/59 and 2014/15 the Ouse Washes supported between 0.5 and 37.9 % of the total population wintering in north-west Europe (mean ± 95 % CI = 18.1 ± 2.4 %). Swans fed on agricultural crops, shifting from post-harvest remains of root crops (e.g. sugar beet and potatoes) in November and December to winter-sown cereals (e.g. wheat) in January and February. Inter-annual variation in the area cultivated for these crops did not result in changes in the peak numbers of swans occurring on the Ouse Washes. Behavioural and body condition data indicated that food supplies on the Ouse Washes and surrounding fens remain adequate to allow the birds to gain and maintain good body condition throughout winter with no increase in foraging effort. Our findings suggest that the recent decline in numbers of Bewick’s Swans at this internationally important site was not linked to inadequate food resources.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
The crystal structure of the high-pressure phase-II of cristobalite has been solved by neutron diffraction (space group P21/c, a = 8.3780(11) Å, b = 4.6018(6) Å, c = 9.0568(13) Å, β = 124.949(7)°, at P = 3.5 GPa). This phase corresponds to a distortion of the high-temperature cubic β-phase, rather than of the ambient temperature and pressure tetragonal α-phase.
Introduction: The ECG diagnosis of acute coronary occlusion (ACO) in the setting of ventricular paced rhythm (VPR) is purported to be impossible. However, VPR has a similar ECG morphology to LBBB. The validated Smith-modified Sgarbossa criteria (MSC) have high sensitivity (Sens) and specificity (Spec) for ACO in LBBB. MSC consist of 1 of the following in 1 lead: concordant ST Elevation (STE) 1 mm, concordant ST depression 1 mm in V1-V3, or ST/S ratio <−0.25 (in leads with 1 mm STE). We hypothesized that the MSC will have higher Sens for diagnosis of ACO in VPR when compared to the original Sgarbossa criteria. We report preliminary findings of the Paced Electrocardiogram Requiring Fast Emergency Coronary Therapy (PERFECT) study Methods: The PERFECT study is a retrospective, multicenter, international investigation of ED patients from 1/2008 - 12/2016 with VPR on the ECG and symptoms suggestive of acute coronary syndrome (e.g. chest pain or shortness of breath). Data from four sites are presented. Acute myocardial infarction (AMI) was defined by the Third Universal Definition of AMI. A blinded cardiologist adjudicated ACO, defined as thrombolysis in myocardial infarction score 0 or 1 on coronary angiography; a pre-defined subgroup of ACO patients with peak cardiac troponin (cTn) >100 times the 99% upper reference limit (URL) of the cTn assay was also analyzed. Another blinded physician measured all ECGs. Statistics were by Mann Whitney U, Chi-square, and McNemars test. Results: The ACO and No-AMI groups consisted of 15 and 79 encounters, respectively. For the ACO and No-AMI groups, median age was 78 [IQR 72-82] vs. 70 [61-75] and 13 (86%) vs. 48 (61%) patients were male. The median peak cTn ratio (cTn/URL) was 260 [33-663] and 0.5 [0-1.3] for ACO vs. no-AMI. The Sens and Spec for the MSC and the original Sgarbossa criteria were 67% (95%CI 39-87) vs. 46% (22-72; p=0.25) and 99% (92-100) vs. 99% (92-100; p=0.5). In pre-defined subgroup analysis of ACO patients with peak cTn >100 times the URL (n=10), the Sens was 90% (54-100) for the MSC vs. 60% (27- 86) for original Sgarbossa criteria (p=0.25). Conclusion: ACO in VPR is an uncommon condition. The MSC showed good Sens for diagnosis of ACO in the presence of VPR, especially among patients with high peak cTn, and Spec was excellent. These methods and results are consistent with studies that have used the MSC to diagnose ACO in LBBB.
On 27 April 2015, Washington health authorities identified Escherichia coli O157:H7 infections associated with dairy education school field trips held in a barn 20–24 April. Investigation objectives were to determine the magnitude of the outbreak, identify the source of infection, prevent secondary illness transmission and develop recommendations to prevent future outbreaks. Case-finding, hypothesis generating interviews, environmental site visits and a case–control study were conducted. Parents and children were interviewed regarding event activities. Odds ratios (OR) and 95% confidence intervals (CI) were computed. Environmental testing was conducted in the barn; isolates were compared to patient isolates using pulsed-field gel electrophoresis (PFGE). Sixty people were ill, 11 (18%) were hospitalised and six (10%) developed haemolytic uremic syndrome. Ill people ranged in age from <1 year to 47 years (median: 7), and 20 (33%) were female. Twenty-seven case-patients and 88 controls were enrolled in the case–control study. Among first-grade students, handwashing (i.e. soap and water, or hand sanitiser) before lunch was protective (adjusted OR 0.13; 95% CI 0.02–0.88, P = 0.04). Barn samples yielded E. coli O157:H7 with PFGE patterns indistinguishable from patient isolates. This investigation provided epidemiological, laboratory and environmental evidence for a large outbreak of E. coli O157:H7 infections from exposure to a contaminated barn. The investigation highlights the often overlooked risk of infection through exposure to animal environments as well as the importance of handwashing for disease prevention. Increased education and encouragement of infection prevention measures, such as handwashing, can prevent illness.
Developing countries are experiencing an increase in total demand for livestock commodities, as populations and per capita demands increase. Increased production is therefore required to meet this demand and maintain food security. Production increases will lead to proportionate increases in greenhouse gas (GHG) emissions unless offset by reductions in the emissions intensity (Ei) (i.e. the amount of GHG emitted per kg of commodity produced) of livestock production. It is therefore important to identify measures that can increase production whilst reducing Ei cost-effectively. This paper seeks to do this for smallholder agro-pastoral cattle systems in Senegal; ranging from low input to semi-intensified, they are representative of a large proportion of the national cattle production. Specifically, it identifies a shortlist of mitigation measures with potential for application to the various herd systems and estimates their GHG emissions abatement potential (using the Global Livestock Environmental Assessment Model) and cost-effectiveness. Limitations and future requirements are identified and discussed. This paper demonstrates that the Ei of meat and milk from livestock systems in a developing region can be reduced through measures that would also benefit food security, many of which are likely to be cost-beneficial. The ability to make such quantification can assist future sustainable development efforts.