We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The macro-social and environmental conditions in which people live, such as the level of a country’s development or inequality, are associated with brain-related disorders. However, the relationship between these systemic environmental factors and the brain remains unclear. We aimed to determine the association between the level of development and inequality of a country and the brain structure of healthy adults.
Methods
We conducted a cross-sectional study pooling brain imaging (T1-based) data from 145 magnetic resonance imaging (MRI) studies in 7,962 healthy adults (4,110 women) in 29 different countries. We used a meta-regression approach to relate the brain structure to the country’s level of development and inequality.
Results
Higher human development was consistently associated with larger hippocampi and more expanded global cortical surface area, particularly in frontal areas. Increased inequality was most consistently associated with smaller hippocampal volume and thinner cortical thickness across the brain.
Conclusions
Our results suggest that the macro-economic conditions of a country are reflected in its inhabitants’ brains and may explain the different incidence of brain disorders across the world. The observed variability of brain structure in health across countries should be considered when developing tools in the field of personalized or precision medicine that are intended to be used across the world.
Adaptive structures have the potential to play a significant role in saving resources in the construction industry in the future. For realisation, this requires actuators that meet the requirements of different buildings with their specific load-bearing structures. In the past, the actuators were mainly developed particularly for one exemplary load-bearing structure. This paper analyses the primary classifications for buildings, followed by challenges of adaptive structures, before outlining the draft of a framework for actuators for adaptive structures to speed up and simplify development.
Classical galactosemia (CG) is an inborn error of galactose metabolism. Many CG patients suffer from long-term complications including poor cognitive functioning. There are indications of social dysfunction but limited evidence in the literature. Therefore, this study aims to improve our understanding of social competence in CG by investigating social cognition, neurocognition and emotion regulation.
Methods:
A comprehensive (neuro)psychological test battery, including self and proxy questionnaires, was administered to CG patients without intellectual disability. Social cognition was assessed by facial emotion recognition, Theory of Mind and self-reported empathy. Standardised results were compared to normative data of the general population.
Results:
Data from 23 patients (aged 8–52) were included in the study. On a group level, CG patients reported satisfaction with social roles and no social dysfunction despite the self-report of lower social skills. They showed deficits in all aspects of social cognition on both performance tests (emotion recognition and Theory of Mind) and self-report questionnaires (empathy). Adults had a lower social participation than the general population. Parents reported lower social functioning, less adaptive emotion regulation and communication difficulties in their children. Individual differences in scores were present.
Conclusion:
This study shows that CG patients without intellectual disability are satisfied with their social competence, especially social functioning. Nevertheless, deficits in social cognition are present in a large proportion of CG patients. Due to the large variability in scores and discrepancies between self- and proxy-report, an individually tailored, comprehensive neuropsychological assessment including social cognition is advised in all CG patients. Treatment plans need to be customised to the individual patient.
Feeding whole prey to felids has shown to benefit their gastrointestinal health. Whether this effect is caused by the chemical or physical nature of whole prey is unknown. Fifteen domestic cats, as a model for strict carnivores, were either fed minced mice (MM) or whole mice (WM), to determine the effect of food structure on digestibility, mean urinary excretion time (MUET) of 15N, intestinal microbial activity and fermentation products. Faeces samples were collected after feeding all cats a commercially available extruded diet (EXT) for 10 d before feeding for 19 d the MM and WM diets with faeces and urine collected from day 11 to 15. Samples for microbiota composition and determination of MUET were obtained from day 16 to 19. The physical structure of the mice diet (minced or not) did not affect large intestinal fermentation as total SCFA and branched-chain fatty acid (BCFA), and most biogenic amine (BA) concentrations were not different (P > 0·10). When changing from EXT to the mice diets, the microbial community composition shifted from a carbolytic (Prevotellaceae) to proteolytic (Fusobacteriaceae) profile and led to a reduced faecal acetic to propionic acid ratio, SCFA, total BCFA (P < 0·001), NH3 (P = 0·04), total BA (P < 0·001) and para-cresol (P = 0·08). The results of this study indicate that food structure within a whole-prey diet is less important than the overall diet type, with major shifts in microbiome and decrease in potentially harmful fermentation products when diet changes from extruded to mice. This urges for careful consideration of the consequences of prey-based diets for gut health in cats.
Depression is associated with metabolic alterations including lipid dysregulation, whereby associations may vary across individual symptoms. Evaluating these associations using a network perspective yields a more complete insight than single outcome-single predictor models.
Methods
We used data from the Netherlands Study of Depression and Anxiety (N = 2498) and leveraged networks capturing associations between 30 depressive symptoms (Inventory of Depressive Symptomatology) and 46 metabolites. Analyses involved 4 steps: creating a network with Mixed Graphical Models; calculating centrality measures; bootstrapping for stability testing; validating central, stable associations by extra covariate-adjustment; and validation using another data wave collected 6 years later.
Results
The network yielded 28 symptom-metabolite associations. There were 15 highly-central variables (8 symptoms, 7 metabolites), and 3 stable links involving the symptoms Low energy (fatigue), and Hypersomnia. Specifically, fatigue showed consistent associations with higher mean diameter for VLDL particles and lower estimated degree of (fatty acid) unsaturation. These remained present after adjustment for lifestyle and health-related factors and using another data wave.
Conclusions
The somatic symptoms Fatigue and Hypersomnia and cholesterol and fatty acid measures showed central, stable, and consistent relationships in our network. The present analyses showed how metabolic alterations are more consistently linked to specific symptom profiles.
Methicillin-resistant Staphylococcus aureus (MRSA) infection is highly unlikely when nasal-swab results are negative. We evaluated the impact of an electronic prompt regarding MRSA nasal screening on the length of vancomycin therapy for respiratory indications.
Design:
Retrospective, single-center cohort study.
Setting:
Tertiary-care academic medical center (Mayo Clinic) in Jacksonville, Florida.
Patients:
Eligible patients received empiric treatment with vancomycin for suspected or confirmed respiratory infections from January through April 2019 (preimplementation cohort) and from October 2019 through January 2020 (postimplementation cohort).
Intervention:
The electronic health system software was modified to provide a best-practice advisory (BPA) prompt to the pharmacist upon order verification of vancomycin for patients with suspected or confirmed respiratory indications. Pharmacists were prompted to order a MRSA nasal swab if it was not already ordered by the provider.
Methods:
We reviewed patient records to determine the time from vancomycin prescription to de-escalation. The secondary end point was incidence of acute kidney injury.
Results:
The study included 120 patients (preimplementation, n = 61; postimplementation, n = 59). Median time to de-escalation was significantly shorter for the postimplementation cohort: 76 hours (interquartile range [IQR], 52–109) versus 42 hours (IQR, 37–61; P = .002). Acute kidney injury occurred in 11 patients (18%) in the preimplementation cohort and in 3 patients (5%) in the postimplementation cohort (P = .01; number needed to treat, 8).
Conclusions:
Implementation of a BPA notification for MRSA nasal screening helped decrease the time to de-escalation of vancomycin.
Cognitive deficits may be characteristic for only a subgroup of first-episode psychosis (FEP) and the link with clinical and functional outcomes is less profound than previously thought. This study aimed to identify cognitive subgroups in a large sample of FEP using a clustering approach with healthy controls as a reference group, subsequently linking cognitive subgroups to clinical and functional outcomes.
Methods
204 FEP patients were included. Hierarchical cluster analysis was performed using baseline brief assessment of cognition in schizophrenia (BACS). Cognitive subgroups were compared to 40 controls and linked to longitudinal clinical and functional outcomes (PANSS, GAF, self-reported WHODAS 2.0) up to 12-month follow-up.
Results
Three distinct cognitive clusters emerged: relative to controls, we found one cluster with preserved cognition (n = 76), one moderately impaired cluster (n = 74) and one severely impaired cluster (n = 54). Patients with severely impaired cognition had more severe clinical symptoms at baseline, 6- and 12-month follow-up as compared to patients with preserved cognition. General functioning (GAF) in the severely impaired cluster was significantly lower than in those with preserved cognition at baseline and showed trend-level effects at 6- and 12-month follow-up. No significant differences in self-reported functional outcome (WHODAS 2.0) were present.
Conclusions
Current results demonstrate the existence of three distinct cognitive subgroups, corresponding with clinical outcome at baseline, 6- and 12-month follow-up. Importantly, the cognitively preserved subgroup was larger than the severely impaired group. Early identification of discrete cognitive profiles can offer valuable information about the clinical outcome but may not be relevant in predicting self-reported functional outcomes.
Nursing home residents with dementia are sensitive to detrimental auditory environments. This paper presents the first literature review of empirical research investigating (1) the (perceived) intensity and sources of sounds in nursing homes, and (2) the influence of sounds on health of residents with dementia and staff.
Design:
A systematic review was conducted in PubMed, Web of Science and Scopus. Study quality was assessed with the Mixed Methods Appraisal Tool. We used a narrative approach to present the results.
Results:
We included 35 studies. Nine studies investigated sound intensity and reported high noise intensity with an average of 55–68 dB(A) (during daytime). In four studies about sound sources, human voices and electronic devices were the most dominant sources. Five cross-sectional studies focused on music interventions and reported positives effects on agitated behaviors. Four randomized controlled trials tested noise reduction as part of an intervention. In two studies, high-intensity sounds were associated with decreased nighttime sleep and increased agitation. The third study found an association between music and less agitation compared to other stimuli. The fourth study did not find an effect of noise on agitation. Two studies reported that a noisy environment had negative effects on staff.
Conclusions:
The need for appropriate auditory environments that are responsive to residents’ cognitive abilities and functioning is not yet recognized widely. Future research needs to place greater emphasis on intervention-based and longitudinal study design.
We examined the 2-year stability of neurological soft signs (NSS) in 29 patients after a first episode of psychosis. The numbers of NSS at inclusion and at 2 years follow-up were similar, but there was a significant increase in the numbers of NSS in the sub-group of patients whose dosage of antipsychotic medication had increased over time.
Endocannabinoid System (ECS) has been highlighted as one of the most relevant research topics by neurobiologists, pharmacists, basic scientists and clinicians (Skaper and Di Marzo, 2012). Recent work has associated major depressive disorder with the ECS (Ashton and Moore, 2011). Despite the close relationship between depression and bipolar disorders, as far as we know, there is no characterization of ECS and congeners in a sample of patients with bipolar disorders.
Aims and objectives
The objective of this work is to characterize the plasma levels of endocannabinoids and congeners in a sample of patients with bipolar disorders.
Method
The clinical group was composed by 19 patients with a diagnosis of bipolar disorders using SCID-IV (First et al., 1999). The control group was formed by 18 relatives of first- or second-degree of the patients.
The following endocannabinoids and congeners were quantified: N-palmitoleoylethanolamide (POEA), N-palmitolylethanolamide (PEA), N-oleoylethanolamide (OEA), N-stearoylethanolamide (SEA), N-arachidonoylethanolamide (AEA), N-dihomo-γ-linolenoylethanolamide (DGLEA), N-docosatetraenoylethanolamide (DEA), N-linoleoylethanolamide (LEA), N-docosahexaenoylethanolamide (DHEA), 2-arachidonoylglycerol (2-AG), 2-linoleoylglycerol (2-LG), and 2-oleoylglycerol (2-OG).
Results
The result showed statistically significant lower levels of AEA, DEA and DHEA in clinical sample. Previous research also identified lower levels of AEA in depressed women (Hill et al., 2008, 2009). Until date, it is unknown if DEA and DHEA have some effect on EC receptors, and whether they have some direct effects on endocannabinoids.
Conclusions
It would be necessary to carry our other research with a larger sample, which could allow the control of potential confounding variables.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Diagnosing heart failure (HF) in primary care can be challenging, especiallyin elderly patients with comorbidities. Insight in the prevalence, age,comorbidity and routine practice of diagnosing HF in general practice mayimprove the process of diagnosing HF.
Aim
To examine the prevalence of HF in relation to ageing and comorbidities, androutine practice of diagnosing HF in general practice.
Methods
A retrospective cohort study was performed using data from electronic healthrecords of 56 320 adult patients of 11 general practices. HF patients werecompared with patients without HF using descriptive analyses andχ2 tests. The following comorbidities were considered: chronicobstructive pulmonary disorder (COPD), diabetes mellitus (DM), hypertension,anaemia and renal function disorder (RFD). Separate analyses were performedfor men and women.
Findings
The point prevalence of HF was 1.2% (95% confidence interval1.13–1.33) and increased with each age category from 0.04%(18–44 years) to 20.9% (⩾85 years). All studiedcomorbidities were significantly (P<0.001) morecommon in HF patients than in patients without HF: COPD (24.1% versus3.1%), DM (34.7% versus 6.5%), hypertension(52.7% versus 16.0%), anaemia (10.9% versus2.3%) and RFD (61.8% versus 7.5%). N-terminal pro-BNP(NT-proBNP) was recorded in 38.1% of HF patients.
Conclusions
HF is highly associated with ageing and comorbidities. Diagnostic use ofNT-proBNP in routine primary care seems underutilized. Instruction of GPs todetermine NT-proBNP in patients suspected of HF is recommended, especiallyIn elderly patients with comorbidities.
Following stage 1 palliation, delayed sternal closure may be used as a technique to enhance thoracic compliance but may also prolong the length of stay and increase the risk of infection.
Methods
We reviewed all neonates undergoing stage 1 palliation at our institution between 2010 and 2017 to describe the effects of delayed sternal closure.
Results
During the study period, 193 patients underwent stage 1 palliation, of whom 12 died before an attempt at sternal closure. Among the 25 patients who underwent primary sternal closure, 4 (16%) had sternal reopening within 24 hours. Among the 156 infants who underwent delayed sternal closure at 4 [3,6] days post-operatively, 11 (7.1%) had one or more failed attempts at sternal closure. Patients undergoing primary sternal closure had a shorter duration of mechanical ventilation and intensive care unit length of stay. Patients who failed delayed sternal closure had a longer aortic cross-clamp time (123±42 versus 99±35 minutes, p=0.029) and circulatory arrest time (39±28 versus 19±17 minutes, p=0.0009) than those who did not fail. Failure of delayed sternal closure was also closely associated with Technical Performance Score: 1.3% of patients with a score of 1 failed sternal closure compared with 18.9% of patients with a score of 3 (p=0.0028). Among the haemodynamic and ventilatory parameters studied, only superior caval vein saturation following sternal closure was different between patients who did and did not fail sternal closure (30±7 versus 42±10%, p=0.002). All patients who failed sternal closure did so within 24 hours owing to hypoxaemia, hypercarbia, or haemodynamic impairment.
Conclusion
When performed according to our current clinical practice, sternal closure causes transient and mild changes in haemodynamic and ventilatory parameters. Monitoring of SvO2 following sternal closure may permit early identification of patients at risk for failure.
Mineral phosphorus (P) used to fertilise crops is derived from phosphate rock, which is a finite resource. Preventing and recycling mineral P waste in the food system, therefore, are essential to sustain future food security and long-term availability of mineral P. The aim of our modelling exercise was to assess the potential of preventing and recycling P waste in a food system, in order to reduce the dependency on phosphate rock. To this end, we modelled a hypothetical food system designed to produce sufficient food for a fixed population with a minimum input requirement of mineral P. This model included representative crop and animal production systems, and was parameterised using data from the Netherlands. We assumed no import or export of feed and food. We furthermore assumed small P soil losses and no net P accumulation in soils, which is typical for northwest European conditions. We first assessed the minimum P requirement in a baseline situation, that is 42% of crop waste is recycled, and humans derived 60% of their dietary protein from animals (PA). Results showed that about 60% of the P waste in this food system resulted from wasting P in human excreta. We subsequently evaluated P input for alternative situations to assess the (combined) effect of: (1) preventing waste of crop and animal products, (2) fully recycling waste of crop products, (3) fully recycling waste of animal products and (4) fully recycling human excreta and industrial processing water. Recycling of human excreta showed most potential to reduce P waste from the food system, followed by prevention and finally recycling of agricultural waste. Fully recycling P could reduce mineral P input by 90%. Finally, for each situation, we studied the impact of consumption of PA in the human diet from 0% to 80%. The optimal amount of animal protein in the diet depended on whether P waste from animal products was prevented or fully recycled: if it was, then a small amount of animal protein in the human diet resulted in the most sustainable use of P; but if it was not, then the most sustainable use of P would result from a complete absence of animal protein in the human diet. Our results apply to our hypothetical situation. The principles included in our model however, also hold for food systems with, for example, different climatic and soil conditions, farming practices, representative types of crops and animals and population densities.
Pastoralists have traditional ecological knowledge (TEK), which is important for their livelihoods and for policies and interventions. Pastoralism is under pressure, however, which may result in a decline of pastoral lifestyle and its related TEK. We, therefore, addressed the following objectives (i) to inventorise and assess how pastoralists characterise and value soils and forages in their environment, (ii) to analyse how soil, forage and livestock (i.e. cattle) characteristics relate to herding decisions and (iii) to determine whether TEK underlying herding decisions differs across generations. Data were collected through focus groups and individual interviews with 72 pastoralists, belonging to three generations and to three agro-ecological zones. Using a three-point scale (high, medium, low), four grasses and three tree forages were assessed in terms of nutritional quality for milk, meat, health and strength. Using their own visual criteria, pastoralists identified five different soils, which they selected for herding at different times of the year. Pastoralists stated that Pokuri was the best soil because of its low moisture content, whereas Karaal was the worst because forage hardly grows on it. They stated that perennials, such as Andropogon gayanus and Loxoderra ledermannii, were of high nutritional quality, whereas annuals such as Andropogon pseudapricus and Hyparrhenia involucrata were of low nutritional quality. Afzelia africana was perceived of high quality for milk production, whereas Khaya senegalensis had the highest quality for meat, health and strength. Pastoralists first used soil, then forage and finally livestock characteristics in their herding decisions. Pastoralists’ TEK was not associated with their generations, but with their agro-ecological zones. This study suggests that pastoralists had common and detailed TEK about soils, forages and livestock characteristics, underlying their herding decisions. To conclude, pastoralists use a holistic approach, combining soil, vegetation and livestock TEK in herding decisions. Such TEK can guide restoration or improvement of grazing lands, and land use planning.
Phenylketonuria (PKU), a genetic metabolic disorder that is characterized by the inability to convert phenylalanine to tyrosine, leads to severe intellectual disability and other cerebral complications if left untreated. Dietary treatment, initiated soon after birth, prevents most brain-related complications. A leading hypothesis postulates that a shortage of brain monoamines may be associated with neurocognitive deficits that are observable even in early-treated PKU. However, there is a paucity of evidence as yet for this hypothesis.
Methods
We therefore assessed in vivo striatal dopamine D2/3 receptor (D2/3R) availability and plasma monoamine metabolite levels together with measures of impulsivity and executive functioning in 18 adults with PKU and average intellect (31.2 ± 7.4 years, nine females), most of whom were early and continuously treated. Comparison data from 12 healthy controls that did not differ in gender and age were available.
Results
Mean D2/3R availability was significantly higher (13%; p = 0.032) in the PKU group (n = 15) than in the controls, which may reflect reduced synaptic brain dopamine levels in PKU. The PKU group had lower plasma levels of homovanillic acid (p < 0.001) and 3-methoxy-4-hydroxy-phenylglycol (p < 0.0001), the predominant metabolites of dopamine and norepinephrine, respectively. Self-reported impulsivity levels were significantly higher in the PKU group compared with healthy controls (p = 0.033). Within the PKU group, D2/3R availability showed a positive correlation with both impulsivity (r = 0.72, p = 0.003) and the error rate during a cognitive flexibility task (r = 0.59, p = 0.020).
Conclusions
These findings provide further support for the hypothesis that executive functioning deficits in treated adult PKU may be associated with cerebral dopamine deficiency.
Methyl isonicotinate is one of several patented 4-pyridyl carbonyl compounds being investigated for a variety of uses in thrips pest management. It is probably the most extensively studied thrips non-pheromone semiochemical, with field and glasshouse trapping experiments, and wind tunnel and Y-tube olfactometer studies in several countries demonstrating a behavioural response that results in increased trap capture of at least 12 thrips species, including the cosmopolitan virus vectors such as western flower thrips and onion thrips. Methyl isonicotinate has several of the characteristics that are required for an effective semiochemical tool and is being mainly used as a lure in combination with coloured sticky traps for enhanced monitoring of thrips in greenhouses. Research indicates that this non-pheromone semiochemical has the potential to be used for other thrips management strategies such as mass trapping, lure and kill, lure and infect, and as a behavioural synergist in conjunction with insecticides, in a range of indoor and outdoor crops.
Policymakers are concerned about nitrogen and phosphorus export to water bodies. Exports may be reduced by paying farmers to adopt practices to reduce runoff or by paying performance incentives tied to estimated run-off reductions. We evaluate the cost-effectiveness of practice and performance incentives for reducing nitrogen exports. Performance incentives potentially improve farm-level and allocative efficiencies relative to practice incentives. However, the efficiency improvements can be undermined by baseline shifts when growers adopt crops that enhance the performance payments but cause more pollution. Policymakers must carefully specify rules for performance-incentive programs and payments to avoid such baseline shifting.
A proposed Yield Reserve Program designed to compensate farmers for any reduced yields resulting from nitrogen (N) application rates reduced to below recommended rates is evaluated. Assuming that farmers currently follow Extension recommendations for applying N, Yield Reserve Program participation reduces expected net revenue by $10 to $13/ha. The Yield Reserve Program reduces expected net revenue by $17 to $20/ha for farmers who apply N to maximize expected net revenue. Farmers' costs of participation increase with lower probabilities of inadequate rainfall and higher corn prices and decline with higher N prices. The Yield Reserve Program can significantly reduce N applications to cropland, which may reduce N content of surface waters, but the costs to taxpayers and farmers will depend on how the program is implemented.
Background: The pathophysiology of subarachnoid hemorrhage (SAH) is complex and includes disruption of the blood-brain barrier (BBB). We freshly isolated BBB endothelial cells (BECs) by 2 distinct methods after experimental SAH and then interrogated their gene expression profiles with the goal of uncovering new therapeutic targets. Methods: SAH was induced using the prechiasmatic blood injection mouse model. BBB permeability studies were performed by administering intraperitoneal cadaverine dye injections at 24h and 48h. BECs were isolated either by sequential magnetic-based sorting for CD45-CD31+ cells or by fluorescence-activated cell sorting (FACS) for Tie2+Pdgfrb- cells. Total RNA was extracted and analyzed using Affymetrix Mouse Gene 2.0 ST Arrays. Results: BBB impairment occurred at 24h and resolved by 48h after SAH. Analysis of gene expression patterns in BECs at 24h reveal clustering of SAH and sham samples. We identified 707 (2.8%) significant differentially-expressed genes (403 upregulated, 304 downregulated) out of 24,865 interrogated probe sets. Many significantly upregulated genes were involved in inflammatory pathways. These microarray results were validated with real-time polymerase chain reaction (RT-PCR). Conclusions: This study is the first to investigate in an unbiased manner, whole genome expression profiling of freshly-isolated BECs in an SAH animal model, yielding targets for novel therapeutic intervention.
Improvements in colorectal cancer (CRC) detection and treatment have led to greater numbers of CRC survivors, for whom there is limited evidence on which to provide dietary guidelines to improve survival outcomes. Higher intake of red and processed meat and lower intake of fibre are associated with greater risk of developing CRC, but there is limited evidence regarding associations with survival after CRC diagnosis. Among 3789 CRC cases in the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort, pre-diagnostic consumption of red meat, processed meat, poultry and dietary fibre was examined in relation to CRC-specific mortality (n 1008) and all-cause mortality (n 1262) using multivariable Cox regression models, adjusted for CRC risk factors. Pre-diagnostic red meat, processed meat or fibre intakes (defined as quartiles and continuous grams per day) were not associated with CRC-specific or all-cause mortality among CRC survivors; however, a marginal trend across quartiles of processed meat in relation to CRC mortality was detected (P 0·053). Pre-diagnostic poultry intake was inversely associated with all-cause mortality among women (hazard ratio (HR)/20 g/d 0·92; 95 % CI 0·84, 1·00), but not among men (HR 1·00; 95 % CI 0·91, 1·09) (Pfor heterogeneity=0·10). Pre-diagnostic intake of red meat or fibre is not associated with CRC survival in the EPIC cohort. There is suggestive evidence of an association between poultry intake and all-cause mortality among female CRC survivors and between processed meat intake and CRC-specific mortality; however, further research using post-diagnostic dietary data is required to confirm this relationship.