We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Although cognitive remediation (CR) improves cognition and functioning, the key features that promote or inhibit its effectiveness, especially between cognitive domains, remain unknown. Discovering these key features will help to develop CR for more impact.
Aim
To identify interrelations between cognition, symptoms, and functioning, using a novel network analysis approach and how CR affects these recovery outcomes.
Methods
A secondary analysis of randomized controlled trial data (N = 165) of CR in early psychosis. Regularized partial correlation networks were estimated, including symptoms, cognition, and functioning, for pre-, post-treatment, and change over time. Pre- and post-CR networks were compared on global strength, structure, edge invariance, and centrality invariance.
Results
Cognition, negative, and positive symptoms were separable constructs, with symptoms showing independent relationships with cognition. Negative symptoms were central to the CR networks and most strongly associated with change in functioning. Verbal and visual learning improvement showed independent relationships to improved social functioning and negative symptoms. Only visual learning improvement was positively associated with personal goal achievement. Pre- and post-CR networks did not differ in structure (M = 0.20, p = 0.45) but differed in global strength, reflecting greater overall connectivity in the post-CR network (S = 0.91, p = 0.03).
Conclusions
Negative symptoms influenced network changes following therapy, and their reduction was linked to improvement in verbal and visual learning following CR. Independent relationships between visual and verbal learning and functioning suggest that they may be key intervention targets to enhance social and occupational functioning.
Humans often learn preferentially from ingroup members who share a social identity affiliation, while ignoring or rejecting information when it comes from someone perceived to be from an outgroup. This sort of bias has well-known negative consequences – exacerbating cultural divides, polarization, and conflict – while reducing the information available to learners. Why does it persist? Using evolutionary simulations, we demonstrate that similarity-biased social learning (also called parochial social learning) is adaptive when (1) individual learning is error-prone and (2) sufficient diversity inhibits the efficacy of social learning that ignores identity signals, as long as (3) those signals are sufficiently reliable indicators of adaptive behaviour. We further show that our results are robust to considerations of other social learning strategies, focusing on conformist and pay-off-biased transmission. We conclude by discussing the consequences of our analyses for understanding diversity in the modern world.
This paper traces back the psychological effects of the ‘masked exile’ of a Jewish Tunisian family settled in France. The author provides a rich analysis of a sudden and permanent change of nationality, country, language, urban bustle and family environment, following the ‘tunisification of Tunisia’ launched by President Bourguiba at the end of the 1950s. A comparison with the situation of later migrant workers from the Maghreb countries is sketched in the second part of this paper.
Hippocampal hyperperfusion has been observed in people at Clinical High Risk for Psychosis (CHR), is associated with adverse longitudinal outcomes and represents a potential treatment target for novel pharmacotherapies. Whether cannabidiol (CBD) has ameliorative effects on hippocampal blood flow (rCBF) in CHR patients remains unknown.
Methods
Using a double-blind, parallel-group design, 33 CHR patients were randomized to a single oral 600 mg dose of CBD or placebo; 19 healthy controls did not receive any drug. Hippocampal rCBF was measured using Arterial Spin Labeling. We examined differences relating to CHR status (controls v. placebo), effects of CBD in CHR (placebo v. CBD) and linear between-group relationships, such that placebo > CBD > controls or controls > CBD > placebo, using a combination of hypothesis-driven and exploratory wholebrain analyses.
Results
Placebo-treated patients had significantly higher hippocampal rCBF bilaterally (all pFWE<0.01) compared to healthy controls. There were no suprathreshold effects in the CBD v. placebo contrast. However, we found a significant linear relationship in the right hippocampus (pFWE = 0.035) such that rCBF was highest in the placebo group, lowest in controls and intermediate in the CBD group. Exploratory wholebrain results replicated previous findings of hyperperfusion in the hippocampus, striatum and midbrain in CHR patients, and provided novel evidence of increased rCBF in inferior-temporal and lateral-occipital regions in patients under CBD compared to placebo.
Conclusions
These findings suggest that hippocampal blood flow is elevated in the CHR state and may be partially normalized by a single dose of CBD. CBD therefore merits further investigation as a potential novel treatment for this population.
The present study examines the effects of the frequency of phoneme, syllable, and word units in the Granada corpus of Spanish phonological speech errors. We computed several measures of phoneme and syllable frequency and selected the most sensitive ones, along with word (lexeme) frequency to compare the frequencies of source, target, and error units at the phoneme, syllable, and word levels. Results showed that phoneme targets have equivalent frequency to matched controls, whereas source phonemes are lower in frequency than chance (the Weak Source effect) and target phonemes (the Davideffect). Target, source, and error syllables and words also were of lower frequency than chance, and error words (when they occur) were lowest in frequency. Contrary to most current theories, which focus on faulty processing of the target units, present results suggest that faulty processing of the source units (phonemes, syllables, and words) is an important factor contributing to phonological speech errors. Low-frequency words and syllables have more difficulty ensuring that their phonemes, especially those of low frequency, are output only in their correct locations.
OBJECTIVES/GOALS: Despite efforts to improve COVID-19 health outcomes through testing and vaccination, SARS-CoV-2 has exacerbated health disparities in underserved populations. Through this study we examined socio-contextual factors impacting decisions to test for COVID-19 among Native Americans in the Flathead Reservation and Hispanics in the Yakima Valley. METHODS/STUDY POPULATION: A series of 28 key informant interviews and 6 focus groups (N=39 focus group participants) were completed with community and tribal leaders using an interview guide informed by the Theory of Planned Behavior, Social Cognitive Theory, and the Social Contextual Factor Frameworks. The interview guide was designed to examine the socio-contextual factors impacting decisions to test for COVID-19 among Native Americans and Hispanics in the Northwest. A codebook was developed to apply deductive coding to informant responses, followed by an inductive, constant comparison approach. Three analysts met to refine the codebook and conduct inter-rater agreement. RESULTS/ANTICIPATED RESULTS: Five themes (social, cultural, health, religious and political factors) were identified that impacted testing for COVID-19. For social factors, participants discussed the influence of families and friends and unfair employment practices influencing decisions to test. Cultural factors included deep rooted distrust for the government and historical trauma. Health factors participants reported included the importance of testing to save lives, distrust for medical system, and health communications around COVID-19 affecting decisions to test. There was some interaction between religious and political factors. While participants mentioned beliefs in putting things in God’s hands, some decisions to test seemed to be affected by their political views. DISCUSSION/SIGNIFICANCE: Several socio-cultural factors influence decisions to test for COVID-19. Understanding the community’s perception of COVID-19 testing is critical for successful implementation of preventive strategies.
OBJECTIVES/GOALS: The COVID-19 pandemic impacted health systems and exposed disparities in access to health care among underserved populations. We examined how the pandemic shaped social, mental, and physical health among Native American and Latino communities in rural and underserved areas. METHODS/STUDY POPULATION: Using Theory of Planned Behavior, Social Cognitive Theory, and Social Contextual Factor frameworks, we developed interview guides to examine perceptions of the COVID-19 pandemic on social, mental, and physical health among community members. Stakeholders of the Confederated Salish and Kootenai Tribes of the Flathead Reservation in Montana and the Hispanic/Latinx population in Yakima Valley in Washington were selected through purposeful community-engagement. A total of six focus group discussions and 30 key informant interviews were administered in both communities. A codebook was developed and deductive coding was applied to informant responses, followed by an inductive, constant comparison approach. The codebook was further refined and inter-rater agreement was completed by three analysts. RESULTS/ANTICIPATED RESULTS: Four themes were highlighted as areas impacted by the COVID-19 pandemic (mental and physical health, family dynamics, and social disruptions) with few differences among geographic areas or between focus group (n=39) and key informant (n=28) participants. Perceived impacts on mental health included increased stress, anxiety, and depression, while pandemic-related lifestyle or family changes impacted physical health. Participants reported changes to family routines and dynamics due to staying home, social distancing, and more frequent interactions inside or limited interactions outside the household respectively. Social distruptions reported included impacts on finances, employment, and household staples, though participants highlighted how many community members stepped up to help those in need. DISCUSSION/SIGNIFICANCE: The COVID-19 pandemic had similar impacts on two geographically distinct underserved communities in Montana and Washington. Understanding the community’s experience with the COVID-19 pandemic is critical to identify strategies to support families, community needs, and mental and physical health in underserved communities.
Both maternal and, separately, paternal mental illness are associated with diminished academic attainment among children. However, the differential impacts of diagnostic type and degree of parental burden (e.g. one v. both parents affected) on these functional outcomes are unknown.
Methods
Using the Swedish national patient (NPR) and multi-generation (MGR) registers, 2 226 451 children (1 290 157 parental pairs), born 1 January 1973–31 December 1997, were followed through 31 December 2013. Diagnostic status of all cohort members was defined for eleven psychiatric disorders, and families classed by exposure: (1) parents affected with any disorder, (2) parents affected with a disorder group (e.g. neuropsychiatric disorders), and (3) parents affected with a specific disorder (e.g. ADHD). Pairs were further defined as ‘unaffected,’ ‘single-affected,’, or ‘dual-affected.’ Among offspring, the study evaluated fulfillment of four academic milestones, from compulsory (primary) school through University (college). Sensitivity analyses considered the impact of child's own mental health, as well as parental education, on main effects.
Results
Marked reductions in the odds of achievement were observed, emerging at the earliest levels of schooling for both single-affected [adjusted odds ratio (aOR), 0.50; 95% CI 0.49–0.51] and dual-affected (aOR 0.29, 95% CI 0.28–0.30) pairs and persisting thereafter [aOR range (single), 0.52–0.65; aOR range (dual), 0.30–0.40]. This pattern was repeated for analyses within diagnosis/diagnostic group. Main results were robust to adjustment for offspring mental health and parent education level.
Conclusions
Parental mental illness is associated with profound reductions in educational attainment in the subsequent generation, with children from dual-affected families at uniquely high risk.
Hidden cluster problems can manifest when broad ethnic categories are used as proxies for cultural traits, especially when traits are assumed to encode cultural distances between groups. We suggest a granular understanding of cultural trait distributions within and between ethnic categories is fundamental to the interpretation of heritability estimates as well as general behavioral outcomes.
Civil society actors contested the fifty-year long transition to a global economy based on the principles of neoliberalism. Mobilization against neoliberal measures represents one of the most common forms of social-movement activity across the world. We explore the evolution of resistance to economic liberalization from the 1970s to the current period. Our study highlights several dimensions of civic opposition to the implementation of free market policies, including: forms of neoliberalism; geographic distribution of protest events across world regions and time; and outcomes of movement campaigns.
A central function of health technology assessment (HTA) agencies is the production of HTA reports to support evidence-informed policy and decision making. HTA agencies are interested in understanding the mechanisms of HTA impact, which can be understood as the influence or impact of HTA report findings on decision making at various levels of the health system. The members of the International Network of Agencies for HTA (INAHTA) meet at their annual Congress where impact story sharing is one important activity. This paper summarizes four stories of HTA impact that were finalists for the David Hailey Award for Best Impact Story.
The methods to measure impact include: document review; claims analysis and review of reimbursement status; citation analysis; qualitative evaluation of stakeholders’ views; and review of media response. HTA agency staff also observed changes in government activities and priorities based on the HTA. Impact assessment can provide information to improve the HTA process, for example, the value of patient and clinician engagement in the HTA process to better define the assessment question and literature reviews in a more holistic and balanced way.
HTA reports produced by publicly funded HTA agencies are valued by health systems around the globe as they support decision making regarding the appropriate use, pricing, reimbursement, and disinvestment of health technologies. HTAs can also have a positive impact on information sharing between different levels of government and across stakeholder groups. These stories show how HTA can have a significant impact, irrespective of the health system and health technology being assessed.
Differences in health outcomes between meat-eaters and non-meat-eaters might relate to differences in dietary intakes between these diet groups. We assessed intakes of major protein-source foods and other food groups in six groups of meat-eaters and non-meat-eaters participating in the European Prospective Investigation into Cancer and Nutrition (EPIC)-Oxford study.
Materials and methods
Data were from 30, 239 participants who answered four questions regarding their consumption of meat, fish, dairy or eggs and completed a food frequency questionnaire (FFQ) in 2010. Participants were categorized as regular meat-eaters (> 50 grams of total/any meat per day: n = 12,997); low meat-eaters (< 50 grams of total/any meat per day: n = 4,650); poultry-eaters (poultry but no red meat: n = 591); fish-eaters (no meat but consumed fish: n = 4,528); vegetarians (no meat or fish: n = 6,672); and vegans (no animal products: n = 801). FFQ foods were categorised into 45 food groups. Analysis of variance was used to test for differences between age-adjusted mean intakes of each food group by diet group.
Results
We found that regular meat-eaters, vegetarians and vegans, respectively, consumed about a third, quarter and a fifth of their total energy intake from high protein-source foods. Compared with regular meat-eaters, low and non-meat-eaters consumed higher amounts of high-protein meat alternatives (soy, legumes, pulses, nuts, seeds) and other plant-based foods (whole grains, vegetables, fruits) and lower amounts of refined grains, fried foods, alcohol, and sugar-sweetened beverages.
Discussion
Overall, our results suggest that there were large differences in the amounts and types of protein-rich and other foods eaten by regular, low and non-meat-eaters. These findings provide insight into potential nutritional explanations for differences in health outcomes between diet groups.
It has been speculated that vegetarians or vegans may have higher risks of fractures than meat eaters, but there is limited evidence from prospective cohorts. We aimed to assess the risks of total and site-specific fractures in people of different diet groups, in a prospective cohort with a large proportion of non-meat eaters.
Materials and methods
In EPIC-Oxford, dietary information was collected at baseline (1993–2001) and at follow-up around 14 years later (≈2010). Participants were categorised into five diet groups (≈20,106 regular meat eaters: ≥ 50 g of meat per day, ≈9,274 low meat eaters: < 50 g of meat per day, ≈8,037 fish eaters, ≈15,499 vegetarians and ≈1,982 vegans, with minor variations in numbers for each outcome after pre-specified exclusions) at both time points. Using multivariable Cox regression adjusted for socio-demographic, lifestyle, and physiological confounders, we estimated the risks of total and site-specific fractures (arm, wrist, hip, leg, ankle, and other main sites i.e. clavicle, rib and vertebra) in the different diet groups, with outcomes identified through record linkage.
Results
Over an average of 17.6 years of follow-up, we observed 3,941 cases of total fractures, 566 arm fractures, 889 wrist fractures, 945 hip fractures, 366 leg fractures, 520 ankle fractures, and 467 other main site fractures. Compared with meat eaters, vegetarians had marginally higher risks of total fractures (hazard ratios and 95% confidence intervals: 1.10; 1.00–1.20) and arm fractures (1.28; 1.01–1.63), while vegans had significantly higher risks of total fractures (1.44; 1.21–1.72) and leg fractures (2.06; 1.22–3.47), and marginally higher risks of arm fractures (1.60, 1.01–2.54). For hip fractures, the risks were higher in fish eaters (1.28; 1.03–1.59), vegetarians (1.27; 1.05–1.55) and vegans (2.35; 1.67–3.30, p-heterogeneity < 0.0001) than regular meat eaters. There were no significant differences in risks of wrist, ankle or other main site fractures by diet groups. Overall, the significant associations appeared stronger without adjustment for body mass index (e.g. 1.52; 1.27–1.81 in vegans for total fractures), and were slightly attenuated with additional adjustment for total protein (1.41; 1.17–1.69) or dietary calcium (1.32; 1.10–1.59).
Discussion
In conclusion, non-meat eaters, especially vegans, had higher risks of either total or some site-specific fractures, particularly hip fractures. The higher risks might be partly explained by the lower body mass index in these diet groups, but differences in dietary intakes of protein and calcium are likely relevant as well. Given the observational design of this study, causality and potential mechanisms should be further investigated.
Meat intake is thought to play a role in the risk of cancer. The Third Expert Report of the World Cancer Research Fund/American Institute for Cancer Research concluded that red meat was a probable cause and processed meat a convincing cause of colorectal cancer. However, evidence for associations between red and processed meat intake and other cancer sites is limited. Furthermore, few studies have investigated the association between poultry intake and cancer risk. Therefore, this study aimed to examine the associations between red, processed meat and poultry intake and incidence for 20 common cancer sites.
Material and methods
We analysed data from 475,264 participants (54 % women) in UK Biobank. Participants were aged 37–73 years and cancer free at baseline. Cancer diagnosis and death due to cancer without prior diagnosis during follow-up were determined using data-linkage with cancer and death registries (with follow-up until 31 March 2016 for England and Wales and until 31 October 2015 for Scotland, respectively). Information on meat consumption was based on a touchscreen questionnaire completed at baseline covering type and frequency of meat intake. We used multivariable-adjusted Cox proportional hazards models to determine the association between baseline meat intake and cancer incidence. Analyses of lung cancer risk were restricted to never smokers. All analyses were adjusted for socio-demographic, lifestyle and women-specific factors.
Results
Over a mean 6.9 (SD 1.3) years of follow-up, 28,431 participants were diagnosed with any type of cancer. Red meat intake was positively associated with risk for colorectal cancer (n cases = 3,164; Hazard ratio (HR) per 50 g/day higher intake 1.22, 95% Confidence Interval (CI) 1.05–1.41), breast cancer (n cases = 5,536; 1.12, 1.01–1.24) and prostate cancer (n cases = 5,807; 1.16, 1.03–1.30). Processed meat intake was positively associated with risk for colorectal cancer (n cases = 3,189; HR per 20 g/day higher intake 1.17, 95% CI 1.06–1.30). Poultry intake was positively associated with risk for cancers of the lymphatic and hematopoietic tissues (n cases = 2,431; HR per 30g/day increment in intake 1.16, 95%-CI 1.03, 1.32).
Discussion
In summary, higher intakes of red and processed meat were associated with a higher risk of colorectal cancer. Red meat consumption was also positively associated with risk of breast and prostate cancer, but these associations are not supported by most previous prospective studies. The positive association of poultry intake with cancers of the lymphatic and hematopoietic tissues requires further investigation.
The evidence of associations between individual foods and dietary fibre with subtypes of stroke (ischaemic and haemorrhagic) is not conclusive. We aimed to investigate this in a large prospective cohort.
Materials and methods
We analysed data on 418,329 men and women from the European Prospective Investigation into Cancer and Nutrition (EPIC) study. Consumption of various animal-sourced foods (red and processed meat, poultry, fish, dairy, egg), plant-sourced foods (fruit and vegetables, legumes, nuts and seeds) and dietary fibre was assessed using validated country-specific questionnaires, calibrated with 24-hour recalls. Using multivariable Cox regressions adjusted for energy intake and socio-demographic, lifestyle and physiological confounders, we estimated hazard ratios of fatal and non-fatal ischaemic, haemorrhagic and total (i.e. ischaemic, haemorrhagic and unspecified) stroke associated with calibrated increment differences in consumption of each food or dietary fibre.
Results
Over an average of 12.7 years of follow-up, we observed 4281 cases of ischaemic stroke, 1430 cases of haemorrhagic stroke, and 7378 cases of total stroke. For ischaemic stroke, lower risks were observed with higher consumption of fruit and vegetables (hazard ratio (HR); 95% confidence interval (CI) for per 200g/d of calibrated intake, 0.87; 0.82–0.93) and dietary fibre (per 10g/d, HR 0.77; 95% CI 0.69–0.86) (p-trend < 0.001 for both); more modest inverse associations were also observed for milk (per 200g/d, HR 0.95; 95% CI 0.91–0.99, p-trend = 0.02), yogurt (per 100g/d, HR 0.91; 95% CI 0.85–0.97, p-trend = 0.004) and cheese (per 30g/d, HR 0.88; 95% CI 0.81–0.97, p-trend = 0.008), while a modest positive association was observed with higher red meat consumption (per 50g/d, HR 1.14; 95% CI 1.02–1.27, p-trend = 0.02). For haemorrhagic stroke, higher risk was associated with higher egg consumption (per 20g/d, HR 1.25; 95% CI 1.09–1.43, p-trend = 0.002). For total stroke, associations were consistent with those of both subtypes; we observed inverse associations for fruit and vegetables (HR 0.89, 95% CI 0.85–0.93), dietary fibre (HR 0.80, 95% CI 0.74–0.86), yogurt (HR 0.91, 95% CI 0.87–0.96), cheese (HR 0.88, 95% CI 0.82–0.94), and positive associations for red and processed meat (HR 1.18, 95% CI 1.05–1.33) and egg (HR 1.07, 95% CI 1.01–1.14).
Discussion
To conclude, risk of ischaemic stroke was inversely associated with consumption of fruit and vegetables, dietary fibre and dairy foods, and positively associated with red meat, while risk of haemorrhagic stroke was positively associated with egg consumption. Causality of the associations cannot be determined in this observational study.
There is evidence that plant-based diets might be associated with a lower risk of IHD; however, previous studies have not reported on intake of subtypes of fruit and vegetables and sources of dietary fibre. This study aims to assess the associations of major plant foods, their subtypes and dietary fibre with risk of ischaemic heart disease (IHD) in the European Prospective Investigation into Cancer and Nutrition (EPIC)-CVD Consortium.
Material and methods
We conducted a prospective analysis of 490,311 men and women in ten European countries without a history of myocardial infarction or stroke at recruitment. Dietary intake was assessed using validated questionnaires and calibrated with 24-hour recall data. Cox regression models, adjusted for IHD risk factors, were used to estimate hazard ratios (HRs) and 95% confidence intervals (CIs).
Results
During a mean of 12.6 years follow-up, we documented 8504 myocardial infarction cases or deaths from IHD. Participants consuming at least eight portions (80 grams each) of fruits and vegetables a day had a 10% lower risk of IHD (HR 0.90, 95% CI: 0.82–0.98) compared with those consuming fewer than three portions a day. The risk of IHD was 6% (95% CI 0.90–0.99; P-trend = 0.009) lower for a 200 g/day higher intake of fruit and vegetables combined, 3% (0.95–1.00; P-trend = 0.021) lower for a 100 g/ day higher fruit intake, and 8% (0.86–0.97; P-trend = 0.006) lower for a 50 g/ day higher intake of bananas. Moreover, risk of IHD was 9% (0.83–0.99; P-trend = 0.032) lower for a 10g/ day higher intake of nuts and seeds, and 10% (0.82–0.98; P-trend = 0.020) lower for a 10g/ day higher intake of total dietary fibre. No associations were observed between legumes, total vegetables and other subtypes of fruit and vegetables and IHD risk.
Discussion
The results from this large prospective study suggest that higher intakes of fruit and vegetables combined, total fruit, bananas, nuts and seeds, and total fibre are associated with a lower risk of IHD. Given the observational design of this study, causality and potential mechanisms should be further investigated.
Prebiotics are a subtype of dietary fibre selectively fermented by beneficial bacterial in the colon. Preclinical evidence has suggested that prebiotics may be associated with a decreased risk of colorectal cancer. However, the association between dietary intake of prebiotics and colorectal cancer risk has not been investigated prospectively. This study aims to prospectively investigate the association between total prebiotic intake and colorectal cancer risk. Further characterisation of the association by prebiotic sub-type (fructans and galacto-oligosaccharides (GOSs)) and colorectal cancer sub-site (colon cancer and rectal cancer) were secondary objectives.
Material and methods:
A total of 53,700 men and women living in England and Scotland who were enrolled in the European Prospective Investigation into Cancer and Nutrition (EPIC)-Oxford study, were included in the analysis and followed up for incident colorectal cancers. Validated semi-quantitative food frequency questionnaires administered at baseline were used to calculate daily fructan, GOS and total prebiotic intake. We used multivariable Cox proportional hazards models to assess associations between prebiotic intake and risk of colorectal cancer.
Results:
A total of 574 incident cases of colorectal cancer were identified during a mean of 16.1 years of follow-up. Total prebiotic, fructan and GOS intake were not significantly associated with colorectal cancer risk. The hazard ratios for those in the highest fourths of total prebiotic, fructan and GOS intake compared to those in the lowest fourths were 0.87 (95% confidence intervals (CI) 0.66–1.14; P for trend = 0.3), 0.91 (95% CI 0.70–1.18; P for trend = 0.4), and 0.87 (95% CI 0.66–1.15; P for trend = 0.4) respectively. The associations remained nonsignificant when colorectal cancer sub-sites were investigated separately.
Discussion:
The results from this observational study do not support an association between prebiotic intake and colorectal cancer risk. Given the biological plausibility of a role for prebiotics in reducing colorectal cancer risk and since the non-significant association between prebiotic intake and colorectal cancer risk observed in the current study may be due to the small number of cases and the healthy profile of the cohort, further epidemiological research is needed to characterise the association between dietary prebiotic intake and colorectal cancer incidence.
The interpersonal dependency inventory comprised three subscales called Emotional reliance of another person (ER), lack of social self confidence (LSS) and Assertion of autonomy (AUT). Several formula have been developed for deriving whole-scale scores.
The aim of the study on 621 addictive subjects was to determine the best formula using the DSM-IV dependent personality disorder as gold standard. The formula 3 ER + LSS – AUT yielded the best values of sensitivity and specificity.
The aim of this paper was to investigate the diagnostic specificity of the self-critical and dependent depressive experiences in a clinical sample of eating disorder patients and to explore the impact of adverse childhood experiences on these dimensions of personality.
Method
A sample of 94 anorexic and 61 bulimic patients meeting DSM-IV criteria and 236 matched controls were assessed with the Depressive Experience Questionnaire (DEQ), the abridged version of the Beck Depression Inventory (BDI) and the AMDP Life Events Inventory. Subjects presenting a major depression or a comorbid addictive disorder were excluded from the sample using the Mini International Neuropsychiatric Interview (MINI).
Results
Anorexic and bulimic patients showed higher scores than controls on both self-criticism and dependency sub-scales of the DEQ. Bulimic patients scored significantly higher than anorexic patients on self-criticism and reported more adverse childhood experiences. Finally, negative life events correlated only with self-criticism in the whole sample.
Discussion
Differences in the DEQ Self-Criticism between anorexics and bulimics could not be accounted for by depression since bulimic patients did not show higher BDI levels compared to anorexic patients and depressive symptoms measured with the BDI were not found to be significant predictors of diagnostic grouping in a logistic multiple regression.
Conclusion
This study supports the diagnostic specificity of the dependent and self-critical depressive dimensions in eating disorders and strengthens previous research on the role of early experiences in the development of these disorders.