We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Tetralogy of Fallot patients face an elevated risk of developing chylothorax and pleural effusions post-surgery. This patient group exhibits risk factors known to compromise the lymphatic system, such as elevated central venous pressure, pulmonary flow changes, and hypoxia. This study investigates the morphology and function of the lymphatic system in tetralogy of Fallot patients through lymphatic magnetic resonance imaging and near-infrared fluorescence imaging, respectively.
Methods:
Post-repair tetralogy of Fallot patients aged 6–18 years were recruited, along with age and gender-matched controls. Magnetic resonance imaging was used to assess the morphology of the thoracic lymphatic vessels and the thoracic, while near-infrared fluorescence imaging was used to assess lymphatic activity utilising lymph rate, velocity, and pressure.
Results:
Nine patients and 10 controls were included. Echocardiography revealed that 2/3 of the patients had moderate-severe pulmonary regurgitation, while none displayed signs of elevated central venous pressure. Magnetic resonance imaging identified three patients with type 3 (out of 4 types) lymphatic abnormalities, while controls had none. The thoracic ducts showed severe (one patient) and moderate (one patient) tortuosity. Mean thoracic duct diameters were 3.3 mm ±1.1 in patients and 3.0 mm ± 0.8 in controls (p-value = 0.53). Near-infrared fluorescence imaging revealed no anomalous patterns.
Conclusion:
Despite no presence of clinical lymphatic disease, 3/9 of the repaired tetralogy of Fallot patients exhibited lymphatic morphological abnormalities. The significance of these anomalies remains uncertain currently. Further research is needed to determine whether these lymphatic alterations in this patient cohort are a result of congenital malformations, haemodynamic shifts, or prenatal and early-life saturation levels.
Abusive head trauma (AHT) is a form of inflicted brain injury that is associated with significant neurological impairment. Given that injuries occur during infancy, cognitive deficits may not become fully apparent for years. It is useful to understand injury factors related to outcomes. A recent study by Eismann et al. (2020) used length of PICU stay as a measure of injury severity and found that it is predictive of short-term and long-term outcomes in AHT. The current study aimed to examine injury severity factors related to acute outcomes (<3 months since injury) within a population of infants admitted to an inpatient rehabilitation unit (IRU).
Participants and Methods:
The sample consisted of 45 infants (32 male, 13 female) hospitalized with suspected AHT. Age at injury was 0-21 months (MED= 4.89 months, SD = 5.48). The majority of patients (93%) had moderate to severe injury based on length of PICU stay (4+ days) [3]. Patients were administered the Mullen Scales of Early Learning (MSEL) during IRU admission, within 3 months of injury (range: 13-68 days; MED: 31 days). Pearson bivariate correlations were used to examine the relationship between MSEL subscales (ELC: Early Learning Composite; VR: Visual Reception; RL: Receptive Language; EL: Expressive Language; FM: Fine Motor; GM: Gross Motor) and the following factors: days since injury and hospitalization time (days in PICU, PICU/General Pediatrics, IRU, total hospitalization). P-values less than .05 were considered significant.
Results:
Scores on the MSEL Early Learning Composite ranged from exceptionally low to high average (Standard Score Range: <49-111; MED: 82; SD = 18.79). Unlike prior studies, time in PICU and time in PICU/General Pediatrics were not associated with any MSEL subscales. MSEL was moderately correlated with days in IRU (ELC: r = -.44; VR: r = -.37; RL: r = -.32; EL: r = -.36; GM: r = -.29) and total hospitalization time (ELC: r = -.46; VR: r = -.42; RL: r = -.36; EL: r = -.37; GM: r = -.31), such that longer hospitalization was associated with lower scores. Greater days since injury was also associated with lower MSEL scores (ELC: r = -.45; VR: r = -.42; RL: r = -.40; EL: r = -.36; FM: r = -.33; GM: r = -.35).
Conclusions:
These results suggest that within an inpatient rehabilitation setting, longer total hospitalization time (including time on IRU) is moderately associated with worse acute neurobehavioral outcomes. While length of PICU stay has been associated with short-term outcomes in the outpatient setting (Eismann et al., 2020), this was not found in the current inpatient sample which had more severe injuries (longer PICU stay, inpatient rehabilitation admission). Interestingly, children assessed further out from injury had worse scores on the MSEL, which has previously been noted. Though this seems counterintuitive, it may reflect that participants with more severe injuries had a longer delay before they were capable of engaging in a neurodevelopmental assessment. These findings have implications for prognosticating early outcomes of AHT in an inpatient rehabilitation setting.
Chronic musculoskeletal pain is associated with neurobiological, physiological, and cellular measures. Importantly, we have previously demonstrated that a biobehavioral and psychosocial resilience index appears to have a protective relationship on the same biomarkers. Less is known regarding the relationships between chronic musculoskeletal pain, protective factors, and brain aging. This study investigates the relationships between clinical pain, a resilience index, and brain age. We hypothesized that higher reported chronic pain would correlate with older appearing brains, and the resilience index will attenuate the strength of the relationship between chronic pain and brain age.
Participants and Methods:
Participants were drawn from an ongoing observational multisite study and included adults with chronic pain who also reported knee pain (N = 135; age = 58.3 ± 8.1; 64% female; 49% non-Hispanic Black, 51% non-Hispanic White; education Mdn = some college; income level Mdn = $30,000 - $40,000; MoCA M = 24.27 ± 3.49). Measures included the Graded Chronic Pain Scale (GCPS), characteristic pain intensity (CPI) and disability, total pain body sites; and a cognitive screening (MoCA). The resilience index consisted of validated biobehavioral (e.g., smoking, waist/hip ratio, and active coping) and psychosocial measures (e.g., optimism, positive affect, negative affect, perceived stress, and social support). T1-weighted MRI data were obtained. Surface area metrics were calculated in FreeSurfer using the Human Connectome Project's multi-modal cortical parcellation scheme. We calculated brain age in R using previously validated and trained machine learning models. Chronological age was subtracted from predicted brain age to generate a brain age gap (BAG). With higher scores of BAG indicating predicated age is older than chronological age. Three parallel hierarchical regression models (each containing one of three pain measures) with three blocks were performed to assess the relationships between chronic pain and the resilience index in relation to BAG, adjusting for covariates. For each model, Block 1 entered the covariates, Block 2 entered a pain score, and Block 3 entered the resilience index.
Results:
GCPS CPI (R2 change = .033, p = .027) and GCPS disability (R2 change = 0.038, p = 0.017) significantly predicted BAG beyond the effects of the covariates, but total pain sites (p = 0.865) did not. The resilience index was negatively correlated and a significant predictor of BAG in all three models (p < .05). With the resilience index added in Block 3, both GCPS CPI (p = .067) and GCPS disability (p = .066) measures were no longer significant in their respective models. Additionally, higher education/income (p = 0.016) and study site (p = 0.031) were also significant predictors of BAG.
Conclusions:
In this sample, higher reported chronic pain correlated with older appearing brains, and higher resilience attenuated this relationship. The biobehavioral and psychosocial resilience index was associated with younger appearing brains. While our data is cross-sectional, findings are encouraging that interventions targeting both chronic pain and biobehavioral and psychosocial factors (e.g., coping strategies, positive and negative affect, smoking, and social support) might buffer brain aging. Future directions include assessing if chronic pain and resilience factors can predict brain aging over time.
The Swan Point site in interior Alaska contains a significant multi-component archaeological record dating back to 14,200 cal BP. The site’s radiocarbon (14C) chronology has been presented in scattered publications that mostly focus on specific archaeological periods in Alaska, in particular its terminal Pleistocene components associated with the East Beringian tradition. This paper synthesizes the site’s 14C data and provides sequential Bayesian models for its cultural zones and subzones. The 14C and archaeological record at Swan Point attests that the location was persistently used over the last 14,000 years, even though major changes are evident within regional vegetation and local faunal communities, reflecting long-term trends culminating in Dene-Athabascan history.
Little is known regarding how the risk of suicide in refugees relates to their host country. Specifically, to what extent, inter-country differences in structural factors between the host countries may explain the association between refugee status and subsequent suicide is lacking in previous literature.
Objectives
We aimed to investigate the risk of suicide among refugees in Sweden and Norway according to their sex, age, region/country of birth and duration of residence.
Methods
Each suicide case between the age of 18-64 years during 1998 and 2018 (17,572 and 9,443 cases in Sweden and Norway, respectively) was matched with up to 20 population-based controls, by sex and age. Multivariate-adjusted conditional logistic regression models yielding adjusted odds ratios (aORs) with 95% confidence intervals (95% CI) were used to test the association between refugee status and suicide.
Results
The aORs for suicide in refugees in Sweden and Norway were 0.5 (95% CI: 0.5-0.6) and 0.3 (95% CI: 0.3-0.4), compared with the Swedish-born and Norwegian-born individuals, respectively. Stratification by region/country of birth showed similar statistically significant lower odds for most refugee groups in both host countries except for refugees from Eritrea (aOR 1.0, 95% CI: 0.7-1.6) in Sweden. The risk of suicide did not vary much across refugee groups by their duration of residence, sex and age.
Conclusions
The findings of almost similar suicide mortality advantages among refugees in two host countries may suggest that resiliency and culture/religion-bound attitudes could be more influential for suicide risk among refugees than other post-migration environmental and structural factors in the host country.
COVID-19 reinforced the need for effective leadership and administration within Clinical and Translational Science Award (CTSA) program hubs in response to a public health crisis. The speed, scale, and persistent evolution of the pandemic forced CTSA hubs to act quickly and remain nimble. The switch to virtual environments paired with supporting program operations, while ensuring the safety and well-being of their team, highlight the critical support role provided by leadership and administration. The pandemic also illustrated the value of emergency planning in supporting organizations’ ability to quickly pivot and adapt. Lessons learned from the pandemic and from other cases of adaptive capacity and preparedness can aid program hubs in promoting and sustaining the overall capabilities of their organizations to prepare for future events.
As the USA and the rest of the world raced to fight the COVID-19 pandemic, years of investments from the National Center for Advancing Translational Sciences allowed for informatics services and resources at CTSA hubs to play a significant role in addressing the crisis. CTSA hubs partnered with local and regional partners to collect data on the pandemic, provide access to relevant patient data, and produce data dashboards to support decision-making. Coordinated efforts, like the National COVID Cohort Collaborative (N3C), helped to aggregate and harmonize clinical data nationwide. Even with significant informatics investments, some CTSA hubs felt unprepared in their ability to respond to the fast-moving public health crisis. Many hubs were forced to quickly evolve to meet local needs. Informatics teams expanded critical support at their institutions which included an engagement platform for clinical research, COVID-19 awareness and education activities in the community, and COVID-19 data dashboards. Continued investments in informatics resources will aid in ensuring that tools, resources, practices, and policies are aligned to meet local and national public health needs.
The North American AED Pregnancy Registry (NAAPR) provides crucial data for understanding the risks of antiepileptic drug (AED) exposure in pregnancy. This study aims to quantify the Canadian contribution to NAAPR and compare AED usage in pregnancy in Canada and the USA.
Methods:
Enrollment rate ratios (ERR) to NAAPR, adjusted for the populations of women of childbearing age, were calculated for the USA, Canada, and for the different Canadian provinces. Methods of enrollment to NAAPR and AED usage were compared between the two countries using chi-squared tests.
Results:
Between 1997 and 2019, 10,215 pregnant women enrolled into NAAPR: 4.1% were Canadian (n = 432, ERR = 0.39, CI95% = 0.35–0.43). Within Canada, no patients were enrolled from the three northern territories or from Prince Edward Island. While fewer patients than expected enrolled from Quebec (ERR = 0.35, CI95% = 0.19–0.58), Nova Scotia had the highest enrollment rate (ERR = 1.55; CI95% = 0.66–3.11). Compared with their American peers, Canadians were less likely to have been enrolled by their healthcare provider and more likely to have been enrolled via social media (p < 0.01). Canadian women were more likely to be taking carbamazepine (24% vs. 15%; p < 0.01) or valproic acid (8% vs. 4%; p < 0.01).
Conclusion:
The proportion of Canadian enrollees into NAAPR was less than expected based on the relative population size of Canadian women of reproductive age. Greater Canadian enrollment to NAAPR would contribute to ongoing worldwide efforts in assessing the risks of AEDs use in pregnant women and help quantify rates of AED usage, major congenital malformations, and access to subspecialized epilepsy care within Canada.
Little is known regarding how the risk of suicide in refugees relates to their host country. Specifically, to what extent inter-country differences in structural factors between the host countries may explain the association between refugee status and subsequent suicide is lacking in previous literature. We aimed to investigate (1) the risk of suicide in refugees resident in Sweden and Norway, in general, and according to their sex, age, region/country of birth and duration of residence, compared with the risk of suicide in the respective majority host population; (2) if factors related to socio-demographics, labour market marginalisation (LMM) and healthcare use might explain the risk of suicide in refugees differently in host countries.
Methods
Using a nested case-control design, each case who died by suicide between the age of 18 and 64 years during 1998 and 2018 (17 572 and 9443 cases in Sweden and Norway, respectively) was matched with up to 20 controls from the general population, by sex and age. Multivariate-adjusted conditional logistic regression models yielding adjusted odds ratios (aORs) with 95% confidence intervals (95% CI) were used to test the association between refugee status and suicide. Separate models were controlled for factors related to socio-demographics, previous LMM and healthcare use. Analyses were also stratified by sex and age groups, by refugees' region/country of birth and duration of residence in the host country.
Results
The aORs for suicide in refugees in Sweden and Norway were 0.5 (95% CI 0.5–0.6) and 0.3 (95% CI 0.3–0.4), compared with the Swedish-born and Norwegian-born individuals, respectively. Stratification by region/country of birth showed similar statistically significant lower odds for most refugee groups in both host countries except for refugees from Eritrea (aOR 1.0, 95% CI 0.7–1.6) in Sweden. The risk of suicide did not vary much across refugee groups by their duration of residence, sex and age except for younger refugees aged 18–24 who did not have a statistically significant relative difference in suicide risk than their respective host country peers. Factors related to socio-demographics, LMM and healthcare use had only a marginal influence on the studied associations in both countries.
Conclusions
Refugees in Sweden and Norway had almost similar suicide mortality advantages compared with the Swedish-born and Norwegian-born population, respectively. These findings may suggest that resiliency and culture/religion-bound attitudes towards suicidal behaviour in refugees could be more influential for their suicide risk after resettlement than other post-migration environmental and structural factors in the host country.
Studies that reveal detailed information about trilobite growth, particularly early developmental stages, are crucial for improving our understanding of the phylogenetic relationships within this iconic group of fossil arthropods. Here we document an essentially complete ontogeny of the trilobite Redlichia cf. versabunda from the Cambrian Series 2 (late Stage 4) Ramsay Limestone of Yorke Peninsula in South Australia, including some of the best-preserved protaspides (the earliest biomineralized trilobite larval stage) known for any Cambrian trilobite. These protaspid stages exhibit similar morphological characteristics to many other taxa within the Suborder Redlichiina, especially to closely related species such as Metaredlichia cylindrica from the early Cambrian period of China. Morphological patterns observed across early developmental stages of different groups within the Order Redlichiida are discussed. Although redlichiine protaspides exhibit similar overall morphologies, certain ontogenetic characters within this suborder have potential phylogenetic signal, with different superfamilies characterized by unique trait combinations in these early growth stages.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Stress and mental health problems is common among medical students, but we lack studies of intervention programmes during medical school.
Design and Methods
The students participated in one of two group session programmes during their third year at medical school. One option was a self-development programme delivered by trained group therapists. Alternatively the students chose a programme focusing on themes of special relevance to doctors. The format was one and a half hours' group sessions, once weekly, altogether 12 times. The baseline data were gathered before the intervention (T1).We studied the effect 3 months post intervention (T2), in this first follow-up paper. One class (N= 128) of medical students were given the group intervention programmes. The next year class (comparison group) received no intervention (N= 152). The main outcome of this study was Perceived Medical School Stress (PMSS), which has been linked to anxiety and depression, as well as need for treatment. We used multilevel linear model (repeated measures) to test for differences over time.
Results:
Both the intervention and the comparison groups showed a decline in PMSS from T1 to T2. There was an interaction between time and the intervention, and this indicates a significant effect (P=0.01) of the intervention. We found this effect due to the participation in the self-development groups (P=0.009). All effects were controlled for age and gender.
Implications
The self-development groups have a beneficial effect on stress among the students in a three months follow-up.
Effective preventive strategies could reduce disability and the long term social and health complications associated with depression, but options are limited. Cognitive bias modification (CBM) is a novel, simple, and safe intervention that corrects the attentional and interpretive biases associated with depression.
Objectives
To determine if CBM decreases the one-year onset of major depression in adults at risk.
Methods
This randomised controlled trial will recruit adults with subsyndromal depression living in Australia (parallel design, 1:1 allocation ratio). The intervention will be delivered via the internet over 52 weeks. The primary outcome of interest is the onset of a major depression according to DSM-IV-TR criteria. Secondary outcomes of interest include change in the severity of depressive (Patient Health Questionnaire, PHQ-9) and changes in attention and interpretive biases. Outcomes will be collected 3, 6, 9 and 12 months after randomisation.
Results
Preliminary data on a subsample of 20 participants showed that the mean±SE PHQ-9 score of controls was 7.5±0.9 at study entry and 7.1±1.5 at week 6 (paired t-test=0.29, p=0.779), whereas the mean±SE score of active CBM participants was 7.4±1.0 and 4.4±1.1, respectively (paired t=6.00, p<0.001). The mean PHQ-9 difference between control and active CBM participants over 6 weeks was 2.6±1.5 points (t=1.79, p=0.090). One of 11 controls (9.1%) and 0/9 active CBM participants showed evidence of clinically significant depressive symptoms at week 6 (i.e., PHQ-9≥15).
Conclusions
By March 2015, 6-months preliminary data will be available on 165 participants.
Twelve evidence-based profiles of roles across the translational workforce and two patients were made available through clinical and translational science (CTS) Personas, a project of the Clinical and Translational Science Awards (CTSA) Program National Center for Data to Health (CD2H). The persona profiles were designed and researched to demonstrate the key responsibilities, motivators, goals, software use, pain points, and professional development needs of those working across the spectrum of translation, from basic science to clinical research to public health. The project’s goal was to provide reliable documents that could be used to inform CTSA software development projects, educational resources, and communication initiatives. This paper presents the initiative to create personas for the translational workforce, including the methodology, engagement strategy, and lessons learned. Challenges faced and successes achieved by the project may serve as a roadmap for others searching for best practices in the creation of Persona profiles.
Introduction: Simulation has assumed an integral role in the Canadian healthcare system with applications in quality improvement, systems development, and medical education. High quality simulation-based research (SBR) is required to ensure the effective and efficient use of this tool. This study sought to establish national SBR priorities and describe the barriers and facilitators of SBR in Emergency Medicine (EM) in Canada. Methods: Simulation leads (SLs) from all fourteen Canadian Departments or Divisions of EM associated with an adult FRCP-EM training program were invited to participate in three surveys and a final consensus meeting. The first survey documented active EM SBR projects. Rounds two and three established and ranked priorities for SBR and identified the perceived barriers and facilitators to SBR at each site. Surveys were completed by SLs at each participating institution, and priority research themes were reviewed by senior faculty for broad input and review. Results: Twenty SLs representing all 14 invited institutions participated in all three rounds of the study. 60 active SBR projects were identified, an average of 4.3 per institution (range 0-17). 49 priorities for SBR in Canada were defined and summarized into seven priority research themes. An additional theme was identified by the senior reviewing faculty. 41 barriers and 34 facilitators of SBR were identified and grouped by theme. Fourteen SLs representing 12 institutions attended the consensus meeting and vetted the final list of eight priority research themes for SBR in Canada: simulation in CBME, simulation for interdisciplinary and inter-professional learning, simulation for summative assessment, simulation for continuing professional development, national curricular development, best practices in simulation-based education, simulation-based education outcomes, and simulation as an investigative methodology. Conclusion: Conclusion: This study has summarized the current SBR activity in EM in Canada, as well as its perceived barriers and facilitators. We also provide a consensus on priority research themes in SBR in EM from the perspective of Canadian simulation leaders. This group of SLs has formed a national simulation-based research group which aims to address these identified priorities with multicenter collaborative studies.
Organic pig husbandry systems in Europe are diverse – ranging from indoor systems with concrete outside run (IN) to outdoor systems all year round (OUT) and combinations of both on one farm (POUT). As this diversity has rarely been taken into account in research projects on organic pig production, the aim of this study was to assess and compare pig health, welfare and productivity in these three systems. Animal health and welfare were assessed using direct observation and records of 22 animal-based measures, comprising 17 health-, 3 productivity- and 2 behavioural measures. These were collected in pregnant sows, weaners and fattening pigs during direct observations and from records within a cross-sectional study on 74 farms (IN: n = 34, POUT: n = 28, OUT: n = 12) in eight countries. Overall, prevalence of several animal health and welfare issues was low (e.g. median 0% for pigs needing hospitalisation, shoulder lesions, ectoparasites; <5% for runts, tail lesions, conjunctivitis). Exceptions in particular systems were respiratory problems in weaners and fatteners (IN: 60.0%, 66.7%; POUT: 66.7%, 60.0%), weaning diarrhoea (IN: 25.0%), and short tails in fatteners (IN: 6.5%, POUT: 2.3%). Total suckling piglet losses (recorded over a period of 12 months per farm) were high in all three systems (IN: 21.3%; POUT: 21.6; OUT: 19.2%). OUT had lower prevalences of respiratory problems, diarrhoea and lameness of sows. POUT farms in most cases kept sows outdoors and weaners and fatteners similar to IN farms, which was reflected in the results regarding several health and welfare parameters. It can be concluded, that European organic pigs kept in all three types of husbandry system showed a low prevalence of health and welfare problems as assessed by our methodology, but respiratory health and diarrhoea should be improved in weaners and fatteners kept indoors and total piglet mortality in all systems. The results provide benchmarks for organic pig producers and organisations which can be used in strategies to promote health and welfare improvement. Furthermore, in future research, the identified health and welfare issues (e.g. suckling piglet mortality, weaning diarrhoea) should be addressed, specifically considering effects of husbandry systems.
The association between lifestyle and survival after colorectal cancer has received limited attention. The female sex hormone, oestrogen, has been associated with lower colorectal cancer risk and mortality after colorectal cancer. Phyto-oestrogens are plant compounds with structure similar to oestrogen, and the main sources in Western populations are plant lignans. We investigated the association between the main lignan metabolite, enterolactone and survival after colorectal cancer among participants in the Danish Diet, Cancer and Health cohort. Prediagnosis plasma samples and lifestyle data, and clinical data from time of diagnosis from 416 women and 537 men diagnosed with colorectal cancer were used. Enterolactone was measured in plasma using a liquid chromatography–tandem mass spectrometry (LC–MS/MS) method. Participants were followed from date of diagnosis until death or end of follow-up. During this time, 210 women and 325 men died (170 women and 215 men died due to colorectal cancer). The Cox proportional hazards model was used to estimate hazard ratios (HR) and 95 % CI. Enterolactone concentrations were associated with lower colorectal cancer-specific mortality among women (HRper doubling: 0·88, 95 % CI 0·80, 0·97, P=0·0123). For men, on the contrary, enterolactone concentrations were associated with higher colorectal cancer-specific mortality (HRper doubling: 1·10, 95 % CI 1·01, 1·21, P=0·0379). The use of antibiotics affects enterolactone production, and the associations between higher enterolactone and lower colorectal cancer-specific mortality were more pronounced among women who did not use antibiotics (analysis on a subset). Our results suggest that enterolactone is associated with lower risk of mortality among women, but the opposite association was found among men.
Unlike planktonic systems, reaction rates in biofilms are often limited by mass transport, which controls the rate of supply of contaminants into the biofilm matrix. To help understand this phenomenon, we investigated the potential of magnetic resonance imaging (MRI) to spatially quantify copper transport and fate in biofilms. For this initial study we utilized an artificial biofilm composed of a 50:50 mix of bacteria and agar. MRI successfully mapped Cu2+ uptake into the artificial biofilm by mapping T2 relaxation rates. A calibration protocol was used to convert T2 values into actual copper concentrations. Immobilization rates in the artificial biofilm were slow compared to the rapid equilibration of planktonic systems. Even after 36 h, the copper front had migrated only 3 mm into the artificial biofilm and at this distance from the copper source, concentrations were very low. This slow equilibration is a result of (1) the time it takes copper to diffuse over such distances and (2) the adsorption of copper onto cell surfaces, which further impedes copper diffusion. The success of this trial run indicates MRI could be used to quantitatively map heavy metal transport and immobilization in natural biofilms.
To identify predominant dietary patterns in four African populations and examine their association with obesity.
Design
Cross-sectional study.
Setting/Subjects
We used data from the Africa/Harvard School of Public Health Partnership for Cohort Research and Training (PaCT) pilot study established to investigate the feasibility of a multi-country longitudinal study of non-communicable chronic disease in sub-Saharan Africa. We applied principal component analysis to dietary intake data collected from an FFQ developed for PaCT to ascertain dietary patterns in Tanzania, South Africa, and peri-urban and rural Uganda. The sample consisted of 444 women and 294 men.
Results
We identified two dietary patterns: the Mixed Diet pattern characterized by high intakes of unprocessed foods such as vegetables and fresh fish, but also cold cuts and refined grains; and the Processed Diet pattern characterized by high intakes of salad dressing, cold cuts and sweets. Women in the highest tertile of the Processed Diet pattern score were 3·00 times more likely to be overweight (95 % CI 1·66, 5·45; prevalence=74 %) and 4·24 times more likely to be obese (95 % CI 2·23, 8·05; prevalence=44 %) than women in this pattern’s lowest tertile (both P<0·0001; prevalence=47 and 14 %, respectively). We found similarly strong associations in men. There was no association between the Mixed Diet pattern and overweight or obesity.
Conclusions
We identified two major dietary patterns in several African populations, a Mixed Diet pattern and a Processed Diet pattern. The Processed Diet pattern was associated with obesity.