We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Research shows initial COVID-19 lockdowns increased population mental distress. Yet, the mental health impact of repeated lockdowns in England remains unknown.
Aims
To: (a) explore changes in population mental health symptoms over the COVID-19 pandemic period (March 2020 to March 2021) in England, comparing this with trends from a decade before (2009–2019) as well as after (2021–2023); (b) compare the mental health impact of each of the three lockdowns in England with periods of eased restrictions, determining who was most affected; (c) examine the impact of demographics and distinct time periods on the prevalence of mental health symptoms.
Method
A secondary analysis of a national longitudinal cohort study, utilising data from Waves 1–13 of the UK Household Longitudinal Study and from Waves 1–9 of the COVID-19 Survey. Mental health was assessed using the 12-item General Health Questionnaire. Student t-tests and logistical regressions were conducted.
Results
There was a significant increase in the prevalence of self-reported symptoms of mental health during England's pandemic period, encompassing three lockdowns, compared with the average of rates from 10 years before. Rates of reported mental health symptoms were not significantly different across each lockdown, but were significantly higher than pre-pandemic rates, declining with eased restrictions. Rates from the end of lockdown to May 2023 revealed elevated mental health symptoms compared with pre-pandemic. Elevated symptoms were observed for women, people homeworking, those with health conditions, individuals aged 30–45 years and those experiencing loneliness.
Conclusion
Repeated lockdowns in England had a substantial impact on mental health, indicating requirements for ongoing mental health support.
Education can be viewed as a control theory problem in which students seek ongoing exogenous input—either through traditional classroom teaching or other alternative training resources—to minimize the discrepancies between their actual and target (reference) performance levels. Using illustrative data from \documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$n=784$$\end{document} Dutch elementary school students as measured using the Math Garden, a web-based computer adaptive practice and monitoring system, we simulate and evaluate the outcomes of using off-line and finite memory linear quadratic controllers with constraintsto forecast students’ optimal training durations. By integrating population standards with each student’s own latent change information, we demonstrate that adoption of the control theory-guided, person- and time-specific training dosages could yield increased training benefits at reduced costs compared to students’ actual observed training durations, and a fixed-duration training scheme. The control theory approach also outperforms a linear scheme that provides training recommendations based on observed scores under noisy and the presence of missing data. Design-related issues such as ways to determine the penalty cost of input administration and the size of the control horizon window are addressed through a series of illustrative and empirically (Math Garden) motivated simulations.
Cross-sectional studies have identified health risks associated with epigenetic aging. However, it is unclear whether these risks make epigenetic clocks ‘tick faster’ (i.e. accelerate biological aging). The current study examines concurrent and lagged within-person changes of a variety of health risks associated with epigenetic aging.
Methods
Individuals from the Great Smoky Mountains Study were followed from age 9 to 35 years. DNA methylation profiles were assessed from blood, at multiple timepoints (i.e. waves) for each individual. Health risks were psychiatric, lifestyle, and adversity factors. Concurrent (N = 539 individuals; 1029 assessments) and lagged (N = 380 individuals; 760 assessments) analyses were used to determine the link between health risks and epigenetic aging.
Results
Concurrent models showed that BMI (r = 0.15, PFDR < 0.01) was significantly correlated to epigenetic aging at the subject-level but not wave-level. Lagged models demonstrated that depressive symptoms (b = 1.67 months per symptom, PFDR = 0.02) in adolescence accelerated epigenetic aging in adulthood, also when models were fully adjusted for BMI, smoking, and cannabis and alcohol use.
Conclusions
Within-persons, changes in health risks were unaccompanied by concurrent changes in epigenetic aging, suggesting that it is unlikely for risks to immediately ‘accelerate’ epigenetic aging. However, time lagged analyses indicated that depressive symptoms in childhood/adolescence predicted epigenetic aging in adulthood. Together, findings suggest that age-related biological embedding of depressive symptoms is not instant but provides prognostic opportunities. Repeated measurements and longer follow-up times are needed to examine stable and dynamic contributions of childhood experiences to epigenetic aging across the lifespan.
Diagnostic criteria for major depressive disorder allow for heterogeneous symptom profiles but genetic analysis of major depressive symptoms has the potential to identify clinical and etiological subtypes. There are several challenges to integrating symptom data from genetically informative cohorts, such as sample size differences between clinical and community cohorts and various patterns of missing data.
Methods
We conducted genome-wide association studies of major depressive symptoms in three cohorts that were enriched for participants with a diagnosis of depression (Psychiatric Genomics Consortium, Australian Genetics of Depression Study, Generation Scotland) and three community cohorts who were not recruited on the basis of diagnosis (Avon Longitudinal Study of Parents and Children, Estonian Biobank, and UK Biobank). We fit a series of confirmatory factor models with factors that accounted for how symptom data was sampled and then compared alternative models with different symptom factors.
Results
The best fitting model had a distinct factor for Appetite/Weight symptoms and an additional measurement factor that accounted for the skip-structure in community cohorts (use of Depression and Anhedonia as gating symptoms).
Conclusion
The results show the importance of assessing the directionality of symptoms (such as hypersomnia versus insomnia) and of accounting for study and measurement design when meta-analyzing genetic association data.
Auditory verbal hallucinations (AVHs) in schizophrenia have been suggested to arise from failure of corollary discharge mechanisms to correctly predict and suppress self-initiated inner speech. However, it is unclear whether such dysfunction is related to motor preparation of inner speech during which sensorimotor predictions are formed. The contingent negative variation (CNV) is a slow-going negative event-related potential that occurs prior to executing an action. A recent meta-analysis has revealed a large effect for CNV blunting in schizophrenia. Given that inner speech, similar to overt speech, has been shown to be preceded by a CNV, the present study tested the notion that AVHs are associated with inner speech-specific motor preparation deficits.
Objectives
The present study aimed to provide a useful framework for directly testing the long-held idea that AVHs may be related to inner speech-specific CNV blunting in patients with schizophrenia. This may hold promise for a reliable biomarker of AVHs.
Methods
Hallucinating (n=52) and non-hallucinating (n=45) patients with schizophrenia, along with matched healthy controls (n=42), participated in a novel electroencephalographic (EEG) paradigm. In the Active condition, they were asked to imagine a single phoneme at a cue moment while, precisely at the same time, being presented with an auditory probe. In the Passive condition, they were asked to passively listen to the auditory probes. The amplitude of the CNV preceding the production of inner speech was examined.
Results
Healthy controls showed a larger CNV amplitude (p = .002, d = .50) in the Active compared to the Passive condition, replicating previous results of a CNV preceding inner speech. However, both patient groups did not show a difference between the two conditions (p > .05). Importantly, a repeated measure ANOVA revealed a significant interaction effect (p = .007, ηp2 = .05). Follow-up contrasts showed that healthy controls exhibited a larger CNV amplitude in the Active condition than both the hallucinating (p = .013, d = .52) and non-hallucinating patients (p < .001, d = .88). No difference was found between the two patient groups (p = .320, d = .20).
Conclusions
The results indicated that motor preparation of inner speech in schizophrenia was disrupted. While the production of inner speech resulted in a larger CNV than passive listening in healthy controls, which was indicative of the involvement of motor planning, patients exhibited markedly blunted motor preparatory activity to inner speech. This may reflect dysfunction in the formation of corollary discharges. Interestingly, the deficits did not differ between hallucinating and non-hallucinating patients. Future work is needed to elucidate the specificity of inner speech-specific motor preparation deficits with AVHs. Overall, this study provides evidence in support of atypical inner speech monitoring in schizophrenia.
The Korean Basketball League(KBL) holds an annual draft to allow teams to select new players, mostly graduates from the elite college basketball teams even though some are from high school teams. In sports games, many factors might influence the success of an athlete. In addition to possessing excellent physical and technical factors, success in a sports game is also influenced by remarkable psychological factors. Several studies reported that elite sports players can control their anxiety during competition, which may lead to better performance. In particular, the temperament and characteristics of players have been regarded as crucial determinants of the player’s performance and goal. In this regard, numerous studies suggest that personality is considered to be an important predictor of long-term success in professional sports
Objectives
Based on previous reports and studies, we hypothesized that physical status, temperament and characteristics, and neurocognitive functions of basketball players could predict the result of KBL draft selection. Especially, temperament and characteristics were associated with the result of KBL selection. The basketball performances including average scores and average rebound were associated with emotional perception and mental rotation.
Methods
We recruited the number of 44 college elite basketball players(KBL selection, n=17; Non-KBL selection, n=27), and the number of 35 age-matched healthy comparison subjects who major in sports education in college. All participants were assessed with the Temperament and Character Inventory(TCI), Sports Anxiety Scales(SAS), Beck Depression Inventory(BDI), Perceived Stress Scale (PSS-10), Trail Making Test(TMT), and Computerized Neuro-cognitive Test(CNT) for Emotional Perception and Mental Rotation.
Results
Current results showed that physical status, temperament and characteristics, and Neurocognitive functions of college basketball players could predict the KBL draft selection. Among temperament and characteristics, novelty seeking and reward dependence were associated with KBL draft selection. The basketball performances including average scores and average rebound were associated with emotional perception and mental rotation.
Conclusions
In order to be a good basketball player for a long time, it was confirmed that temperamental factors and Neurocognitive factors were very closely related. Furthermore, it is also judged that these results can be used as basic data to predict potential professional basketball players.
Background: High-grade gliomas (HGG) present challenges with short post-surgery survival and high progression rates. Extracellular vesicles (EVs) in the tumor microenvironment (TME) contribute to a pro-tumorigenic setting. Investigating Transfer RNA fragments (TfRNA) in HGG patient plasma EVs reveals potential biomarkers and therapeutic targets, shedding light on the molecular landscape for enhanced diagnostic and therapeutic strategies. This study examines TfRNA in 10 HGG patients at diagnosis, offering insights into the molecular landscape for improved management strategies. Methods: The study involved the collection of plasma samples from HGG patients and controls. EVs were isolated from these samples and subsequently analyzed for tfRNA. Results: Analysis of plasma EVs highlighted distinct differences in TfRNA fragments between High-Grade Glioma (HGG) and control samples. HGG EVs showed a global reduction in tRNA content, higher 5’ tfRNA proportions, and increased nuclear tfrna compared to controls. A notable biological marker, elevated in HGG, holds potential as a diagnostic indicator. Conclusions: Our study concludes that High-Grade Gliomas (HGG) demonstrate a global reduction in tfRNA content in plasma extracellular vesicles compared to non-cancer controls, echoing findings in other cancers. Despite this, specific tfRNA molecules in HGG show significant differential expression or sorting into EVs, indicating their potential as future biomarkers or therapeutic targets.
Background: Sedation in PICU masks physical exam findings, leading to diagnostic challenges. In adult models, electroencephalography can evaluate the brain’s response to sedation using feedforward connectivity and anteriorization of alpha hubs, proving useful for prognostication. Feasibility of model translation into pediatric population was assessed, with the hypothesis that the same markers of adaptive reconfiguration would correlate with a higher potential for recovering consciousness. Methods: Electroencephalograms from children undergoing sedation were analyzed for strength and direction of functional connectivity using the weighted and directed phase lag index. Target population was refined with an iterative inclusion criteria. We examined relationships between hub location reconfiguration, directed phase lag index, baseline Glasgow Coma Scale, and 3-month post-treatment Glasgow Outcome Scale-Extended. Results: Evaluation of 14 subjects showed promise in children aged 5-18 undergoing sedation with midazolam, dexmedetomidine, and propofol. Further analysis of five subjects revealed a correlation between adaptive reconfiguration during anesthesia and both higher baseline Glasgow Coma Scale and Glasgow Outcome Scale-Extended scores post-treatment. Conclusions: The findings indicate that the functional brain network connectivity model may have diagnostic and prognostic potential regarding children’s consciousness levels. While the initial data is promising, further analysis of six additional cases is pending and deemed essential to thoroughly evaluate the model’s efficacy.
Early neurodevelopmental deviations, such as abnormal cortical folding patterns, are a candidate biomarker for major depressive disorder (MDD). Previous studies on patterns of abnormal cortical gyrification in MDD have provided valuable insights; however, the findings on cortical folding are controversial.
Objectives
We aimed to investigate the association of MDD with the local gyrification index (LGI) in each cortical region at the whole-brain level and the association of the LGI with clinical characteristics of MDD, including recurrence, remission status, illness duration, severity of depression, and medication status of patients with MDD.
Methods
We obtained T1-weighted images of 234 patients with MDD and 215 healthy controls (HCs). LGI values were automatically calculated using the FreeSurfer software according to the Desikan–Killiany atlas. LGI values from 66 cortical regions in the bilateral hemispheres were analyzed. We compared the LGI values between the MDD and HC groups using the analysis of covariance, including patients’ age, sex, and years of education as covariates. The association between clinical characteristics and LGI values was investigated in the MDD group.
Results
Compared with HCs, patients with MDD showed significantly decreased LGI values in the cortical regions, including the bilateral ventrolateral and dorsolateral prefrontal cortices, medial and lateral orbitofrontal cortices, insula, right rostral anterior cingulate cortex, and several temporal and parietal regions, with the highest effect size in the left pars triangularis (Cohen’s f = 0.361; P = 1.78 × 10-13). As for the association of clinical characteristics with LGIs within the MDD group, recurrence and longer illness duration of MDD were associated with increased gyrification in several occipital and temporal regions, which showed no significant difference in LGIs between MDD and HC groups.
Conclusions
Considering that the aforementioned cortical regions are involved in emotion regulation, abnormal cortical folding patterns in such regions may be associated with the dysfunction of emotion regulation-related neural circuits, which may lead to MDD. These findings suggest that LGI may be a relatively stable neuroimaging marker associated with the trait of MDD predisposition.
The Developmental Origins of Health and Disease (DOHaD), also termed developmental programming, refers to adaptations during development that predispose an individual or a population towards later life noncommunicable disease (NCD) conditions or chronic diseases. The developmental trajectory of an individual is determined broadly by the interaction between that individual’s genes and the environment. “Environment” in this sense may include maternal or paternal factors, influences such as nutritional status before or during pregnancy, stress and exposure to contaminants, drugs or alcohol; and maternal diseases of pregnancy that influence transport of substrates and nutrients across the placenta (e.g., preeclampsia, placental insufficiency); and pre-term birth; factors that occur before or around the time of conception, during gestation or in the period after birth; the so-called First 1,000 Days. The interactions between genes and environment determine not only the developmental processes of the fetus and placenta leading to short term morbidity (low birth weight) and mortality, but also long term morbidity of multiple systems including neurodevelopmental disorders such as learning difficulties, poor developmental trajectories and cognitive development, mental health and behavioral disorders in children, and metabolic disorders such as obesity and diabetes. In later life, developmental programming contributes to heart disease such as hypertension and coronary heart disease, type II diabetes, obesity, immune, behavioral, and neurological disorders.
The mechanisms underlying developmental programming can result from structural changes in tissues or organs, effects on germ cells or stem cells, alterations in the microbiome, or in core inflammatory and immunological processes. There are clear sex differences in these responses, strong intergenerational effects, and variable vulnerability across the life course. Many adjustments occur as adaptive fetal responses to adversity or stress, such as hypoxemia or inappropriate nutrient supply, to ensure survival. The placenta plays a critical role in developmental programming, both in regulating the impact of maternal influences on the fetus, and through its direct impact on fetal development.
This study aimed: to evaluate the association between coronavirus disease 2019 infection and olfactory and taste dysfunction in patients presenting to the out-patient department with influenza-like illness, who underwent reverse transcription polymerase chain reaction testing for coronavirus; and to determine the sensitivity, specificity, and positive and negative predictive values of olfactory and taste dysfunction and other symptoms in these patients.
Methods
Patients presenting with influenza-like illness to the study centre in September 2020 were included in the study. The symptoms of patients who tested positive for coronavirus on reverse transcription polymerase chain reaction testing were compared to those with negative test results.
Results
During the study period, 909 patients, aged 12–70 years, presented with influenza-like illness; of these, 316 (34.8 per cent) tested positive for coronavirus. Only the symptoms of olfactory and taste dysfunction were statistically more significant in patients testing positive for coronavirus than those testing negative.
Conclusion
During the pandemic, patients presenting to the out-patient department with sudden loss of sense of smell or taste may be considered as positive for coronavirus disease 2019, until proven otherwise.
Optimal preoperative therapy regimen in the treatment of resectable retroperitoneal sarcoma (RPS) remains unclear. This study compares the impact of preoperative radiation, chemoradiation and chemotherapy on overall survival (OS) in RPS patients.
Materials and Methods:
The National Cancer Database (NCDB) was queried for patients with non-metastatic, resectable RPS (2006–15). The primary endpoint was OS, evaluated by Kaplan–Meier method, log-rank test, Cox multivariable analysis and propensity score matching.
Results:
A total of 1,253 patients met the inclusion criteria, with 210 patients (17%) receiving chemoradiation, 850 patients (68%) receiving radiation and 193 patients (15%) receiving chemotherapy. On Cox multivariable analysis, when compared to preoperative chemoradiation, preoperative radiation was not associated with improved OS (hazards ratio [HR] 0·98, 95% CI 0·76–1·25, p = 0·84), while preoperative chemotherapy was associated with worse OS (HR 1·64, 95% CI 1·24–2·18, p < 0·001). Similar findings were observed in 199 and 128 matched pairs for preoperative radiation and chemotherapy, respectively, when compared to preoperative chemoradiation.
Findings:
Our study suggested an OS benefit in using preoperative chemoradiation compared to chemotherapy alone, but OS outcomes were comparable between preoperative chemoradiation and radiation alone.
Ventilator-capable skilled nursing facilities (vSNFs) are critical to the epidemiology and control of antibiotic-resistant organisms. During an infection prevention intervention to control carbapenem-resistant Enterobacterales (CRE), we conducted a qualitative study to characterize vSNF healthcare personnel beliefs and experiences regarding infection control measures.
Design:
A qualitative study involving semistructured interviews.
Setting:
One vSNF in the Chicago, Illinois, metropolitan region.
Participants:
The study included 17 healthcare personnel representing management, nursing, and nursing assistants.
Methods:
We used face-to-face, semistructured interviews to measure healthcare personnel experiences with infection control measures at the midpoint of a 2-year quality improvement project.
Results:
Healthcare personnel characterized their facility as a home-like environment, yet they recognized that it is a setting where germs were ‘invisible’ and potentially ‘threatening.’ Healthcare personnel described elaborate self-protection measures to avoid acquisition or transfer of germs to their own household. Healthcare personnel were motivated to implement infection control measures to protect residents, but many identified structural barriers such as understaffing and time constraints, and some reported persistent preference for soap and water.
Conclusions:
Healthcare personnel in vSNFs, from management to frontline staff, understood germ theory and the significance of multidrug-resistant organism transmission. However, their ability to implement infection control measures was hampered by resource limitations and mixed beliefs regarding the effectiveness of infection control measures. Self-protection from acquiring multidrug-resistant organisms was a strong motivator for healthcare personnel both outside and inside the workplace, and it could explain variation in adherence to infection control measures such as a higher hand hygiene adherence after resident care than before resident care.
Induction chemotherapy (iC) followed by concurrent chemoradiation has been shown to improve overall survival (OS) for locally advanced pancreatic cancer (LAPC). However, the survival benefit of stereotactic body radiation therapy (SBRT) versus conventionally fractionated radiation therapy (CFRT) following iC remains unclear.
Materials and methods:
The National Cancer Database (NCDB) was queried for primary stage III, cT4N0-1M0 LAPC (2004–15). Kaplan–Meier analysis, Cox proportional hazards method and propensity score matching were used.
Results:
Among 872 patients, 738 patients underwent CFRT and 134 patients received SBRT. Median follow-up was 24·3 and 22·9 months for the CFRT and SBRT cohorts, respectively. The use of SBRT showed improved survival in both the multivariate analysis (hazards ratio 0·78, p = 0·025) and 120 propensity-matched pairs (median OS 18·1 versus 15·9 months, p = 0·004) compared to the CFRT.
Findings:
This NCDB analysis suggests survival benefit with the use of SBRT versus CFRT following iC for the LAPC.
This National Cancer Database (NCDB) analysis was performed to evaluate the outcomes of adjuvant chemotherapy (AC) versus observation for resected pancreatic adenocarcinoma treated with neoadjuvant therapy (NT).
Materials and methods:
The NCDB was queried for primary stages I–II cT1-3N0-1M0 resected pancreatic adenocarcinoma treated with NT (2004–2015). Baseline patient, tumour and treatment characteristics were extracted. The primary end point was overall survival (OS). With a 6-month conditional landmark, Kaplan–Meier analysis, multivariable Cox proportional hazards method and 1:1 propensity score matching was used to analyse the data.
Results:
A total of 1,737 eligible patients were identified, of which 1,247 underwent post-operative observation compared to 490 with AC. The overall median follow-up was 34·7 months. The addition of AC showed improved survival on the multivariate analysis (HR 0·78, p < 0·001). AC remained statistically significant for improved OS, with a median OS of 26·3 months versus 22·3 months and 2-year OS of 63·9% versus 52·9% for the observation cohort (p < 0·001). Treatment interaction analysis showed OS benefit of AC for patients with smaller tumours.
Findings:
Our findings suggest a survival benefit for AC compared to observation following NT and surgery for resectable pancreatic adenocarcinoma, especially in patients with smaller tumours.
Hereditary transthyretin-mediated (hATTR) amyloidosis is a progressive disease caused by mutations in the TTR gene leading to multisystem organ dysfunction. Pathogenic TTR aggregation, misfolding, and fibrillization lead to deposition of amyloid in multiple body organs and frequently involve the peripheral nerve system and the heart. Common neurologic manifestations include: sensorimotor polyneuropathy (PN), autonomic neuropathy, small-fiber PN, and carpal tunnel syndrome. Many patients have significant progression due to diagnostic delays as hATTR PN is not considered within the differential diagnosis. Recently, two effective novel disease-modifying therapies, inotersen and patisiran, were approved by Health Canada for the treatment of hATTR PN. Early diagnosis is crucial for the timely introduction of these disease-modifying treatments that reduce impairments, improve quality of life, and extend survival. In this guideline, we aim to improve awareness and outcomes of hATTR PN by making recommendations directed to the diagnosis, monitoring, and treatment in Canada.
Arctic mining has a bad reputation because the extractive industry is often responsible for a suite of environmental problems. Yet, few studies explore the gap between untouched tundra and messy megaproject from a historical perspective. Our paper focuses on Advent City as a case study of the emergence of coal mining in Svalbard (Norway) coupled with the onset of mining-related environmental change. After short but intensive human activity (1904–1908), the ecosystem had a century to respond, and we observe a lasting impact on the flora in particular. With interdisciplinary contributions from historical archaeology, archaeozoology, archaeobotany and botany, supplemented by stable isotope analysis, we examine 1) which human activities initially asserted pressure on the Arctic environment, 2) whether the miners at Advent City were “eco-conscious,” for example whether they showed concern for the environment and 3) how the local ecosystem reacted after mine closure and site abandonment. Among the remains of typical mining infrastructure, we prioritised localities that revealed the subtleties of long-term anthropogenic impact. Significant pressure resulted from landscape modifications, the import of non-native animals and plants, hunting and fowling, and the indiscriminate disposal of waste material. Where it was possible to identify individual inhabitants, these shared an economic attitude of waste not, want not, but they did not hold the environment in high regard. Ground clearances, animal dung and waste dumps continue to have an effect after a hundred years. The anthropogenic interference with the fell field led to habitat creation, especially for vascular plants. The vegetation cover and biodiversity were high, but we recorded no exotic or threatened plant species. Impacted localities generally showed a reduction of the natural patchiness of plant communities, and highly eutrophic conditions were unsuitable for liverworts and lichens. Supplementary isotopic analysis of animal bones added data to the marine reservoir offset in Svalbard underlining the far-reaching potential of our multi-proxy approach. We conclude that although damaging human–environment interactions formerly took place at Advent City, these were limited and primarily left the visual impact of the ruins. The fell field is such a dynamic area that the subtle anthropogenic effects on the local tundra may soon be lost. The fauna and flora may not recover to what they were before the miners arrived, but they will continue to respond to new post-industrial circumstances.
To investigate the association between parity and the risk of incident dementia in women.
Methods
We pooled baseline and follow-up data for community-dwelling women aged 60 or older from six population-based, prospective cohort studies from four European and two Asian countries. We investigated the association between parity and incident dementia using Cox proportional hazards regression models adjusted for age, educational level, hypertension, diabetes mellitus and cohort, with additional analysis by dementia subtype (Alzheimer dementia (AD) and non-Alzheimer dementia (NAD)).
Results
Of 9756 women dementia-free at baseline, 7010 completed one or more follow-up assessments. The mean follow-up duration was 5.4 ± 3.1 years and dementia developed in 550 participants. The number of parities was associated with the risk of incident dementia (hazard ratio (HR) = 1.07, 95% confidence interval (CI) = 1.02–1.13). Grand multiparity (five or more parities) increased the risk of dementia by 30% compared to 1–4 parities (HR = 1.30, 95% CI = 1.02–1.67). The risk of NAD increased by 12% for every parity (HR = 1.12, 95% CI = 1.02–1.23) and by 60% for grand multiparity (HR = 1.60, 95% CI = 1.00–2.55), but the risk of AD was not significantly associated with parity.
Conclusions
Grand multiparity is a significant risk factor for dementia in women. This may have particularly important implications for women in low and middle-income countries where the fertility rate and prevalence of grand multiparity are high.
Recently, we found that in ovo feeding of l-leucine (l-Leu) afforded thermotolerance, stimulated lipid metabolism and modified amino acid metabolism in male broiler chicks. However, the effects of in ovo feeding of l-Leu on thermoregulation and growth performance until marketing age of broilers are still unknown. In this study, we investigated the effects of in ovo feeding of l-Leu on body weight (BW) gain under control thermoneutral temperature or chronic heat stress. We measured changes of body temperature and food intake, organ weight, as well as amino acid metabolism and plasma metabolites under acute and chronic heat stress in broilers. A total of 168 fertilized Chunky broiler eggs were randomly divided into 2 treatment groups in experiments. The eggs were in ovo fed with l-Leu (34.5 µmol/500 µl per egg) or sterile water (500 µl/egg) during incubation. After hatching, male broilers were selected and assigned seven to nine replicates (one bird/replicate) in each group for heat challenge experiments. Broilers (29- or 30-day-old) were exposed to acute heat stress (30 ± 1°C) for 120 min or a chronic heat cyclic and continued heat stress (over 30 ± 1°C; ages, 15 to 44 days). In ovo feeding of l-Leu caused a significant suppression of enhanced body temperature without affecting food intake, plasma triacylglycerol, non-esterified fatty acids, ketone bodies, glucose, lactic acid or thyroid hormones under acute heat stress. Daily body temperature was significantly increased by l-Leu in ovo feeding under chronic heat stress. Interestingly, in ovo feeding of l-Leu caused a significantly higher daily BW gain compared with that of the control group under chronic heat stress. Moreover, some essential amino acids, including Leu and isoleucine, were significantly increased in the liver and decreased in the plasma by l-Leu in ovo feeding under acute heat stress. These results suggested that l-Leu in ovo feeding afforded thermotolerance to broilers under acute heat stress mainly through changing amino acid metabolism until marketing age.
An experiment was conducted to test the hypothesis that meat products have digestible indispensable amino acid scores (DIAAS) >100 and that various processing methods will increase standardised ileal digestibility (SID) of amino acids (AA) and DIAAS. Nine ileal-cannulated gilts were randomly allotted to a 9 × 8 Youden square design with nine diets and eight 7-d periods. Values for SID of AA and DIAAS for two reference patterns were calculated for salami, bologna, beef jerky, raw ground beef, cooked ground beef and ribeye roast heated to 56, 64 or 72°C. The SID of most AA was not different among salami, bologna, beef jerky and cooked ground beef, but was less (P < 0·05) than the values for raw ground beef. The SID of AA for 56°C ribeye roast was not different from the values for raw ground beef and 72°C ribeye roast, but greater (P < 0·05) than those for 64°C ribeye roast. For older children, adolescents and adults, the DIAAS for all proteins, except cooked ground beef, were >100 and bologna and 64°C ribeye roast had the greatest (P < 0·05) DIAAS. The limiting AA for this age group were sulphur AA (beef jerky), leucine (bologna, raw ground beef and cooked ground beef) and valine (salami and the three ribeye roasts). In conclusion, meat products generally provide high-quality protein with DIAAS >100 regardless of processing. However, overcooking meat may reduce AA digestibility and DIAAS.