We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Approximately 60 million individuals worldwide are currently living with dementia. As the median age of the world’s population rises, the number of dementia cases is expected to increase markedly, and to affect ∼150 million individuals by 2050. This will create a huge and unsustainable economic and social burden across the globe. Although promising pharmacological treatment options for Alzheimer’s disease – the most common cause of dementia – are starting to emerge, dementia prevention and risk reduction remain vital. In this review, we present evidence from large-scale epidemiological studies and randomised controlled trials to indicate that adherence to healthy dietary patterns could improve cognitive function and lower dementia risk. We outline potential systemic (e.g. improved cardiometabolic health, lower inflammation, modified gut microbiome composition/metabolism, slower pace of aging) and brain-specific (e.g. lower amyloid-β load, reduced brain atrophy and preserved cerebral microstructure and energetics) mechanisms of action. We also explore current gaps in our knowledge and outline potential directions for future research in this area. Our aim is to provide an update on current state of the knowledge, and to galvanise research on this important topic.
The concept of the protein transition represents a shift from a diet rich in animal proteins to one richer in plant-based alternatives, largely in response to environmental sustainability concerns. However, a simple swap by replacing dairy protein with plant protein will lead to lower protein quality and a lower intake of key micronutrients that sit naturally within the dairy matrix. Owing to antagonistic effects within the plant food matrix, micronutrients in plant sources exhibit lower bioavailability which is not reflected in food composition data or dietary guidelines. The dairy matrix effect includes moderation of blood lipid levels in which calcium plays a key role. Protein recommendations often take a muscle-centric approach. Hence, strategies to increase the anabolic potential of plant proteins have focused on increasing total protein intake to counter the suboptimal amino acid composition relative to dairy protein or leucine fortification. However, emerging evidence indicates a role for nutrient interactions and non-nutrient components (milk exosomes, bioactive peptides) of the dairy matrix in modulating postprandial muscle protein synthesis rates. To ensure the food system transformation is environmentally sustainable and optimal from a nutrition perspective, consideration needs to be given to complementary benefits of different food matrices and the holistic evaluation of foods in the protein transition. This narrative review critically examines the role of dairy in the protein transition, emphasising the importance of the food matrix in nutrient bioavailability and muscle health. By considering both nutritional and sustainability perspectives, we provide a holistic evaluation of dairy’s contribution within evolving dietary patterns.
In major depressive disorder (MDD), only ~35% achieve remission after first-line antidepressant therapy. Using UK Biobank data, we identify sociodemographic, clinical, and genetic predictors of antidepressant response through self-reported outcomes, aiming to inform personalized treatment strategies.
Methods
In UK Biobank Mental Health Questionnaire 2, participants with MDD reported whether specific antidepressants helped them. We tested whether retrospective lifetime response to four selective serotonin reuptake inhibitors (SSRIs) (N = 19,516) – citalopram (N = 8335), fluoxetine (N = 8476), paroxetine (N = 2297) and sertraline (N = 5883) – was associated with sociodemographic (e.g. age, gender) and clinical factors (e.g. episode duration). Genetic analyses evaluated the association between CYP2C19 variation and self-reported response, while polygenic score (PGS) analysis assessed whether genetic predisposition to psychiatric disorders and antidepressant response predicted self-reported SSRI outcomes.
Results
71%–77% of participants reported positive responses to SSRIs. Non-response was significantly associated with alcohol and illicit drug use (OR = 1.59, p = 2.23 × 10−20), male gender (OR = 1.25, p = 8.29 × 10−08), and lower-income (OR = 1.35, p = 4.22 × 10−07). The worst episode lasting over 2 years (OR = 1.93, p = 3.87 × 10−16) and no mood improvement from positive events (OR = 1.35, p = 2.37 × 10−07) were also associated with non-response. CYP2C19 poor metabolizers had nominally higher non-response rates (OR = 1.31, p = 1.77 × 10−02). Higher PGS for depression (OR = 1.08, p = 3.37 × 10−05) predicted negative SSRI outcomes after multiple testing corrections.
Conclusions
Self-reported antidepressant response in the UK Biobank is influenced by sociodemographic, clinical, and genetic factors, mirroring clinical response measures. While positive outcomes are more frequent than remission reported in clinical trials, these self-reports replicate known treatment associations, suggesting they capture meaningful aspects of antidepressant effectiveness from the patient’s perspective.
Loneliness has become a major public health issue of the recent decades due to its severe impact on health and mortality. Little is known about the relation between loneliness and social anxiety. This study aimed (1) to explore levels of loneliness and social anxiety in the general population, and (2) to assess whether and how loneliness affects symptoms of social anxiety and vice versa over a period of five years.
Methods
The study combined data from the baseline assessment and the five-year follow-up of the population-based Gutenberg Health Study. Data of N = 15 010 participants at baseline (Mage = 55.01, s.d.age = 11.10) were analyzed. Multiple regression analyses with loneliness and symptoms of social anxiety at follow-up including sociodemographic, physical illnesses, and mental health indicators at baseline were used to test relevant covariates. Effects of loneliness on symptoms of social anxiety over five years and vice versa were analyzed by autoregressive cross-lagged structural equation models.
Results
At baseline, 1076 participants (7.41%) showed symptoms of social anxiety and 1537 (10.48%) participants reported feelings of loneliness. Controlling for relevant covariates, symptoms of social anxiety had a small significant effect on loneliness five years later (standardized estimate of 0.164, p < 0.001). Vice versa, there was no significant effect of loneliness on symptoms of social anxiety taking relevant covariates into account.
Conclusions
Findings provided evidence that symptoms of social anxiety are predictive for loneliness. Thus, prevention and intervention efforts for loneliness need to address symptoms of social anxiety.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Effective regulation is essential for preventing the establishment of new invasive plants and managing the environmental, social, and economic impacts of those already established. Invasive plants are regulated by jurisdictions at a mix of local, regional, national, and international levels. Enhanced coordination of policy and regulations has been identified as a key strategy for addressing the impacts of invasive species; however, coordination between jurisdictions, and even within jurisdictions, is not always considered. To review regulatory coordination in Australia, we compiled a comprehensive dataset of noxious weeds (defined as invasive plants and potentially invasive plants with controls specified in regulation) in each Australian jurisdiction (i.e., state or territory). We found that jurisdictions on average shared ca. 67% (SD = 15%) of noxious weed listings. Neighboring jurisdictions were not more similar than separated jurisdictions in their noxious weed listings. There were significant differences in the biogeographic native ranges of noxious weeds between jurisdictions, with species native to temperate Asia being most frequently listed overall. The predominant likely entry pathway for noxious weeds in Australia was the ornamental trade. Listings were primarily dedicated to proactive control, prohibiting the cultivation of noxious weeds to avoid their naturalization. There were 415 noxious weeds regulated in a harmonious manner across jurisdictions. However, there were 327 noxious weeds regulated by jurisdictions in a discordant manner, potentially leaving neighboring jurisdictions vulnerable to invasion. We suggest jurisdictions reassess the regulation of these 327 discordant noxious weeds in Australia and utilize a national taxonomic standard to avoid problematic synonyms. Improved cohesion of policies could be achieved through wider adoption of existing regulatory systems and codevelopment of regulations between government and industry.
Evaluate knowledge and beliefs about dietary nitrate among United Kingdom (UK)-based adults.
Design:
An online questionnaire was administered to evaluate knowledge and beliefs about dietary nitrate. Overall knowledge of dietary nitrate was quantified using a twenty-one-point Nitrate Knowledge Index. Responses were compared between socio-demographic groups.
Setting:
UK.
Participants:
A nationally representative sample of 300 adults.
Results:
Only 19 % of participants had heard of dietary nitrate prior to completing the questionnaire. Most participants (∼70 %) were unsure about the effects of dietary nitrate on health parameters (e.g. blood pressure, cognitive function and cancer risk) or exercise performance. Most participants were unsure of the average population intake (78 %) and acceptable daily intake (83 %) of nitrate. Knowledge of dietary sources of nitrate was generally low, with only ∼30 % of participants correctly identifying foods with higher or lower nitrate contents. Almost none of the participants had deliberately purchased, or avoided purchasing, a food based around its nitrate content. Nitrate Knowledge Index scores were generally low (median (interquartile range (IQR)): 5 (8)), but were significantly higher in individuals who were currently employed v. unemployed (median (IQR): 5 (7) v. 4 (7); P < 0·001), in those with previous nutrition education v. no nutrition education (median (IQR): 6 (7) v. 4 (8); P = 0·012) and in individuals who had heard of nitrate prior to completing the questionnaire v. those who had not (median (IQR): 9 (8) v. 4 (7); P < 0·001).
Conclusions:
This study demonstrates low knowledge around dietary nitrate in UK-based adults. Greater education around dietary nitrate may be valuable to help individuals make more informed decisions about their consumption of this compound.
Spray drying dilute suspensions of bentonitic montmorillonite produces a powder that shows totally random orientation of the crystallites within a sample large enough to diffract X-rays. The powder is collected by an electrostatic precipitator and can be handled in the normal mounting processes without introducing preferred orientation. Electron micrographs show this powder to be composed on a small scale of thin, crumpled, and rolled films. The extremely small montmorillonite crystallites that make up the film are oriented with [001] directions perpendicular to the film surface. Orientation within the plane of the film is random as shown by selected area electron diffraction. Crumpling and rolling of the film is sufficient to make the orientation of [001] directions random in three dimensions in a large sample when X-ray diffraction is registered.
The X-ray diffraction patterns all show diffraction maxima (both hk and 00l), and their relative intensities with respect to each other can be determined. The line broadening of the 06 and the 003 peaks was studied. The average crystallite size as calculated from the line broadening varied from six to eleven unit layers thick for four bentonitic montmorillonites. The average lateral dimension of crystallites varied from 140 Å to 250 Å. Ratios of lateral dimensions to thickness varied from 2.3 to 3.4.
The general public and scientific community alike are abuzz over the release of ChatGPT and GPT-4. Among many concerns being raised about the emergence and widespread use of tools based on large language models (LLMs) is the potential for them to propagate biases and inequities. We hope to open a conversation within the environmental data science community to encourage the circumspect and responsible use of LLMs. Here, we pose a series of questions aimed at fostering discussion and initiating a larger dialogue. To improve literacy on these tools, we provide background information on the LLMs that underpin tools like ChatGPT. We identify key areas in research and teaching in environmental data science where these tools may be applied, and discuss limitations to their use and points of concern. We also discuss ethical considerations surrounding the use of LLMs to ensure that as environmental data scientists, researchers, and instructors, we can make well-considered and informed choices about engagement with these tools. Our goal is to spark forward-looking discussion and research on how as a community we can responsibly integrate generative AI technologies into our work.
This paper considers testing for unit roots in Gaussian panels with cross-sectional dependence generated by common factors. Within our setup, we can analyze restricted versions of the two prevalent approaches in the literature, that of Moon and Perron (2004, Journal of Econometrics 122, 81–126), who specify a factor model for the innovations, and the PANIC setup proposed in Bai and Ng (2004, Econometrica 72, 1127–1177), who test common factors and idiosyncratic deviations separately for unit roots. We show that both frameworks lead to locally asymptotically normal experiments with the same central sequence and Fisher information. Using Le Cam’s theory of statistical experiments, we obtain the local asymptotic power envelope for unit-root tests. We show that the popular Moon and Perron (2004, Journal of Econometrics 122, 81–126) and Bai and Ng (2010, Econometric Theory 26, 1088–1114) tests only attain the power envelope in case there is no heterogeneity in the long-run variance of the idiosyncratic components. We develop a new test which is asymptotically uniformly most powerful irrespective of possible heterogeneity in the long-run variance of the idiosyncratic components. Monte Carlo simulations corroborate our asymptotic results and document significant gains in finite-sample power if the variances of the idiosyncratic shocks differ substantially among the cross-sectional units.
The protection of non-combatants in times of autonomous warfare raises the question of the timeliness of the international protective emblem. (Fully) Autonomous weapon systems are often launched from a great distance, and there may be no possibility for the operators to notice protective emblems at the point of impact; therefore, such weapon systems will need to have a way to detect protective emblems and react accordingly. In this regard, the present contribution suggests a cross-frequency protective emblem. Technical deployment is considered, as well as interpretation by methods of machine learning. Approaches are explored as to how software can recognize protective emblems under the influence of various boundary conditions. Since a new protective emblem could also be misused, methods of distribution are considered, including encryption and authentication of the received signal. Finally, ethical aspects are examined.
Biophysical conditions played a fundamental role in early human colonization of insular territories, particularly in food-producing societies dealing with limited resources and the challenges of maintaining a sustainable carrying capacity. Studies on past human colonization of small oceanic islands thus offer insights into economic plasticity, ecological impacts, and adaptation of early food-producing groups. On the coast of southern Chile, early evidence is dated to 950 cal BP of island colonization by coastal populations with mainland subsistence systems based on the exploitation of marine resources, along with gathering, managing, and cultivating plants and hunting terrestrial animals. Strikingly, the extent to which these mixed economies contributed to insular colonization efforts is largely unknown. Here we used organic residue analysis of ceramic artifacts to shed light on the subsistence of populations on Mocha Island in southern Chile. We extracted and analyzed lipids from 51 pottery sherds associated with the El Vergel cultural complex that flourished in southern Chile between 950 and 400 cal BP. Chemical and stable isotope analysis of the extracts identified a range of food products, including C3 and C4 plants and marine organisms. The results reveal the central role of mixed subsistence systems in fueling the colonization of Mocha Island.
Georgia lies to the northeast of Türkiye, having a western border on the Black Sea. With a population of some 3·73 million, Georgia has a tradition of gastronomic excellence dating back millennia. However, changing lifestyles and external influences have, as elsewhere, led to problems of suboptimal nutrition, and lifestyle-related diseases and disorders prevail. There is considerable scope for improving the focus on public health (PH) and nutrition in Georgia. With this in mind, the Georgian Nutrition Society teamed up with The Nutrition Society of the UK and Ireland and the Sabri Ülker Foundation, a PH charity based in Istanbul, Türkiye, to host a conference and workshops in Tbilisi, Georgia. The primary purpose was to review the current status of PH and nutrition in Georgia with reference to the situation elsewhere, to share examples of best practice and to identify opportunities for improvement. A particular highlight was the presentation of a programme of nutrition education for family physicians recently implemented in Türkiye. This summary of the proceedings is intended as a blueprint for action in Georgia and also to inspire others to consider how PH might be improved via a focus on balanced nutrition.
Pre-diagnostic stages of psychotic illnesses, including ‘clinical high risk’ (CHR), are marked by sleep disturbances. These sleep disturbances appear to represent a key aspect in the etiology and maintenance of psychotic disorders. We aimed to examine the relationship between self-reported sleep dysfunction and attenuated psychotic symptoms (APS) on a day-to-day basis.
Methods
Seventy-six CHR young people completed the Experience Sampling Methodology (ESM) component of the European Union Gene-Environment Interaction Study, collected through PsyMate® devices, prompting sleep and symptom questionnaires 10 times daily for 6 days. Bayesian multilevel mixed linear regression analyses were performed on time-variant ESM data using the brms package in R. We investigated the day-to-day associations between sleep and psychotic experiences bidirectionally on an item level. Sleep items included sleep onset latency, fragmentation, and quality. Psychosis items assessed a range of perceptual, cognitive, and bizarre thought content common in the CHR population.
Results
Two of the seven psychosis variables were unidirectionally predicted by previous night's number of awakenings: every unit increase in number of nightly awakenings predicted a 0.27 and 0.28 unit increase in feeling unreal or paranoid the next day, respectively. No other sleep variables credibly predicted next-day psychotic symptoms or vice-versa.
Conclusion
In this study, the relationship between sleep disturbance and APS appears specific to the item in question. However, some APS, including perceptual disturbances, had low levels of endorsement amongst this sample. Nonetheless, these results provide evidence for a unidirectional relationship between sleep and some APS in this population.
This review aims to critically evaluate the efficacy of long-chain ո-3 PUFA ingestion in modulating muscle protein synthesis (MPS), with application to maintaining skeletal muscle mass, strength and function into later life. Ageing is associated with a gradual decline in muscle mass, specifically atrophy of type II fibres, that is exacerbated by periods of (in)voluntary muscle disuse. At the metabolic level, in otherwise healthy older adults, muscle atrophy is underpinned by anabolic resistance which describes the impaired MPS response to non-pharmacological anabolic stimuli, namely, physical activity/exercise and amino acid provision. Accumulating evidence implicates a mechanistic role for n-3 PUFA in upregulating MPS under stimulated conditions (post-prandial state or following exercise) via incorporation of EPA and DHA into the skeletal muscle phospholipid membrane. In some instances, these changes in MPS with chronic ո-3 PUFA ingestion have translated into clinically relevant improvements in muscle mass, strength and function; an observation evidently more prevalent in healthy older women than men. This apparent sexual dimorphism in the adaptive response of skeletal muscle metabolism to EPA and DHA ingestion may be related to a greater propensity for females to incorporate ո-3 PUFA into human tissue and/or the larger dose of ingested ո-3 PUFA when expressed relative to body mass or lean body mass. Future experimental studies are warranted to characterise the optimal dosing and duration of ո-3 PUFA ingestion to prescribe tailored recommendations regarding n-3 PUFA nutrition for healthy musculoskeletal ageing into later life.
Currently, 7 named Sarcocystis species infect cattle: Sarcocystis hirsuta, S. cruzi, S. hominis, S. bovifelis, S. heydorni, S. bovini and S. rommeli; other, unnamed species also infect cattle. Of these parasites of cattle, a complete life cycle description is known only for S. cruzi, the most pathogenic species in cattle. The life cycle of S. cruzi was completed experimentally in 1982, before related parasite species were structurally characterized, and before the advent of molecular diagnostics; to our knowledge, no archived frozen tissues from the cattle employed in the original descriptions remain for DNA characterization. Here, we isolated DNA from a paraffin-embedded kidney of a calf experimentally infected with S. cruzi in 1980; we then sequenced portions of 18S rRNA, 28S rRNA, COX1 and Acetyl CoA genes and verified that each shares 99–100% similarity to other available isolates attributed to S. cruzi from naturally infected cattle. We also reevaluated histological sections of tissues of calves experimentally infected with S. cruzi in the original description, exploiting improvements in photographic technology to render clearer morphological detail. Finally, we reviewed all available studies of the life cycle of S. cruzi, noting that S. cruzi was transmitted between bison (Bison bison) and cattle (Bos taurus) and that the strain of parasite derived from bison appeared more pathogenic than the cattle strain. Based on these newfound molecular, morphological and physiological data, we thereby redescribed S. cruzi and deposited reference material in the Smithsonian Museum for posterity.
This review explores the evolution of dietary protein intake requirements and recommendations, with a focus on skeletal muscle remodelling to support healthy ageing based on presentations at the 2023 Nutrition Society summer conference. In this review, we describe the role of dietary protein for metabolic health and ageing muscle, explain the origins of protein and amino acid (AA) requirements and discuss current recommendations for dietary protein intake, which currently sits at about 0⋅8 g/kg/d. We also critique existing (e.g. nitrogen balance) and contemporary (e.g. indicator AA oxidation) methods to determine protein/AA intake requirements and suggest that existing methods may underestimate requirements, with more contemporary assessments indicating protein recommendations may need to be increased to >1⋅0 g/kg/d. One example of evolution in dietary protein guidance is the transition from protein requirements to recommendations. Hence, we discuss the refinement of protein/AA requirements for skeletal muscle maintenance with advanced age beyond simply the dose (e.g. source, type, quality, timing, pattern, nutrient co-ingestion) and explore the efficacy and sustainability of alternative protein sources beyond animal-based proteins to facilitate skeletal muscle remodelling in older age. We conclude that, whilst a growing body of research has demonstrated that animal-free protein sources can effectively stimulate and support muscle remodelling in a manner that is comparable to animal-based proteins, food systems need to sustainably provide a diversity of both plant and animal source foods, not least for their protein content but other vital nutrients. Finally, we propose some priority research directions for the field of protein nutrition and healthy ageing.
The existing literature provides conflicting evidence of whether a collectivistic value orientation is associated with ethical or unethical behavior. To address this confusion, we integrate collectivism theory and research with prior work on social identity, moral boundedness, group morality, and moral identity to develop a model of the double-edged effects of collectivism on employee conduct. We argue that collectivism is morally bounded depending on who the other is, and thus it inhibits employees’ motivation to engage in unethical pro-self behavior, yet strengthens their motivation to engage in unethical pro-organization behavior. We further predict that these effects are mediated by the psychological mechanism of organizational goal commitment and moderated by a person’s strength of moral identity. Results of three studies conducted in China and the United States and involving both field and experimental data offer strong support for our hypotheses. Theoretical and practical implications of the research are discussed.
This study investigated sex differences in Fe status, and associations between Fe status and endurance and musculoskeletal outcomes, in military training. In total, 2277 British Army trainees (581 women) participated. Fe markers and endurance performance (2·4 km run) were measured at the start (week 1) and end (week 13) of training. Whole-body areal body mineral density (aBMD) and markers of bone metabolism were measured at week 1. Injuries during training were recorded. Training decreased Hb in men and women (mean change (–0·1 (95 % CI –0·2, –0·0) and –0·7 (95 % CI –0·9, –0·6) g/dl, both P < 0·001) but more so in women (P < 0·001). Ferritin decreased in men and women (–27 (95 % CI –28, –23) and –5 (95 % CI –8, –1) µg/l, both P ≤ 0·001) but more so in men (P < 0·001). Soluble transferrin receptor increased in men and women (2·9 (95 % CI 2·3, 3·6) and 3·8 (95 % CI 2·7, 4·9) nmol/l, both P < 0·001), with no difference between sexes (P = 0·872). Erythrocyte distribution width increased in men (0·3 (95 % CI 0·2, 0·4)%, P < 0·001) but not in women (0·1 (95 % CI –0·1, 0·2)%, P = 0·956). Mean corpuscular volume decreased in men (–1·5 (95 % CI –1·8, –1·1) fL, P < 0·001) but not in women (0·4 (95 % CI –0·4, 1·3) fL, P = 0·087). Lower ferritin was associated with slower 2·4 km run time (P = 0·018), sustaining a lower limb overuse injury (P = 0·048), lower aBMD (P = 0·021) and higher beta C-telopeptide cross-links of type 1 collagen and procollagen type 1 N-terminal propeptide (both P < 0·001) controlling for sex. Improving Fe stores before training may protect Hb in women and improve endurance and protect against injury.