To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Sarcopenia, the age-related decline in muscle mass and strength, is a contributor to frailty and reduced quality of life. Emerging evidence suggests an emerging role of the gut microbiome in modulating skeletal muscle through microbial species and metabolites, such as short-chain fatty acids (SCFAs), potentially influencing inflammation, nutrient absorption, and glucose and protein metabolism. This review considers the potential of probiotics, prebiotics, and synbiotics as interventions to mitigate sarcopenia based on animal and human studies, while providing a critique of present barriers that need to be addressed. Preclinical models, including germ-free mice and faecal microbiota transplantation, demonstrate that gut microbiota from healthy or young donors may enhance overall muscle health via reductions in inflammatory and muscle atrophy markers. Limited human studies show that probiotics such as Lactobacillus and Bifidobacterium could improve branched-chain amino acid (BCAA) bioavailability and potentially sarcopenia indices, although findings have been inconsistent. Particularly, challenges including inconsistent microbial assessments, lack of dietary control and interindividual variability due to diet, age, genetics, comorbidities and medications may hinder progress in this field. Delivery methods (e.g. capsules, fermented foods or fortified products) could further complicate efficacy through probiotic stability and dietary restrictions in older adults. Standardised protocols [e.g. Strengthening The Organisation and Reporting of Microbiome Studies (STORMS) checklist] and multi-omics approaches may be critical to address these limitations and identify microbial signatures linked to sarcopenia outcomes. While preclinical evidence highlights mechanistic pathways pertinent to amino acid metabolism, translating findings to humans requires rigorous experimental trials.
Groundwater iron varies geographically and iron intake through drinking water can minimise iron deficiency (ID). Rice, a major share of daily meals (∼70% of total energy) in Bangladesh, absorbs a substantial amount of water. This study aimed to estimate the contribution of groundwater iron entrapped in cooked rice and its implications on the recommended iron intake. A cross-sectional study was conducted among 25 households, selected by the iron content of their drinking groundwater source in Sirajganj district, Bangladesh. Each household pre-supplied with 600 g of raw rice (300 g for each cooking), was instructed to cook ‘water-draining rice’ (WDR) and ‘water-sitting rice’ (WSR). Using atomic absorption spectrophotometry, iron content in filtered and non-filtered water was measured as 0.4 ± 0.2 mg/L and 6.1 ± 2.0 mg/L, respectively. After adjusting for water filtration, the weighted mean of total iron content in WDR and WSR was 6.18 mg and 5.70 mg, respectively. Assuming the average rice intake, iron content in WDR and WSR fulfilled approximately 98.15% and 90.62% of the average requirement for non-pregnant and non-lactating women (NPNL). The water-entrapped iron in cooked WDR and WSR fulfilled about 23.77% and 20.4% of Recommended Dietary Allowances, and 52.83% and 45.30% of Estimated Average Requirements, respectively in NPNL women, suggesting that groundwater entrapped in cooked rice is an influential dietary iron source. The substantial amount of iron from cooked rice can make an additional layer to the environmental contribution of iron in this setting with the potential to contribute ID prevention.
There is substantial international variation in recommended vitamin C intake levels. In the USA, the recommendation is 90 mg/d for men and 75 mg/d for women, while in the UK, the current recommendation – established in 1991 – is only 40 mg/d for adults. This UK level was based on the 1953 Sheffield study, which found that 10 mg/d prevents scurvy, with 40 mg/d chosen as the recommended level for yielding somewhat higher plasma levels. In this commentary, we argue that the UK recommendation overlooked key evidence available at the time. Specifically, at least six controlled trials published before 1991 reported benefits from vitamin C supplementation in participants whose baseline vitamin C intake was already 40 mg/d or higher. One randomised controlled trial, published in 1993, found benefits from vitamin C supplementation even at a baseline intake of about 500 mg/d; however, this trial involved ultramarathon runners, and the findings should not be broadly generalised. Nonetheless, such results challenge the assumption that 40 mg/d is universally adequate to maintain full health. We also highlight that the UK recommendations were narrowly focused on preventing dermatological symptoms of scurvy, despite strong evidence – even at the time – that vitamin C deficiency can also cause cardiac dysfunction and greater morbidity due to respiratory infections. We conclude that the current UK vitamin C recommendation should be re-evaluated in light of controlled trial evidence and broader clinical outcomes.
Although many online-based dietary surveys have been developed in recent years, systems that easily survey the dietary balance based on the Japanese diet are insufficient. This study aimed to evaluate the relationship between dietary balance scores from an online survey system based on the Japanese Food Guide Spinning Top, and nutrient/food intake calculated using the weighing method from dietary records (DRs), as well as to assess the system’s utility and applicability. An online dietary balance survey and semi-weighted DR assessment with food photographs were conducted in Japanese participants (n = 34). Registered dietitians entered the balance scores into the system based on the participants’ food photographs, and the scores were calculated using the system. Significant positive correlations (p < 0.001) were found between the online dietary balance scores and nutrient/food intake from DRs; especially for ‘grain dishes’ and carbohydrates (r = 0.704); ‘vegetable dishes’ and the vegetable dish group (sum of potatoes, vegetables, mushrooms, and algae) (r = 0.774); ‘main dishes’ and protein (r = 0.661); ‘milk’ and the milk and milk products group (r = 0.744); and ‘fruits’ and the fruits group (r = 0.748). Bland–Altman analysis showed that the dietary balance scores obtained by this system tended to underestimate the intake compared with the weighing method. Although there are limitations to the accurate estimation of nutrient and food intake, the online dietary balance scores obtained from the online dietary balance survey system were useful for understanding the dietary balance in the Japanese diet.
Kids SIPsmartER is a school-based behavioural intervention for rural Appalachia middle school students with an integrated two-way short message service (SMS) strategy for caregivers. When tested in a cluster randomized controlled trial, the intervention led to significant improvements in sugar-sweetened beverage (SSB) consumption among students and caregivers. This study explores changes in secondary caregiver outcomes, including changes in caregiver SSB-related theory of planned behaviour constructs (affective attitudes, instrumental attitudes, subjective norms, perceived behavioural control, and intentions), parenting practices, and the home environment. Participants included 220 caregivers (93% female, 88% White, 95% non-Hispanic, mean age 40.6) in Virginia and West Virginia at baseline and 7 months post-intervention. Relative to control caregivers (n = 102), intervention caregivers (n = 118) showed statistically significant improvements in instrumental attitudes (Coef.= 0.53, 95% CI [0.04, 1.01], p = 0.033), behavioural intentions (Coef.=0.46, 95% CI [0.05, 0.88], p = 0.027), parenting practices (Coef. = 0.22, 95% CI [0.11, 0.33], p < 0.001), and total home SSB availability (Coef. = –0.25, 95% CI [–0.39, –0.11], p < 0.001), with specific improvements for sweetened juice drinks (Coef. = –0.18, 95% CI [–0.35, –0.01], p = 0.043) and regular soda/soft drinks (Coef. = –0.31, 95% CI [–0.55, –0.07], p = 0.010). In contrast, there were no significant between group changes for affective attitudes, subjective norms, or perceived behavioural control. Our findings highlight future research areas and fill gaps in intervention literature. This study is among the few to develop and evaluate a scalable, theory-based caregiver SMS component in a rural, school-based intervention. Combined with evidence that Kids SIPsmartER improved SSB behaviours, our results emphasize the potential of theory-guided SMS interventions to impact SSB-related outcomes.
Despite the multiple advantages of 25-hydroxyvitamin D (calcifediol or 25(OH)D) compared to cholecalciferol, it is used sparingly. This study was planned to assess the safety and efficacy of supplementing daily 25 µg of calcifediol capsules vis-a-vis 100 µg (4000 IU) of cholecalciferol sachets in apparently healthy individuals with vitamin D deficiency in Chandigarh, India (latitude 30.7° North, 76.8° East). It was a prospective, interventional study to evaluate the effects of calcifediol vis-a-vis cholecalciferol. Following initial screening of 70 subjects in each group, 62 were included in the calcifediol and 41 in the cholecalciferol group. Forty-six from calcifediol and 37 from cholecalciferol group completed the 6-month follow up. There was a significant increase in serum 25(OH)D (355% in cholecalciferol & 574% in calcifediol groups, respectively, p < 0.001) and 1,25 (OH)2D (p < 0.001) with a marked decrease in iPTH (p < 0.001) and ALP (p = 0.016) in both groups. Though serum ALP decreased significantly more in the calcifediol group than the cholecalciferol group, no appreciable difference in other biochemical parameters was noted between the groups. No episodes of hypercalcaemia or incidence of new renal stone disease were observed during follow-up. However, hypercalciuria (spot urine calcium creatinine > 0.2 mg/mg) was noted in 8/46 individuals in the calcifediol group and 5/37 individuals in the cholecalciferol group at final visit with no significant difference between two groups. This study establishes the efficacy and safety of correcting vitamin D deficiency with daily 25 µg calcifediol capsules as an alternative to 4000 IU (100 µg) cholecalciferol sachets.
Responsible zooarchaeology encompasses: (1) care of reference collections, (2) management of zooarchaeological collections during study, (3) dissemination of results, and (4) long-term curation. Our responses to these challenges must be governed by shared values regarding the professional and ethical treatment of our natural and cultural heritage.
Zooarchaeological research reveals that humans are simultaneously resilient in the face of environmental change, and culpable as drivers of environmental change. Recent research indicates that even habitats thought to be unmodified by human activities were substantially, and often intentionally, altered by humans in the past. Zooarchaeological approaches to studying past environmental conditions generally fall within two primary themes: (1) the interactions between humans, animals, and the environments in which they live, and (2) the consequences of those interactions for both humans and animals.
Zooarchaeological research is guided by the scientific method. Zooarchaeologists distinguish between primary data, which are descriptive observations, and secondary data, which are analytical products derived from primary data. As much primary data as possible should be clearly recorded during the initial study, and these data should be accessible to future researchers.
The ultimate goal of zooarchaeological analysis is to use animal remains, alongside other evidence, to make inferences regarding the biological, cultural, and ecological behavior of people in the past. Secondary data, which are often mathematically derived from primary data, link primary observations about zooarchaeological specimens to larger cultural and ecological processes.
A key dimension of human–animal relationships is predation. People pursue animal resources that support life and health, while ensuring that the costs required to find, catch, transport, process, distribute, and consume these foods do not exceed the benefits they offer. Animals play a key role in human subsistence strategies, and their use and meaning is woven into all other facets of human life, from the sacred to the profane.
Knowledge drawn from ecology, the study of interactions between organisms and their environments, is critical to zooarchaeological interpretation. Using theories and methods common in modern ecology, zooarchaeological research demonstrates the profound impact of human behavior on ecosystems across space and time. Ecological understanding allows zooarchaeologists to understand how humans shaped ecosystems in the past, how those systems shaped us, and how we may adapt to ecological changes in the future.
Taphonomy is the study of the transformation of archaeological deposits from deposition, to recovery, and analysis. These changes occur prior to excavation (first-order changes), and during excavation and analysis (second-order changes). The taphonomic histories of assemblages vary greatly from site to site, and may not be completely knowable, even using multiple lines of evidence.