To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Micronutrient deficiencies (MND) are a significant global health issue, particularly affecting children’s growth and cognitive potential and predisposing to adverse health outcomes for women of reproductive age (WRA).(1) Over half of global MND cases occur in Sub-Saharan Africa (SSA), with 80% of women estimated to be deficient in at least one of three micronutrients(2). Large-scale food fortification is a cost-effective strategy recommended for combatting widespread MND and has been effectively implemented in many developed countries(3). In developing countries such as SSA, socio-economic barriers and a fragmented food processing industry hinders effective implementation of food fortification(4). As a result, countries with fortification programmes face significant challenges, including low coverage of fortified food in the population and poor compliance with fortification standards by food producers(5) The contribution of food fortification to nutrient intakes of WRA in SSA have yet to be fully assessed. This study sought to evaluate mandatory food fortification programmes in SSA and estimate the contribution of fortified food consumption to micronutrient intakes and requirements of WRA. We utilised multi-national fortification data from the global fortification data exchange, which includes data on country fortification standards and the estimated level of compliance to fortification requirements. Data on the supply and consumption of fortifiable food was also included from the FAO. We calculated the potential nutrient intake from fortified food consumption for each nutrient using country fortification standards and food availability. We adjusted the estimated intake for each nutrient by multiplying with the estimated compliance percentage. We also assessed what proportion of women’s requirements for essential micronutrients, folate, iron, iodine, vitamin A, and zinc, are met through fortified food consumption using RNI values from WHO/FAO for WRA. Between 2019 and 2021, we estimated that mandatory fortification of wheat and maize flour, oil and salt in SSA contributes a median of 138µgDFE of folic acid, 217µg of iodine, 43µg RAE of vitamin A and 2.1mg and 2.0mg of iron and zinc respectively to the intakes of WRA daily. These intakes represent 12.8% (0.0-49.2) of iron, 27.5% (0.0-83.2) of zinc, 55.0% (0.0-245.0) of folate, 8.8% (0.0-37.2) of vitamin A and 228.2% (98.2-358.6) of iodine requirements respectively, taking into consideration the lower bioavailability of iron and zinc from cereal-based diets of SSA populations. In reality, compliance with fortification requirements in SSA is low, estimated at a median of 22% (0.0 - 83.4) for maize flour, 44% (0.0 - 72.0) for vegetable oil and 83% (0.0 - 100.0) for wheat flour fortification and is a major factor limiting the overall contribution of fortification to micronutrient intakes. Inadequate regulatory monitoring to ensure compliance with fortification requirements in SSA have resulted in lower-quality fortified foods, limiting women’s potential to achieve adequate micronutrient intake through fortified food consumption.
Sarcopenia is a skeletal muscle disease characterised by low muscle mass, strength and/or impaired physical function that is associated with a wide range of adverse outcomes including osteoporosis, falls, fractures, disability, hospitalization, loss of independence and quality of life and mortality, if left untreated.(1, 2) This is also growing evidence linking sarcopenia to many other chronic conditions, including type 2 diabetes, fatty liver disease, cognitive impairment and dementia, certain cancers (and post- treatment outcomes), cardiovascular disease and impaired immunity.(3–4) Despite its significant impact, awareness and knowledge about this disease amongst healthcare professionals (and the general public), including how to identify and treat/manage sarcopenia, remains low. There are currently no approved pharmacological agents for the treatment sarcopenia, but there is moderate-to-high level evidence informing clinical practice guidelines that multifaceted interventions incorporating resistance-based training with adequate nutrition focusing on high quality protein or multi-nutrient protein-based supplements can prevent and manage this disease.(1, 5–6) Meta-analyses of randomised controlled trials consistently demonstrate that progressive resistance training (at least twice weekly) is the most effective approach to elicit gains in muscle mass and strength (independent of age), with the provision of dietary protein or multi- nutrient protein-based supplements providing small added benefits(7). Emerging evidence also indicates that minimal dose exercise strategies (e.g., resistance/strength “snacking” activities) and limiting sedentary behaviours (breaking up prolonged sitting) may help to attenuate age-related muscle loss. With regards to nutritional factors, most guidelines for older adults recommend a protein intake of 1.2 to 1.6 g/kg/d (25-30g of protein per meal) incorporating 3-4g of leucine to support muscle health. However, the benefits of protein alone on muscle-related outcomes are modest and appear mostly limited to those with insufficient (deficient) intakes (levels) and/or who are sarcopenic, frail and/or malnourished. A wide range of other nutritional-related factors (with and without exercise) have been investigated, including β-hydroxy β-methylbutyrate (HMB), vitamin D, creatine, antioxidants, omega-3 fatty acids, and phospholipids, as well as multi-nutrient supplements and various diets (Mediterranean diet), dietary patterns and foods (dairy products). There is also growing evidence that altering the gut microbiota and the use of probiotics, prebiotics and synbiotics may enhance muscle health. This presentation will provide an update of the evidence related to these factors to help guide decision making for clinical management and provide an overview of the current criteria used to identify poor muscle health and sarcopenia, including a new muscle health monitoring and management algorithm we have developed.
There is strong evidence that children are particularly vulnerable to the persuasiveness of marketing, and that their exposure to marketing of unhealthy food products influences their preference for and consumption of these products(1). In New Zealand (NZ), marketing is self-regulated by the industry-led Advertising Standards Authority (ASA). The ASA has two relevant codes, the Children’s Advertising and Food and Beverage Advertising Codes; however, product packaging is omitted. We investigated child-appealing marketing techniques displayed on packaged food products in NZ. We also assessed the potential impacts of different nutrient profiling systems to inform future policy design to restrict child-appealing marketing on food products in NZ. This research was conducted using the 2023 Nutritrack dataset, which contains data collected via photographs of packaged food products available in major NZ supermarkets. We focused on product categories that were shown to have a high prevalence of child-appealing marketing in a similar Australian study(2): confectionery, snack foods, cereal bars and breakfast cereals (n=2015 products). The images of products within these selected categories were assessed and coded using the “Child-appealing packaging” criteria developed by Mulligan et al.(3). Mann-Whitney U tests were used to assess differences in nutrient composition between products with and without child-appealing packaging, using information extracted from Nutrient Information Panels. In addition, the Food Standards Australia New Zealand Nutrient Profiling Scoring Criterion (NPSC) and the World Health Organization Nutrient Profiling Model for the Western Pacific Region (WHO WPRO) were applied to all food products identified as appealing to children to determine which products would be ineligible to be marketed to children under these two potential policy options. Overall, 724 (35.9%) of the 2015 products examined had child-appealing packaging. Snack foods had the highest proportion of products with child-appealing packaging (44.5%), followed by confectionery (39.3%), cereal bars (23.3%) and breakfast cereals (22%). The most common type of child-appealing marketing technique used was “child-appealing visual/graphical design of package” which featured on 513 food items. Overall, compared with products without child-appealing packaging, the median content of energy, protein, total fat, and saturated fat was lower, and the median content of sugar and sodium was higher in products with child-appealing packaging (all p<0.05). Of the 724 products that were found to have child-appealing packaging, 566 (78.2%) would be considered ineligible to be marketed to children when assessed using the NPSC and 706 (97.5%) would be ineligible using the WHO WPRO.Our research shows that a considerable number of food products available in New Zealand supermarkets are using marketing techniques on their packaging that appeal to children. If policies were introduced to reduce the use of child-appealing marketing on food packaging, the WHO WPRO would provide the highest level of protection for children.
Biodiversity knowledge gaps and biases persist across low-income tropical regions. Genetic data are essential for addressing these issues, supporting biodiversity research and conservation planning. To assess progress in wildlife genetic sampling within the Philippines, I evaluated the scope, representativeness, and growth of publicly available genetic data and research on endemic vertebrates from the 1990s through 2024. Results showed that 82.3% of the Philippines’ 769 endemic vertebrates have genetic data, although major disparities remain. Reptiles had the least complete coverage but exhibited the highest growth, with birds, mammals, and amphibians following in that order. Species confined to smaller biogeographic subregions, with narrow geographic ranges, or classified as threatened or lacking threat assessments were disproportionately underrepresented. Research output on reptiles increased markedly, while amphibian research lagged behind. Although the number of non-unique authors in wildlife genetics studies involving Philippine specimens has grown steeply, Filipino involvement remains low. These results highlight the uneven and non-random distribution of wildlife genetic knowledge within this global biodiversity hotspot. Moreover, the limited participation of Global South researchers underscores broader inequities in wildlife genomics. Closing these gaps and addressing biases creates a more equitable and representative genetic knowledge base and supports its integration into national conservation efforts aligned with global biodiversity commitments.
Global efforts to combat micronutrient deficiencies have often focused on assessing nutrient intakes and supplies(1,2), yet no studies have explored the role of crop selection and land suitability to tackle these deficiencies. This study aims to bridge this gap using existing estimates of global prevalence of iron(3) and zinc(4) deficiencies to identify crops with the highest potential to mitigate these deficiencies. Using the USDA food composition database, we established nutrient profiles for 37 widely cultivated crops, focusing on their iron and zinc content per 100 grams. To evaluate these crops’ effectiveness to meet nutrient requirements, we compared compositions to Harmonized Average Requirements (H-ARs) for women of reproductive age (WRA), a group particularly vulnerable to micronutrient deficiencies. The H-ARs account for variations in nutrient absorption and bioavailability(5). For each crop, we calculated the percentage of the H-AR met by 100 grams of iron and zinc content. This percentage was adjusted for the global prevalence of iron and zinc deficiencies by introducing deficiency weighting—multiplying each crop’s nutrient contributions by the global prevalence of iron and zinc deficiencies. The result was a deficiency-weighted nutrient score for each crop. Soybeans scored highest at 61.67, followed by cowpeas (50.30), pearl millet (33.69), and Phaseolus beans (31.33), indicating their strong potential to address global iron and zinc deficiencies. Next, we integrated these nutrient scores with global land suitability and yield potential data from the Global Agro-Ecological Zones (GAEZ) database to map regions most suited for growing these nutrient-dense crops. On average, our findings show that Tonga is the most suitable country for soybean cultivation, with a potential yield of 3.77 tons per hectare (tons/ha), Uruguay for cowpeas (2.82 tons/ha), Lithuania for Phaseolus beans (3.93 tons/ha), and Guinea-Bissau for pearl millet (3.87 tons/ha). Through multivariate clustering, we linked global deficiency patterns with yield potential across various regions. Countries such as those in the Caribbean, Eastern, Western, and Middle Africa, and Southern and Southeastern Asia emerged as priority regions where the production of these crops would be most beneficial to combat iron and zinc deficiencies. The results provide valuable insights to align agricultural land use practice with nutritional requirements, particularly in regions with high iron and zinc deficiency prevalence.
Household food production is considered a key avenue for improving food security and nutritional status, particularly for low-income people from developing countries. However, little is known about what aspects of home garden production enhance nutritional outcomes. This paper aims to assess how home gardens influence nutritional status while considering the impact of various child, maternal, and household characteristics such as birthweight, age, education, and income. We also examined the impact of distance to the market mediating this association. We conducted a cross-sectional study of 403 children (24-60 months) and their mothers (18-45 years) in Batticaloa district, Sri Lanka using a pre-tested structured questionnaire. Maternal and child anthropometric measures were taken, and children were classified as stunted, wasted and underweight based on the WHO references, and BMI was calculated for mothers(1). Logistic regression was used to analyse the factors associated with the dependent variable, nutritional outcomes. Food production diversity was not associated with maternal or child nutritional outcomes. The only production variable associated with child nutritional outcome was livestock ownership, and it was negatively associated with child wasting (P < 0.01). Surprisingly, increased market distance improved the child undernutrition (P <0.05). Higher levels of maternal education were significantly associated with reducing stunting and underweight in children (P < 0.01). Childbirth weight showed a negative association with a child underweight (P < 0.01), and we also observed a small negative effect of a child’s age on stunting. These findings suggest that while home gardens can be an entry point, improving nutrition may require a multifaceted approach that addresses a broader range of factors.
There is concern amongst the public, equestrians, animal welfare organisations, and horse-sport governing bodies regarding the welfare of performance horses, but equestrian culture appears slow to change. The present study seeks to increase our understanding of human factors underlying the persistence of welfare-compromising management and training practices within the performance horse world. Individual, semi-structured interviews focused on equestrians’ attitudes were conducted with 22 equestrians from classical equestrian disciplines in the US, Canada, and the UK. Interview transcripts were analysed using reflexive thematic analysis. Five main themes were identified: perception of welfare issues; conflicting conceptions of a good life; objectification of the horse; instrumentalisation of horse care; and enculturation. Participants perceived and were concerned about horse welfare, but expressed dissonance-reducing strategies, including trivialisation, reframing and justification. Participants shared conflicting conceptions of a good life and described how equestrian activities may infringe upon horse welfare. Objectification of horses was among the attitudinal factors identified that may permit persistence of harmful practices, while the instrumentalisation of care theme showed how management practices often focused on performance and the horse’s job more than care about the horse. Finally, enculturation (the process of adopting attitudes and behaviours of a culture) in equestrianism may be fundamental to maintaining practices and attitudes that compromise horse welfare. These findings provide an enhanced understanding of why horse welfare issues persist in classical equestrian disciplines and may inform future human behaviour change strategies to promote improved horse welfare.
In childhood, diets high in sodium and low in potassium contribute to raised blood pressure and cardiovascular disease later in life(1). For New Zealand (NZ) children, bread is a major source of dietary sodium, and fruit, vegetables, and milk are major dietary sources of potassium(2,3). However, it is mandatory to use iodised salt in NZ bread meaning reducing the salt and thus sodium content could put children at risk of iodine deficiency(4). Our objective was to measure the sodium, potassium, and iodine intake, and blood pressure of NZ school children 8-13 years old. A cross-sectional survey was conducted in five primary schools in Auckland and Dunedin. Primary schools were recruited between July 2022 and February 2023 using purposive sampling. Seventy-five children (n= 37 boys, 29 girls, and nine children who did not state their gender) took part. The most common ethnicity was NZ European and Other (n=54 or 72%) followed by Māori (indigenous inhabitants; n=9 or 12%) and Pasifika (n=5 or 7%). The main outcomes were 24-hour sodium and potassium intake, sodium to potassium molar ratio, 24-hour iodine intake, and BP. Sodium, potassium, and iodine intake were assessed using 24-hour urine samples and BP was assessed using standard methods. Differences by gender were tested using two-sample t-tests and nonparametric Wilcoxon two-sample tests. The mean (SD) 24-hour sodium excretion, potassium excretion, and sodium to potassium molar ratio for children with complete samples (n=59) were 2,420 (1,025) mg, 1,567 (733) mg, and 3.0 (1.6), respectively. The median (25th, 75th percentile) urinary iodine excretion was 88 (61, 122) µg per 24 hours and the mean (SD) systolic and diastolic blood pressure (n=74) were 105 (10) mmHg and 67 (9) mmHg, respectively. There was a significant difference between boys and girls for iodine (77 (43, 96) vs. 98 (72, 127) µg per 24 hours; p=0.02) but no other outcomes. In conclusion, children consumed more sodium and less potassium and iodine than World Health Organization recommendations(5). However, future research should confirm these findings in a nationally representative sample. Evidence-based, equitable interventions and policies with adequate monitoring should be considered to reduce potentially suboptimal sodium, potassium, and iodine intakes in New Zealand.
The deliberations for the Pandemic Accord have opened an important moment of reflection on future approaches to pandemic preparedness. The concept had been increasingly prominent in global health discourse for several years before the pandemic and had concretised into a set of standardised mainstream approaches to the prediction of threats. Since 2019, the authors and the wider research team have led a research project on the meanings and practices of preparedness. At its close, the authors undertook 25 interviews to capture reflections of regional and global health actors’ ideas about preparedness, and how and to what extent these were influenced by Covid-19. Here, an analysis of interview responses is presented, with attention to (dis)connections between the views of those occupying positions in regional and global institutions. The interviews revealed that preparedness means different things to different people and institutions. Analysis revealed several domains of preparedness with distinct conceptualisations of what preparedness is, its purposes, and scope. Overall, there appear to be some changes in thinking due to Covid-19, but also strong continuities, especially with respect to a technical focus and an underplaying of the inequities that became evident (in terms of biosocial vulnerabilities but also global-regional disparities) and, related to this, the importance of power and politics. Here, the analysis has revealed three elements, cutting across the domains but particularly strong within the dominant framing of preparedness, which act to sideline direct engagement with power and politics in the meanings and practices of preparedness. These are an emphasis on urgent action, a focus on universal or standardised approaches, and a resort to technical interventions as solutions. A rethinking of pandemic preparedness needs to enable better interconnections across scales and attention to financing that enables more equitable partnerships between states and regions. Such transformation in established hierarchies will require explicit attention to power dynamics and the political nature of preparedness.
Listeria monocytogenes is a major foodborne pathogen that forms biofilms, enhancing their potential to survive under harsh conditions, including existing antimicrobial treatments(1). Natural antimicrobials such as lysozyme and nisin are generally recognized as safe in food applications(2). As a result of repetitive exposure to antimicrobials and the formation of biofilms, Listeria monocytogenes have developed resistance, making it more challenging to ensure food safety. The hurdle effect, which combines various antimicrobial compounds, has gained attention for controlling foodborne pathogens. Before applying the combined treatment, the effect of nutrient content on the antimicrobial efficacy of these individual antimicrobials is important to optimizing conditions for designing effective combined treatments, as it ensures the maximum potential of each antimicrobial under a given nutrient condition. In preliminary experiments, the effect of nutrient content on the antimicrobial efficacy of nisin and lysozyme to inhibit planktonic cells of two different Listeria monocytogenes strains was studied in nutrient-rich (full-strength TSB) and nutrient-reduced (10% TSB) environments with the Minimum Inhibitory Concentrations (MICs) of Nisin (1250 µg/mL) and Lysozyme (312.5 µg/mL). To study the effect of nutrient content on structural modifications of the Listeria monocytogenes cell envelop and its subsequent impact on the antimicrobial efficacy, nisin and lysozyme concentrations were studied under various nutrient conditions (10%, 30%, 50%, 75%, and 100% TSB). Two sets of cultures for each strain were prepared by pre-growing the microorganism in full-strength TSB (non-preconditioned cells) and corresponding strengths of TSB (preconditioned cells) prior to inoculating into various TSB strengths to perform antimicrobial treatment. Statistical analysis was carried out using two-way analysis of variance (ANOVA), followed by Turkey’s test for post hoc comparison analysis. The results of this study showed that the number of surviving cells at the end of the treatment was significantly decreased (P<0.05) in both nisin and lysozyme treatments compared to the untreated controls with the maximum antimicrobial activity in 10% TSB. Under both non-preconditioned and preconditioned cell states, the maximum inhibitory activity of lysozyme was observed in 10% TSB. The maximum inhibitory activity of nisin was observed in 10% TSB and 30% TSB under preconditioned cell state and non-preconditioned cell state, respectively. Preliminary results of this study found that the nutrient content at the time of antimicrobial treatment and the initial state of the Listeria monocytogenes cells through preconditioning growth conditions influence the antimicrobial efficacy of nisin and lysozyme. This research could provide insights for optimizing future antimicrobial treatments, such as selecting the appropriate dose and antimicrobial strategies.
Poor iron status is one of the most prevalent problems facing infants worldwide, in both developing and developed countries(1). A complex interplay of both dietary and non-dietary factors affects iron intake, absorption, and requirements, and subsequently iron status(2). We aimed to describe iron status in an ethnically diverse cohort of urban-dwelling infants. Data were collected from 364 infants aged 7.0 to 10.0 months living in two main urban centres in New Zealand (Auckland and Dunedin) between July 2020 and February 2022. Participants were grouped by total ethnicity, with any participants who did not identify as either Māori or Pacific categorised into a single ‘others’ group. Haemoglobin, plasma ferritin, soluble transferrin receptor (sTfR), C-Reactive protein, and alpha-1-acid-glycoprotein were obtained from a non-fasting venous blood sample. Inflammation was adjusted for using the Biomarkers Reflecting Inflammation and Nutritional Determinants of Anaemia (BRINDA) method(3). Body iron concentration (mg/kg body weight) was calculated using the ratio of sTfR and ferritin. A total of 96.3% of Pacific infants were iron sufficient, defined as body iron ≥0 mg/kg body weight and haemoglobin (Hb) ≥105 g/L, compared to 82.3% of Māori and 76.0% of ‘other’ (i.e. neither Māori nor Pacific) infants. ‘Other’ infants had the highest prevalence of iron deficiency overall, with 2.8% categorised with iron-deficiency anaemia (IDA) (body iron <0 mg/kg, haemoglobin <105 g/L), 11.8% with early ‘functional’ iron deficiency (body iron <0 mg/kg, haemoglobin ≥105 g/L), and 9.4% with iron depletion (ferritin <15 µg/L, in the absence of early ‘functional’ iron deficiency and iron deficiency anaemia). For Māori infants, 3.2% and 6.5% had IDA and early ‘functional’ iron deficiency respectively, and 8.1% were iron depleted. One (3.7%) Pacific infant was iron depleted, and the remainder were iron sufficient. Plasma ferritin and body iron concentration were, on average, higher in Pacific compared to non-Pacific infants. These findings give an up-to-date and robust understanding of the iron status of infants by ethnicity, highlighting an unexpected finding that infants who are neither Māori nor Pacific may be at higher risk of poor iron status in NZ.
Every year tonnes of macadamia nuts are produced globally, resulting in a large production of by-products such as macadamia husk. This by-product contains high concentrations of phenolic compounds with antioxidant and health-promoting properties(1). Oral delivery of these phenolic compounds via food systems is challenging as the stability and biological activities can change when exposed to different types of environmental conditions (e.g., heat, light, oxygen, and pH)(2). Therefore, this study aimed to protect the integrity and stability of phenolic compounds from macadamia husk against such environmental conditions towards their delivery via food and related products. Different extracts of macadamia husk were prepared by conventional solvent extraction (CSE), accelerated solvent extraction (ASE), and ultrasonic probe-assisted extraction (UPAE) using water and organic solvents such as ethanol and methanol mixtures of different concentrations with water. The extracts were characterised by their total phenol content (TPC), total flavonoid content (TFC), and antioxidant properties (using the 2,2- diphenyl-1-picrylhydrazyl (DPPH) radical scavenging activity method). UPLC-HRes-MS/MS analysis was applied for screening and characterising phenolic compounds in macadamia husk extracts (MHE). Liposomes composed of soy lecithin were used to encapsulate the phenolic compounds to maintain their stability and biological activity against environmental conditions. The mean particle size, homogeneity, zeta potential, and encapsulation efficiency were used to characterise the liposome properties. 50% ethanol was the most effective solvent for maximising the TPC (47.90±0.67 mg GAE/g of dry weight and TFC (149.85±6.54 mg QE/g of DW) in extracts obtained using the three methods studied. Fifteen phenolic compounds including phenolic acids (e.g. chlorogenic acid, protocatechuic acid), flavonoids (e.g. catechin, epicatechin, epigallocatechin, gallocatechin) and other polyphenols (e.g. daidzin, myricetin 3-O- arabinoside, quercetin O-glucoside) were identified in the aqueous ethanol extract. The empty (control) liposomes had a mean diameter of 173.23±1.29 nm and exhibited a zeta potential of -80.14 mV. MHE loading significantly (p < 0.05) increased the liposome size (to 186.33±0.29 nm) and reduced the zeta potential values (-77.00±0.73 mV) and homogeneity of the size distribution. This study shows 50% ethanol was the most effective solvent for maximising the TPC and TFC in extracts obtained using the three different methods studied. Liposomes containing phenolic extract exhibited highly negative zeta potential values, indicating favourable stability and long-term protection of phenolic compounds. Thus, this study provides a promising approach to the extraction and encapsulation of phenolic compounds from New Zealand-grown macadamia husk for their possible incorporation into food products.
Echinococcus equinus is a parasitic cestode primarily maintained within an equine-canine life cycle, with horses, donkeys, mules, and other ungulates serving as intermediate hosts. Although E. equinus has historically been considered non-zoonotic, recent molecular studies suggest that this assumption may need to be reevaluated. This study aimed to investigate the presence and molecular characterization of E. equinus in equids from Türkiye. A retrospective analysis of 52 equine necropsies performed between 2020 and 2025 identified hydatid cysts in one Arabian horse and two donkeys. Gross and histopathological examination confirmed the presence of hydatid cysts in the liver and lungs, exhibiting characteristic structural features. Molecular identification was conducted through PCR amplification targeting the mitochondrial cytochrome c oxidase subunit 1 (mt-CO1) gene, with all positive samples confirmed as E. equinus through sequence analysis. Phylogenetic analysis demonstrated a close relationship between the obtained sequences and reference E. equinus strains from other geographic regions. These findings provide the molecular confirmation of E. equinus in equids from Türkiye and underscore the need for targeted surveillance to better understand its distribution, transmission, and zoonotic relevance, especially considering the first confirmed human case reported in the country in 2021.
The World Health Organization (WHO) has a global initiative to eliminate industrially produced trans fatty acids (iTFAs) from the food supply (1). Formed via the partial hydrogenation of vegetable oils to create hardened vegetable fat, iTFAs can be found in processed foods including fried foods and baked goods. Even small amounts of iTFAs can increase the risk of coronary heart disease. These can be successfully eliminated from the food supply with the WHO recommending a ban on partially hydrogenated oils or to limit iTFA in food to a maximum of 2% of total fat (1). As of June 2024, over 50 countries had one of these regulatory measures in place. The trans-Tasman Food Regulation System is considering policy options to ensure iTFAs are eliminated or reduced as much as possible from the food supply in Australia and New Zealand. Up to date data on the presence of iTFAs in the New Zealand food supply is needed to inform this work as this was last measured in New Zealand in 2007/09 for packaged food and 2013 for fast food. The aim of this survey was to determine the presence and levels of iTFAs in the New Zealand food supply. Since it is not possible to analytically quantify iTFA separately from trans-fats that occur naturally in food products of ruminant origin, such as dairy, beef and lamb products, the sampling plan was designed to target products likely to contain predominately iTFA and adapted from the WHO global protocol for measuring trans fatty acid profiles of foods(2) to the New Zealand context. The survey analysed the trans-fat content of 627 products across national supermarkets (275 products), international supermarkets specialising in imported foods (149 products) and ready-to-eat food outlets (203 products from three regions). One hundred and six products (16.9%) contained trans-fat that exceeded 2% of total fat. Twenty-five (4%) of these products were likely to contain predominately iTFA. The 25 products predominately containing iTFA included eight products from national supermarkets (mostly bakery products), nine products from international supermarkets (mostly curry pastes and biscuits) and eight products from ready-to-eat food outlets (all fried foods). The median trans-fat content of these 25 products was 3.2% of total fat (assumed to be all iTFA). Over a third of these products contained more than double the recommended WHO limit, with five products containing over four times the limit and one product containing more than 16 times the WHO limit. The remaining 81 products may contain some iTFA, but we were unable to quantify the amount. The results from this survey will be used by New Zealand Food Safety to inform the consideration of regulatory options for reducing iTFAs in foods in New Zealand.
Although current estimates suggest that global food production is enough to meet nutritional needs, there are still significant challenges with equitable distribution(1). Tackling these disparities is essential for achieving global nutrition security now and in the future. This study uses the DELTA Model® to analyse global nutrient supply dynamics at national resolution and address nutritional shortfalls in specific countries(2). By examining the distribution of food commodities and nutrients in 2020, we project the future food and nutrient production needs for 2050 to ensure adequate global supply. Our findings indicate that while some nutrients are sufficiently supplied on a global scale, many countries face significant national deficiencies in essential nutrients such as vitamins A, B12, B2, potassium, and iron. Addressing these gaps will require substantial increases in nutrient supply or redistribution. For example, a 1% increase in global protein, targeted at countries with insufficient protein, could close the 2020 gaps. However, if current consumption patterns persist, the global food system will need a 26% increase in production by 2050 to accommodate population growth and changing consumption patterns. Our study developed a framework for exploring future production scenarios. This involves reducing surplus national nutrient supply linearly over decades while simultaneously increasing production of undersupplied nutrients. This framework provides a more practical assessment of future needs, transitioning from idealized production scenarios to realistic projections. Our study investigated a potential future for nutrient supply to meet minimum requirements by 2050. Calcium and vitamin E are crucial, and production must be increased to address significant gaps, given their severe deficiencies in 2020. Energy and fibre production will be required to peak between 2030 and 2040 before stabilizing back near 2020 levels. Predicted changes in nutrient supply from 2020 to 2050 vary: while calcium and vitamin E will need to increase, phosphorus, thiamine and the indispensable amino acids can decrease without compromising global nutrition with only minor redistribution. These results are essential for determining the food supply required to achieve adequate global nutrient supply in the future. Incorporating these insights into global food balance models will provide key stakeholders with evidence, refine future projections, and inform policy decisions aimed at promoting sustainable healthy diets worldwide.
The Brazil nut tree Bertholletia excelsa is an icon of Amazon conservation through sustainable use. Moderate disturbance, such as that caused by swidden agriculture, favours this heliophilic species. Our systematic literature review of Bertholletia studies and historical records addresses the following questions: do slash-and-burn farming systems increase Bertholletia density and growth? What do historical records reveal about the links between Bertholletia life history and human occupation? And what policies and regulations shape the current context for harnessing this synergistic potential for sustainable use? Compared to mature forests, slash-and-burn fallow seedling/sapling densities (11–82 individuals ha–1, with a mean of 29 individuals ha–1) are greater and faster-growing. Extant Bertholletia trees that were cut and burned during swidden preparation resprout as forked individuals and supplement new seeds buried by Dasyprocta spp. The presence of large forked Bertholletia trees and the occurrence of anthropogenic soils, particularly brown soils associated with Brazil nut tree groves, provide evidence that extant Bertholletia groves may be islands of active and passive agroecological management by ancestral Indigenous populations and local communities. This supports the notion that conservation through sustainable use can maintain Amazonian megadiversity. Furthermore, fire has been used in the Amazon since the onset of crop cultivation (including Bertholletia) c. 4500 years ago, suggesting that a more effective approach than banning fires would be to implement a systematic and methodical fire and fuel management strategy, given the ineffectiveness of command-and-control policies in this regard. The 124 conservation units and Indigenous lands in the Amazon containing Brazil nut trees reinforce the importance of policies to create protected areas. Evidence suggests that the presence of an Amazonian biocultural forest – a phenomenon resulting from the interaction between human activities and natural processes – can be sustainably used to promote what might be termed ‘sociobiodiversity conservation’.
The nutrition workforce plays a vital role in disease prevention and health promotion, with expanding job opportunities shaped by factors like aging populations, climate change, global food systems, and advancing technologies(1,2). Preparing students for careers that require adaptability involves understanding the valuable skills they possess and identifying any gaps. This research aimed to identify the skills and knowledge valued by students who had recently completed work-based placements, and explore recent graduates’ experiences, challenges, and preparedness for employment. At the end of their work-based placements students give presentations sharing their experiences and learning. Permission was sought from ten students to analyse the recordings of these presentations. The presentations were selected to include a range of nutrition fields, including sports nutrition, public health, community nutrition, dietary counselling, food and industry, and nutrition communication. Additionally, a list of graduates (within four years of graduation) from various fields (as above) was compiled and they were invited to participate. Semi-structured interviews (n=10) were conducted online via Zoom and recorded. The interview guide included open-ended questions on employment experiences, challenges, preparedness, and required skills. The interviews, transcription and analyses were completed by two student researchers between November 2023 and February 2024. Thematic analysis using NVivo software was used to identify themes. The themes developed included the importance of skills relating to; i) communicating complex nutrition concepts to the public, ii) collaborating within diverse teams, iii) identifying and filling personal knowledge gaps. In addition Graduates felt practical experience from their University study boosted their preparedness for the workforce, though many struggled to apply their skills in non-traditional roles and expand their career scope. In summary, ongoing focus on team-based projects, communication with non-science audiences, and strategies for continuous learning using evidence-based sources are crucial for both undergraduate and postgraduate education.
Head and neck cancer (HNC), characterised by malignant neoplasms originating in the oral cavity, upper aerodigestive tract, the sinuses, salivary glands, bone, and soft tissues of the head and neck, is diagnosed in approximately 600 people annually in New Zealand. Although HNC is a less common cancer, it has a profound effect on almost all aspects of the lives of those affected, particularly the nutritional and social domains. This is due to the common treatment modality being surgery and/or radiotherapy, which can result in major structural and physiological changes in the affected areas, which in turn affects chewing, swallowing, and speaking(1). Specific nutrition impact symptoms (NIS) of HNC have been identified and are significant predictors of reduced dietary intake and malnutrition risk(2). We aimed to identify and describe the malnutrition risk, prevalence of NIS, and protein and energy intake of community living adult HNC survivors 6 months–3 years post treatment in New Zealand. Participants were recruited through virtual HNC support groups in New Zealand. A descriptive observational case series design was used. Malnutrition risk was determined using the Patient-Generated Subjective Global Assessment Short Form (PG-SGA SF). Malnutrition was defined as a PG-SGA SF score between 2 - 8 (mild/suspected - moderate malnutrition) or ≥9 (severely malnourished). NIS were obtained via a validated symptom checklist specific for HNC patients(3), and dietary data was collected using a four-day food record. Participants (N=7) are referred to as PTP1 – PTP7. PTP1 was well-nourished. PTP3 through PTP7 were categorised as mildly/suspected to moderately malnourished (scores ranged from 2-7), and PTP2 was severely malnourished (score of 16). NIS were experienced by all seven participants, with “difficulty chewing” and “difficulty swallowing” being the most selected and highest scored NIS that interfered with oral intake. PTP2 (severely malnourished) scored loss of appetite, difficulty chewing, and difficulty swallowing highly (interfering “a lot”), indicating a high degree of prevalence and impact. Despite being well-nourished, PTP1 had inadequate energy intake (85.5% of their estimated energy requirement (EER)). PTP2, 3, 6, and 7 also had inadequate energy intake (79.3%, 79.3%, 73.9%, and 99.3%, respectively, of their EER). All participants had adequate protein intake based on a range of 1.2-1.5 g/kg body weight per day. The prevalence of malnutrition and NIS in this case series indicates an urgent need for research to identify the true extent of malnutrition in community living HNC survivors post treatment.
In New Zealand, Māori and Pasifika have the lowest foodborne illness notification rates (per 100,000 people) for most foodborne illnesses(1); with underreporting of illness and differing food safety practices as possible factors. New Zealand Food Safety (NZFS) is responsible for regulating the New Zealand food safety system to make sure food is safe and suitable for all New Zealanders. Supporting consumers to make informed food choices and understand safe food preparation practices is a key priority for NZFS(2). As part of this, NZFS communicates food safety advice through various traditional channels including published material and campaigns.To better understand consumer attitudes, knowledge and behaviours around food safety and suitability, NZFS conducted an online survey of 1602 New Zealanders 15 years and over between 24 November and 17 December 2023. The survey used a quota sampling method and included booster samples for Māori and Pasifika. The margin of error was ±2.9% at a 95% confidence interval. The survey was available in English and Te Reo Māori.(3). The study highlighted key insights into food safety practices for Māori and Pasifika. For example, NZFS advises consumers not to wash raw chicken due to the potential for cross-contamination during food preparation. In the survey(3), we found that 67% of consumers who prepare chicken said they washed it either sometimes or always; further, 79% of consumers who prepare chicken believe they should. The most common reason for washing raw chicken was because of hygiene (23%). Even though NZFS messaging is clear to not wash raw chicken, it is concerning that the advice is not adhered to, and the risks are not recognised. In the survey, Pasifika who prepare chicken were more likely to say they wash raw chicken either sometimes or always (79% of Pasifika). As a food safety regulator, it is important to understand our Māori and Pasifika consumers and their perceptions, knowledge and behaviours around food safety practices, but also to consider how we can communicate effectively with them. For example, of the food safety information sources most trusted, Māori were more likely to trust friends, family and or whanau (49%), and Pasifika were most likely to trust health professionals (53%)(3). With a view of trying to better understand our Māori and Pasifika consumers models such as Te Whare Tapa Wha(4) (the Māori Health Model) provide an important and holistic view of health-based concepts of taha whanau (family and social wellbeing), taha tinana (physical wellbeing), taha hinengaro (mental and emotional wellbeing) and taha wairua (spiritual wellbeing). There are opportunities for NZFS to reflect on and use Te Whare Tapa Wha throughout the survey development and implementation process, through to the delivery of targeted food safety messages.