We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Excess sodium consumption, mostly from dietary salt, causes high blood pressure and an increased risk of cardiovascular disease(1). In parallel, insufficient potassium intake also contributes to raised blood pressure(2). Switching regular salt for potassium-enriched salt, where a proportion of the sodium chloride is replaced with potassium chloride, is a promising public health intervention to address both these issues(3). However, the supply chain to support increased use of potassium-enriched salt in Australia is not well understood. The objectives of this study were to investigate how the salt supply chain operates in Australia and to obtain food industry stakeholder perspectives on the technical barriers and enablers to increased potassium-enriched salt use. Twelve interviews with industry stakeholders (from food companies, salt manufacturers and trade associations) were conducted and thematically analysed using a template analysis method. Two top-level themes were developed: ‘supply chain practices’ and ‘technical barriers and enablers’. The potassium-enriched salt supply chain was described as less well-established than the low-cost production and distribution of regular salt in Australia. However, food companies reported not having difficulty sourcing potassium chloride. For Australian food industry stakeholders, cost, flavour and functionality were perceived as key barriers to increased uptake of potassium-enriched salt as a food ingredient. Stakeholders from food companies were hesitant to use potassium-enriched salt due to concerns about bitter or metallic flavours and uncertainty whether it would provide the same microbial/shelf-life functions or textural quality as regular salt. However, potassium-enriched salt manufacturers had divergent opinions stating potassium-enriched salt was a suitable functional replacement for regular salt and that flavour differences observed may be due to the incorrect use of potassium chloride as opposed to use of a purpose-made potassium-enriched salt. Stakeholders identified that establishing a stable and affordable supply of potassium-enriched salt in Australia and increased support for food technology research and development would enable increased use. To improve uptake of potassium-enriched salt by the Australian food industry, future efforts should focus on strengthening potassium-enriched salt supply chains and improving appeal for food industry to use in manufacturing and for consumers to purchase. Public health advocacy efforts should ensure that industry is equipped with the latest evidence on the feasibility and benefits of using potassium-enriched salt as an ingredient. Ongoing engagement is critical to ensure that industry is aware of their responsibility and opportunity to offer healthier foods to consumers by switching regular salt to potassium-enriched salt within foods.
Ambient air pollution remains a global challenge, with adverse impacts on health and the environment. Addressing air pollution requires reliable data on pollutant concentrations, which form the foundation for interventions aimed at improving air quality. However, in many regions, including the United Kingdom, air pollution monitoring networks are characterized by spatial sparsity, heterogeneous placement, and frequent temporal data gaps, often due to issues such as power outages. We introduce a scalable data-driven supervised machine learning model framework designed to address temporal and spatial data gaps by filling missing measurements within the United Kingdom. The machine learning framework used is LightGBM, a gradient boosting algorithm based on decision trees, for efficient and scalable modeling. This approach provides a comprehensive dataset for England throughout 2018 at a 1 km2 hourly resolution. Leveraging machine learning techniques and real-world data from the sparsely distributed monitoring stations, we generate 355,827 synthetic monitoring stations across the study area. Validation was conducted to assess the model’s performance in forecasting, estimating missing locations, and capturing peak concentrations. The resulting dataset is of particular interest to a diverse range of stakeholders engaged in downstream assessments supported by outdoor air pollution concentration data for nitrogen dioxide (NO2), Ozone (O3), particulate matter with a diameter of 10 μm or less (PM10), particulate matter with a diameter of 2.5 μm or less PM2.5, and sulphur dioxide (SO2), at a higher resolution than was previously possible.
Observational studies suggest that 25-hydroxy vitamin D (25(OH)D) concentration is inversely associated with pain. However, findings from intervention trials are inconsistent. We assessed the effect of vitamin D supplementation on pain using data from a large, double-blind, population-based, placebo-controlled trial (the D-Health Trial). 21 315 participants (aged 60–84 years) were randomly assigned to a monthly dose of 60 000 IU vitamin D3 or matching placebo. Pain was measured using the six-item Pain Impact Questionnaire (PIQ-6), administered 1, 2 and 5 years after enrolment. We used regression models (linear for continuous PIQ-6 score and log-binomial for binary categorisations of the score, namely ‘some or more pain impact’ and ‘presence of any bodily pain’) to estimate the effect of vitamin D on pain. We included 20 423 participants who completed ≥1 PIQ-6. In blood samples collected from 3943 randomly selected participants (∼800 per year), the mean (sd) 25(OH)D concentrations were 77 (sd 25) and 115 (sd 30) nmol/l in the placebo and vitamin D groups, respectively. Most (76 %) participants were predicted to have 25(OH)D concentration >50 nmol/l at baseline. The mean PIQ-6 was similar in all surveys (∼50·4). The adjusted mean difference in PIQ-6 score (vitamin D cf placebo) was 0·02 (95 % CI (−0·20, 0·25)). The proportion of participants with some or more pain impact and with the presence of bodily pain was also similar between groups (both prevalence ratios 1·01, 95 % CI (0·99, 1·03)). In conclusion, supplementation with 60 000 IU of vitamin D3/month had negligible effect on bodily pain.
This chapter reviews the critical role that a contract research organization performs in developing new therapeutics for Alzheimer’s disease (AD). Late-phase AD trials are lengthy, expensive, and require specialized expertise and experience in order to optimize signal detection. We review the intricacies of AD protocol design, selection of optimal neuropsychiatric tests for different AD stages, rater training, imaging technologies, and use of biomarkers. Careful planning of trials involves well-thought-out investigator selection, site monitoring, and patient recruitment and retention strategies. Examples of operational issues in large global trials are also given, including adaptations necessary due to the COVID pandemic.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
Historically, there are inconsistencies in the calculation of whole-grain intake, particularly through use of highly variable whole-grain food definitions. The current study aimed to determine the impact of using a whole-grain food definition on whole-grain intake estimation in Australian and Swedish national cohorts and investigate impacts on apparent associations with CVD risk factors. This utilised the Australian National Nutrition and Physical Activity Survey 2011–2012, the Swedish Riksmaten adults 2010–2011 and relevant food composition databases. Whole-grain intakes and associations with CVD risk factors were determined based on consumption of foods complying with the Healthgrain definition (≥30 % whole grain (dry weight), more whole than refined grain and meeting accepted standards for ‘healthy foods’ based on local regulations) and compared with absolute whole-grain intake. Compliance of whole-grain containing foods with the Healthgrain definition was low in both Sweden (twenty-nine of 155 foods) and Australia (214 of 609 foods). Significant mean differences of up to 24·6 g/10 MJ per d of whole-grain intake were highlighted using Swedish data. Despite these large differences, application of a whole-grain food definition altered very few associations with CVD risk factors, specifically, changes with body weight and blood glucose associations in Australian adults where a whole-grain food definition was applied, and some anthropometric measures in Swedish data where a high percentage of whole-grain content was included. Use of whole-grain food definitions appears to have limited impact on measuring whole-grain health benefits but may have greater relevance in public health messaging.
The volume of evidence from scientific research and wider observation is greater than ever before, but much is inconsistent and scattered in fragments over increasingly diverse sources, making it hard for decision-makers to find, access and interpret all the relevant information on a particular topic, resolve seemingly contradictory results or simply identify where there is a lack of evidence. Evidence synthesis is the process of searching for and summarising a body of research on a specific topic in order to inform decisions, but is often poorly conducted and susceptible to bias. In response to these problems, more rigorous methodologies have been developed and subsequently made available to the conservation and environmental management community by the Collaboration for Environmental Evidence. We explain when and why these methods are appropriate, and how evidence can be synthesised, shared, used as a public good and benefit wider society. We discuss new developments with potential to address barriers to evidence synthesis and communication and how these practices might be mainstreamed in the process of decision-making in conservation.
To determine the impacts of using a whole grain food definition on measurement of whole grain intake compared with calculation of total grams of intake irrespective of the source.
Design:
The Australian whole grain database was expanded to identify foods that comply with the Healthgrain whole grain food definition (≥30 % whole grains on a dry weight basis, whole grain ingredients exceeds refined grain and meeting accepted standards for healthy foods based on local regulations). Secondary analysis of the National Nutrition and Physical Activity Survey (NNPAS) 2011–2012 dietary intake data included calculation of whole grain intakes based on intake from foods complying with the Healthgrain definition. These were compared with intake values where grams of whole grain in any food had been included.
Setting:
Australia.
Participants:
Australians (≥2 years) who participated in the NNPAS 2011–2012 (n 12 153).
Results:
Following expansion of the whole grain database, 214 of the 609 foods containing any amount of whole grain were compliant with the Healthgrain definition. Significant mean differences (all P < 0·05) of 2·84–6·25 g/d of whole grain intake (5·91–9·44 g/d energy adjusted) were found when applying the Healthgrain definition in comparison with values from foods containing any whole grain across all age groups.
Conclusions:
Application of a whole grain food definition has substantial impact on calculations of population whole grain intakes. While use of such definitions may prove beneficial in settings such as whole grain promotion, the underestimation of total intake may impact on identification of any associations between whole grain intake and health outcomes.
To determine the relationship between falls and deficits in specific cognitive domains in older adults.
Design:
An analysis of the English Longitudinal Study of Ageing (ELSA) cohort.
Setting:
United Kingdom community-based.
Participants:
5197 community-dwelling older adults recruited to a prospective longitudinal cohort study.
Measurements:
Data on the occurrence of falls and number of falls, which occurred during a 12-month follow-up period, were assessed against the specific cognitive domains of memory, numeracy skills, and executive function. Binomial logistic regression was performed to evaluate the association between each cognitive domain and the dichotomous outcome of falls in the preceding 12 months using unadjusted and adjusted models.
Results:
Of the 5197 participants included in the analysis, 1308 (25%) reported a fall in the preceding 12 months. There was no significant association between the occurrence of a fall and specific forms of cognitive dysfunction after adjusting for self-reported hearing, self-reported eyesight, and functional performance. After adjustment, only orientation (odds ratio [OR]: 0.80; 95% confidence intervals [CI]: 0.65–0.98, p = 0.03) and verbal fluency (adjusted OR: 0.98; 95% CI: 0.96–1.00; p = 0.05) remained significant for predicting recurrent falls.
Conclusions:
The cognitive phenotype rather than cognitive impairment per se may predict future falls in those presenting with more than one fall.
Although food from grazed animals is increasingly sought by consumers because of perceived animal welfare advantages, grazing systems provide the farmer and the animal with unique challenges. The system is dependent almost daily on the climate for feed supply, with the importation of large amounts of feed from off farm, and associated labour and mechanisation costs, sometimes reducing economic viability. Furthermore, the cow may have to walk long distances and be able to harvest feed efficiently in a highly competitive environment because of the need for high levels of pasture utilisation. She must, also, be: (1) highly fertile, with a requirement for pregnancy within ~80 days post-calving; (2) ‘easy care’, because of the need for the management of large herds with limited labour; (3) able to walk long distances; and (4) robust to changes in feed supply and quality, so that short-term nutritional insults do not unduly influence her production and reproduction cycles. These are very different and are in addition to demands placed on cows in housed systems offered pre-made mixed rations. Furthermore, additional demands in environmental sustainability and animal welfare, in conjunction with the need for greater system-level biological efficiency (i.e. ‘sustainable intensification’), will add to the ‘robustness’ requirements of cows in the future. Increasingly, there is evidence that certain genotypes of cows perform better or worse in grazing systems, indicating a genotype×environment interaction. This has led to the development of tailored breeding objectives within countries for important heritable traits to maximise the profitability and sustainability of their production system. To date, these breeding objectives have focussed on the more easily measured traits and those of highest relative economic importance. In the future, there will be greater emphasis on more difficult to measure traits that are important to the quality of life of the animal in each production system and to reduce the system’s environmental footprint.
Most studies underline the contribution of heritable factors for psychiatric disorders. However, heritability estimates depend on the population under study, diagnostic instruments, and study designs that each has its inherent assumptions, strengths, and biases. We aim to test the homogeneity in heritability estimates between two powerful, and state of the art study designs for eight psychiatric disorders.
Methods
We assessed heritability based on data of Swedish siblings (N = 4 408 646 full and maternal half-siblings), and based on summary data of eight samples with measured genotypes (N = 125 533 cases and 208 215 controls). All data were based on standard diagnostic criteria. Eight psychiatric disorders were studied: (1) alcohol dependence (AD), (2) anorexia nervosa, (3) attention deficit/hyperactivity disorder (ADHD), (4) autism spectrum disorder, (5) bipolar disorder, (6) major depressive disorder, (7) obsessive-compulsive disorder (OCD), and (8) schizophrenia.
Results
Heritability estimates from sibling data varied from 0.30 for Major Depression to 0.80 for ADHD. The estimates based on the measured genotypes were lower, ranging from 0.10 for AD to 0.28 for OCD, but were significant, and correlated positively (0.19) with national sibling-based estimates. When removing OCD from the data the correlation increased to 0.50.
Conclusions
Given the unique character of each study design, the convergent findings for these eight psychiatric conditions suggest that heritability estimates are robust across different methods. The findings also highlight large differences in genetic and environmental influences between psychiatric disorders, providing future directions for etiological psychiatric research.
Leafy spurge is a troublesome, exotic weed in the northern Great Plains of the United States. Leafy spurge produces showy yellow bracts during June that give this weed a conspicuous appearance. A study was conducted to determine the feasibility of using remote sensing techniques to detect leafy spurge in this phenological stage. Study sites were located in North Dakota and Montana. Plant canopy reflectance measurements showed that leafy spurge had higher visible (0.63- to 0.69-μm) reflectance than several associated plant species. The conspicuous yellow bracts of leafy spurge gave it distinct yellow-green and pink images on conventional color and color-infrared aerial photographs, respectively. Leafy spurge also could be distinguished on conventional color video imagery where it had a golden yellow image response. Quantitative data obtained from digitized video images showed that leafy spurge had statistically different digital values from those of associated vegetation and soil. Computer analyses of video images showed/that light reflected from leafy spurge populations could be quantified from associated vegetation. This technique permits area estimates of leafy spurge populations. Large format conventional color photographs of Theodore Roosevelt National Park near Medora, ND were digitized and integrated with a geographic information system to produce a map of leafy spurge populations within the park that can be useful to monitor the spread or decline of leafy spurge.
Native Polyacrylamide gel electrophoresis (PAGE) and enzyme activity staining were used to identify possible progenitor species of velvetleaf (Abutilon theophrasti Medic. # ABUTH). Multiple forms of superoxide dismutase activity were observed in each of the plants surveyed. Three enzyme forms were common to all the species and bio types while one form was different in the velvetleaf biotypes compared to the other species. Multiple forms of peroxidase activity were also detected. The three velvetleaf biotypes possessed identical enzyme forms with minimal similarity to peroxidase forms found in the other species. Multiple forms of esterase activity separated as two nonoverlapping groups. A slowly migrating group was observed in all the velvetleaf biotypes and a more rapidly migrating group characterized the remaining Abutilon species. The results of this study indicated that the progenitors of velvetleaf were not among the species surveyed and suggested that the progenitors may no longer be extant.
A scentless plant bug feeds on velvetleaf seeds. Fungi, dominated by the genera Fusarium and Alternaria, were isolated from insect-attacked seeds at levels related to insect density on the plants. The combined effects of insect feeding and fungal infection decreased seed germination. Burial of insect-attacked seeds in soil for 24 months reduced seed survival and increased Fusarium infection. Decreases in velvetleaf seed viability and survival in soil caused by a seed-feeding insect and associated seed fungi suggests that subsequent infestations by velvetleaf can be decreased through integrated use of the two biological control agents.
Field infestations of a seed-feeding insect developed from overwintered populations reduced viability of velvetleaf seed to 17.5 and 15.5% at two locations in central Missouri, compared to 95.5 and 87.5% at insect-free sites. Insect feeding enhanced the proportion of seedborne microorganisms in seed up to 98% compared to the average fungal infection of 8% for seed not exposed to the insect. There was a strong negative correlation between fungal infection associated with insect feeding and percent velvetleaf seed viability. The insect transmits microorganisms externally just as pollen is carried by various other insect species and not by ingestion and regurgitation. The effectiveness of the insect on reducing seed viability and seed production in central Missouri is mainly limited by the time required to build up populations capable of significantly affecting early-season velvetleaf seed production.
The evolution of glyphosate resistance in weedy species places an environmentally benign herbicide in peril. The first report of a dicot plant with evolved glyphosate resistance was horseweed, which occurred in 2001. Since then, several species have evolved glyphosate resistance and genomic information about nontarget resistance mechanisms in any of them ranges from none to little. Here, we report a study combining iGentifier transcriptome analysis, cDNA sequencing, and a heterologous microarray analysis to explore potential molecular and transcriptomic mechanisms of nontarget glyphosate resistance of horseweed. The results indicate that similar molecular mechanisms might exist for nontarget herbicide resistance across multiple resistant plants from different locations, even though resistance among these resistant plants likely evolved independently and available evidence suggests resistance has evolved at least four separate times. In addition, both the microarray and sequence analyses identified non–target-site resistance candidate genes for follow-on functional genomics analysis.
Reconstructions of past environmental changes are critical for understanding the natural variability of Earth's climate system and for providing a context for present and future global change. Radiocarbon-dated lake sediments from Lake CF3, northeastern Baffin Island, Arctic Canada, are used to reconstruct past environmental conditions over the last 11,200 years. Numerous proxies, including chironomid-inferred July air temperatures, diatom-inferred lakewater pH, and sediment organic matter, reveal a pronounced Holocene thermal maximum as much as 5°C warmer than historic summer temperatures from ∼10,000 to 8500 cal yr B.P. Following rapid cooling ∼8500 cal yr B.P., Lake CF3 proxies indicate cooling through the late Holocene. At many sites in northeastern Canada, the Holocene thermal maximum occurred later than at Lake CF3; this late onset of Holocene warmth is generally attributed to the impacts of the decaying Laurentide Ice Sheet on early Holocene temperatures in northeastern Canada. However, the lacustrine proxies in Lake CF3 apparently responded to insolation-driven warmth, despite the proximity of Lake CF3 to the Laurentide Ice Sheet and its meltwater. The magnitude and timing of the Holocene thermal maximum at Lake CF3 indicate that temperatures and environmental conditions at this site are highly sensitive to changes in radiative forcing.
New Zealand bittercress is reported as new to the United States. While collecting specimens to determine what Cardamine species occur in the nursery trade, New Zealand bittercress was discovered in a container nursery in Wilkes County, North Carolina. The nursery tracked the shipment of contaminated plants to a wholesale nursery in Washington County, Oregon. It was subsequently confirmed that New Zealand bittercress also occurs in a nursery in Clackamas County, Oregon, and has likely been distributed throughout the United States as a contaminant in container grown ornamental plants. Thus far there have been no reports of naturalized populations outside of container nursery crop production facilities.
The primary objective of this study was to assess the impact of catfish imports and tariffs on the U.S. catfish industry, with particular focus on the U.S. International Trade Commission ruling on Vietnam in 2003. Given the importance of Vietnam to the U.S. catfish market, it was assumed that catfish import prices would increase by 35 percent if the maximum tariff was imposed on catfish from Vietnam. With the tariff, domestic catfish prices at the wholesale level would increase by $0.06 per lb, and farm prices by $0.03 per lb. Processor sales would increase by 1.66 percent. Total welfare at the wholesale level would increase from $69.2 million to $71.7 million, an increase of about 3.63 percent, and processor and farm revenue would increase by 4.4 percent and 5.8 percent, respectively. These results represent the greatest possible benefit and suggest modest gains for the U.S. catfish industry.