We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study examined whether supplementation with collagen peptides (CP) affects appetite and post-exercise energy intake in healthy active females.
In this randomised, double-blind crossover study, 15 healthy females (23 ± 3 y) consumed 15 g/day of CP or a taste matched non-energy control (CON) for 7 days. On day 7, participants cycled for 45 min at ∼55% Wmax, before consuming the final supplement. Sixty min post supplementation an ad libitum meal was provided, and energy intake recorded. Subjective appetite sensations were measured daily for 6 days (pre- and 30 min post-supplement), and pre (0 min) to 280 min post-exercise on day 7. Blood glucose and hormone concentrations (total ghrelin, glucagon-like peptide-1 (GLP-1), and peptide YY (PYY), cholecystokinin (CCK), dipeptidyl peptidase-4 (sDPP4), leptin, and insulin, were measured fasted at baseline (day 0), then pre-breakfast (0 min), post-exercise (100 min), post-supplement (115, 130, 145, 160 min) and post-meal (220, 280 min) on day 7.
Ad-libitum energy intake was ∼10% (∼41kcal) lower in the CP trial (P=0.037). There was no difference in gastrointestinal symptoms or subjective appetite sensations throughout the trial (P≥0.412). Total plasma GLP-1 (area under the curve, CON: 6369±2330; CP: 9064±3021 pmol/L; P<0.001) and insulin (+80% at peak) were higher after CP (P<0.001). Plasma ghrelin and leptin were lower in CP (condition effect; P≤0.032). PYY, CCK, sDPP4 and glucose were not different between CP and placebo (P≥0.100).
CP supplementation following exercise increased GLP-1 and insulin concentrations and reduced ad libitum energy intake at a subsequent meal in physically active females.
The stars of the Milky Way carry the chemical history of our Galaxy in their atmospheres as they journey through its vast expanse. Like barcodes, we can extract the chemical fingerprints of stars from high-resolution spectroscopy. The fourth data release (DR4) of the Galactic Archaeology with HERMES (GALAH) Survey, based on a decade of observations, provides the chemical abundances of up to 32 elements for 917 588 stars that also have exquisite astrometric data from the Gaia satellite. For the first time, these elements include life-essential nitrogen to complement carbon, and oxygen as well as more measurements of rare-earth elements critical to modern-life electronics, offering unparalleled insights into the chemical composition of the Milky Way. For this release, we use neural networks to simultaneously fit stellar parameters and abundances across the whole wavelength range, leveraging synthetic grids computed with Spectroscopy Made Easy. These grids account for atomic line formation in non-local thermodynamic equilibrium for 14 elements. In a two-iteration process, we first fit stellar labels to all 1 085 520 spectra, then co-add repeated observations and refine these labels using astrometric data from Gaia and 2MASS photometry, improving the accuracy and precision of stellar parameters and abundances. Our validation thoroughly assesses the reliability of spectroscopic measurements and highlights key caveats. GALAH DR4 represents yet another milestone in Galactic archaeology, combining detailed chemical compositions from multiple nucleosynthetic channels with kinematic information and age estimates. The resulting dataset, covering nearly a million stars, opens new avenues for understanding not only the chemical and dynamical history of the Milky Way but also the broader questions of the origin of elements and the evolution of planets, stars, and galaxies.
This article reconstructs the size and organisation of the rural market in hired labour in fourteenth-century England, providing a comparative reference point for arrangements elsewhere in medieval Europe. Quantitative assessment of 1,445 manorial court sessions from six manors casts new light on the English labour market, which was larger and less regulated than previously assumed and the government's wide-ranging labour legislation in the wake of the Black Death was novel in its scale and provisions. Contrary to received wisdom, manorial authorities made few efforts to regulate labour. The older view had placed an over-reliance on the early work of W.O. Ault and had ignored the significance of nil returns. The reasons for the lack of regulation, and its implications for our understanding of the complex interaction between pandemics, labour markets, and legal responses are explored. Finally, the study illustrates how legal responses to pandemics can have inadvertent yet profound consequences.
Introduction. Some medical centers and surgeons require patients to stop smoking cigarettes prior to elective orthopaedic surgeries in an effort to decrease surgical complications. Given higher rates of smoking among rural individuals, rural patients may be disproportionately impacted by these requirements. We assessed the perceptions and experiences of rural-residing Veterans and clinicians related to this requirement. Methods. We conducted qualitative semistructured one-on-one interviews of 26 rural-residing veterans, 10 VA orthopaedic surgery staff (from two Veterans Integrated Services Networks), 24 PCPs who serve rural veterans (14 VA; 10 non-VA), and 4 VA pharmacists. Using the knowledge, attitudes, and behavior framework, we performed conventional content analysis. Results. We found three primary themes across respondents: (1) knowledge of and the evidence base for the requirement varied widely; (2) strong personal attitudes toward the requirement; and (3) implementation and possible implications of this requirement. All surgery staff reported knowledge of requirements at their institution. VA PCPs reported knowledge of requirements but typically could not recall specifics. Most patients were unaware. The majority of respondents felt this requirement could increase motivation to quit smoking. Some PCPs felt a more thorough explanation of smoking-related complications would result in increased quit attempts. About half of all patients reported belief that the requirement was reasonable regardless of initial awareness. Respondents expressed little concern that the requirement might increase rural-urban disparities. Most PCPs and patients felt that there should be exceptions for allowing surgery, while surgical staff disagreed. Discussion. Most respondents thought elective surgery was a good motivator to quit smoking; but patients, PCPs, and surgical staff differed on whether there should be exceptions to the requirement that patients quit preoperatively. Future efforts to augment perioperative smoking cessation may benefit from improving coordination across services and educating patients more about the benefits of quitting.
In recent decades, the use of conditionality backed by benefit sanctions for those claiming unemployment and related benefits has become widespread in the social security systems of high-income countries. Critics argue that sanctions may be ineffective in bringing people back to employment or indeed harmful in a range of ways. Existing reviews largely assess the labour market impacts of sanctions but our understanding of the wider impacts is more limited. We report results from a scoping review of the international quantitative research evidence on both labour market and wider impacts of benefit sanctions. Following systematic search and screening, we extract data for 94 studies reporting on 253 outcome measures. We provide a narrative summary, paying attention to the ability of the studies to support causal inference. Despite variation in the evidence base and study designs, we found that labour market studies, covering two thirds of our sample, consistently reported positive impacts for employment but negative impacts for job quality and stability in the longer term, along with increased transitions to non-employment or economic inactivity. Although largely relying on non-experimental designs, wider-outcome studies reported significant associations with increased material hardship and health problems. There was also some evidence that sanctions were associated with increased child maltreatment and poorer child well-being. Lastly, the review highlights the generally poor quality of the evidence base in this area, with few studies employing research methods designed to identify the causal impact of sanctions, especially in relation to wider impacts.
For modern readers, John Gower is likely to appear as a thinker who was ‘conservative’ in his outlook, being rigidly elitist in his views and relentlessly condemnatory to those whom he saw as challenging the existing social hierarchy. Barrie Dobson, for instance, awarded him ‘the title of Jeremiah of late fourteenth-century England’ and bemoaned his ‘monotonous, heavy-handed and extremely pessimistic approach to his theme’. If Gower's reputation is justified then it owes a good deal to his treatment of peasants in general and to his response to their participation in the Great Revolt of 1381 in particular, where he portrays the rebels as vicious animals who are without reason or merit. Gower was writing during a period of tumultuous change in English society and economy, following successive devastating outbreaks of plague after 1348. Here we set out the reality of the changes which English society underwent during this period and examine the intellectual framework that Gower employed to make sense of them. The abundance of central government and local sources which has survived from this period has allowed historians in recent years to recover a great deal about the lives of the lowest ranks of society, to reassess the changes after the Black Death and to re-interpret the causes of the Great Revolt. It will be argued here that fundamental changes to the labour market shook the ideological and moral framework of society, and that the attempts of the ruling elite to make sense of what was happening – and their inability to prevent it – informed the development of a poetic common voice in Ricardian England. The historical evidence indicates that Gower was not only a conservative thinker but was backwards looking even by the standards of his own day.
The Peasants and the Third Estate
Of the estimated 2.8m people in England in the 1370s, the overwhelming majority – perhaps 85% or more – belonged to what contemporary social commentators defined as the third estate, the laboratores whose ordained role was to provide food, clothing and shelter through manual labour for the other two estates: the oratores, who prayed for society and sought to bring it salvation, and the bellatores, whose task was to provide physical protection and justice.
The Black Death first reduced England’s population by nearly one half then prevented demographic recovery. Volatility characterised the 1350s and 1360s, due to extreme weather conditions, poor harvests, contracting output, disrupted markets, labour shortages and a high turnover of people. Towns struggled to assimilate the influx of migrants. The availability of land on favourable terms, and of well-paid employment, greatly benefited the lower orders of society, but caused consternation to the ruling elite.The government responded with a wave of legislation to regulate labour mobility, prices and wages, so as to impose upon workers the discipline of manual labour deemed essential to the common profit. By the 1380s equilibrium had replaced the volatility. The economy had contracted, and shifted from arable production to pastoral and manufactured products. Towns were smaller, but their residents tended to be wealthier. The attitude of the authorities to labour had become more realistic and less idealistic, emphasising its noble qualities rather than denouncing its vices.
Although the conceptual design is a fundamental process through which design decisions are made, its focus is on finding the right solution. Is finding the right solution enough for a good design? Defining the problem or applying a solution-focused process may not be enough to create the differences that must be present in today's variable conditions. This can be overcome through seeking meaning instead of seeking a solution. The purpose of this article is to develop an approach that focuses on seeking meaning for products by starting with a design-thinking approach to the conceptual design process in engineering design. Focusing on a search for meaning in engineering design will provide advantages, such as creating unique values and sustainable competition.
Ferrierite is the name for a series of zeolite-group of minerals which includes three species with the same ferrierite framework (FER) crystal structure but different extra-framework cations. Recent studies have shown that ferrierite can exhibit a fibrous-asbestiform crystal habit and may possess the same properties as carcinogenic fibrous erionite. Characterisation of the ferrierite in and around a mine location will be helpful in assessing the potential for toxic outcomes of exposure in the mine and any local population.
The zeolite-rich tuff deposit of Lovelock, Nevada, USA is the largest occurrence of diagenetic ferrierite-Mg. A previous survey reported that ferrierite hosted in these rocks displays a fibrous morphology. However, these observations concerned a limited number of samples and until now there has been little evidence of widespread occurrence of fibrous ferrierite in the Lovelock deposit.
The main goal of this study was to perform a mineralogical and morphometric characterisation of the tuff deposit at Lovelock and evaluate the distribution of fibrous ferrierite in the outcrop. For this purpose, a multi-analytical approach including powder X-ray diffraction, scanning and transmission microscopies, micro-Raman spectroscopy, thermal analyses, and surface-area determination was applied.
The results prove fibrous ferrierite is widespread and intermixed with mordenite and orthoclase, although there are variations in the spatial distribution in the bedrock. The crystal habit of the ferrierite ranges from prismatic to asbestiform (elongated, thin and slightly flexible) and fibres are aggregated in bundles. According to the WHO counting criteria, most of the ferrierite fibres can be classified as breathable. While waiting for confirmatory in vitro and in vivo tests to assess the actual toxicity/pathogenicity potential of this mineral fibre, it is recommended to adopt a precautionary approach for mining operations in this area to reduce the risk of exposure.
New approaches are needed to safely reduce emergency admissions to hospital by targeting interventions effectively in primary care. A predictive risk stratification tool (PRISM) identifies each registered patient's risk of an emergency admission in the following year, allowing practitioners to identify and manage those at higher risk. We evaluated the introduction of PRISM in primary care in one area of the United Kingdom, assessing its impact on emergency admissions and other service use.
METHODS:
We conducted a randomized stepped wedge trial with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. PRISM was implemented in eleven primary care practice clusters (total thirty-two practices) over a year from March 2013. We analyzed routine linked data outcomes for 18 months.
RESULTS:
We included outcomes for 230,099 registered patients, assigned to ranked risk groups.
Overall, the rate of emergency admissions was higher in the intervention phase than in the control phase: adjusted difference in number of emergency admissions per participant per year at risk, delta = .011 (95 percent Confidence Interval, CI .010, .013). Patients in the intervention phase spent more days in hospital per year: adjusted delta = .029 (95 percent CI .026, .031). Both effects were consistent across risk groups.
Primary care activity increased in the intervention phase overall delta = .011 (95 percent CI .007, .014), except for the two highest risk groups which showed a decrease in the number of days with recorded activity.
CONCLUSIONS:
Introduction of a predictive risk model in primary care was associated with increased emergency episodes across the general practice population and at each risk level, in contrast to the intended purpose of the model. Future evaluation work could assess the impact of targeting of different services to patients across different levels of risk, rather than the current policy focus on those at highest risk.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
METHODS:
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
RESULTS:
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
CONCLUSIONS:
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
METHODS:
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
RESULTS:
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
CONCLUSIONS:
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
A gravity survey was conducted on the Windmill Islands, East Antarctica, during the 2004–05 summer season. The aim of the study was to investigate the subsurface geology of the Windmill Islands area. Ninety-seven gravity stations were established. Additionally, 49 observations from a survey in 1993–94 were re-reduced and merged with the 2004–05 data. A three-dimensional subsurface model was constructed from the merged gravity dataset to determine the subsurface geology of the Windmill Islands. The main country rock in the Windmill Islands is a Garnet-bearing Granite Gneiss. A relatively dense intrusive charnockite unit, the Ardery Charnockite, generates the dominant gravity high of the study area and has been modelled to extend to depths of 7–13 km. It has moderate to steep contacts against the surrounding Garnet-bearing Granite Gneiss. The Ardery Charnockite surrounds a less dense granite pluton, the Ford Granite, which is modelled to a depth of 6–12 km and creates a localized gravity low. This granitic pluton extends at depth towards the east. The modelling process has also shown that Mitchell Peninsula is linked to the adjacent Law Dome ice cap by an ‘ice ramp’ of approximately 100 m thickness.
The charge that the United States Supreme Court exercised a conservative influence upon the nation’s constitutional life during the period from 1864 to 1938 is impossible to refute. The Supreme Court during the period from the end of the Civil War to the New Deal era has been portrayed as having largely abdicated its obligation to protect society’s common interests in favour of a laissez-faire constitutionalism reflecting the social and political views of new and powerful economic interests. The judicial conservatism of the late nineteenth and early twentieth centuries conflicted with the political ideals of Progressives and with the direction taken by American policy-makers since the acceptance of Franklin D. Roosevelt’s New Deal in the 1930s. Historians have labelled the Court’s laissez-faire conservative style as undesirable, if not consciously immoral.
Nevertheless, the problem of understanding the ideas which lay at the foundation of judicial conservatism should be addressed. General legal historians have preferred to begin and end their inquiries into early influences on the judicial mind with a short overview of legal education and leave aside the possible influence of college studies. In recent years, historians have broadened their investigations of the intellectual underpinnings of late nineteenth-century legal thought in an attempt to provide the sort of synthetic account of legal thought suggested by Perry Miller’s Life of the Mind—a work which attempts to connect the thought of leading members of the bar to intellectual currents outside the legal sphere. The result has been a limited rehabilitation of the Supreme Court’s reputation during the Gilded Age.
This paper asks whether where someone lives bears any association with their attitudes to inequality and income redistribution, focusing on the relative contribution of neighbourhood income, density and ethnic composition. People on higher incomes showed higher support for redistribution when living in more deprived neighbourhoods. People with lower levels of altruism had higher levels of support for redistribution in neighbourhoods of higher density. People living in more ethnically mixed neighbourhoods had higher levels of support for redistribution on average, but this support declined for Whites with low levels of altruism as the deprivation of the neighbourhood increased. Current trends which sustain or extend income and wealth inequalities, reflected in patterns of residence, may undermine social cohesion in the medium- to long-term. This may be offset to some extent by trends of rising residential ethnic diversity.