We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The effect dietary FODMAPs (fermentable oligo-, di- and mono-saccharides and polyols) in healthy adults is poorly documented. This study compared specific effects of low and moderate FODMAP intake (relative to typical intake) on the faecal microbiome, participant-reported outcomes and gastrointestinal physiology. In a single-blind cross-over study, 25 healthy participants were randomised to one of two provided diets, ‘low’ (LFD) <4 g/d or ‘moderate’ (MFD) 14-18 g/d, for 3 weeks each, with ≥2-week washout between. Endpoints were assessed in the last week of each diet. The faecal bacterial/archaeal and fungal communities were characterised in 18 participants in whom high quality DNA was extracted by 16S rRNA and ITS2 profiling, and by metagenomic sequencing. There were no differences in gastrointestinal or behavioural symptoms (fatigue, depression, anxiety), or in faecal characteristics and biochemistry (including short-chain fatty acids). Mean colonic transit time (telemetry) was 23 (95% confidence interval: 15, 30) h with the MFD compared with 34 (24, 44) h with LFD (n=12; p=0.009). Fungal diversity (richness) increased in response to MFD, but bacterial richness was reduced, coincident with expansion of the relative abundances of Bifidobacterium, Anaerostipes, and Eubacterium. Metagenomic analysis showed expansion of polyol-utilising Bifidobacteria, and Anaerostipes with MFD. In conclusion, short-term alterations of FODMAP intake are not associated with symptomatic, stool or behavioural manifestations in healthy adults, but remarkable shifts within the bacterial and mycobiome populations were observed. These findings emphasise the need to quantitatively assess all microbial Domains and their interrelationships to improve understanding of consequences of diet on gut function.
Geoarchaeological research as part of the AHRC funded Living with Monuments (LwM) project investigated the upper Kennet river system across the Avebury World Heritage landscape. The results demonstrate that in the early–mid-Holocene (c. 9500–1000 bc) there was very low erosion of disturbed soils into the floodplain, with floodplain deposits confined to a naturally forming bedload fluvial deposit aggrading in a shallow channel of inter-linked deeper pools. At the time of the Neolithic monument building in the 4th–early 3rd millennium bc, the river was wide and shallow with areas of presumed braid plain. Between c. 4000 and 1000 bc, a human induced signature of soil erosion became a minor component of fluvial sedimentation in the Kennet palaeo-channel but it was small scale and localised. This strongly suggests that there is little evidence of widespread woodland removal associated with Neolithic farming and monument building, despite the evidently large timber requirements for Neolithic sites like the West Kennet palisade enclosures. Consequently, there was relatively light human disturbance of the hinterland and valley slopes over the longue durée until the later Bronze Age/Early Iron Age, with a predominance of pasture over arable land. Rather than large Neolithic monument complexes being constructed within woodland clearings, representing ancestral and sacred spaces, the substantially much more open landscape provided a suitable landscape with areas of sarsen spreads potentially easily visible. During the period c. 3000–1000 bc, the sediment load within the channel slowly increased with alluvial deposition of increasingly humic silty clays across the valley floor. However, this only represents small-scale landscape disturbance. It is from the Late Bronze Age–Early Iron Age when the anthropogenic signal of human driven alluviation becomes dominant and overtakes the bedload fluvial signal across the floodplain, with localised colluvial deposits on the floodplain margins. Subsequently, the alluvial archive describes more extensive human impact across this landscape, including the disturbance of loessic-rich soils in the catchment. The deposition of floodplain wide alluvium continues throughout the Roman, medieval, and post-medieval periods, correlating with the development of a low-flow, single channel, with alluvial sediments describing a decreasing energy in the depositional environment.
To address increasingly pressing social–environmental challenges, the transformative strand of sustainability science seeks to move beyond a descriptive-analytical stance in order to explore and contribute to the implementation of radical alternatives to dominant and unsustainable paradigms, norms, and values. However, in many cases, academia is not currently structured to support and reward inter-/trans-disciplinary and transformative endeavors. This paper introduces a theory of change for the Future Earth Pathways Initiative, and similar initiatives, to help leverage the capacity of sustainability scientists to engage in transformative research.
Technical summary
The increasing body of descriptive-analytical knowledge produced by sustainability science over the last two decades has largely failed to trigger the transformation of policies, norms, and behaviors it was aiming to inform. The emergent transformative strand of sustainability science is a proactive alternative approach seeking to play an active role in processes of societal change by developing knowledge about options, solutions, and pathways, and by participating in their implementation. In principle, scientists can enhance their contribution to more sustainable futures by engaging in transformative research. However, a lack of skills and competencies, relatively unmatured transformative methods and concepts, and an institutional landscape still geared toward disciplinary and descriptive-analytical research, still hinders the sustainability science community from engaging more widely in transformative research. In this paper, the Future Earth Pathways Initiative introduces a theory of change (ToC) for increasing the capacity of sustainability scientists to engage in this type of research. This ToC ultimately aims to build a growing community of practitioners engaged in transformative research, to advance concepts, methods, and paradigms to foster ‘fit-for-purpose transformative research’, and to shape institutions to nurture transformative research-friendly contexts.
Social media summary
What would a theory of change for leveraging the transformative capacity of sustainability science look like?
Incorporating emerging knowledge into Emergency Medical Service (EMS) competency assessments is critical to reflect current evidence-based out-of-hospital care. However, a standardized approach is needed to incorporate new evidence into EMS competency assessments because of the rapid pace of knowledge generation.
Objective:
The objective was to develop a framework to evaluate and integrate new source material into EMS competency assessments.
Methods:
The National Registry of Emergency Medical Technicians (National Registry) and the Prehospital Guidelines Consortium (PGC) convened a panel of experts. A Delphi method, consisting of virtual meetings and electronic surveys, was used to develop a Table of Evidence matrix that defines sources of EMS evidence. In Round One, participants listed all potential sources of evidence available to inform EMS education. In Round Two, participants categorized these sources into: (a) levels of evidence quality; and (b) type of source material. In Round Three, the panel revised a proposed Table of Evidence. Finally, in Round Four, participants provided recommendations on how each source should be incorporated into competency assessments depending on type and quality. Descriptive statistics were calculated with qualitative analyses conducted by two independent reviewers and a third arbitrator.
Results:
In Round One, 24 sources of evidence were identified. In Round Two, these were classified into high- (n = 4), medium- (n = 15), and low-quality (n = 5) of evidence, followed by categorization by purpose into providing recommendations (n = 10), primary research (n = 7), and educational content (n = 7). In Round Three, the Table of Evidence was revised based on participant feedback. In Round Four, the panel developed a tiered system of evidence integration from immediate incorporation of high-quality sources to more stringent requirements for lower-quality sources.
Conclusion:
The Table of Evidence provides a framework for the rapid and standardized incorporation of new source material into EMS competency assessments. Future goals are to evaluate the application of the Table of Evidence framework in initial and continued competency assessments.
Approximately 80 million people live with chronic hepatitis B virus (HBV) infection in the WHO Africa Region. The natural history of HBV infection in this population is poorly characterised, and may differ from patterns observed elsewhere due to differences in prevailing genotypes, environmental exposures, co-infections, and host genetics. Existing research is largely drawn from small, single-centre cohorts, with limited follow-up time. The Hepatitis B in Africa Collaborative Network (HEPSANET) was established in 2022 to harmonise the process of ongoing data collection, analysis, and dissemination from 13 collaborating HBV cohorts in eight African countries. Research priorities for the next 5 years were agreed upon through a modified Delphi survey prior to baseline data analysis being conducted. Baseline data on 4,173 participants with chronic HBV mono-infection were collected, of whom 38.3% were women and the median age was 34 years (interquartile range 28–42). In total, 81.3% of cases were identified through testing of asymptomatic individuals. HBeAg-positivity was seen in 9.6% of participants. Follow-up of HEPSANET participants will generate evidence to improve the diagnosis and management of HBV in this region.
This paper focuses upon alterity and how we can more fully embrace intimations of otherness in our dealings with prehistoric monuments. Taking as its inspiration recent attempts to explain such structures, and the landscapes of which they were part, it makes two arguments. First, that while ethnographic analogies offer a vital point of departure for thinking through the possibilities raised by alterity and otherness, we may well have been overlooking a rich set of data—derived from careful excavation and painstaking metrical analyses—that has been sitting in front of us for a very long time. Second, despite over a decade of sustained critical debate, we seem remarkably timid when it comes to seeing where these data might take us. Through the lens of two Late Neolithic stone circles from southern Britain (one big, one small), research into measurement units and alignments is allied with recent excavation and survey data in order to explore ideas of hybridity, nomad-geometry and the arresting/manipulation of time and motion. Placing these glimpses of alterity front and centre, they are then used to establish new starting-points for the interpretation of these structures.
Many male prisoners have significant mental health problems, including anxiety and depression. High proportions struggle with homelessness and substance misuse.
Aims
This study aims to evaluate whether the Engager intervention improves mental health outcomes following release.
Method
The design is a parallel randomised superiority trial that was conducted in the North West and South West of England (ISRCTN11707331). Men serving a prison sentence of 2 years or less were individually allocated 1:1 to either the intervention (Engager plus usual care) or usual care alone. Engager included psychological and practical support in prison, on release and for 3–5 months in the community. The primary outcome was the Clinical Outcomes in Routine Evaluation Outcome Measure (CORE-OM), 6 months after release. Primary analysis compared groups based on intention-to-treat (ITT).
Results
In total, 280 men were randomised out of the 396 who were potentially eligible and agreed to participate; 105 did not meet the mental health inclusion criteria. There was no mean difference in the ITT complete case analysis between groups (92 in each arm) for change in the CORE-OM score (1.1, 95% CI –1.1 to 3.2, P = 0.325) or secondary analyses. There were no consistent clinically significant between-group differences for secondary outcomes. Full delivery was not achieved, with 77% (108/140) receiving community-based contact.
Conclusions
Engager is the first trial of a collaborative care intervention adapted for prison leavers. The intervention was not shown to be effective using standard outcome measures. Further testing of different support strategies for prison with mental health problems is needed.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
The Avebury henge is one of the famous megalithic monuments of the European Neolithic, yet much remains unknown about the detail and chronology of its construction. Here, the results of a new geophysical survey and re-examination of earlier excavation records illuminate the earliest beginnings of the monument. The authors suggest that Avebury's Southern Inner Circle was constructed to memorialise and monumentalise the site of a much earlier ‘foundational’ house. The significance here resides in the way that traces of habitation may take on special social and historical value, leading to their marking and commemoration through major acts of monument building.
Many countries face the challenge of an aging population. Development of suitable technologies to support frail elderly living in care homes, sheltered housing or at home remains a concern. Technology evaluation in real-life conditions is often lacking, and randomized controlled trials of ‘pre-designed’ technologies are expensive and fail to deliver. A novel alternative would be ‘living labs’-real-life test and experimentation environments where users and producers co-create innovations and large-scale data can be collected.
Methods:
The goal of the living labs and Data Driven Research and Innovation (DDRI) Programme is to use data driven analytics and insights to support technology development for independent living, healthy aging and more cost-effective care. This involves a cluster of long-term residential care facilities providing 24/7 living lab settings, linked to an embedded innovation hub. DDRI also encompasses private vehicles (e.g. sensors in cars) to enable elderly to drive safely for longer. Collaborations have been established with Universities in England, Scotland and Ireland and with international industry partners.
Results:
Several projects are underway: (i) develop machine learning algorithm from non-intrusive sensor data to build a well-being representation for individual residents/citizens; (ii) evaluate innovative interventions for good sleep environment and nutritional support; and (iii) establish ethics framework to ensure that needs of residents, families and staff are embedded in design, communication, and evaluation of future DDRI projects. In addition, fifteen interdisciplinary doctoral fellowships are in place, six universities are working closely with individual living lab settings, and an innovation hub has been established in one care home for horizon-scanning and strategic technology selection and implementation.
Conclusions:
Over the next five years, a national network of 20 residential living labs with over 1,500 participants will be established. Generation of new user-led technologies, blueprints for capture of individual data at significant scale, and ethical and organizational guidelines will be developed. Intelligent mobility via data capture/feedback in vehicles will be established.
Important Bird and Biodiversity Areas (IBAs) are sites identified as being globally important for the conservation of bird populations on the basis of an internationally agreed set of criteria. We present the first review of the development and spread of the IBA concept since it was launched by BirdLife International (then ICBP) in 1979 and examine some of the characteristics of the resulting inventory. Over 13,000 global and regional IBAs have so far been identified and documented in terrestrial, freshwater and marine ecosystems in almost all of the world’s countries and territories, making this the largest global network of sites of significance for biodiversity. IBAs have been identified using standardised, data-driven criteria that have been developed and applied at global and regional levels. These criteria capture multiple dimensions of a site’s significance for avian biodiversity and relate to populations of globally threatened species (68.6% of the 10,746 IBAs that meet global criteria), restricted-range species (25.4%), biome-restricted species (27.5%) and congregatory species (50.3%); many global IBAs (52.7%) trigger two or more of these criteria. IBAs range in size from < 1 km2 to over 300,000 km2 and have an approximately log-normal size distribution (median = 125.0 km2, mean = 1,202.6 km2). They cover approximately 6.7% of the terrestrial, 1.6% of the marine and 3.1% of the total surface area of the Earth. The launch in 2016 of the KBA Global Standard, which aims to identify, document and conserve sites that contribute to the global persistence of wider biodiversity, and whose criteria for site identification build on those developed for IBAs, is a logical evolution of the IBA concept. The role of IBAs in conservation planning, policy and practice is reviewed elsewhere. Future technical priorities for the IBA initiative include completion of the global inventory, particularly in the marine environment, keeping the dataset up to date, and improving the systematic monitoring of these sites.
The following paper aims to take a critical look at the role that can be played within the broad context of landscape based archaeological research by Geographical Information Systems (GIS). It will be argued that the rapid acceptance of GIS by archaeologists has not been without its problems, with a number of archaeologists wondering whether, despite the hype, any new approaches have been introduced at all. This, it will be argued, is a direct result of GIS-based applications tending to work within a largely inherited theoretical framework and, more importantly, lacking at present a critical theory of practice.
The aim of the paper is move beyond critique to suggest how GIS can provide not only an efficient means of generating simple distribution maps, but a flexible environment within which to bridge developments in theory and practice. Using an on-going case-study centred upon flood events in the palaeo-flood plain of the river Tisza, the implications of using GIS to welcome uncertainty into the analytical environment are explored and a number of approaches advocated. The significance these developments have in expanding our interpretive frameworks is explored through the fore-grounding and challenging of a number of dualistic modes of thought in that area actively encouraged and reinforced by the use of traditional GIS.
Although crop diversity has been identified as essential to enhance global food security and adapt to climate change, high loss of genetic resources is occurring due to agricultural industrialization and market requirements. Value chain development is an emerging market strategy that seeks to simultaneously achieve agrobiodiversity conservation and economic goals, though little empirical evidence exists regarding the extent to which value chains encourage biodiversity maintenance. This study considers the conservation of native potatoes among households in the highlands of Peru where value chain development is being pursued to create market niches for certain native potato varieties. Utilizing a mixed-methods case study approach, the findings of this study indicate that the conservers of native varieties are the households with more endowed resource bases as well as those that sell native varieties in value chains. However, the findings suggest that value chains themselves likely have only a marginal effect on conservation. Native potato conservation and potato production for value chains exist as two separate livelihood activities, and households with more resources are best positioned to engage in both. While value chains allow households to capitalize on the economic value of certain native varieties, the production of other native varieties allows households to fulfill cultural values. Based on these findings, this study concludes that value chain opportunities for native varieties should continue to be identified but they alone are not an adequate strategy to conserve agrobiodiversity. Therefore, in addition to value chain development, a full suite of conservation schemes should be implemented simultaneously.
This paper focuses upon the web of practices and transformations bound up in the extraction and movement of megaliths during the Neolithic of southern Britain. The focus is on the Avebury landscape of Wiltshire, where over 700 individual megaliths were employed in the construction of ceremonial and funerary monuments. Locally sourced, little consideration has been given to the process of acquisition and movement of sarsen stones that make up key monuments such as the Avebury henge and its avenues, attention instead focusing on the middle-distance transportation of sarsen out of this region to Stonehenge. Though stone movements were local, we argue they were far from lacking in significance, as indicated by the subsequent monumentalization of at least two locations from which they were likely acquired. We argue that since such stones embodied place(s), their removal, movement and resetting represented a remarkably dynamic and potentially disruptive reconfiguration of the world as it was known. Megaliths were never inert or stable matter, and we need to embrace this in our interpretative accounts if we are to understand the very different types of monument that emerged in prehistory as a result.
Family carers of people with dementia frequently report acting abusively toward them and carer psychological morbidity predicts this. We investigated whether START (STrAtegies for RelaTives), a psychological intervention which reduces depression and anxiety in family carers also reduces abusive behavior in carers of people living in their own homes. We also explored the longitudinal course of carer abusive behavior over two year.
Methods:
We included self-identified family carers who gave support at least weekly to people with dementia referred in the previous year to three UK mental health services and a neurological dementia service. We randomly assigned these carers to START, an eight-session, manual-based coping intervention, or treatment as usual (TAU). Carer abusive behavior (Modified Conflict Tactic Scale (MCTS) score ≥2 representing significant abuse) was assessed at baseline, 4, 8, 12, and 24 months.
Results:
We recruited 260 carers, 173 to START and 87 to TAU. There was no evidence that abusive behavior levels differed between randomization groups or changed over time. A quarter of carers still reported significant abuse after two years, but those not acting abusively at baseline did not become abusive.
Conclusion:
There was no evidence that START, which reduced carer anxiety and depression, reduced carer abusive behavior. For ethical reasons, we frequently intervened to manage concerning abuse reported in both groups, which may have disguised an intervention effect. Future dementia research should include elder abuse as an outcome, and consider carefully how to manage detected abuse.
As a result of the exclusive use of extremely small megaliths (miniliths), the prehistoric stone settings of Exmoor, south-west England, challenge current approaches to the interpretation of monumental stone architecture during the later Neolithic and Early Bronze Age. Whilst the broader context of the practice of erecting tiny upright stones (a seemingly diverse and widespread phenomenon) and the reasons why this diminutive architecture has tended to escape sustained critical comment have been explored (smaller stone elements being relegated to a generalised background or subsidiary role such as ‘packing’), attempts to explain the settings have been remarkably few. Drawing upon the results of ten years of piecemeal fieldwork on the moor the present paper seeks to rectify this, arguing that far from being generalised ritual structures or metaphorical expressions of hunting groups, the tiny stones were, instead, an integral part of a dynamic human–animal landscape of movement and pause.
Depressed subjects have deficits in facialemotion recognition that resemble the deficits found in persons with focal right hemisphere brain damage. To locate the brain regions responsible for this problem, the authors imaged regional cerebral blood flow (rCBF) with H2O15 positron emission tomography in 10 mood-disordered patients, as well as in 10 age- and sex-matched healthy comparison subjects, while the subjects matched photographs for facial emotion or, as a control, facial identity. While matching faces for emotion, mood-disordered subjects had decreased rCBF activation bilaterally in their temporal lobes, as well as in the right insula, compared with healthy comparison subjects. Abnormal function of limbic and paraiimbic regions may partially explain the facial emotion-recognition deficits previously noted in depressed subjects.