We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We present a timeseries of 14CO2 for the period 1910–2021 recorded by annual plants collected in the southwestern United States, centered near Flagstaff, Arizona. This timeseries is dominated by five commonly occurring annual plant species in the region, which is considered broadly representative of the southern Colorado Plateau. Most samples (1910–2015) were previously archived herbarium specimens, with additional samples harvested from field experiments in 2015–2021. We used this novel timeseries to develop a smoothed local record with uncertainties for “bomb spike” 14C dating of recent terrestrial organic matter. Our results highlight the potential importance of local records, as we document a delayed arrival of the 1963–1964 bomb spike peak, lower values in the 1980s, and elevated values in the last decade in comparison to the most current Northern Hemisphere Zone 2 record. It is impossible to retroactively collect atmospheric samples, but archived annual plants serve as faithful scribes: samples from herbaria around the Earth may be an under-utilized resource to improve understanding of the modern carbon cycle.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Method
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Results
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
Conclusions
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.
Current dam discharge patterns in Noxon Rapids Reservoir reduce concentration and exposure times (CET) of herbicides used for aquatic plant management. Herbicide applications during periods of low dam discharge may increase herbicide CETs and improve efficacy. Applications of rhodamine WT dye were monitored under peak (736 to 765 m3 s−1) and minimum (1.4 to 2.8 m3 s−1) dam discharge patterns to quantify water-exchange processes. Whole-plot dye half-life under minimal discharge was 33 h, a 15-fold increase compared with the dye treatment during peak discharge. Triclopyr concentrations measured during minimum discharge within the treated plot ranged from 214 ± 25 to 1,243 ± 36 µg L−1 from 0 to 48 h after treatment (HAT), respectively. Endothall concentrations measured during minimum discharge in the same plot ranged from 164 ± 78 to 2,195 ± 1,043 µg L−1 from 0 to 48 HAT, respectively. Eurasian watermilfoil (Myriophyllum spicatum L.) occurrence in the treatment plot was 66%, 8%, and 14% during pretreatment, 5 wk after treatment (WAT), and 52 WAT, respectively. Myriophyllum spicatum occurrence in the nontreated plot was 68%, 71%, and 83% during pretreatment, 5 WAT, and 52 WAT, respectively. Curlyleaf pondweed (Potamogeton crispus L.) occurrence in the treatment plot was 29%, 0%, and 97% during pretreatment, 5 WAT, and 52 WAT, respectively. Potamogeton crispus increased from 24% to 83% at 0 WAT to 52 WAT, respectively, in the nontreated plot. Native species richness declined from 3.3 species per point to 2.1 in the treatment plot in the year of treatment but returned to pretreatment numbers by 52 WAT. Native species richness did not change during the study in the nontreated reference plot. Herbicide applications during periods of low flow can increase CETs and improve control, whereas applications during times of high-water flow would shorten CETs and could result in reduced treatment efficacy.
The COVID-19 pandemic has shone a spotlight on how health outcomes are unequally distributed among different population groups, with disadvantaged communities and individuals being disproportionality affected in terms of infection, morbidity and mortality, as well as vaccine access. Recently, there has been considerable debate about how social disadvantage and inequality intersect with developmental processes to result in a heightened susceptibility to environmental stressors, economic shocks and large-scale health emergencies. We argue that DOHaD Society members can make important contributions to addressing issues of inequality and improving community resilience in response to COVID-19. In order to do so, it is beneficial to engage with and adopt a social justice framework. We detail how DOHaD can align its research and policy recommendations with a social justice perspective to ensure that we contribute to improving the health of present and future generations in an equitable and socially just way.
This paper presents the current state of mathematical modelling of the electrochemical behaviour of lithium-ion batteries (LIBs) as they are charged and discharged. It reviews the models developed by Newman and co-workers, both in the cases of dilute and moderately concentrated electrolytes and indicates the modelling assumptions required for their development. Particular attention is paid to the interface conditions imposed between the electrolyte and the active electrode material; necessary conditions are derived for one of these, the Butler–Volmer relation, in order to ensure physically realistic solutions. Insight into the origin of the differences between various models found in the literature is revealed by considering formulations obtained by using different measures of the electric potential. Materials commonly used for electrodes in LIBs are considered and the various mathematical models used to describe lithium transport in them discussed. The problem of upscaling from models of behaviour at the single electrode particle scale to the cell scale is addressed using homogenisation techniques resulting in the pseudo-2D model commonly used to describe charge transport and discharge behaviour in lithium-ion cells. Numerical solution to this model is discussed and illustrative results for a common device are computed.
There are significant drug–drug interactions between human immunodeficiency virus antiretroviral therapy and intranasal steroids, leading to high serum concentrations of iatrogenic steroids and subsequently Cushing's syndrome.
Method
All articles in the literature on cases of intranasal steroid and antiretroviral therapy interactions were reviewed. Full-length manuscripts were analysed and the relevant data were extracted.
Results
A literature search and further cross-referencing yielded a total of seven reports on drug–drug interactions of intranasal corticosteroids and human immunodeficiency virus protease inhibitors, published between 1999 and 2019.
Conclusion
The use of potent steroids metabolised via CYP3A4, such as fluticasone and budesonide, are not recommended for patients taking ritonavir or cobicistat. Mometasone should be used cautiously with ritonavir because of pharmacokinetic similarities to fluticasone. There was a delayed onset of symptoms in many cases, most likely due to the relatively lower systemic bioavailability of intranasal fluticasone.
Alexander Disease (AD) is a rare and ultimately lethal leukodystrophy, typically presenting in infants who exhibit developmental delay, macrocephaly, seizures, spasticity and quadriparesis. Classic infantile forms are generally due to sporadic mutations in GFAP that result in the massive deposition of intra-astrocytic Rosenthal fibres, particularly in the frontal white matter. However, phenotypic manifestations are broad and include both juvenile and adult forms that often display infratentorial pathology and a paucity of leukodystrophic features. We describe the unique case of an 8.5 year old female who presented with an 8 month history of progressively worsening vomiting and cachexia, whose extensive multidisciplinary systemic workup, including GI biopsies, proved negative. Neuroimaging ultimately revealed bilaterally symmetric and anterior predominant supratentorial signal alterations in the white matter plus a 1.7 x 1.2 x 0.7 mm right dorsal medullary mass. Biopsy of this presumed low-grade glioma revealed features in keeping with AD, which was later confirmed on whole exome sequencing. The proband exhibited a pathogenic p.Arg239Cys heterozygous missense mutation in GFAP, which was apparently inherited from her asymptomatic mother (1% mosaicism in the mother’s blood). Germline mosaic inheritance patterns of young-onset AD, particularly those presenting with a tumor-like mass of the brainstem, are scarcely reported in the literature and serve to expand the clinicopathologic spectrum of AD.
LEARNING OBJECTIVES
This presentation with enable the learner to:
1. Recognize an uncommon clinical presentation of AD.
2. Describe the underlying genetics of AD, including a rare familial juvenile onset form featuring germline mosaicism.
A Doyle–Fuller–Newman (DFN) model for the charge and discharge of nano-structured lithium iron phosphate (LFP) cathodes is formulated on the basis that lithium transport within the nanoscale LFP electrode particles is much faster than cell discharge, and is therefore not rate limiting. We present some numerical solutions to the model and show that for relevant parameter values, and a variety of C-rates, it is possible for sharp discharge fronts to form and intrude into the electrode from its outer edge(s). These discharge fronts separate regions of fully utilised LFP electrode particles from those that are not. Motivated by this observation an asymptotic solution to the model is sought. The results of the asymptotic analysis of the DFN model lead to a reduced order model, which we term the reaction front model (or RFM). Favourable agreement is shown between solutions to the RFM and the full DFN model in appropriate parameter regimes. The RFM is significantly cheaper to solve than the DFN model, and therefore has the potential to be used in scenarios where computational costs are prohibitive, e.g. in optimisation and parameter estimation problems or in engineering control systems.
Wild radish (Raphanus raphanistrum L.) is a weed found globally in agricultural systems. The facultative winter annual nature of this plant and high genetic variability makes modeling its growth and phenology difficult. In the present study, R. raphanistrum natural seedbanks exhibited a biphasic pattern of emergence, with emergence peaks occurring in both fall and spring. Traditional sigmoidal models were inadequate to fit this pattern, regardless of the predictive environmental variable, and a corresponding biphasic model (sigmoidal + Weibull) was used to describe emergence based on the best parameters. Each best-fit chronological, thermal, and hydrothermal model accounted for at least 85% of the variation of the validation data. Observations on phenology progression from four cohorts were used to create a common model that described all cohorts adequately. Different phenological stages were described using chronological, thermal, hydrothermal, daylength-dependent thermal time, and daylength-dependent hydrothermal time. Integrating daylength and temperature into the models was important for predicting reproductive stages of R. raphanistrum.
This project will work closely with existing service partners involved in street level services and focus on testing and evaluating three approaches for street level interventions for youth who are homeless and who have severe or moderate mentally illness. Youth will be asked to choose their preferred service approach:
Housing First related initiatives focused on interventions designed to move youth to appropriate and available housing and ongoing housing supports.
Treatment First initiatives to provide Mental Health/Addiction supports and treatment solutions, and; Simultaneous attention to both Housing and Treatment Together
Our primary objective is to understand the service delivery preferences of homeless youth and understand the outcomes of these choices. Our research questions include:
1. Which approaches to service are chosen by youth?
2. What are the differences and similarities between groups choosing each approach?
3. What are the critical ingredients needed to effectively implement services for homeless youth from the perspectives of youth, families and service providers?
Focus groups with staff and family members will occur to assist in understanding the nature of each of service approach, changes that evolve within services, & facilitators and barriers to service delivery. This work will be important in determining which approach is chosen by youth and why. Evaluating the outcomes with each choice will provide valuable information about outcomes for the service options chosen by youth. This assist in better identifying weaknesses in the services offered and inform further development of treatment options that youth will accept.
Gene × environment (G × E) interactions in eating pathology have been increasingly investigated, however studies have been limited by sample size due to the difficulty of obtaining genetic data.
Objective
To synthesize existing G × E research in the eating disorders (ED) field and provide a clear picture of the current state of knowledge with analyses of larger samples.
Method
Complete data from seven studies investigating community (n = 1750, 64.5% female) and clinical (n = 426, 100% female) populations, identified via systematic review, were included. Data were combined to perform five analyses: 5-HTTLPR × Traumatic Life Events (0–17 events) to predict ED status (n = 909), 5-HTTLPR × Sexual and Physical Abuse (n = 1097) to predict bulimic symptoms, 5-HTLPR × Depression to predict bulimic symptoms (n = 1256), and 5-HTTLPR × Impulsivity to predict disordered eating (n = 1149).
Results
The low function (s) allele of 5-HTTLPR interacted with number of traumatic life events (P < .01) and sexual and physical abuse (P < .05) to predict increased likelihood of an ED in females but not males (Fig. 1). No other G × E interactions were significant, possibly due to the medium to low compatibility between datasets (Fig. 1).
Conclusion
Early promising results suggest that increased knowledge of G × E interactions could be achieved if studies increased uniformity of measuring ED and environmental variables, allowing for continued collaboration to overcome the restrictions of obtaining genetic samples.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Given the common view that pre-exercise nutrition/breakfast is important for performance, the present study investigated whether breakfast influences resistance exercise performance via a physiological or psychological effect. Twenty-two resistance-trained, breakfast-consuming men completed three experimental trials, consuming water-only (WAT), or semi-solid breakfasts containing 0 g/kg (PLA) or 1·5 g/kg (CHO) maltodextrin. PLA and CHO meals contained xanthan gum and low-energy flavouring (approximately 122 kJ), and subjects were told both ‘contained energy’. At 2 h post-meal, subjects completed four sets of back squat and bench press to failure at 90 % ten repetition maximum. Blood samples were taken pre-meal, 45 min and 105 min post-meal to measure serum/plasma glucose, insulin, ghrelin, glucagon-like peptide-1 and peptide tyrosine-tyrosine concentrations. Subjective hunger/fullness was also measured. Total back squat repetitions were greater in CHO (44 (sd 10) repetitions) and PLA (43 (sd 10) repetitions) than WAT (38 (sd 10) repetitions; P < 0·001). Total bench press repetitions were similar between trials (WAT 37 (sd 7) repetitions; CHO 39 (sd 7) repetitions; PLA 38 (sd 7) repetitions; P = 0·130). Performance was similar between CHO and PLA trials. Hunger was suppressed and fullness increased similarly in PLA and CHO, relative to WAT (P < 0·001). During CHO, plasma glucose was elevated at 45 min (P < 0·05), whilst serum insulin was elevated (P < 0·05) and plasma ghrelin suppressed at 45 and 105 min (P < 0·05). These results suggest that breakfast/pre-exercise nutrition enhances resistance exercise performance via a psychological effect, although a potential mediating role of hunger cannot be discounted.
Individual organisms on land and in the ocean sequester massive amounts of the carbon emitted into the atmosphere by humans. Yet the role of ecosystems as a whole in modulating this uptake of carbon is less clear. Here, we study several different mechanisms by which climate change and ecosystems could interact. We show that climate change could cause changes in ecosystems that reduce their capacity to take up carbon, further accelerating climate change. More research on – and better governance of – interactions between climate change and ecosystems is urgently required.
Sleep disturbances are prevalent in cancer patients, especially those with advanced disease. There are few published intervention studies that address sleep issues in advanced cancer patients during the course of treatment. This study assesses the impact of a multidisciplinary quality of life (QOL) intervention on subjective sleep difficulties in patients with advanced cancer.
Method
This randomized trial investigated the comparative effects of a multidisciplinary QOL intervention (n = 54) vs. standard care (n = 63) on sleep quality in patients with advanced cancer receiving radiation therapy as a secondary endpoint. The intervention group attended six intervention sessions, while the standard care group received informational material only. Sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI) and Epworth Sleepiness Scale (ESS), administered at baseline and weeks 4 (post-intervention), 27, and 52.
Results
The intervention group had a statistically significant improvement in the PSQI total score and two components of sleep quality and daytime dysfunction than the control group at week 4. At week 27, although both groups showed improvements in sleep measures from baseline, there were no statistically significant differences between groups in any of the PSQI total and component scores, or ESS. At week 52, the intervention group used less sleep medication than control patients compared to baseline (p = 0.04) and had a lower ESS score (7.6 vs. 9.3, p = 0.03).
Significance of results
A multidisciplinary intervention to improve QOL can also improve sleep quality of advanced cancer patients undergoing radiation therapy. Those patients who completed the intervention also reported the use of less sleep medication.
Feed represents a substantial proportion of production costs in the dairy industry and is a useful target for improving overall system efficiency and sustainability. The objective of this study was to develop methodology to estimate the economic value for a feed efficiency trait and the associated methane production relevant to Canada. The approach quantifies the level of economic savings achieved by selecting animals that convert consumed feed into product while minimizing the feed energy used for inefficient metabolism, maintenance and digestion. We define a selection criterion trait called Feed Performance (FP) as a 1 kg increase in more efficiently used feed in a first parity lactating cow. The impact of a change in this trait on the total lifetime value of more efficiently used feed via correlated selection responses in other life stages is then quantified. The resulting improved conversion of feed was also applied to determine the resulting reduction in output of emissions (and their relative value based on a national emissions value) under an assumption of constant methane yield, where methane yield is defined as kg methane/kg dry matter intake (DMI). Overall, increasing the FP estimated breeding value by one unit (i.e. 1 kg of more efficiently converted DMI during the cow’s first lactation) translates to a total lifetime saving of 3.23 kg in DMI and 0.055 kg in methane with the economic values of CAD $0.82 and CAD $0.07, respectively. Therefore, the estimated total economic value for FP is CAD $0.89/unit. The proposed model is robust and could also be applied to determine the economic value for feed efficiency traits within a selection index in other production systems and countries.
Following publication, errors were discovered in the y-axis labels of the electron and hole concentration plots in the following figure panels: figure 4c, figure 4d, figure 5c, figure 5d, figure 6c, figure 6d, figure 8c and figure 8d. The error does not affect the description, analysis or conclusions. The correct representation of the figure panels are shown here.
Introduction: Although use of point of care ultrasound (PoCUS) protocols for patients with undifferentiated hypotension in the Emergency Department (ED) is widespread, our previously reported SHoC-ED study showed no clear survival or length of stay benefit for patients assessed with PoCUS. In this analysis, we examine if the use of PoCUS changed fluid administration and rates of other emergency interventions between patients with different shock types. The primary comparison was between cardiogenic and non-cardiogenic shock types. Methods: A post-hoc analysis was completed on the database from an RCT of 273 patients who presented to the ED with undifferentiated hypotension (SBP <100 or shock index > 1) and who had been randomized to receive standard care with or without PoCUS in 6 centres in Canada and South Africa. PoCUS-trained physicians performed scans after initial assessment. Shock categories and diagnoses recorded at 60 minutes after ED presentation, were used to allocate patients into subcategories of shock for analysis of treatment. We analyzed actual care delivered including initial IV fluid bolus volumes (mL), rates of inotrope use and major procedures. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: Although there were expected differences in the mean fluid bolus volume between patients with non-cardiogenic and cardiogenic shock, there was no difference in fluid bolus volume between the control and PoCUS groups (non-cardiogenic control 1878 mL (95% CI 1550 – 2206 mL) vs. non-cardiogenic PoCUS 1687 mL (1458 – 1916 mL); and cardiogenic control 768 mL (194 – 1341 mL) vs. cardiogenic PoCUS 981 mL (341 – 1620 mL). Likewise there were no differences in rates of inotrope administration, or major procedures for any of the subcategories of shock between the control group and PoCUS group patients. The most common subcategory of shock was distributive. Conclusion: Despite differences in care delivered by subcategory of shock, we did not find any significant difference in actual care delivered between patients who were examined using PoCUS and those who were not. This may help to explain the previously reported lack of outcome difference between groups.