We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Noradrenergic activation in the central and peripheral nervous systems is a putative mechanism explaining the link between hypertension and affective disorders.
Aims
We investigated whether these stress-sensitive comorbidities may be dependent on basal noradrenergic activity and whether vascular responses to centrally acting stimuli vary according to noradrenergic activity.
Method
We examined the relation of affective disorders and stress-mediated vascular responses to plasma concentrations of normetanephrine, a measure of noradrenergic activity, in subjects with primary hypertension (n = 100, mean ± s.d. age 43 ± 11 years, 54% male). The questionnaires Patient Health Questionnaire-9 (PHQ-9), 16-item Quick Inventory of Depressive Symptomatology-Self Report (QIDSSR-16) and Generalized Anxiety Disorder-7 (GAD-7) were used for evaluation of symptoms of depression and anxiety. Forearm blood flow (strain gauge plethysmography) was used to assess vascular responses to mental stress and to device-guided breathing (DGB), interventions that respectively increase or decrease noradrenergic activity in the prefrontal cortex and locus coeruleus.
Results
Low mood and high anxiety were two- to threefold higher for hypertensive subjects in the highest compared with the lowest normetanephrine tertiles (each P < 0.005). Forearm vasodilator responses to mental stress and vasoconstrictor responses to DGB were attenuated in those with high compared with low normetanephrine (28.3 ± 21% v. 47.1 ± 30% increases for mental stress and 3.7 ± 21% v. 18.6 ± 15% decreases for DGB for highest versus lowest tertiles of normetanephrine, each P ≤ 0.01).
Conclusions
A hyperadrenergic state in hypertension is associated with mood disturbance and impaired stress-modulated vasomotor responses. This association may be mediated by chronic stress impinging on pathways regulating central arousal and peripheral sympathetic nerve activity.
Despite debilitating consequences, cancer-associated malnutrition often goes under-detected due to the lack of a standardised diagnostic tool (1). The Global Leadership Initiative on Malnutrition (GLIM) criteria was established in 2018 in order to standardise the diagnosis of malnutrition globally (2). The aim of this study was to determine the association between GLIM-diagnosed malnutrition and overall survival in a large cohort of patients with mixed-cancer types. This is the first study to stratify patients according to treatment intent and is one of the largest studies to identify reduced muscle mass using gold-standard CT analysis of body composition.
Patients receiving anti-cancer treatment for solid tumours were enrolled in a cross-sectional study to examine nutritional status between 2011-2016. The GLIM criteria was retrospectively applied. CT images at the third lumbar vertebrae (L3) were used to quantify skeletal muscle index and categorised according to previously published cut-points. Survival analysis was carried out using Kaplan-Meier curves and Cox-Regression.
Of 1405 patients enrolled, 52.5% were male. Mean age was 62 years (SD:12 years). The most common cancer diagnosis was gastrointestinal (44.5%) and 60.3% had metastatic disease. In total, 40.4% of participants were diagnosed with GLIM-malnutrition (14.8% had stage 1 moderate and 25.6% had stage 2 severe malnutrition). Median follow-up time was 102.4 months (95% CI 99.6–105.2 months). Median survival for those without malnutrition was 30.4 months (95% CI 23.5–37.2 months), versus 11.0 months (95% CI 6.6 – 15.4 months p<0.001) for those with stage 1 moderate and 10.0 months (95% CI 8.1–11.9 months p<0.001) for stage 2 severe malnutrition. Multivariate-analysis (controlling for gender, age, cancer site, GLIM-malnutrition and treatment intent) demonstrated a hazard ratio (HR) of death of 1.499 (95% CI 1.233–1.822, p<0.001) for stage 1 moderate and HR 1.548 (1.322–1.800, p<0.001) for stage 2 severe-malnutrition. The prevalence of stage 2 severe malnutrition was significantly higher in the palliative cohort (receiving supportive measures) (32.7%) when compared to patients being treated with curative intent (18.2%, p=0.004).
This study is one of the largest studies to date which uses CT analysis to accurately identify reduced muscle mass and confirms that the GLIM criteria can be used to predict overall survival in a large mixed-cancer cohort. These findings suggest that malnutrition, regardless of GLIM severity ranking has a significant impact on overall survival. Future research should focus on determining oncology specific cut-points for the GLIM criteria.
Iron deficiency has been associated with heart failure severity and mortality in children and adults. Intravenous iron therapy has been associated with improved outcomes for adults with heart failure. However, little is known about its impact and safety in children. We performed a single-centre review of all intravenous iron sucrose infusions prescribed to hospitalised patients ≤ 21 years of age with a primary cardiac diagnosis from 2020 to 2022. Ninety-one children (median age 6 years, weight 18 kg) received 339 iron sucrose infusions with a median dose of 6.5 mg/kg [5.1 mg/kg, 7.0 mg/kg]. At initial infusion, the majority (n = 63, 69%) had CHD, 70 patients (77%) were being managed by the advanced cardiac therapy team for heart failure, 13 (14%) were listed for heart transplant, 32 (35%) were on at least one vasoactive infusion, and 5 (6%) were supported with a ventricular assist device. Twenty infusions (6%) were associated with 27 possible infusion-related adverse events in 15 patients. There were no episodes of anaphylaxis or life-threatening adverse events. The most common adverse events were hypotension (n = 12), fever (n = 5), tachycardia (n = 3), and nausea/vomiting (n = 3). Eight of 20 infusion-related adverse events required intervention, and two infusions were associated with escalation in a patient’s level of care. Following intravenous iron repletion, patients’ serum iron, serum ferritin, transferrin saturation, and haemoglobin increased (p < 0.05 for all). In children hospitalised with cardiac disease, intravenous iron sucrose repletion is safe and may improve haemoglobin and iron parameters, including transferrin saturation and ferritin levels.
Plasmodium simium, a parasite of platyrrhine monkeys, is known to cause human malaria outbreaks in Southeast Brazil. It has been hypothesized that, upon the introduction of Plasmodium vivax into the Americas at the time of the European colonization, the human parasite adapted to neotropical anophelines of the Kerteszia subgenus and to local monkeys, along the Atlantic coast of Brazil, to give rise to a sister species, P. simium. Here, to obtain new insights into the origins and adaptation of P. simium to new hosts, we analysed whole-genome sequence (WGS) data from 31 P. simium isolates together with a global sequence dataset of 1086 P. vivax isolates. Population genomic analyses revealed that P. simium comprises a discrete parasite lineage with greatest genetic similarity to P. vivax populations from Latin America – especially those from the Amazon Basin of Brazil – and to ancient European P. vivax isolates, consistent with Brazil as the most likely birthplace of the species. We show that P. simium displays half the amount of nucleotide diversity of P. vivax from Latin America, as expected from its recent origin. We identified pairs of sympatric P. simium isolates from monkeys and from humans as closely related as meiotic half-siblings, revealing ongoing zoonotic transmission of P. simium. Most critically, we show that P. simium currently causes most, and possibly all, malarial infections usually attributed to P. vivax along the Serra do Mar Mountain Range of Southeast Brazil.
Enlist E3® soybean is resistant to 2,4-D, glyphosate, and glufosinate, allowing postemergence applications of these herbicides sequentially or as tank mixes. The objectives of this experiment were to evaluate the effect of postemergence herbicide application timing and sequence with or without a preemergence application of micro-encapsulated acetochlor on waterhemp and common lambsquarters control, soybean yield, and economic returns. Field experiments were conducted in Rosemount and Franklin, Minnesota, in 2021 and 2022. Site, herbicide application timing, and sequence influenced weed control, yield, and profitability. In Rosemount, preemergence followed by (fb) two-pass postemergence programs, including 2,4-D + glyphosate applied at mid-postemergence with or without S-metolachlor, resulted in ≥95% waterhemp control at 28 d after late postemergence application. In Franklin, where weed density was lower, two-pass postemergence programs, regardless of preemergence application that included at least one application of 2,4-D + glyphosate (with or without S-metolachlor), provided ≥97% control of waterhemp and common lambsquarters at 28 d after late postemergence. The level of control was comparable to that of a preemergence herbicide fb a mid-postemergence application of 2,4-D + glyphosate + S-metolachlor at that site. In Rosemount, including acetochlor as the preemergence herbicide in the preemergence fb postemergence programs improved soybean yield by 32% and partial returns by US$384.50 ha−1 compared to postemergence herbicides–only programs. In contrast, the preemergence application did not affect yield or profitability in Franklin. The highest soybean yield (2,925.7 kg ha−1) in Rosemount resulted after glufosinate was applied early postemergence fb 2,4-D + glyphosate applied mid-postemergence. This yield was comparable to that of glufosinate applied early postemergence fb 2,4-D + glyphosate + S-metolachlor applied mid-postemergence and the two-pass glufosinate (early postemergence fb mid-postemergence) program, highlighting the importance of early season weed control. In Franklin, 2,4-D + glyphosate + S-metolachlor (applied mid-postemergence) fb glufosinate (applied late postemergence) provided a yield that was similar to the aforementioned programs at that site.
Psychotic disorders are severe mental health conditions frequently associated with long-term disability, reduced quality of life and premature mortality. Early Intervention in Psychosis (EIP) services aim to provide timely, comprehensive packages of care for people with psychotic disorders. However, it is not clear which components of EIP services contribute most to the improved outcomes they achieve.
Aims
We aimed to identify associations between specific components of EIP care and clinically significant outcomes for individuals treated for early psychosis in England.
Method
This national retrospective cohort study of 14 874 EIP individuals examined associations between 12 components of EIP care and outcomes over a 3-year follow-up period, by linking data from the National Clinical Audit of Psychosis (NCAP) to routine health outcome data held by NHS England. The primary outcome was time to relapse, defined as psychiatric inpatient admission or referral to a crisis resolution (home treatment) team. Secondary outcomes included duration of admissions, detention under the Mental Health Act, emergency department and general hospital attendances and mortality. We conducted multilevel regression analyses incorporating demographic and service-level covariates.
Results
Smaller care coordinator case-loads and the use of clozapine for eligible people were associated with reduced relapse risk. Physical health interventions were associated with reductions in mortality risk. Other components, such as cognitive–behavioural therapy for psychosis (CBTp), showed associations with improvements in secondary outcomes.
Conclusions
Smaller case-loads should be prioritised and protected in EIP service design and delivery. Initiatives to improve the uptake of clozapine should be integrated into EIP care. Other components, such as CBTp and physical health interventions, may have specific benefits for those eligible. These findings highlight impactful components of care and should guide resource allocation to optimise EIP service delivery.
Mass Gathering Medicine focuses on mitigating issues at Mass Gathering Events. Medical skills can vary substantially among staff, and the literature provides no specific guidance on staff training. This study highlights expert opinions on minimum training for medical staff to formalize preparation for a mass gathering.
Methods
This is a 3-round Delphi study. Experts were enlisted at Mass Gathering conferences, and researchers emailed participation requests through Stat59 software. Consent was obtained verbally and on Stat59 software. All responses were anonymous. Experts generated opinions. The second and third rounds used a 7-point linear ranking scale. Statements reached a consensus if the responses had a standard deviation (SD) of less than or equal to 1.0.
Results
Round 1 generated 137 open-ended statements. Seventy-three statements proceeded to round 2. 28.7% (21/73) found consensus. In round 3, 40.3% of the remaining statements reached consensus (21/52). Priority themes included venue-specific information, staff orientation to operations and capabilities, and community coordination. Mass casualty preparation and triage were also highlighted as a critical focus.
Conclusions
This expert consensus framework emphasizes core training areas, including venue-specific operations, mass casualty response, triage, and life-saving skills. The heterogeneity of Mass Gatherings makes instituting universal standards challenging. The conclusions highlight recurrent themes of priority among multiple experts.
After Hurricane Ida, faith-based organizations were vital to disaster response. However, this community resource remains understudied. This exploratory study examines local faith-based organizational involvement in storm recovery by evaluating response activities, prevalence and desire for formal disaster education, coordination with other organizations, effect of storm damage on response, and observations for future response.
Methods
An exploratory survey was administered to community leaders throughout the Bayou Region of Louisiana consisting of questions regarding demographics, response efforts, coordination with other organizations, formal disaster training, the impact of storm damage on ability to respond, and insights into future response.
Results
Faith-based organizations are active during storm response. There is a need and desire for formal disaster education. Many organizations experienced storm damage but continued serving their community. Other emerging themes included: importance of clear communications, building stronger relationships with other organizations prior to a disaster, and coordination of resources.
Conclusions
Faith-based organizations serve an important role in disaster response. Though few have formal training, they are ready and present in the area of impact, specifically in hurricane response. In the midst of organizational and personal damage, these organizations respond quickly and effectively to provide a necessary part of the disaster management team.
For multi-scale differential equations (or fast–slow equations), one often encounters problems in which a key system parameter slowly passes through a bifurcation. In this article, we show that a pair of prototypical reaction–diffusion equations in two space dimensions can exhibit delayed Hopf bifurcations. Solutions that approach attracting/stable states before the instantaneous Hopf point stay near these states for long, spatially dependent times after these states have become repelling/unstable. We use the complex Ginzburg–Landau equation and the Brusselator models as prototypes. We show that there exist two-dimensional spatio-temporal buffer surfaces and memory surfaces in the three-dimensional space-time. We derive asymptotic formulas for them for the complex Ginzburg–Landau equation and show numerically that they exist also for the Brusselator model. At each point in the domain, these surfaces determine how long the delay in the loss of stability lasts, that is, to leading order when the spatially dependent onset of the post-Hopf oscillations occurs. Also, the onset of the oscillations in these partial differential equations is a hard onset.
Herbaceous perennials must annually rebuild the aboveground photosynthetic architecture from carbohydrates stored in crowns, rhizomes, and roots. Knowledge of carbohydrate utilization and storage can inform management decisions and improve control outcomes for invasive perennials. We monitored the nonstructural carbohydrates in a population of the hybrid Bohemian knotweed [Polygonum ×bohemicum (J. Chrtek & Chrtková) Zika & Jacobson [cuspidatum × sachalinense]; syn.: Fallopia ×bohemica (Chrtek and Chrtková) J.P. Bailey] and in Japanese knotweed [Polygonum cuspidatum Siebold & Zucc.; syn.: Fallopia japonica (Houtt.) Ronse Decr.]. Carbohydrate storage in crowns followed seasonal patterns typical of perennial herbaceous dicots corresponding to key phenological events. Starch was consistently the highest nonstructural carbohydrate present. Sucrose levels did not show a consistent inverse relationship with starch levels. Lateral distribution of starch in rhizomes and, more broadly, total nonstructural carbohydrates sampled before dormancy break showed higher levels in rhizomes compared with crowns. Total nonstructural carbohydrate levels in crowns reached seasonal lows at an estimated 22.6% of crown dry weight after accumulating 1,453.8 growing degree days (GDD) by the end of June, mainly due to depleted levels of stored starch, with the estimated minimum of 12.3% reached by 1,220.3 GDD accumulated by mid-June. Depletion corresponded to rapid development of vegetative canopy before entering the reproductive phase in August. Maximum starch accumulation in crowns followed complete senescence of aboveground tissues by mid- to late October. Removal of aboveground shoot biomass in late June to early July with removal of vegetation regrowth in early September before senescence would optimize the use of time and labor to deplete carbohydrate reserves. Additionally, foliar-applied systemic herbicide translocation to belowground tissue should be maximized with applications in late August through early fall to optimize downward translocation with assimilate movement to rebuild underground storage reserves. Fall applications should be made before loss of healthy leaf tissue, with the window for control typically ending by late September in Minnesota.
Glufosinate serves as both a primary herbicide option and a complement to glyphosate and other postemergence herbicides for managing herbicide-resistant weed species. Enhancing broadleaf weed control with glufosinate through effective mixtures may mitigate further herbicide resistance evolution in soybean and other glufosinate-resistant cropping systems. Two field experiments were conducted in 2020 and 2021 at four locations in Wisconsin (Arlington, Brooklyn, Janesville, and Lancaster) and one in Illinois (Macomb) to evaluate the effects of postemergence-applied glufosinate mixed with inhibitors of protoporphyrinogen oxidase (PPO) (flumiclorac-pentyl, fluthiacet-methyl, fomesafen, and lactofen; Group 14 herbicides), bentazon (a Group 6 herbicide), and 2,4-D (a Group 4 herbicide) on waterhemp control, soybean phytotoxicity, and yield. The experiments were established in a randomized, complete block design with four replications. The first experiment focused on soybean phytotoxicity 14 d after treatment (DAT) and yield in the absence of weed competition. All treatments received a preemergence herbicide, with postemergence herbicide applications occurring between the V3 and V6 soybean growth stages, depending on the site-year. The second experiment evaluated the effect of herbicide treatments on waterhemp control 14 DAT and on soybean yield. Lactofen, applied alone or with glufosinate, produced the greatest phytotoxicity to soybean at 14 DAT, but this injury did not translate into yield loss. Mixing glufosinate with 2,4-D, bentazon, and PPO-inhibitor herbicides did not increase waterhemp control, nor did it affect soybean yield compared to when glufosinate was applied alone, but it may be an effective practice to reduce selection pressure for glufosinate-resistant waterhemp.
The extent to which legislators pursue their privately held preferences in office has important implications for representative democracy and is exceedingly difficult to measure. Many models of legislative decision-making tacitly assume that members are willing and able to carry out the wishes of their constituents so as to maximize their reelection prospects and, in so doing, relegate their personal preferences. This project explores this assumption by examining the role that members’ place of birth plays in shaping legislative behavior, apart from other politically relevant factors like partisanship. We find that birthplace exerts an independent influence on members’ voting behavior. Using a variety of geographic measures, we find that members who are born in close proximity to one another tend to exhibit similar patterns in roll call voting, even when accounting for partisanship, constituency attributes, and a variety of other determinants of voting. We also demonstrate in a secondary analysis that the agricultural composition of members’ birthplace influences their support for agricultural protection. Our findings suggest that members’ personal history shapes the representational relationship they have with their constituents.
The stellar age and mass of galaxies have been suggested as the primary determinants for the dynamical state of galaxies, with environment seemingly playing no or only a very minor role. We use a sample of 77 galaxies at intermediate redshift ($z\sim0.3$) in the Middle-Ages Galaxies Properties with Integral field spectroscopy (MAGPI) Survey to study the subtle impact of environment on galaxy dynamics. We use a combination of statistical techniques (simple and partial correlations and principal component analysis) to isolate the contribution of environment on galaxy dynamics, while explicitly accounting for known factors such as stellar age, star formation histories, and stellar masses. We consider these dynamical parameters: high-order kinematics of the line-of-sight velocity distribution (parametrised by the Gauss-Hermite coefficients $h_3$ and $h_4$), kinematic asymmetries $V_{\textrm{asym}}$ derived using kinemetry, and the observational spin parameter proxy $\lambda_{R_e}$. Of these, the mean $h_4$ is the only parameter found to have a significant correlation with environment as parametrised by group dynamical mass. This correlation exists even after accounting for age and stellar mass trends. We also find that satellite and central galaxies exhibit distinct dynamical behaviours, suggesting they are dynamically distinct classes. Finally, we confirm that variations in the spin parameter $\lambda_{R_e}$ are most strongly (anti-)correlated with age as seen in local studies, and show that this dependence is well-established by $z\sim0.3$.
Australian children fall short of national dietary guidelines with only 63 % consuming adequate fruit and 10 % enough vegetables. Before school care operates as part of Out of School Hours Care (OSHC) services and provides opportunities to address poor dietary habits in children. The aim of this study was to describe the food and beverages provided in before school care and to explore how service-level factors influence food provision.
Design:
A cross-sectional study was conducted in OSHC services. Services had their before school care visited twice between March and June 2021. Direct observation was used to capture food and beverage provision and child and staff behaviour during breakfast. Interviews with staff collected information on service characteristics. Foods were categorised using the Australian Dietary Guidelines, and frequencies were calculated. Fisher’s exact test was used to compare food provision with service characteristics.
Setting:
The before school care of OSHC services in New South Wales, Australia.
Participants:
Twenty-five OSHC services.
Results:
Fruit was provided on 22 % (n 11) of days and vegetables on 12 % (n 6). Services with nutrition policies containing specific language on food provision (i.e. measurable) were more likely to provide fruit compared with those with policies using non-specific language (P= 0·027). Services that reported receiving training in healthy eating provided more vegetables than those who had not received training (P= 0·037).
Conclusions:
Before school care can be supported to improve food provision through staff professional development and advocating to regulatory bodies for increased specificity requirements in the nutrition policies of service providers.
To compare rates of clinical response in children with Clostridioides difficile infection (CDI) treated with metronidazole vs vancomycin.
Design:
Retrospective cohort study was performed as a secondary analysis of a previously established prospective cohort of hospitalized children with CDI. For 187 participants 2–17 years of age who were treated with metronidazole and/or vancomycin, the primary outcome of clinical response (defined as resolution of diarrhea within 5 days of treatment initiation) was identified retrospectively. Baseline variables associated with the primary outcome were included in a logistic regression propensity score model estimating the likelihood of receiving metronidazole vs vancomycin. Logistic regression using inverse probability of treatment weighting (IPTW) was used to estimate the effect of treatment on clinical response.
Results:
One hundred seven subjects received metronidazole and 80 subjects received vancomycin as primary treatment. There was no univariable association between treatment group and clinical response; 78.30% (N = 83) of the metronidazole treatment group and 78.75% (N = 63) of the vancomycin group achieved clinical response (P = 0.941). After adjustment using propensity scores with IPTW, the odds of a clinical response for participants who received metronidazole was 0.554 (95% CI: 0.272, 1.131) times the odds of those who received vancomycin (P = 0.105).
Conclusions:
In this observational cohort study of pediatric inpatients with CDI, the rate of resolution of diarrhea after 5 days of treatment did not differ among children who received metronidazole vs vancomycin.
We present a 1000 km transect of phase-sensitive radar measurements of ice thickness, basal reflection strength, basal melting and ice-column deformation across the Ross Ice Shelf (RIS). Measurements were gathered at varying intervals in austral summer between 2015 and 2020, connecting the grounding line with the distant ice shelf front. We identified changing basal reflection strengths revealing a variety of basal conditions influenced by ice flow and by ice–ocean interaction at the ice base. Reflection strength is lower across the central RIS, while strong reflections in the near-front and near-grounding line regions correspond with higher basal melt rates, up to 0.47 ± 0.02 m a−1 in the north. Melting from atmospherically warmed surface water extends 150–170 km south of the RIS front. Melt rates up to 0.29 ± 0.03 m a−1 and 0.15 ± 0.03 m a−1 are observed near the grounding lines of the Whillans and Kamb Ice Stream, respectively. Although troublesome to compare directly, our surface-based observations generally agree with the basal melt pattern provided by satellite-based methods but provide a distinctly smoother pattern. Our work delivers a precise measurement of basal melt rates across the RIS, a rare insight that also provides an early 21st-century baseline.
This chapter studies the voting behavior of members of the House of Representatives. If the presence of Fox News in a district shapes potential candidates’ perceptions about district party composition and the constituency’s electoral preferences, there are good chances that the same can be said of sitting House members. Here, of course, the expectation is not about how these perceptions affect the decision to run for office; instead, they affect decisions about how to perform so as to stay in office. Much like potential candidates, sitting members of Congress have to make inferences about what their constituents want. Typically, they make these inferences based on their perceptions of the partisan composition of their district, among other considerations. If sitting members are influenced like potential candidates, Fox News might shift their perceptions in the direction of thinking their district is more right-leaning. Alternatively, based on our evidence from Chapter 3, they might feel more vulnerable to challenges from potential candidates to their (ideological) right. In either case, a reasonable expectation, which we find evidence for, is that member roll call votes will move in a rightward direction, especially among Democrats representing more competitive districts.