We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The nutrition workforce plays a vital role in disease prevention and health promotion, with expanding job opportunities shaped by factors like aging populations, climate change, global food systems, and advancing technologies(1,2). Preparing students for careers that require adaptability involves understanding the valuable skills they possess and identifying any gaps. This research aimed to identify the skills and knowledge valued by students who had recently completed work-based placements, and explore recent graduates’ experiences, challenges, and preparedness for employment. At the end of their work-based placements students give presentations sharing their experiences and learning. Permission was sought from ten students to analyse the recordings of these presentations. The presentations were selected to include a range of nutrition fields, including sports nutrition, public health, community nutrition, dietary counselling, food and industry, and nutrition communication. Additionally, a list of graduates (within four years of graduation) from various fields (as above) was compiled and they were invited to participate. Semi-structured interviews (n=10) were conducted online via Zoom and recorded. The interview guide included open-ended questions on employment experiences, challenges, preparedness, and required skills. The interviews, transcription and analyses were completed by two student researchers between November 2023 and February 2024. Thematic analysis using NVivo software was used to identify themes. The themes developed included the importance of skills relating to; i) communicating complex nutrition concepts to the public, ii) collaborating within diverse teams, iii) identifying and filling personal knowledge gaps. In addition Graduates felt practical experience from their University study boosted their preparedness for the workforce, though many struggled to apply their skills in non-traditional roles and expand their career scope. In summary, ongoing focus on team-based projects, communication with non-science audiences, and strategies for continuous learning using evidence-based sources are crucial for both undergraduate and postgraduate education.
Background: While efgartigimod usage is expected to reduce immunoglobulin (IG) utilization, evidence in clinical practice is limited. Methods: In this retrospective cohort study, patients with gMG treated with efgartigimod for ≥1-year were identified from US medical/pharmacy claims data (April 2016-January 2024) and data from the My VYVGART Path patient support program (PSP). The number of IG courses during 1-year before and after efgartigimod initiation (index date) were evaluated. Patients with ≥6 annual IG courses were considered chronic IG users. Myasthenia Gravis Activities of Daily Living (MG-ADL) scores before and after index were obtained from the PSP where available. Descriptive statistics were used without adjustment for covariates. Results: 167 patients with ≥1 IG claim before index were included. Prior to efgartigimod initiation, the majority of patients (62%) received IG chronically. During the 1-year after index, the number of IG courses fell by 95% (pre: 1531, post: 75). 89% (n=149/167) of patients fully discontinued IG usage. Mean (SD) best-follow up MG-ADL scores were significantly reduced after index (8.0 [4.1] to 2.8 [2.1], P<0.05, n=73/167, 44%). Conclusions: Based on US claims, IG utilization was substantially reduced among patients who continued efgartigimod for ≥1-year, with patients demonstrating a favorable MG-ADL response.
DSM-5 specifies bulimia nervosa (BN) severity based on specific thresholds of compensatory behavior frequency. There is limited empirical support for such severity groupings. Limited support could be because the DSM-5’s compensatory behavior frequency cutpoints are inaccurate or because compensatory behavior frequency does not capture true underlying differences in severity. In support of the latter possibility, some work has suggested shape/weight overvaluation or use of single versus multiple purging methods may be better severity indicators. We used structural equation modeling (SEM) Trees to empirically determine the ideal variables and cutpoints for differentiating BN severity, and compared the SEM Tree groupings to alternate severity classifiers: the DSM-5 indicators, single versus multiple purging methods, and a binary indicator of shape/weight overvaluation.
Methods
Treatment-seeking adolescents and adults with BN (N = 1017) completed self-report measures assessing BN and comorbid symptoms. SEM Trees specified an outcome model of BN severity and recursively partitioned this model into subgroups based on shape/weight overvaluation and compensatory behaviors. We then compared groups on clinical characteristics (eating disorder symptoms, depression, anxiety, and binge eating frequency).
Results
SEM Tree analyses resulted in five severity subgroups, all based on shape/weight overvaluation: overvaluation <1.25; overvaluation 1.25–3.74; overvaluation 3.75–4.74; overvaluation 4.75–5.74; and overvaluation ≥5.75. SEM Tree groups explained 1.63–6.41 times the variance explained by other severity schemes.
Conclusions
Shape/weight overvaluation outperformed the DSM-5 severity scheme and single versus multiple purging methods, suggesting the DSM-5 severity scheme should be reevaluated. Future research should examine the predictive utility of this severity scheme.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
Functional cognitive disorder is an increasingly recognised subtype of functional neurological disorder for which treatment options are currently limited. We have developed a brief online group acceptance and commitment therapy (ACT)-based intervention.
Aims
To assess the feasibility of conducting a randomised controlled trial of this intervention versus treatment as usual (TAU).
Method
The study was a parallel-group, single-blind randomised controlled trial, with participants recruited from cognitive neurology, neuropsychiatry and memory clinics in London. Participants were randomised into two groups: ACT + TAU or TAU alone. Feasibility was assessed on the basis of recruitment and retention rates, the acceptability of the intervention, and signal of efficacy on the primary outcome measure (Acceptance and Action Questionnaire II (AAQ-II)) score, although the study was not powered to demonstrate this statistically. Outcome measures were collected at baseline and at 2, 4 and 6 months post-intervention, including assessments of quality of life, memory, anxiety, depression and healthcare use.
Results
We randomised 44 participants, with a participation rate of 51.1% (95% CI 40.8–61.5%); 36% of referred participants declined involvement, but retention was high, with 81.8% of ACT participants attending at least four sessions, and 64.3% of ACT participants reported being ‘satisfied’ or ‘very satisfied’ compared with 0% in the TAU group. Psychological flexibility as measured using the AAQ-II showed a trend towards modest improvement in the ACT group at 6 months. Other measures (quality of life, mood, memory satisfaction) also demonstrated small to modest positive trends.
Conclusions
It has proven feasible to conduct a randomised controlled trial of ACT versus TAU.
Meaningful medical data are crucial for response teams in the aftermath of disaster. Electronic Medical Record (EMR) systems have revolutionized healthcare by facilitating real-time data collection, storage, and analysis. These capabilities are particularly relevant for post-disaster and austere environments. fEMR, an EMR system designed for such settings, enables rapid documentation of patient information, treatments, and outcomes, ensuring critical data capture.
Objectives:
Data collected through fEMR can be leveraged to perform comprehensive monitoring and evaluation (M&E) of emergency medical services, assess operational needs and efficiency, and support public health syndromic surveillance.
Method/Description:
Analyzing these data identifies patterns and trends or assesses treatment effectiveness. This insight facilitates data-driven decision-making and the optimization of medical protocols. fEMR’s real-time reports enhance situational awareness and operational coordination among response units. The aggregated data can detect trends, classify case-mix, and facilitate after-action reviews, contributing to continuous improvement in emergency preparedness and response strategies. The system also supports fulfilling reporting requirements for health agencies and funding organizations, ensuring accountability and transparency.
Results/Outcomes:
EMRs like fEMR are vital for emergency response teams, supporting immediate patient care and ongoing M&E of disaster response efforts. Its robust data management capabilities support evidence-based practices and strategic planning, improving the effectiveness of emergency medical services in disaster scenarios.
Conclusion:
The effective use of fEMR in disaster response scenarios highlights its significance in enhancing operational efficiency, ensuring accountability, and improving the overall effectiveness of emergency medical services through comprehensive data management and real-time reporting.
Herbaceous perennials must annually rebuild the aboveground photosynthetic architecture from carbohydrates stored in crowns, rhizomes, and roots. Knowledge of carbohydrate utilization and storage can inform management decisions and improve control outcomes for invasive perennials. We monitored the nonstructural carbohydrates in a population of the hybrid Bohemian knotweed [Polygonum ×bohemicum (J. Chrtek & Chrtková) Zika & Jacobson [cuspidatum × sachalinense]; syn.: Fallopia ×bohemica (Chrtek and Chrtková) J.P. Bailey] and in Japanese knotweed [Polygonum cuspidatum Siebold & Zucc.; syn.: Fallopia japonica (Houtt.) Ronse Decr.]. Carbohydrate storage in crowns followed seasonal patterns typical of perennial herbaceous dicots corresponding to key phenological events. Starch was consistently the highest nonstructural carbohydrate present. Sucrose levels did not show a consistent inverse relationship with starch levels. Lateral distribution of starch in rhizomes and, more broadly, total nonstructural carbohydrates sampled before dormancy break showed higher levels in rhizomes compared with crowns. Total nonstructural carbohydrate levels in crowns reached seasonal lows at an estimated 22.6% of crown dry weight after accumulating 1,453.8 growing degree days (GDD) by the end of June, mainly due to depleted levels of stored starch, with the estimated minimum of 12.3% reached by 1,220.3 GDD accumulated by mid-June. Depletion corresponded to rapid development of vegetative canopy before entering the reproductive phase in August. Maximum starch accumulation in crowns followed complete senescence of aboveground tissues by mid- to late October. Removal of aboveground shoot biomass in late June to early July with removal of vegetation regrowth in early September before senescence would optimize the use of time and labor to deplete carbohydrate reserves. Additionally, foliar-applied systemic herbicide translocation to belowground tissue should be maximized with applications in late August through early fall to optimize downward translocation with assimilate movement to rebuild underground storage reserves. Fall applications should be made before loss of healthy leaf tissue, with the window for control typically ending by late September in Minnesota.
This chapter centers on the major descriptive findings of L2 research, focusing on ordered and systematic development. We review and discuss such things as morpheme orders, developmental stages/sequences, unmarked before marked, and U-shaped development, among others. We also review the evidence for L1 influence on ordered development. We touch on the nature of internal (e.g., Universal Grammar, general learning mechanisms) and external constraints (e.g., quantity and quality of input and interaction with that input, frequency) as underlying factors in ordered development. We also briefly touch upon variability during staged development.
In this chapter we review the qualitative difference between explicit knowledge and implicit knowledge (underlying mental representation). The chapter focuses on whether instruction affects the latter. We review the accepted finding that instruction does not affect ordered development. We also review the issue of whether instruction affects rate of development and ultimate attainment. We review important variables in the research on instructed acquisition including type of knowledge measured, the nature of assessments used in the research, and short-term vs. long-term studies, among others.
This chapter lays the foundation for how the field of second language acquisition arose. We briefly review the pioneering work in the late 1950s and 1960s in first language acquisition (e.g., Berko Gleason, Brown, Klima & Bellugi). We also review the generative revolution in linguistics and how it laid the groundwork for the idea of constrained language acquisition. We then review the seminal articles by S. Pit Corder (1967) and Larry Selinker (1972) that posited the major questions in second language acquisition, and end with the pioneering work that mirrored research in first language acquisition (e.g., Dulay & Burt, Krashen, Wode). We end the chapter with the major question that launched second language acquisition research in the early 1970s: Are L1 and L2 acquisition similar or different?
This chapter defines what kind of input contains the data necessary for acquisition (communicatively embedded input) and focuses on its fundamental role in acquisition. Subsequently, we review the claims on the role of output and interaction, focusing on these major issues: Comprehensible output is necessary for acquisition; comprehensible output is beneficial for acquisition; comprehensible output does little to nothing for acquisition. We also discuss the nature of interaction more generally, focusing on whether interaction affects the acquisition of formal features of language.
In this chapter we touch on the idea of inter-learner variability in outcome (i.e., how far learners get) as well as rate of acquisition among different learners. We then link these issues to the idea of individual differences as explanatory factors. We focus on the most studied: motivation, aptitude, and working memory.