We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The European General Practitioners Research Network (EGPRN) designed and validated a comprehensive definition of multimorbidity using a systematic literature review and qualitative research throughout Europe. This survey assessed which criteria in the EGPRN concept of multimorbidity could detect decompensating patients in residential care within a primary care cohort at a six-month follow-up.
Method:
Family physicians included all multimorbid patients encountered in their residential care homes from July to December 2014. Inclusion criteria were those of the EGPRN definition of multimorbidity. Exclusion criteria were patients under legal protection and those unable to complete the 2-year follow-up. Decompensation was defined as the occurrence of death or hospitalization for more than seven days. Statistical analysis was undertaken with uni- and multi-variate analysis at a six-month follow-up using a combination of approaches including both automatic classification and expert decision. A multiple correspondence analysis and a hierarchical clustering on principal components confirmed the consistency of the results. Finally, a logistic regression was performed to identify and quantify risk factors for decompensation.
Findings: About 12 family physicians participated in the study. In the study, 64 patients were analyzed. On analyzing the characteristics of the participants, two statistically significant variables between the two groups (decompensation and Nothing To Report): pain (p = 0.004) and the use of psychotropic drugs (p = 0.019) were highlighted. The final model of the logistic regression showed pain as the main decompensation risk factor.
Conclusion:
Action should be taken by the health teams and their physicians to prevent decompensation in patients in residential care who are experiencing pain.
We investigate the consequences of periodic, on–off glucose infusion on the glucose–insulin regulatory system based on a system-level mathematical model with two explicit time delays. Studying the effects of such infusion protocols is mathematically challenging yet a promising direction for probing the system response to infusion. We pay special attention to the interplay of periodic infusion with intermediate-time-scale, ultradian oscillations that arise as a result of the physiological response of glucose uptake and back-release into the bloodstream. By using numerical solvers and numerical continuation software, we investigate the response of the model to different infusion patterns, explore how these patterns affect the overall levels of glucose and insulin, and how this can lead to entrainment. By doing so, we provide a road-map of system responses that can potentially help identify new, less-invasive, test strategies for detecting abnormal responses to glucose uptake without falling into lockstep with the infusion pattern.
The European General Practitioners Research Network (EGPRN) designed and validated a comprehensive definition of multimorbidity using a systematic literature review and qualitative research throughout Europe. Identification of risk factors for decompensation would be an interesting challenge for family physicians (FPs) in the management of multimorbid patients. The aim was to assess which items from the EGPRN’s definition of multimorbidity could identify outpatients at risk of decompensation at 24 months.
Methods:
A cohort study. About 120 multimorbid patients from Western Brittany, France, were included by general practitioners between 2014 and 2015. The status “decompensation” (hospitalization of at least 7 days or death) or “nothing to report (NTR)” was collected at 24 months of follow-up.
Findings:
At 24 months, there were 44 patients (36.6%) in the decompensation group. Two variables were significant risk factors for decompensation: the number of visits to the FP per year (HR = 1.06 [95% CI 1.03–1.10], P < 0.001) and the total number of diseases (HR = 1.12 [95% CI 1.013–1.33], P = 0.039).
Conclusion:
FPs should be warned that a high number of consultations and a high total number of diseases may predict death or hospitalization. These results need to be confirmed by large-scale cohorts in primary care.
The RDA for dietary protein is likely insufficient for individuals with cystic fibrosis (CF). This study sought to characterise protein intake and diet quality in adults with cystic fibrosis (awCF), before and after elexacaftor/tezacaftor/ivacaftor (ETI) therapy, compared with healthy controls. Dietary intake was assessed by diet diary in awCF at baseline (BL, n 40) and at follow-up > 3 months post ETI therapy (follow-up (FUP), n 40) and in age-matched healthy controls (CON, n 80) free from known disease at a single time point. Protein intake dose and daily distribution, protein quality, protein source and overall diet quality were calculated for each participant. Both CON (1·39 (sd 0·47) g·kg–1·day–1) and CF (BL: 1·44 (sd 0·52) g·kg–1·day–1, FUP: 1·12 (sd 0·32) g·kg–1·day–1) had a higher mean daily protein intake than the protein RDA of 0·75g·kg–1·day–1. There was a significant reduction in daily protein intake in the CF group at FUP (P = 0·0003, d = 0·73), with levels below the alternative suggested dietary intake of ≥ 1·2 g·kg–1·day–1. There were no sex differences or noticeable effects on protein quality or source following the commencement of ETI therapy when compared with CON (all P > 0·05), although overall diet quality decreased between time points (P = 0·027, d = 0·57). The observed reduction in daily protein intake in the present cohort emphasises the importance of ensuring appropriate dietary protein intake to promote healthy ageing in adults with CF. More research is needed to evidence base dietary protein requirements in this at-risk population.
Previous studies have shown that repetitive transcranial magnetic stimulation (rTMS) can treat suicidal symptoms; however, the effects of rTMS on suicidal ideation (SI) in late-life depression (LLD) have not been well-characterized, particularly with theta burst stimulation (TBS).
Methods
Data were analyzed from 84 older adults with depression from the FOUR-D trial (ClinicalTrials.gov identifier: NCT02998580), who received either bilateral standard rTMS or bilateral TBS targeting the dorsolateral prefrontal cortex. The primary outcome was change in the Beck Scale for Suicide Ideation (SSI). The secondary outcome was remission of SI. Demographic, cognitive, and clinical characteristics that may moderate the effects of rTMS or TBS on SI were explored.
Results
There was a statistically significant change in the total SSI score over time [χ2(7) = 136.018, p < 0.001], with no difference between the two treatment groups. Remission of SI was 55.8% in the standard rTMS group and 53.7% in the TBS group. In the standard rTMS group, there was no difference in remission of SI between males and females, whereas remission was higher in females in the TBS group (χ2(1) =6.87, p = 0.009). There was a significant correlation between time to remission of SI and RCI z-score for D-KEFS inhibition/switching [rs = −0.389, p = 0.012].
Conclusions
Both bilateral rTMS and bilateral TBS were effective in reducing SI in LLD. There may be sex differences in response to TBS, with females having more favorable response in reducing SI. There may be an association between improvement in cognitive flexibility and inhibition and reduction of SI.
This chapter presents an overview of social interaction, technology, and language learning within the context of a cross-cultural exchange project. Interaction with others and being an active participant in an environment where the language is used is crucial to language learning. We will first look at social interaction and situate it in the context of language teaching and learning. Next, we present some primary themes of social interaction and discuss the practices that inform the role social interaction plays in collaborative projects in language classes. We provide examples of how technological tools were used to facilitate virtual social interaction between language students in France and the United States. Finally, the chapter concludes by offering insights for cross-cultural projects that prioritize social interaction.
Older adults with treatment-resistant depression (TRD) benefit more from treatment augmentation than switching. It is useful to identify moderators that influence these treatment strategies for personalised medicine.
Aims
Our objective was to test whether age, executive dysfunction, comorbid medical burden, comorbid anxiety or the number of previous adequate antidepressant trials could moderate the superiority of augmentation over switching. A significant moderator would influence the differential effect of augmentation versus switching on treatment outcomes.
Method
We performed a preplanned moderation analysis of data from the Optimizing Outcomes of Treatment-Resistant Depression in Older Adults (OPTIMUM) randomised controlled trial (N = 742). Participants were 60 years old or older with TRD. Participants were either (a) randomised to antidepressant augmentation with aripiprazole (2.5–15 mg), bupropion (150–450 mg) or lithium (target serum drug level 0.6 mmol/L) or (b) switched to bupropion (150–450 mg) or nortriptyline (target serum drug level 80–120 ng/mL). Treatment duration was 10 weeks. The two main outcomes of this analysis were (a) symptom improvement, defined as change in Montgomery–Asberg Depression Rating Scale (MADRS) scores from baseline to week 10 and (b) remission, defined as MADRS score of 10 or less at week 10.
Results
Of the 742 participants, 480 were randomised to augmentation and 262 to switching. The number of adequate previous antidepressant trials was a significant moderator of depression symptom improvement (b = −1.6, t = −2.1, P = 0.033, 95% CI [−3.0, −0.1], where b is the coefficient of the relationship (i.e. effect size), and t is the t-statistic for that coefficient associated with the P-value). The effect was similar across all augmentation strategies. No other putative moderators were significant.
Conclusions
Augmenting was superior to switching antidepressants only in older patients with fewer than three previous antidepressant trials. This suggests that other intervention strategies should be considered following three or more trials.
Cortical excitability has been proposed as a novel neurophysiological marker of neurodegeneration in Alzheimer’s dementia (AD). However, the link between cortical excitability and structural changes in AD is not well understood.
Objective:
To assess the relationship between cortical excitability and motor cortex thickness in AD.
Methods:
In 62 participants with AD (38 females, mean ± SD age = 74.6 ± 8.0) and 47 healthy control (HC) individuals (26 females, mean ± SD age = 71.0 ± 7.9), transcranial magnetic stimulation resting motor threshold (rMT) was determined, and T1-weighted MRI scans were obtained. Skull-to-cortex distance was obtained manually for each participant using MNI coordinates of the motor cortex (x = −40, y = −20, z = 52).
Results:
The mean skull-to-cortex distances did not differ significantly between participants with AD (22.9 ± 4.3 mm) and HC (21.7 ± 4.3 mm). Participants with AD had lower motor cortex thickness than healthy individuals (t(92) = −4.4, p = <0.001) and lower rMT (i.e., higher excitability) than HC (t(107) = −2.0, p = 0.045). In the combined sample, rMT was correlated positively with motor cortex thickness (r = 0.2, df = 92, p = 0.036); however, this association did not remain significant after controlling for age, sex and diagnosis.
Conclusions:
Patients with AD have decreased cortical thickness in the motor cortex and higher motor cortex excitability. This suggests that cortical excitability may be a marker of neurodegeneration in AD.
This Element offers a primer for the study of meaning in a Construction Grammar approach. It reviews the main principles of meaning shared across constructionist frameworks, including its ubiquity in grammatical structure, its usage-based formation, and its nature as the output of cognitive representations. It also reviews the importance given to meaning in construction-based explanations of sentence composition, innovative language use, and language change. Paradoxically, the Element shows that there is no systematic framework delineating the rich structure of constructional meaning, which has led to theoretical disagreements and inconsistencies. It therefore proposes an operational model of meaning for practitioners of Construction Grammar. It details the characteristics of a complex interface of semantic, pragmatic, and social meaning, and shows how this framework sheds light on recent theoretical issues. The Element concludes by considering ways in which this framework can be used for future descriptive and theoretical research questions.
An expanding scholarship on interracial intimacy in colonial contexts has generally focused on cases of administrative disputes or judicial conflicts that brought “mixed couples,” “mixed-race families,” or “métis” to the attention of colonial authorities. But what of the lives and experiences of those who did not contest their legal status, remaining under the administrative radar and thus virtually invisible in the archives of the colonial state? This article tackles these issues in the context of the French colony of New Caledonia by analyzing the trajectory of a household established out of official sight, made up of a French settler, a Kanak woman, and their descendants. The goal here is to understand what this phenomenon of relative social invisibility reveals about the scope and limits of colonial domination “at ground level.” Combining ethnographic fieldwork and archival research, the article traces the conditions under which this family configuration was able to emerge and then endure for over fifty years. It finally disappeared after the death of the French settler, when each of the wider family groups—European and Kanak—to varying degrees sought to efface this awkward past within their respective social worlds.
This study provides a comprehensive analysis of the snow and avalanche climate of the Chic-Chocs region of the Gaspé Peninsula, located in the northeastern Appalachians of eastern Canada. The data revealed two major components of the snow and avalanche climate: a cold snow cover combined with a maritime influence causing melt/ice layers through rain-on-snow events. The CRCM6-SNOWPACK model chain was good at representing the seasonal mean of climatic indicators, snow grain type and an avalanche problem type that well represented the investigated snow and avalanche climate of the study region. The global comparison shows that the snow and avalanche climate is different from other areas in western North America, but similar to Mount Washington (New Hampshire, USA) and central Japan. We show a clustering based solely on avalanche problem types, which showed that the onset date of wet snow problems divided most of the winters into three clusters. We compare these clusters with the French Alps and show some similarities, moving away from a traditional snow and avalanche climate description. The paper concludes that the use of advanced snow cover modeling combined with avalanche problem type characterization represents a suitable method to improve our understanding and classification of snow and avalanche climates for avalanche related problems, ultimately contributing to improved forecasting and risk management in similar regions.
In this study, we tackle the challenge of inferring the initial conditions of a Rayleigh–Taylor mixing zone for modelling purposes by analysing zero-dimensional (0-D) turbulent quantities measured at an unspecified time. This approach assesses the extent to which 0-D observations retain the memory of the flow, evaluating their effectiveness in determining initial conditions and, consequently, in predicting the flow’s evolution. To this end, we generated a comprehensive dataset of direct numerical simulations, focusing on miscible fluids with low density contrasts. The initial interface deformations in these simulations are characterised by an annular spectrum parametrised by four non-dimensional numbers. To study the sensitivity of 0-D turbulent quantities to initial perturbation distributions, we developed a surrogate model using a physics-informed neural network (PINN). This model enables computation of the Sobol indices for the turbulent quantities, disentangling the effects of the initial parameters on the growth of the mixing layer. Within a Bayesian framework, we employ a Markov chain Monte Carlo (MCMC) method to determine the posterior distributions of initial conditions and time, given various state variables. This analysis sheds light on inertial and diffusive trajectories, as well as the progressive loss of initial conditions memory during the transition to turbulence. Furthermore, it identifies which turbulent quantities serve as better predictors of Rayleigh–Taylor mixing zone dynamics by more effectively retaining the memory of the flow. By inferring initial conditions and forward propagating the maximum a posteriori (MAP) estimate, we propose a strategy for modelling the Rayleigh–Taylor transition to turbulence.
Objectives/Goals: Manual skin assessment in chronic graft-versus-host disease (cGVHD) can be time consuming and inconsistent (>20% affected area) even for experts. Building on previous work we explore methods to use unmarked photos to train artificial intelligence (AI) models, aiming to improve performance by expanding and diversifying the training data without additional burden on experts. Methods/Study Population: Common to many medical imaging projects, we have a small number of expert-marked patient photos (N = 36, n = 360), and many unmarked photos (N = 337, n = 25,842). Dark skin (Fitzpatrick type 4+) is underrepresented in both sets; 11% of patients in the marked set and 9% in the unmarked set. In addition, a set of 20 expert-marked photos from 20 patients were withheld from training to assess model performance, with 20% dark skin type. Our gold standard markings were manual contours around affected skin by a trained expert. Three AI training methods were tested. Our established baseline uses only the small number of marked photos (supervised method). The semi-supervised method uses a mix of marked and unmarked photos with human feedback. The self-supervised method uses only unmarked photos without any human feedback. Results/Anticipated Results: We evaluated performance by comparing predicted skin areas with expert markings. The error was given by the absolute difference between the percentage areas marked by the AI model and expert, where lower is better. Across all test patients, the median error was 19% (interquartile range 6 – 34) for the supervised method and 10% (5 – 23) for the semi-supervised method, which incorporated unmarked photos from 83 patients. On dark skin types, the median error was 36% (18 – 62) for supervised and 28% (14 – 52) for semi-supervised, compared to a median error on light skin of 18% (5 – 26) for supervised and 7% (4 – 17) for semi-supervised. Self-supervised, using all 337 unmarked patients, is expected to further improve performance and consistency due to increased data diversity. Full results will be presented at the meeting. Discussion/Significance of Impact: By automating skin assessment for cGVHD, AI could improve accuracy and consistency compared to manual methods. If translated to clinical use, this would ease clinical burden and scale to large patient cohorts. Future work will focus on ensuring equitable performance across all skin types, providing fair and accurate assessments for every patient.
Geriatric (old age) psychiatry faces growing challenges amid Europe’s ageing population. This editorial emphasises the need for specialised training, mentorship and subspecialty recognition to attract young psychiatrists. By addressing structural gaps and fostering innovation, the field offers a rewarding career in enhancing older adults’ mental healthcare and quality of life.
The intuition that knowledge requires the satisfaction of some sort of anti-luck condition is widely shared. I examine the claim that modal robustness is sufficient for satisfying this condition: for a true belief to be non-luckily true, it is sufficient that this belief is safe and sensitive. I argue that this claim is false by arguing that, at least when it comes to beliefs in necessary truths, satisfying the anti-luck condition requires satisfying a non-modal condition. I also advance a plausible candidate for this condition and argue for the implausibility of mathematical Platonism on this basis.
There is a growing focus on understanding the complexity of dietary patterns and how they relate to health and other factors. Approaches that have not traditionally been applied to characterise dietary patterns, such as latent class analysis and machine learning algorithms, may offer opportunities to characterise dietary patterns in greater depth than previously considered. However, there has not been a formal examination of how this wide range of approaches has been applied to characterise dietary patterns. This scoping review synthesised literature from 2005 to 2022 applying methods not traditionally used to characterise dietary patterns, referred to as novel methods. MEDLINE, CINAHL and Scopus were searched using keywords including latent class analysis, machine learning and least absolute shrinkage and selection operator. Of 5274 records identified, 24 met the inclusion criteria. Twelve of twenty-four articles were published since 2020. Studies were conducted across seventeen countries. Nine studies used approaches with applications in machine learning, such as classification models, neural networks and probabilistic graphical models, to identify dietary patterns. The remaining studies applied methods such as latent class analysis, mutual information and treelet transform. Fourteen studies assessed associations between dietary patterns characterised using novel methods and health outcomes, including cancer, cardiovascular disease and asthma. There was wide variation in the methods applied to characterise dietary patterns and in how these methods were described. The extension of reporting guidelines and quality appraisal tools relevant to nutrition research to consider specific features of novel methods may facilitate consistent reporting and enable synthesis to inform policies and programs.
Cumulative exposure to anticholinergic and sedative medications has been associated with worsening physical function in older adults. We evaluated the feasibility of measuring physical function using wearable devices and explored the impact of reducing the anticholinergic and sedative medication burden in a pilot study of community-dwelling adults aged 60 years and older. Evaluations included the 10-meter walk test (10MWT), the Short Physical Performance Battery (SPPB), and the mini-BESTest. Two participants/month were recruited in one clinic in 2022. The five participants had a median age of 67, a median DBI of 1.7, and four were female. The feasibility analysis showed that the 10MWT and SPPB tests were completed on 12/12, and the mini-BESTest on 11/12. An exploratory analysis showed clinically meaningful improvements in gait speed (mean +0.18 m/s) and SPPB (mean +2.2 points). We showed the feasibility of measuring physical function by wearable devices during deprescribing of anticholinergic and sedative medications.
The diet proposed by the EAT-Lancet Commission has faced criticism concerning its affordability. This study aimed to investigate the cost associated with a greater alignment to the EAT-Lancet reference diet in the province of Québec, Canada. The dietary habits of 1147 French-speaking adults were assessed using repeated web-based 24-h recall data collected between 2015 and 2017 in the cross-sectional PRÉDicteurs Individuels, Sociaux et Environnementaux (PREDISE) study. Diet costs were calculated using a Nielsen food price database. Usual dietary intakes and diet costs were estimated using the National Cancer Institute’s multivariate Markov Chain Monte Carlo method. Adherence to the EAT-Lancet diet was assessed using the EAT-Lancet dietary index (EAT-I). Associations between diet costs and EAT-I scores were evaluated using linear regression models with restricted cubic splines. After adjustment for energy intake, a higher EAT-I score (75th v. 25th percentiles) was associated with a 1·0 $CAD increase in daily diet costs (95 % CI, 0·7, 1·3). This increase in diet costs was mostly driven by the following component scores of the EAT-I (75th v. 25th percentiles, higher scores reflecting greater adherence): vegetables (1·6 $CAD/d, 95 % CI: 1·2, 2·1), free sugars (1·6 $CAD/d, 95 % CI: 1·3, 1·9), fish and plant-based proteins (1·4 $CAD/d, 95 % CI: 1·0, 1·8), fruits (0·9 $CAD/d, 95 % CI: 0·4, 1·3) and whole grains (0·4 $CAD/d, 95 % CI: 0·0, 0·8). Inversely, a greater score for the poultry and eggs component was associated with reduced diet costs (–1·2 $CAD/d, 95 % CI: −1·7, −0·7). This study suggests that adhering to the EAT-Lancet diet may be associated with an increase in diet costs in the province of Québec.