We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The scope of unconscious processing has long been, and still remains, a hotly debated issue. This is driven in part by the current diversity of methods to manipulate and measure perceptual consciousness. Here, we provide ten recommendations and nine outstanding issues about designing experimental paradigms, analyzing data, and reporting the results of studies on unconscious processing. These were formed through dialogue among a group of researchers representing a range of theoretical backgrounds. We acknowledge that some of these recommendations naturally do not align with some existing approaches and are likely to change following theoretical and methodological development. Nevertheless, we hold that at this stage of the field they are instrumental in evoking a much-needed discussion about the norms of studying unconscious processes and helping researchers make more informed decisions when designing experiments. In the long run, we aim for this paper and future discussions around the outstanding issues to lead to a more convergent corpus of knowledge about the extent – and limits – of unconscious processing.
Social media platforms such as Instagram and TikTok have transformed how individuals seek and engage with dietary advice, leading to the rapid propagation of unverified nutrition information that can result in poor dietary choices, contribute to disordered eating and broader health issues(1). This study aims to measure the accuracy and frequency of nutrition misinformation, identify the types of thematic misinformation, and understand user engagement metrics on Instagram and TikTok. A mixed-methods approach was employed, analysing 500 posts (250 from each platform) collected over a six-month period from September 15, 2023, to March 15, 2024, using keywords like healthy eating, healthy food, diet, and weight loss. Posts were evaluated using a modified, previously developed, Social Media Evaluation Checklist(2) and evidence coding framework to measure accuracy. Results indicated a higher prevalence of misinformation on TikTok compared to Instagram, with a significant portion (p < 0.05) of users lacking relevant credentials. The most common characteristic was cooking/recipes and meal plans. TikTok posts with mostly accurate and completely inaccurate information had higher engagement than Instagram posts. TikTok had a larger proportion of completely inaccurate posts (10.8%), mainly related to weight loss. Descriptive statistics for engagement metrics showed TikTok posts had significantly (p < 0.05) higher likes and comments compared to Instagram posts across various accuracy levels. For Instagram, completely accurate posts had a mean of 4,318 likes and 40 comments, while mostly accurate posts had a mean of 25,153 likes and 186 comments. For TikTok, completely accurate posts had a mean of 146,327 likes and 423 comments, mostly accurate posts had a mean of 75,804 likes and 483 comments. An independent t-test revealed significant (p < 0.05) differences in likes and comments between platforms for posts with varying accuracy levels. The study highlights the need for social media platforms to verify the qualifications of individuals providing nutrition advice and implement measures to promote accurate information. Further research is necessary to develop effective strategies to mitigate the spread of misinformation via social media platforms and its impact on public health.
In RISE, TV46000 once monthly (q1m) or once every 2 months (q2m) significantly extended time to impending schizophrenia relapse. The current study (SHINE, NCT03893825) evaluated the long-term safety, tolerability, and effect of TV46000.
Methods
Patients completing RISE without relapse (rollover) or newly recruited (de novo) were eligible. The de novo and placebo rollover cohorts were randomized 1:1 to q1m or q2m for ≤56 weeks; the TV46000 rollover cohort continued assigned regimen. Exploratory efficacy endpoints included time to impending relapse and patient centered outcomes (PCOs) including Schizophrenia Quality of Life Scale (SQLS).
Results
334 patients were randomized and received TV46000 q1m (n=172) or q2m (n=162), for 202.3 patient-years [PY] of TV-46000 treatment. Treatment-emergent adverse events (AEs) reported for ≥5% of patients were: overall–injection site pain (event rate/100 PY, n [%]; 23.23, 16 [5%]); de novo (n=109)–injection site pain (56.10, 11 [10%]), injection site nodule (16.03, 6 [6%]), blood creatine phosphokinase increased (16.03, 8 [7%]), urinary tract infection (10.69, 7 [6%]); placebo rollover (n=53)–tremor (18.50, 5 [9%]); TV46000 rollover (n=172)–headache (7.97, n=8 [5%]). Serious AEs reported for ≥2 patients were worsening schizophrenia and hyperglycemia. Kaplan– Meier estimates for remaining relapse-free at week 56 were 0.98 (2% risk; q1m) and 0.88 (12%; q2m). SQLS improved for q1m (least-squares mean change [SE], − 2.16 [0.98]) and q2m (− 0.43 [0.98]); other PCOs (5Level EuroQoL 5Dimensions Questionnaire, Personal and Social Performance Scale, Drug Attitudes Inventory 10-item version) remained stable.
Conclusions
TV-46000 had a favorable long-term benefit–risk profile in patients with schizophrenia.
Working memory encompasses the limited incoming information that can be held in mind for cognitive processing. To date, we have little information on the effects of bilingualism on working memory because, absent evidence, working memory tasks cannot be assumed to measure the same constructs across language groups. To garner evidence regarding the measurement equivalence in Spanish and English, we examined second-grade children with typical development, including 80 bilingual Spanish–English speakers and 167 monolingual English speakers in the United States, using a test battery for which structural equation models have been tested – the Comprehensive Assessment Battery for Children – Working Memory (CABC-WM). Results established measurement invariance across groups up to the level of scalar invariance.
Background: Patients with an acute ischemic stroke (AIS) are selected to receive reperfusion therapy using either computed tomography (CT-CTA) or magnetic brain imaging (MRI). The aim of this study was to compare CT and MRI as the primary imaging modality for AIS patients undergoing EVT. Methods: Data for AIS patients between January 2018 and January 2021 were extracted from two prospective multicenter EVT cohorts: the ETIS registry in France (MRI) and the OPTIMISE registry in Canada (CT). Demographics, procedural data and outcomes were collected. We assessed the association of qualifying imaging (CT vs. MRI) with time metrics and functional outcome. Results: From January 2018 to January 2021, 4059 patients selected by MRI and 1324 patients selected by CT were included in the study. Demographics were similar between the two groups. The median imaging-to-arterial puncture time was 37 minutes longer in the MRI group. Patients selected by CT had more favorable 90-day functional outcomes (mRS 0-2) as compared to patients selected by MRI (48.5% vs 44.4%; adjusted OR (aOR), 1.54, 95%CI 1.31 to 1.80, p<0.001). Conclusions: Patients with AIS undergoing EVT who were selected with MRI as opposed to CT had longer imaging-to-arterial-puncture delays and worse functional outcomes at 90 days.
Obesity is a significant health issue in Aotearoa; effective and pragmatic strategies to facilitate weight loss are urgently required. Growing recognition of the circadian rhythm’s impact on metabolism has popularised diets like time-restricted eating (TRE)(1). The 16:8 TRE method involves limiting food intake to an 8-hour daily eating window and can lead to weight loss without other substantial changes to diet(2). Nonetheless, TRE requires accountability and tolerating hunger for short periods. Continuous glucose monitors (CGM) are small wearable biofeedback devices that measure interstitial glucose levels scanned via smartphones. By providing immediate feedback on the physiological effects of eating and fasting, CGM use may promote adherence to TRE(3). This pilot study aimed to 1) investigate how CGM affects adherence to TRE and 2) assess the feasibility of CGM use while undertaking TRE. This two-arm randomised controlled trial enrolled healthy adults from Dunedin, assigning them to TRE-only or TRE+CGM groups for 14 days. Successful adherence to TRE was defined a priori as maintaining an 8-hour eating window on 80% of days. CGM feasibility was defined a priori as scanning the glucose monitor thrice daily on 80% of days. Secondary outcomes included well-being, anthropometry, glucose levels, and overall TRE and CGM experiences via semi-structured interviews. Twenty-two participants were randomised into two groups: TRE-only (n = 11) and TRE+CGM (n = 11, with n = 2 excluded from analysis post-randomisation for medical reasons). Participants had a diverse range of ethnicities, the mean age was 32 (+/-14.9) years, and 55% were female. The TRE+CGM group adhered to the 8-hour eating window for an average of 10.0 days (range 2-14) compared with 8.6 days (range 2-14) in the TRE-only group. Both groups had similar mean eating window durations of 8.1 hours. Five (56%) participants in the TRE+CGM group achieved the a priori criteria for TRE adherence, compared to 3 (27%) in the TRE-only group. Participants in the TRE+CGM group performed an average of 8.2 (+/-5.6) daily scans, with n = 7 (78%) of participants meeting the a priori CGM feasibility criteria. Neither group reported consistent adverse psychological impacts in DASS-21 and WHO-5 scores. Interviews highlighted that CGM increased hunger tolerance during fasting as participants felt reassured by their normal glucose levels. CGM aided TRE accountability by acting as a biological tracker of food intake. Participants reported that TRE led to improved energy and self-efficacy, a more productive daily routine, and healthier food choices. Promisingly, 72% of participants would use CGM and undertake TRE in future. This study demonstrates that using CGM while undertaking TRE is feasible and can improve adherence by enhancing hunger tolerance and accountability. Overall, participants experienced increased awareness of eating habits and physiological mechanisms. Over the longer term, this simple and synergistic approach may be a helpful weight loss strategy.
Adequate dietary fibre (DF) intake is recommended to relieve constipation and improve gut health(1). It is often assumed that individuals with constipation have relatively low DF intake and do not meet the recommended adequate intake of 25 g and 30 g for females and males, respectively. The 2008/09 New Zealand Adult Nutrition Survey confirmed that the mean DF was 17.9 grams (g) per day for females and 22.8 g per day for males, which was well below the recommended adequate intake(2). With the continuous shift of dietary patterns over time, we sought to compare the current usual DF intake of two cohorts of New Zealand adults: those who have constipation with those without constipation but with relatively low DF intake. We report baseline dietary data from two randomised controlled dietary studies (Kiwifruit Ingestion to Normalise Gut Symptoms (KINGS) (ACTRN12621000621819) and Bread Related Effects on microbiAl Distribution (BREAD) (ACTRN12622000884707)) conducted in Christchurch, New Zealand in 2021 and 2022, respectively. The KINGS study included adults with either functional constipation or constipation-predominant irritable bowel syndrome to consume either two green kiwifruit or maltodextrin for four weeks. The BREAD study is a crossover study and included healthy adults without constipation but with relatively low DF intake (<18 g for females, <22 g for males) to consume two types of bread with different DF content, each bread for four weeks separated by a two-week washout period. All participants completed a non-consecutive three-day food diary at baseline. Dietary data were entered into FoodWorks Online Professional (Xyris Software Australia, 2021) to assess mean daily DF intake. Fifty-six adults from the KINGS study (n = 48 females, n= 8 males; mean age ± standard deviation: 42.8 ± 12.6 years) and BREAD study (n = 33 females, n= 23 males; mean age: 40.4 ± 13.4 years) completed a baseline food diary. In the KINGS study, females with constipation had a daily mean DF intake of 25.0 ± 9.4 g whilst male participants consumed 26.9 ± 5.0 g per day. In the BREAD study, females without constipation had a mean daily DF intake of 19.4 ± 5.8 g, whereas males had 22.6 ± 8.5 g per day. There was a statistically significant difference in the mean daily DF intake between females with constipation and those without constipation (p < 0.001) but not between males (p = 0.19). These two studies found that DF intakes among females with constipation were not as relatively low as previously assumed, as they met their adequate intake of 25 g. Further data analysis from the KINGS and BREAD studies will reveal the effects of using diet to manage constipation and promote better gut health in these two cohorts of New Zealand adults.
Nutrition intervention trials play a key role in informing clinical and dietary guidelines. Within these trials, we need participants to change their behaviours; however, researchers seldom systematically consider how to support participants with these changes, contributing to poor adherence. Here we evaluate how using a behaviour change framework to develop support within a dietary intervention impacts young adults’ adherence to required trial behaviours. In the Protein Diet Satisfaction (PREDITION) trial, 80 young adults were randomised to a flexitarian or vegetarian diet for 10-weeks to investigate the psychological and cardiometabolic effects of moderate lean red meat consumption as part of a balanced diet(1). To understand these outcomes, it was key that participants within the trial (i) ate a healthy, basal vegetarian diet (excluding meat, poultry, and fish not provided by research team) and (ii) reported their dietary intake daily on a smartphone application (required to evaluate intervention compliance). To enhance adherence to these behaviours the Nine Principles framework was used to develop behaviour change support (BCS)(2). Key components of the BCS included access to a dietitian-led Facebook group, text reminders, and food delivery. Effectiveness was measured using the following analyses of the 78 participants who completed the study: pre-post change in targeted dietary habits over time using a subscore of the Healthy Diet Habits Index, adherence score to reporting over 10-weeks, Facebook group engagement, and impact evaluation. Analysis included linear imputation modelling, t-tests, and chi-square analysis. The total Healthy Diet Habits Index subscore out of 16 significantly increased from baseline to week 10 (10.6 ± 2.6 to 11.2 ± 2.6, p = 0.011), demonstrating maintenance of a healthy diet. Overall adherence to reporting was high across the 10 weeks, with the total population mean reporting score 90.4 ± 14.6 out of a possible 100. This strengthens study validity, allowing us to confidently report if participants complied with study requirements of consuming the intervention protein (red meat or plant-based meat alternatives) on top of a basal vegetarian diet. Although relatively low active Facebook engagement was observed (on average <1 ‘react’ per post), most participants agreed the text messages and Facebook groups supported them to adhere to recording (63%) and eating healthily (60%), respectively. This is the first study to provide an example of how a framework can be used to systematically develop, implement, and assess BCS within a nutrition trial. This appears to be a promising way to enhance adherence to study-related behaviours, including the burdensome task of reporting dietary intake. We believe this has great potential to improve research validity and decrease resource waste, not only for the PREDITION trial but in future dietary intervention trials.
The diagnosis of functional constipation (FC) relies on patient-reported outcomes evaluated as criteria based on the clustering of symptoms. Although the ROME IV criteria for FC diagnosis is relevant for a multicultural population(1), how an individual’s lifestyle, environment and culture may influence the pathophysiology of FC remains a gap in our knowledge. Building on insights into mechanisms underpinning disorders of gut-brain interactions (formerly functional gastrointestinal disorders) in the COMFORT Cohort(2), this study aimed to investigate the differences in gastrointestinal (GI) symptom scores among participants with FC in comparison to healthy controls between Chinese and non-Chinese New Zealanders. The Gastrointestinal Understanding of Functional Constipation In an Urban Chinese and Urban non-Chinese New Zealander Cohort (GUTFIT) study was a longitudinal cohort study, which aimed to determine a comprehensive profile of characteristics and biological markers of FC between Chinese and non-Chinese New Zealanders. Chinese (classified according to maternal and paternal ethnicity) or non-Chinese (mixed ethnicities) adults living in Auckland classified as with or without FC based on ROME IV were enrolled. Monthly assessment (for 3 months) of GI symptoms, anthropometry, quality of life, diet, and biological samples were assessed monthly over March to June 2023. Demographics were obtained through a self-reported questionnaires and GI symptoms were assessed using the Gastrointestinal Symptom Rating Scale (GSRS) and Structured Assessment of Gastrointestinal Symptoms Scale (SAGIS). This analysis is a cross-sectional assessment of patient-reported outcomes of GI symptoms. Of 78 enrolled participants, 66 completed the study (male, n = 10; female, n = 56) and were distributed across: Chinese with FC (Ch-FC; n = 11), Chinese control (Ch-CON; n = 19), non-Chinese with FC (NCh-FC; n = 16), non-Chinese control (NCh-CON; n = 20). Mean (SD) age, body mass index, and waist circumference were 40 ± 9 years, 22.7 ± 2.5 kg/m2, and 78.0 ± 7.6 cm, respectively. Ethnicity did not impact SAGIS domain scores for GI symptoms (Ethnicity x FC severity interaction p>0.05). Yet, the constipation symptoms domain of the GSRS was scored differently depending on ethnicity and FC status (Ethnicity x FC interaction p<0.05). In post hoc comparison, NCh-FC tended to have higher GSRS constipation severity scores than Ch-FC (3.4 ± 1.0 versus 3.8 ± 0.8 /8, p<0.1) Although constipation symptom severity tended to be higher in NCh-FC, on the whole, ethnicity did not explain variation in this cohort. FC status was a more important predictor of GI symptoms scores. Future research will assess differences in symptom burden to explore ethnicity-specific characteristics of FC.
Distinct pathophysiology has been identified with disorders of gut-brain interactions (DGBI), including functional constipation (FC)(1,2), yet the causes remain unclear. Identifying how modifiable factors (i.e., diet) differ depending on gastrointestinal health status is important to understand relationships between dietary intake, pathophysiology, and disease burden of FC. Given that dietary choices are culturally influenced, understanding ethnicity-specific diets of individuals with FC is key to informing appropriate symptom management and prevention strategies. Despite distinct genetic and cultural features of Chinese populations with increasing FC incidence(3), DGBI characteristics are primarily described in Caucasian populations(2). We therefore aimed to identify how dietary intake of Chinese individuals with FC differs to non-Chinese individuals with FC, relative to healthy controls. The Gastrointestinal Understanding of Functional Constipation In an Urban Chinese and Urban non-Chinese New Zealander Cohort (GUTFIT) study was a longitudinal case-control study using systems biology to investigate the multi-factorial aetiology of FC. Here we conducted a cross-sectional dietary intake assessment, comparing Chinese individuals with FC (Ch-FC) against three control groups: a) non-Chinese with FC (NCh-FC) b) Chinese without FC (Ch-CON) and c) non-Chinese without FC (NCh-CON). Recruitment from Auckland, New Zealand (NZ) identified Chinese individuals based on self-identification alongside both parents self-identifying as Chinese, and FC using the ROME IV criteria. Dietary intake was captured using 3-day food diaries recorded on consecutive days, including one weekend day. Nutrient analysis was performed by Foodworks 10 and statistical analysis with SPSS using a generalised linear model (ethnicity and FC status as fixed factors). Of 78 enrolled participants, 66 completed the study and 64 (39.4 ± 9.2 years) completed a 3-day food diary at the baseline assessment. More participants were female (84%) than male (16%). FC and ethnicity status allocated participants into 1 of 4 groups: Ch-FC (n = 11), Ch-CON (n = 18), NCh-FC (n = 16), NCh-CON (n = 19). Within NCh, ethnicities included NZ European (30%), non-Chinese Asian (11%), Other European (11%), and Latin American (2%). Fibre intake did not differ between Ch-FC and NCh-FC (ethnicity × FC status interaction p>0.05) but was independently lower overall for FC than CON individuals (21.8 ± 8.7 versus 27.0 ± 9.7 g, p<0.05) and overall for Ch than NCh (22.1 ± 8.0 versus 27.0 ± 10.4 g, p<0.05). Carbohydrate, protein, and fat intakes were not different across groups (p>0.05 each, respectively). In the context of fibre and macronutrient intake, there is no difference between Ch-FC and NCh-FC. Therefore, fibre and macronutrients are unlikely to contribute to potential pathophysiological differences in FC between ethnic groups. A more detailed assessment of dietary intake concerning micronutrients, types of fibre, or food choices may be indicated to ascertain whether other dietary differences exist.
OBJECTIVES/GOALS: To tackle population-level health disparities, quality dashboards can leverage individual socioeconomic status (SES) measures, which are not always readily accessible. This study aimed to assess the feasibility of a population health management strategy for colorectal cancer (CRC) screening rates using the HOUSES index and heatmap analysis. METHODS/STUDY POPULATION: We applied the 2019 Minnesota Community Measurement data for optimal CRC screening to eligible Mayo Clinic Midwest panel patients. SES was defined by HOUSES index, a validated SES measure based on publicly available property data for the U.S. population. We first assessed the association of suboptimal CRC screening rate with HOUSES index adjusting for age, sex, race/ethnicity, comorbidity, and Zip-code level deprivation by using a mixed effects logistic regression model. We then assessed changes in ranking for performance of individual clinics (i.e., % of patients with optimal CRC screening rate) before and after adjusting for HOUSES index. Geographical hotspots of high proportions of low SES AND high proportions of suboptimal CRC screening were superimposed to identify target population for outreach. RESULTS/ANTICIPATED RESULTS: A total of 58,382 adults from 41 clinics were eligible for CRC screening assessment in 2019 (53% Female). Patients with lower SES defined by HOUSES quartile 1-3 have significantly lower CRC screening compared to those with highest SES (HOUSES quartile 4) (adj. OR [95% CI]: 0.52 [0.50-0.56] for Q1, 0.66 [0.62-0.70] for Q2, and 0.81 [0.76-0.85]) for Q3). Ranking of 26 out of 41 (63%) clinics went down after adjusting for HOUSES index suggesting disproportionately higher proportion of underserved patients with suboptimal CRC screening. We were able to successfully identify hotspots of suboptimal CRC (area with greater than 130% of expected value) and overlay with higher proportion of underserved population (HOUSES Q1), which can be used for data-driven targeted interventions such as mobile health clinics. DISCUSSION/SIGNIFICANCE: HOUSES index and associated heatmap analysis can contribute to advancing health equity. This approach can aid health care organizations in meeting the newly established standards by The Joint Commission, which have elevated health equity to a national safety priority.
To investigate the symptoms of SARS-CoV-2 infection, their dynamics and their discriminatory power for the disease using longitudinally, prospectively collected information reported at the time of their occurrence. We have analysed data from a large phase 3 clinical UK COVID-19 vaccine trial. The alpha variant was the predominant strain. Participants were assessed for SARS-CoV-2 infection via nasal/throat PCR at recruitment, vaccination appointments, and when symptomatic. Statistical techniques were implemented to infer estimates representative of the UK population, accounting for multiple symptomatic episodes associated with one individual. An optimal diagnostic model for SARS-CoV-2 infection was derived. The 4-month prevalence of SARS-CoV-2 was 2.1%; increasing to 19.4% (16.0%–22.7%) in participants reporting loss of appetite and 31.9% (27.1%–36.8%) in those with anosmia/ageusia. The model identified anosmia and/or ageusia, fever, congestion, and cough to be significantly associated with SARS-CoV-2 infection. Symptoms’ dynamics were vastly different in the two groups; after a slow start peaking later and lasting longer in PCR+ participants, whilst exhibiting a consistent decline in PCR- participants, with, on average, fewer than 3 days of symptoms reported. Anosmia/ageusia peaked late in confirmed SARS-CoV-2 infection (day 12), indicating a low discrimination power for early disease diagnosis.
We identify a set of essential recent advances in climate change research with high policy relevance, across natural and social sciences: (1) looming inevitability and implications of overshooting the 1.5°C warming limit, (2) urgent need for a rapid and managed fossil fuel phase-out, (3) challenges for scaling carbon dioxide removal, (4) uncertainties regarding the future contribution of natural carbon sinks, (5) intertwinedness of the crises of biodiversity loss and climate change, (6) compound events, (7) mountain glacier loss, (8) human immobility in the face of climate risks, (9) adaptation justice, and (10) just transitions in food systems.
Technical summary
The Intergovernmental Panel on Climate Change Assessment Reports provides the scientific foundation for international climate negotiations and constitutes an unmatched resource for researchers. However, the assessment cycles take multiple years. As a contribution to cross- and interdisciplinary understanding of climate change across diverse research communities, we have streamlined an annual process to identify and synthesize significant research advances. We collected input from experts on various fields using an online questionnaire and prioritized a set of 10 key research insights with high policy relevance. This year, we focus on: (1) the looming overshoot of the 1.5°C warming limit, (2) the urgency of fossil fuel phase-out, (3) challenges to scale-up carbon dioxide removal, (4) uncertainties regarding future natural carbon sinks, (5) the need for joint governance of biodiversity loss and climate change, (6) advances in understanding compound events, (7) accelerated mountain glacier loss, (8) human immobility amidst climate risks, (9) adaptation justice, and (10) just transitions in food systems. We present a succinct account of these insights, reflect on their policy implications, and offer an integrated set of policy-relevant messages. This science synthesis and science communication effort is also the basis for a policy report contributing to elevate climate science every year in time for the United Nations Climate Change Conference.
Social media summary
We highlight recent and policy-relevant advances in climate change research – with input from more than 200 experts.
Older brain age – as estimated from structural MRI data – is known to be associated with detrimental mental and physical health outcomes in older adults. Social isolation, which has similar detrimental effects on health, may be associated with accelerated brain aging though little is known about how different trajectories of social isolation across the life course moderate this association. We examined the associations between social isolation trajectories from age 5 to age 38 and brain age assessed at age 45.
Methods
We previously created a typology of social isolation based on onset during the life course and persistence into adulthood, using group-based trajectory analysis of longitudinal data from a New Zealand birth cohort. The typology comprises four groups: ‘never-isolated’, ‘adult-only’, ‘child-only’, and persistent ‘child-adult’ isolation. A brain age gap estimate (brainAGE) – the difference between predicted age from structural MRI date and chronological age – was derived at age 45. We undertook analyses of brainAGE with trajectory group as the predictor, adjusting for sex, family socio-economic status, and a range of familial and child-behavioral factors.
Results
Older brain age in mid-adulthood was associated with trajectories of social isolation after adjustment for family and child confounders, particularly for the ‘adult-only’ group compared to the ‘never-isolated’ group.
Conclusions
Although our findings are associational, they indicate that preventing social isolation, particularly in mid-adulthood, may help to avert accelerated brain aging associated with negative health outcomes later in life.
Premixed turbulent flames, encountered in power generation and propulsion engines, are an archetype of a randomly advected, self-propagating surface. While such a flame is known to exhibit large-scale intermittent flapping, the possible intermittency of its small-scale fluctuations has been largely disregarded. Here, we experimentally reveal the inner intermittency of a premixed turbulent V-flame, while clearly distinguishing this small-scale feature from large-scale outer intermittency. From temporal measurements of the fluctuations of the flame, we find a frequency spectrum that has a power-law subrange with an exponent close to $-2$, which is shown to follow from Kolmogorov phenomenology. Crucially, however, the moments of the temporal increment of the flame position are found to scale anomalously, with exponents that saturate at higher orders. This signature of small-scale inner intermittency is shown to originate from high-curvature, cusp-like structures on the flame surface, which have significance for modelling the heat release rate and other key properties of premixed turbulent flames.
We summarize what we assess as the past year's most important findings within climate change research: limits to adaptation, vulnerability hotspots, new threats coming from the climate–health nexus, climate (im)mobility and security, sustainable practices for land use and finance, losses and damages, inclusive societal climate decisions and ways to overcome structural barriers to accelerate mitigation and limit global warming to below 2°C.
Technical summary
We synthesize 10 topics within climate research where there have been significant advances or emerging scientific consensus since January 2021. The selection of these insights was based on input from an international open call with broad disciplinary scope. Findings concern: (1) new aspects of soft and hard limits to adaptation; (2) the emergence of regional vulnerability hotspots from climate impacts and human vulnerability; (3) new threats on the climate–health horizon – some involving plants and animals; (4) climate (im)mobility and the need for anticipatory action; (5) security and climate; (6) sustainable land management as a prerequisite to land-based solutions; (7) sustainable finance practices in the private sector and the need for political guidance; (8) the urgent planetary imperative for addressing losses and damages; (9) inclusive societal choices for climate-resilient development and (10) how to overcome barriers to accelerate mitigation and limit global warming to below 2°C.
Social media summary
Science has evidence on barriers to mitigation and how to overcome them to avoid limits to adaptation across multiple fields.
Personality disorders are frequently encountered by all healthcare professionals and can often pose a diagnostic dilemma due to the crossover of different traits amongst the various subtypes. The ICD 10 classification comprised of succinct parameters of the 10 subtypes of personality disorders but lacked a global approach to address the complexity of the disease. The ICD 11 classification provides a more structural approach to aid in clinical diagnosis.
Objectives
A literature review of the diagnostic applicability of ICD 11 classification of personality disorders is presented in comparison with the ICD 10 classification.
Methods
A retrospective analysis of the literature outlining the ICD 10 and 11 classifications of personality disorders, exploring the differences in evidence-based applications of both.
Results
The ICD 11 classification of personality disorders supersedes the ICD 10 classification in describing the severity of the personality dysfunction in conjunction with a wide range of trait domain qualifiers, thus enabling the clinician to portray the disease dynamically. The current evidence available on the utility of the ICD 11 classification gives a promising outlook for its application in clinical settings.
Conclusions
The ICD 11 has transformed the classification of personality disorders by projecting a dimensional description of personality functioning, aiming to overcome the diagnostic deficiencies in the ICD 10 classification. The versatility offered by the application of the ICD 11 classification can be pivotal in reshaping the focus and intensity of clinical management of the disease.
The coronavirus disease 2019 (COVID-19) pandemic is bringing to light the long-neglected area of mental health. Current evidence demonstrates an increase in mental, neurological and substance use conditions globally. Although long-established as a leading cause of disease burden, mental health has been historically grossly underfunded. This analysis seeks to demonstrate the extent to which funding for mental health has been prioritised within the international COVID-19 response.
Methods
The authors analysed the development and humanitarian funding through data provided by the International Aid Transparency Initiative. Project-level COVID-19 data from January 2020 to March 2021 were reviewed for mental health relevance. Relevant projects were then classified into categories based on populations of concern for mental health and the degree of COVID-19 involvement. Financial information was assessed through project transaction data in US Dollars.
Results
Of the 8319 projects provided, 417 were mental health relevant. Mental health-relevant funding accounted for less than 2% of all COVID-19 development and humanitarian funding. Target populations which received the majority of mental health relevant funding were children and humanitarian populations, and 46% of funding went towards activities which combined COVID-19 responses with general humanitarian actions. Over half of mental health relevant funding was received by ten countries, and ten donor organisations provided almost 90% of funding.
Conclusion
This analysis shows that the international donor community is currently falling short in supporting mental health within and beyond the COVID-19 pandemic. As the pandemic continues, sustainable country-led awareness, treatment, and prevention for mental, neurological and substance use conditions must be prioritised
Dichanthium annulatum is one of the dominant grasses of India, North Africa, Southeast Asia, China, Australia, Fiji, New Guinea, Cuba, Haiti and Puerto Rico. This drought-tolerant grass is an excellent fodder in mixed pastures. Developing varieties with improved quality and tolerance to various abiotic stresses is hampered due to its apomictic nature. Germplasm collection, characterization, genetic diversity analysis and core subset development followed by selection for desirable traits seems to be the most plausible breeding tool for developing new cultivars. In the present study, 498 genotypes collected from different agro-ecological zones in India were included. Genotypes were characterized for various metric and non-numeric traits; and the nutritional parameters. Agglomerative clustering analysis, using the Euclidean distance method, showed 14 distinct clusters. High variability was recorded for green forage yield, quantitative traits and nutritive quality parameters. A core subset of 50 accessions was identified, which captured most of the morphological and nutritional variability present in the total germplasm. Clustering of genotypes was observed to be related to the climatic conditions of the place of collection. High genetic variability observed for various morphological traits as well as forage yield indicated that these genotypes or subset of genotypes can be evaluated in different abiotic stress conditions such as salt, light and moisture stress for the identification of suitable varieties for the respective areas. Variability was attributed to inter-generic, inter-specific crossing together with the occasional presence of sexual plants in nature.