We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
This review aims to highlight the relative importance of cardiovascular disease (CVD) lifestyle-associated risk factors among individuals with inflammatory bowel disease (IBD) and examine the effectiveness of lifestyle interventions to improve these CVD risk factors. Adults with IBD are at higher risk of CVD due to systemic and gut inflammation. Besides that, tobacco smoking, dyslipidaemia, hypertension, obesity, physical inactivity and poor diet can also increase CVD risk. Typical IBD behavioural modification including food avoidance and reduced physical activity, as well as frequent corticosteroid use, can further increase CVD risk. We reviewed seven studies and found that there is insufficient evidence to conclude the effects of diet and/or physical activity interventions on CVD risk outcomes among populations with IBD. However, the limited findings suggest that people with IBD can adhere to a healthy diet or Mediterranean diet (for which there is most evidence) and safely participate in moderately intense aerobic and resistance training to potentially improve anthropometric risk factors. This review highlights the need for more robust controlled trials with larger sample sizes to assess and confirm the effects of lifestyle interventions to mitigate modifiable CVD risk factors among the IBD population.
Information on the time spent completing cognitive testing is often collected, but such data are not typically considered when quantifying cognition in large-scale community-based surveys. We sought to evaluate the added value of timing data over and above traditional cognitive scores for the measurement of cognition in older adults.
Method:
We used data from the Longitudinal Aging Study in India-Diagnostic Assessment of Dementia (LASI-DAD) study (N = 4,091), to assess the added value of timing data over and above traditional cognitive scores, using item-specific regression models for 36 cognitive test items. Models were adjusted for age, gender, interviewer, and item score.
Results:
Compared to Quintile 3 (median time), taking longer to complete specific items was associated (p < 0.05) with lower cognitive performance for 67% (Quintile 5) and 28% (Quintile 4) of items. Responding quickly (Quintile 1) was associated with higher cognitive performance for 25% of simpler items (e.g., orientation for year), but with lower cognitive functioning for 63% of items requiring higher-order processing (e.g., digit span test). Results were consistent in a range of different analyses adjusting for factors including education, hearing impairment, and language of administration and in models using splines rather than quintiles.
Conclusions:
Response times from cognitive testing may contain important information on cognition not captured in traditional scoring. Incorporation of this information has the potential to improve existing estimates of cognitive functioning.
Abacs approximating the product-moment correlation for both explicit and implicit selection are presented. These abacs give accuracy to within .01 of the corresponding analytic estimate.
The weighted euclidean distances model in multidimensional scaling (WMDS) represents individual differences as dimension saliences which can be interpreted as the orientations of vectors in a subject space. It has recently been suggested that the statistics of directions would be appropriate for carrying out tests of location with such data. The nature of the directional representation in WMDS is reviewed and it is argued that since dimension saliences are almost always positive, the directional representations will usually be confined to the positive orthant. Conventional statistical techniques are appropriate to angular representations of the individual differences which will yield angles in the interval (0, 90) so long as dimension saliences are nonnegative, a restriction which can be imposed. Ordinary statistical methods are also appropriate with several linear indices which can be derived from WMDS results. Directional statistics may be applied more fruitfully to vector representations of preferences.
This study explored the prospective use of the Ages and Stages Questionnaires-3 in follow-up after cardiac surgery.
Materials and Method:
For children undergoing cardiac surgery at 5 United Kingdom centres, the Ages and Stages Questionnaires-3 were administered 6 months and 2 years later, with an outcome based on pre-defined cut-points: Red = 1 or more domain scores >2 standard deviations below the normative mean, Amber = 1 or more domain scores 1–2 standard deviations below the normal range based on the manual, Green = scores within the normal range based on the manual.
Results:
From a cohort of 554 children <60 months old at surgery, 306 participated in the postoperative assessment: 117 (38.3%) were scored as Green, 57 (18.6%) as Amber, and 132 (43.1%) as Red. Children aged 6 months at first assessment (neonatal surgery) were likely to score Red (113/124, 85.6%) compared to older age groups (n = 32/182, 17.6%). Considering risk factors of congenital heart complexity, univentricular status, congenital comorbidity, and child age in a logistic regression model for the outcome of Ages and Stages score Red, only younger age was significant (p < 0.001). 87 children had surgery in infancy and were reassessed as toddlers. Of these, 43 (49.2%) improved, 30 (34.5%) stayed the same, and 13 (16.1%) worsened. Improved scores were predominantly in those who had a first assessment at 6 months old.
Discussion:
The Ages and Stages Questionnaires results are most challenging to interpret in young babies of 6 months old who are affected by complex CHD.
The association between cannabis and psychosis is established, but the role of underlying genetics is unclear. We used data from the EU-GEI case-control study and UK Biobank to examine the independent and combined effect of heavy cannabis use and schizophrenia polygenic risk score (PRS) on risk for psychosis.
Methods
Genome-wide association study summary statistics from the Psychiatric Genomics Consortium and the Genomic Psychiatry Cohort were used to calculate schizophrenia and cannabis use disorder (CUD) PRS for 1098 participants from the EU-GEI study and 143600 from the UK Biobank. Both datasets had information on cannabis use.
Results
In both samples, schizophrenia PRS and cannabis use independently increased risk of psychosis. Schizophrenia PRS was not associated with patterns of cannabis use in the EU-GEI cases or controls or UK Biobank cases. It was associated with lifetime and daily cannabis use among UK Biobank participants without psychosis, but the effect was substantially reduced when CUD PRS was included in the model. In the EU-GEI sample, regular users of high-potency cannabis had the highest odds of being a case independently of schizophrenia PRS (OR daily use high-potency cannabis adjusted for PRS = 5.09, 95% CI 3.08–8.43, p = 3.21 × 10−10). We found no evidence of interaction between schizophrenia PRS and patterns of cannabis use.
Conclusions
Regular use of high-potency cannabis remains a strong predictor of psychotic disorder independently of schizophrenia PRS, which does not seem to be associated with heavy cannabis use. These are important findings at a time of increasing use and potency of cannabis worldwide.
Cannabis use and familial vulnerability to psychosis have been associated with social cognition deficits. This study examined the potential relationship between cannabis use and cognitive biases underlying social cognition and functioning in patients with first episode psychosis (FEP), their siblings, and controls.
Methods
We analyzed a sample of 543 participants with FEP, 203 siblings, and 1168 controls from the EU-GEI study using a correlational design. We used logistic regression analyses to examine the influence of clinical group, lifetime cannabis use frequency, and potency of cannabis use on cognitive biases, accounting for demographic and cognitive variables.
Results
FEP patients showed increased odds of facial recognition processing (FRP) deficits (OR = 1.642, CI 1.123–2.402) relative to controls but not of speech illusions (SI) or jumping to conclusions (JTC) bias, with no statistically significant differences relative to siblings. Daily and occasional lifetime cannabis use were associated with decreased odds of SI (OR = 0.605, CI 0.368–0.997 and OR = 0.646, CI 0.457–0.913 respectively) and JTC bias (OR = 0.625, CI 0.422–0.925 and OR = 0.602, CI 0.460–0.787 respectively) compared with lifetime abstinence, but not with FRP deficits, in the whole sample. Within the cannabis user group, low-potency cannabis use was associated with increased odds of SI (OR = 1.829, CI 1.297–2.578, FRP deficits (OR = 1.393, CI 1.031–1.882, and JTC (OR = 1.661, CI 1.271–2.171) relative to high-potency cannabis use, with comparable effects in the three clinical groups.
Conclusions
Our findings suggest increased odds of cognitive biases in FEP patients who have never used cannabis and in low-potency users. Future studies should elucidate this association and its potential implications.
Edited by
David M. Greer, Boston University School of Medicine and Boston Medical Center,Neha S. Dangayach, Icahn School of Medicine at Mount Sinai and Mount Sinai Health System
There are many central nervous system (CNS) pathologies that are managed in the neurointensive care unit. Neurocritical patients are a diverse group with vastly different presentations, management, expected duration of their clinical course, and disease-related long-term outcomes. Clinical entities include traumatic brain injury (TBI), ischemic stroke, aneurysmal subarachnoid hemorrhage (aSAH), intraparenchymal hemorrhages (ICH), spinal cord injury (SCI), brain tumors, postoperative craniotomy patients, and nonsurgical diseases, such as myasthenia gravis, Guillain–Barré syndrome, and CNS infections (meningitis and encephalitis).
There are a variety of bedside neurosurgical and neurocritical care procedures that may be required to provide care and mitigate the effects of primary neurologic pathology and to improve outcomes. Despite the many advances in neurosurgical and neurocritical care in that last several decades, complications from these procedures, while generally rare, still can occur (Table 16.1).
Field experiments were conducted at Clayton and Rocky Mount, NC, during summer 2020 to determine the growth and fecundity of Palmer amaranth plants that survived glufosinate with and without grass competition in cotton. Glufosinate (590 g ai ha−1) was applied to Palmer amaranth early postemergence (5 cm tall), mid-postemergence (7 to 10 cm tall), and late postemergence (>10 cm tall) and at orthogonal combinations of those timings. Nontreated Palmer amaranth was grown in weedy, weed-free in-crop (WFIC) and weed-free fallow (WFNC) conditions for comparisons. Palmer amaranth control decreased as larger plants were treated; no plants survived the sequential glufosinate applications in both experiments. The apical and circumferential growth of Palmer amaranth surviving glufosinate treatments was reduced by more than 44% compared to the WFIC and WFNC Palmer amaranth in both experiments. The biomass of Palmer amaranth plants surviving glufosinate was reduced by more than 62% when compared with the WFIC and WFNC in all experiments. The fecundity of Palmer amaranth surviving glufosinate treatments was reduced by more than 73% compared to WFNC Palmer amaranth in all experiments. Remarkably, the plants that survived glufosinate were fecund as WFIC plants only in the Grass Competition experiment. The results prove that despite decreased vegetative growth of Palmer amaranth surviving glufosinate treatment, plants remain fecund and can be fecund as nontreated plants in cotton. These results suggest that a glufosinate-treated grass weed may not have a significant interspecific competition effect on Palmer amaranth that survives glufosinate. Glufosinate should be applied to 5 to 7 cm Palmer amaranth to cease vegetative and reproductive capacities.
Palmer amaranth (Amaranthus palmeri S. Watson, AMAPA) is one of the most troublesome weeds in North America due to its rapid growth rate, substantial seed production, competitiveness and the evolution of herbicide-resistant populations. Though frequently encountered in the South, Midwest, and Mid-Atlantic regions of the United States, A. palmeri was recently identified in soybean [Glycine max (L.) Merr.] fields in Genesee, Orange, and Steuben counties, NY, where glyphosate was the primary herbicide for in-crop weed control. This research, conducted in 2023, aimed to (1) describe the dose response of three putative resistant NY A. palmeri populations to glyphosate, (2) determine their mechanisms of resistance, and (3) assess their sensitivity to other postemergence herbicides commonly used in NY crop production systems. Based on the effective dose necessary to reduce aboveground biomass by 50% (ED50), the NY populations were 42 to 67 times more resistant to glyphosate compared with a glyphosate-susceptible population. Additionally, the NY populations had elevated EPSPS gene copy numbers ranging from 25 to 135 located within extrachromosomal circular DNA (eccDNA). Label rate applications of Weed Science Society of America (WSSA) Group 2 herbicides killed up to 42% of the NY populations of A. palmeri. Some variability was observed among populations in response to WSSA Group 5 and 27 herbicides. All populations were effectively controlled by labeled rates of herbicides belonging to WSSA Groups 4, 10, 14, and 22. Additional research is warranted to confirm whether NY populations have evolved multiple resistance to herbicides within other WSSA groups and to develop effective A. palmeri management strategies suitable for NY crop production.
Coastal wetlands are hotspots of carbon sequestration, and their conservation and restoration can help to mitigate climate change. However, there remains uncertainty on when and where coastal wetland restoration can most effectively act as natural climate solutions (NCS). Here, we synthesize current understanding to illustrate the requirements for coastal wetland restoration to benefit climate, and discuss potential paths forward that address key uncertainties impeding implementation. To be effective as NCS, coastal wetland restoration projects will accrue climate cooling benefits that would not occur without management action (additionality), will be implementable (feasibility) and will persist over management-relevant timeframes (permanence). Several issues add uncertainty to understanding if these minimum requirements are met. First, coastal wetlands serve as both a landscape source and sink of carbon for other habitats, increasing uncertainty in additionality. Second, coastal wetlands can potentially migrate outside of project footprints as they respond to sea-level rise, increasing uncertainty in permanence. To address these first two issues, a system-wide approach may be necessary, rather than basing cooling benefits only on changes that occur within project boundaries. Third, the need for NCS to function over management-relevant decadal timescales means methane responses may be necessary to include in coastal wetland restoration planning and monitoring. Finally, there is uncertainty on how much data are required to justify restoration action. We summarize the minimum data required to make a binary decision on whether there is a net cooling benefit from a management action, noting that these data are more readily available than the data required to quantify the magnitude of cooling benefits for carbon crediting purposes. By reducing uncertainty, coastal wetland restoration can be implemented at the scale required to significantly contribute to addressing the current climate crisis.
Seattle Children’s Research Institute is identifying the amount and type of health equity scholarship being conducted institution wide. However, methods for categorizing how scholarship is equity-focused are lacking. We developed and evaluated the reliability of a health equity scholarship coding schema applied to Seattle Children’s affiliated scholarship.
Methods:
A 2021–2022 Ovid MEDLINE affiliation search yielded 3551 affiliated scholarship records, with 1079 records identified via an existing filter as scholarship addressing social determinants of health. Through reliability testing and examining concordance and discordance across three independent coders of these records, we developed a coding schema to classify health equity scholarship (yes/no). When health equity scholarship proved positive/Yes, the coders assigned a one through five maturity rating of the scholarship towards addressing inequities. Subsequent reliability testing including a new coder was conducted for 992 subsequent affiliated scholarship records (Oct 2022–June 2023), with additional testing of the sensitivity and specificity of the existing filter relative to the new coding schema.
Results:
Reliability for identifying health equity scholarship was consistently high (Fleiss kappas ≥ .78) and categorization of health equity scholarship into maturity levels was moderate (Fleiss kappas ≥ .47). The coding schema identified additional health equity scholarship not captured in an existing filter for social determinants of health scholarship. Based on the new schema, 23.3% of Seattle Childrens’ affiliated scholarship published October 2002–June 2023 was health equity focused.
Conclusions:
This new coding schema can be used to identify and categorize health equity scholarship to help quantitate the health equity focus of portfolios of human-focused research.
Military Servicemembers and Veterans are at elevated risk for suicide, but rarely self-identify to their leaders or clinicians regarding their experience of suicidal thoughts. We developed an algorithm to identify posts containing suicide-related content on a military-specific social media platform.
Methods
Publicly-shared social media posts (n = 8449) from a military-specific social media platform were reviewed and labeled by our team for the presence/absence of suicidal thoughts and behaviors and used to train several machine learning models to identify such posts.
Results
The best performing model was a deep learning (RoBERTa) model that incorporated post text and metadata and detected the presence of suicidal posts with relatively high sensitivity (0.85), specificity (0.96), precision (0.64), F1 score (0.73), and an area under the precision-recall curve of 0.84. Compared to non-suicidal posts, suicidal posts were more likely to contain explicit mentions of suicide, descriptions of risk factors (e.g. depression, PTSD) and help-seeking, and first-person singular pronouns.
Conclusions
Our results demonstrate the feasibility and potential promise of using social media posts to identify at-risk Servicemembers and Veterans. Future work will use this approach to deliver targeted interventions to social media users at risk for suicide.
We assessed adverse events in hospitalized patients receiving selected vesicant antibiotics or vasopressors administered through midline catheters or peripherally inserted central catheters (PICC). The rates of catheter-related bloodstream infections, thrombosis, and overall events were similar across the two groups, while occlusion was higher in the PICC group.
Globally, mental disorders account for almost 20% of disease burden and there is growing evidence that mental disorders are associated with various social determinants. Tackling the United Nations Sustainable Development Goals (UN SDGs), which address known social determinants of mental disorders, may be an effective way to reduce the global burden of mental disorders.
Objectives
To examine the evidence base for interventions that seek to improve mental health through targeting the social determinants of mental disorders.
Methods
We conducted a systematic review of reviews, using a five-domain conceptual framework which aligns with the UN SDGs (PROSPERO registration: CRD42022361534). PubMed, PsycInfo, and Scopus were searched from 01 January 2012 until 05 October 2022. Citation follow-up and expert consultation were used to identify additional studies. Systematic reviews including interventions seeking to change or improve a social determinant of mental disorders were eligible for inclusion. Study screening, selection, data extraction, and quality appraisal were conducted in accordance with PRISMA guidelines. The AMSTAR-2 was used to assess included reviews and results were narratively synthesised.
Results
Over 20,000 records were screened, and 101 eligible reviews were included. Most reviews were of low, or critically low, quality. Reviews included interventions which targeted sociocultural (n = 31), economic (n = 24), environmental (n = 19), demographic (n = 15), and neighbourhood (n = 8) determinants of mental disorders. Interventions demonstrating the greatest promise for improved mental health from high and moderate quality reviews (n = 37) included: digital and brief advocacy interventions for female survivors of intimate partner violence; cash transfers for people in low-middle-income countries; improved work schedules, parenting programs, and job clubs in the work environment; psychosocial support programs for vulnerable individuals following environmental events; and social and emotional learning programs for school students. Few effective neighbourhood-level interventions were identified.
Conclusions
This review presents interventions with the strongest evidence base for the prevention of mental disorders and highlights synergies where addressing the UN SDGs can be beneficial for mental health. A range of issues across the literature were identified, including barriers to conducting randomised controlled trials and lack of follow-up limiting the ability to measure long-term mental health outcomes. Interdisciplinary and novel approaches to intervention design, implementation, and evaluation are required to improve the social circumstances and mental health experienced by individuals, communities, and populations.
Researchers increasingly rely on aggregations of radiocarbon dates from archaeological sites as proxies for past human populations. This approach has been critiqued on several grounds, including the assumptions that material is deposited, preserved, and sampled in proportion to past population size. However, various attempts to quantitatively assess the approach suggest there may be some validity in assuming date counts reflect relative population size. To add to this conversation, here we conduct a preliminary analysis coupling estimates of ethnographic population density with late Holocene radiocarbon dates across all counties in California. Results show that counts of late Holocene radiocarbon-dated archaeological sites increase significantly as a function of ethnographic population density. This trend is robust across varying sampling windows over the last 5000 BP. Though the majority of variation in dated-site counts remains unexplained by population density. Outliers reveal how departures from the central trend may be influenced by regional differences in research traditions, development-driven contract work, organic preservation, and landscape taphonomy. Overall, this exercise provides some support for the “dates-as-data” approach and offers insights into the conditions where the underlying assumptions may or may not hold.
Throughout the COVID-19 pandemic, many areas in the United States experienced healthcare personnel (HCP) shortages tied to a variety of factors. Infection prevention programs, in particular, faced increasing workload demands with little opportunity to delegate tasks to others without specific infectious diseases or infection control expertise. Shortages of clinicians providing inpatient care to critically ill patients during the early phase of the pandemic were multifactorial, largely attributed to increasing demands on hospitals to provide care to patients hospitalized with COVID-19 and furloughs.1 HCP shortages and challenges during later surges, including the Omicron variant-associated surges, were largely attributed to HCP infections and associated work restrictions during isolation periods and the need to care for family members, particularly children, with COVID-19. Additionally, the detrimental physical and mental health impact of COVID-19 on HCP has led to attrition, which further exacerbates shortages.2 Demands increased in post-acute and long-term care (PALTC) settings, which already faced critical staffing challenges difficulty with recruitment, and high rates of turnover. Although individual healthcare organizations and state and federal governments have taken actions to mitigate recurring shortages, additional work and innovation are needed to develop longer-term solutions to improve healthcare workforce resiliency. The critical role of those with specialized training in infection prevention, including healthcare epidemiologists, was well-demonstrated in pandemic preparedness and response. The COVID-19 pandemic underscored the need to support growth in these fields.3 This commentary outlines the need to develop the US healthcare workforce in preparation for future pandemics.