We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The perinatal period has gained increasing attention from developmental psychopathologists; however, experiences during birth have been minimally examined using this framework. The current study aimed to evaluate longitudinal associations between childhood maltreatment, negative birth experiences, and postpartum mental health across levels of self-reported emotion dysregulation and respiratory sinus arrhythmia (RSA). Expectant mothers (N = 223) participated in a longitudinal study from the third trimester of pregnancy to 7 months postpartum. Participants contributed prenatal resting RSA and completed questionnaires prenatally, 24 hours after birth, and 7 months postpartum. Results indicated that more childhood maltreatment was associated with higher birth fear and postpartum anxiety and depressive symptoms. Resting RSA moderated the association between childhood maltreatment and birth fear, such that more childhood maltreatment and higher resting RSA were associated with increased birth fear. Additionally, self-reported prenatal emotion dysregulation moderated the association between childhood maltreatment and postpartum depressive symptoms, such that more childhood maltreatment and higher emotion dysregulation were associated with increased depressive symptoms. Emotion dysregulation across multiple levels may amplify vulnerability to negative birth experiences and postpartum psychopathology among individuals with childhood maltreatment histories. Thus, emotion dysregulation in the context of trauma-informed care may be worthwhile intervention targets during the perinatal period.
This chapter describes the value of using Contemporary Integrative Interpersonal Theory (CIIT) to understand the self and social impairments that define personality disorders as a group. CIIT’s major tenets are summarized, with a particular emphasis on elaborating how the self and self-functioning are an integral part of interpersonal experience and expression. A generic definition of adaptive interpersonal functioning is provided along with a demonstration of how CIIT can accommodate specific constructs and diagnoses using borderline personality disorder and narcissism as examples.
Around 1000 years ago, Madagascar experienced the collapse of populations of large vertebrates that ultimately resulted in many species going extinct. The factors that led to this collapse appear to have differed regionally, but in some ways, key processes were similar across the island. This review evaluates four hypotheses that have been proposed to explain the loss of large vertebrates on Madagascar: Overkill, aridification, synergy, and subsistence shift. We explore regional differences in the paths to extinction and the significance of a prolonged extinction window across the island. The data suggest that people who arrived early and depended on hunting, fishing, and foraging had little effect on Madagascar’s large endemic vertebrates. Megafaunal decline was triggered initially by aridification in the driest bioclimatic zone, and by the arrival of farmers and herders in the wetter bioclimatic zones. Ultimately, it was the expansion of agropastoralism across both wet and dry regions that drove large endemic vertebrates to extinction everywhere.
Traditional foods are increasingly being incorporated into modern diets. This is largely driven by consumers seeking alternative food sources that have superior nutritional and functional properties. Within Australia, Aboriginal and Torres Strait Islander peoples are looking to develop their traditional foods for commercial markets. However, supporting evidence to suggest these foods are safe for consumption within the wider general population is limited. At the 2022 NSA conference a keynote presentation titled ‘Decolonising food regulatory frameworks to facilitate First Peoples food sovereignty’ was presented. This presentation was followed by a manuscript titled ‘Decolonising food regulatory frameworks: Importance of recognising traditional culture when assessing dietary safety of traditional foods’, which was published in the conference proceedings journal(1). These pieces examined the current regulatory frameworks that are used to assess traditional foods and proposed a way forward that would allow Traditional Custodians to successfully develop their foods for modern markets. Building upon the previously highlighted works, this presentation will showcase best practice Indigenous engagement and collaboration principles in the development of traditionally used food products. To achieve this, we collaborated with a collective of Gamilaraay peoples who are looking to reignite their traditional grain practices and develop grain-based food products. To meet the current food safety regulatory requirements, we needed to understand how this grain would fit into modern diets, which included understanding the history of use, elucidating the nutritional and functional properties that can be attributed to the grain, and developing a safety dossier(2) so that the Traditional Custodians can confidently take their product to market. To aid the Traditional Custodians in performing their due diligence, we have systemically analysed the dietary safety of the selected native grain and compared it side-by-side with commonly consumed wheat in a range of in vitro bioassays and chemical analyses. From a food safety perspective, we show that the native grain is equivalent to commonly consumed wheat. The native grain has been shown to be no more toxic than wheat within our biological screening systems. Chemical analysis showed that the level of contaminants are below tolerable limits, and we were not able to identify any chemical classes of concern. Our initial findings support the history of safe use and suggest that the tested native grain species would be no less safe than commonly consumed wheat. This risk assessment and previously published nutritional study(3) provides an overall indication that the grain is nutritionally superior and viable for commercial development. The learnings from this project can direct the future risk assessment of traditional foods and therefore facilitate the safe market access of a broader range of traditionally used foods. Importantly, the methods presented are culturally safe and financially viable for the small businesses hoping to enter the market.
The Society for Healthcare Epidemiology of America, the Association of Professionals in Infection Control and Epidemiology, the Infectious Diseases Society of America, and the Pediatric Infectious Diseases Society represent the core expertise regarding healthcare infection prevention and infectious diseases and have written multisociety statement for healthcare facility leaders, regulatory agencies, payors, and patients to strengthen requirements and expectations around facility infection prevention and control (IPC) programs. Based on a systematic literature search and formal consensus process, the authors advocate raising the expectations for facility IPC programs, moving to effective programs that are:
• Foundational and influential parts of the facility’s operational structure
• Resourced with the correct expertise and leadership
• Prioritized to address all potential infectious harms
This document discusses the IPC program’s leadership—a dyad model that includes both physician and infection preventionist leaders—its reporting structure, expertise, and competencies of its members, and the roles and accountability of partnering groups within the healthcare facility. The document outlines a process for identifying minimum IPC program medical director support. It applies to all types of healthcare settings except post-acute long-term care and focuses on resources for the IPC program. Long-term acute care hospital (LTACH) staffing and antimicrobial stewardship programs will be discussed in subsequent documents.
Herbicide drift to sensitive crops can result in significant injury, yield loss, and even crop destruction. When pesticide drift is reported to the Georgia Department of Agriculture (GDA), tissue samples are collected and analyzed for residues. Seven field studies were conducted in 2020 and 2021 in cooperation with the GDA to evaluate the effect of (1) time interval between simulated drift event and sampling, (2) low-dose herbicide rates, and (3) the sample collection methods on detecting herbicide residues in cotton (Gossypium hirsutum L.) foliage. Simulated drift rates of 2,4-D, dicamba, and imazapyr were applied to non-tolerant cotton in the 8- to 9-leaf stage with plant samples collected at 7 or 21 d after treatment (DAT). During collection, plant sampling consisted of removing entire plants or removing new growth occurring after the 7-leaf stage. Visual cotton injury from 2,4-D reached 43% to 75% at 0.001 and 0.004 kg ae ha−1, respectively; for dicamba, it was 9% to 41% at 0.003 or 0.014 kg ae ha−1, respectively; and for imazapyr, it was 1% to 74% with 0.004 and 0.03 kg ae ha−1 rates, respectively. Yield loss was observed with both rates of 2,4-D (11% to 51%) and with the high rate of imazapyr (52%); dicamba did not influence yield. Herbicide residues were detected in 88%, 88%, and 69% of samples collected from plants treated with 2,4-D, dicamba, and imazapyr, respectively, at 7 DAT compared with 25%, 16%, and 22% when samples were collected at 21 DAT, highlighting the importance of sampling quickly after a drift event. Although the interval between drift event and sampling, drift rate, and sampling method can all influence residue detection for 2,4-D, dicamba, and imazapyr, the factor with the greatest influence is the amount of time between drift and sample collection.
Metabolite supplementation during in vitro embryo development improves blastocyst quality, however, our understanding of the incorporation of metabolites during in vitro maturation (IVM) is limited. Two important metabolites, follistatin and choline, have beneficial impacts during in vitro culture; however, effects of supplementation during IVM are unknown. The objective of this study was to investigate combining choline and follistatin during IVM on bovine oocytes and subsequent early embryonic development. We hypothesized that supplementation of choline with follistatin would synergistically improve oocyte quality and subsequent early embryonic development. Small follicles were aspirated from slaughterhouse ovaries to obtain cumulus oocyte complexes for IVM with choline (0, 1.3 or 1.8 mM) and follistatin (0 or 10 ng/mL) supplementation in a 3 × 2 design. A subset of oocytes underwent transcriptomic analysis, the remaining oocytes were used for IVF and in vitro culture (IVC). Transcript abundance of CEPT1 tended to be reduced in oocytes supplemented with 1.8 mM choline and follistatin compared to control oocytes (P = 0.07). Combination of follistatin with 1.8 mM choline supplementation during maturation, tended (P = 0.08) to reduce CPEB4 in oocytes. In the blastocysts, HDCA8, NANOG, SAV1 and SOX2 were increased with choline 1.8 mM supplementation without follistatin (P < 0.05), while HDCA8 and SOX2 were increased when follistatin was incorporated (P < 0.05). The combination of choline and follistatin during oocyte maturation may provide a beneficial impact on early embryonic development. Further research is warranted to investigate the interaction between these two metabolites during early embryonic development and long-term influence on fetal development.
Depression is the leading cause of disability worldwide(1). The microbiota-gut-brain axis may play a role in the aetiology of depression, and probiotics show promise for improving mood and depressive state(2). Further evidence is required to support mechanisms and in high-risk populations, such as those with sub-threshold depression (which may be 2-3 times more prevalent than diagnosed depression)(3). The aims were to assess the efficacy of a probiotic compared with placebo in reducing the severity of depressive symptoms in participants with subthreshold depression, and to investigate potential mechanistic markers of inflammatory, antioxidant status and stress response. A double-blind, randomised, placebo-controlled trial was conducted in participants meeting diagnosis of subthreshold depression (DSM-5); aged 18-65 years; ≥18.5 kg/m2 body mass index; not taking antidepressants, centrally acting medications, probiotics nor antibiotics for at least 6 weeks. The probiotic (4 × 109 AFU/CFU, 2.5 g freeze-dried powder containing Lactobacillus fermentum LF16 (DSM26956), L. rhamnosus LR06 (DSM21981), L. plantarum LP01 (LMG P-21021), Bifidobacterium longum BL04 (DSM 23233)) or placebo was taken daily for 3-months. Data was collected at 3 study visits (pre-, mid- (6 weeks), post-intervention). Self-reported questionnaires measured psychological symptoms (Beck Depression Inventory, BDI; Hospital Anxiety Depression Scale, HADS) and quality of life. Blood and salivary samples were collected for biomarkers including cortisol awakening response (CAR). General linear models examined within-group and between-group differences across all time points. Thirty-nine participants completed the study (n = 19 probiotic; n = 20 placebo) using intention-to-treat analysis. The probiotic group decreased in BDI score by −6.5 (95% CI −12.3; −0.7) and −7.6 (95% CI −13.4; −1.8) at 6 and 12 weeks, respectively. The HADS-A score decreased in the probiotic group by −2.8 (95% CI −5.2; −0.4) and −2.7 (95% CI −5.1; −0.3) at 6 and 12, respectively. The HADS-D score decreased in the probiotic group by −3.0 (95% CI −5.4; −0.7) and −2.5 (−4.9; −0.2) at 6 and 12 weeks of intervention, respectively. No between group differences were found. There were no changes in perceived stress or quality of life scores. The probiotic group had reduced hs-CRP levels (7286.2 ± 1205.8 ng/dL vs. 5976.4 ± 1408.3; P = 0.003) and increased total glutathione (14.2 ± 8.9 ng/dL vs. 9.3 ± 4.7; P = 0.049) compared to placebo, post intervention. Lower levels of CAR were found in the probiotic compared to placebo (−0.04 ± 0.17 μg/dL vs. 0.16 ± 0.25; P = 0.009). A significant reduction in depressive symptoms and anxiety was observed within the probiotic group only. These results were supported by improvements observed in biomarkers, suggesting probiotics may improve psychological wellbeing in adults experiencing sub-threshold depression, by potential pathways involved in central nervous system homeostasis and inflammation. Future analyses are required to understand changes within the intestinal microbiota and to clarify how their metabolites facilitate emotional processing.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Acute clinical deterioration in hospital inpatients can be caused by a range of factors including dementia, delirium, substance withdrawal and psychiatric disturbance, creating challenges in diagnosis, often requiring a management plan with input from multiple disciplines. Staff forums and broader literature have confirmed that healthcare staff working in non-mental health settings, may not be as skilled in recognising and managing early signs of emerging and/or escalating clinical agitation. The BoC RRT is a consultation service within the Division of Medicine and CL Psychiatry. Staffed by Medical Registrars and Mental Health Nurses, the collaboration provides a unique healthcare response to acute general wards. The BoC RRT has been implemented to address the rising number of incidences whereby staff and patient safety are compromised. Using evidence-based skills the team aimed to: respond to episodes of clinical agitation that require an internal security response, assist ward referrals by exploring biopsychosocial contributants to behaviour, develop individual patient support plans and review and reduce restrictive intervention practices.
Objectives
To determine if the rapid response model has influenced:
- The impact on staff/patient safety
- Frequency of emergency responses for aggression
- Frequency of restrictive intervention use
Methods
This project was approved as a quality assurance project (QA2022018). The patients within scope of the BoC RRT include inpatients in medical and surgical wards. It excludes patients in Emergency Departments, mental health units, outpatient clinics, and visitors. The evaluation of the pilot has used a PDSA (Plan, Do, Study, Act) cycle when implementing new improvements. A mixed methods approach explored the impact of the BoC RRT. Staff consultation will identify challenges in responding to scenarios whereby there is risk of harm to staff and patients. Staff feedback and the emergency response data was monitored.
Results
In 2021, there was approx. 720 code greys per month, requiring a security response. Since the implementation of BoC RRT, these numbers have reduced to 527. Reviewing restrictive intrvention practices has identified areas for policy review and need for education. Staff consultation found that nurses were confident caring for those patients exhibiting clinical agitation associated with delirium and dementia. However, caring for people with mental health or substance use disorders were more challenging.
Conclusions
These interim results indicate that BoC RRT has been generally well received by clinical staff. The decline in code grey responses indicates that it is likely having a positive impact in early identification and management of clinical agitation for hospital inpatients. There is support for this response model to continue beyond the pilot phase and further area for research.
Mental health problems are elevated in autistic individuals but there is limited evidence on the developmental course of problems across childhood. We compare the level and growth of anxious-depressed, behavioral and attention problems in an autistic and typically developing (TD) cohort.
Methods
Latent growth curve models were applied to repeated parent-report Child Behavior Checklist data from age 2–10 years in an inception cohort of autistic children (Pathways, N = 397; 84% boys) and a general population TD cohort (Wirral Child Health and Development Study; WCHADS; N = 884, 49% boys). Percentile plots were generated to quantify the differences between autistic and TD children.
Results
Autistic children showed elevated levels of mental health problems, but this was substantially reduced by accounting for IQ and sex differences between the autistic and TD samples. There was small differences in growth patterns; anxious-depressed problems were particularly elevated at preschool and attention problems at late childhood. Higher family income predicted lower base-level on all three dimensions, but steeper increase of anxious-depressed problems. Higher IQ predicted lower level of attention problems and faster decline over childhood. Female sex predicted higher level of anxious-depressed and faster decline in behavioral problems. Social-affect autism symptom severity predicted elevated level of attention problems. Autistic girls' problems were particularly elevated relative to their same-sex non-autistic peers.
Conclusions
Autistic children, and especially girls, show elevated mental health problems compared to TD children and there are some differences in predictors. Assessment of mental health should be integrated into clinical practice for autistic children.
Testing of asymptomatic patients for severe acute respiratory coronavirus virus 2 (SARS-CoV-2) (ie, “asymptomatic screening) to attempt to reduce the risk of nosocomial transmission has been extensive and resource intensive, and such testing is of unclear benefit when added to other layers of infection prevention mitigation controls. In addition, the logistic challenges and costs related to screening program implementation, data noting the lack of substantial aerosol generation with elective controlled intubation, extubation, and other procedures, and the adverse patient and facility consequences of asymptomatic screening call into question the utility of this infection prevention intervention. Consequently, the Society for Healthcare Epidemiology of America (SHEA) recommends against routine universal use of asymptomatic screening for SARS-CoV-2 in healthcare facilities. Specifically, preprocedure asymptomatic screening is unlikely to provide incremental benefit in preventing SARS-CoV-2 transmission in the procedural and perioperative environment when other infection prevention strategies are in place, and it should not be considered a requirement for all patients. Admission screening may be beneficial during times of increased virus transmission in some settings where other layers of controls are limited (eg, behavioral health, congregate care, or shared patient rooms), but widespread routine use of admission asymptomatic screening is not recommended over strengthening other infection prevention controls. In this commentary, we outline the challenges surrounding the use of asymptomatic screening, including logistics and costs of implementing a screening program, and adverse patient and facility consequences. We review data pertaining to the lack of substantial aerosol generation during elective controlled intubation, extubation, and other procedures, and we provide guidance for when asymptomatic screening for SARS-CoV-2 may be considered in a limited scope.
While unobscured and radio-quiet active galactic nuclei are regularly being found at redshifts
$z > 6$
, their obscured and radio-loud counterparts remain elusive. We build upon our successful pilot study, presenting a new sample of low-frequency-selected candidate high-redshift radio galaxies (HzRGs) over a sky area 20 times larger. We have refined our selection technique, in which we select sources with curved radio spectra between 72–231 MHz from the GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) survey. In combination with the requirements that our GLEAM-selected HzRG candidates have compact radio morphologies and be undetected in near-infrared
$K_{\rm s}$
-band imaging from the Visible and Infrared Survey Telescope for Astronomy Kilo-degree Infrared Galaxy (VIKING) survey, we find 51 new candidate HzRGs over a sky area of approximately
$1200\ \mathrm{deg}^2$
. Our sample also includes two sources from the pilot study: the second-most distant radio galaxy currently known, at
$z=5.55$
, with another source potentially at
$z \sim 8$
. We present our refined selection technique and analyse the properties of the sample. We model the broadband radio spectra between 74 MHz and 9 GHz by supplementing the GLEAM data with both publicly available data and new observations from the Australia Telescope Compact Array at 5.5 and 9 GHz. In addition, deep
$K_{\rm s}$
-band imaging from the High-Acuity Widefield K-band Imager (HAWK-I) on the Very Large Telescope and from the Southern Herschel Astrophysical Terahertz Large Area Survey Regions
$K_{\rm s}$
-band Survey (SHARKS) is presented for five sources. We discuss the prospects of finding very distant radio galaxies in our sample, potentially within the epoch of reionisation at
$z \gtrsim 6.5$
.
The Hierarchical Taxonomy of Psychopathology (HiTOP) has emerged out of the quantitative approach to psychiatric nosology. This approach identifies psychopathology constructs based on patterns of co-variation among signs and symptoms. The initial HiTOP model, which was published in 2017, is based on a large literature that spans decades of research. HiTOP is a living model that undergoes revision as new data become available. Here we discuss advantages and practical considerations of using this system in psychiatric practice and research. We especially highlight limitations of HiTOP and ongoing efforts to address them. We describe differences and similarities between HiTOP and existing diagnostic systems. Next, we review the types of evidence that informed development of HiTOP, including populations in which it has been studied and data on its validity. The paper also describes how HiTOP can facilitate research on genetic and environmental causes of psychopathology as well as the search for neurobiologic mechanisms and novel treatments. Furthermore, we consider implications for public health programs and prevention of mental disorders. We also review data on clinical utility and illustrate clinical application of HiTOP. Importantly, the model is based on measures and practices that are already used widely in clinical settings. HiTOP offers a way to organize and formalize these techniques. This model already can contribute to progress in psychiatry and complement traditional nosologies. Moreover, HiTOP seeks to facilitate research on linkages between phenotypes and biological processes, which may enable construction of a system that encompasses both biomarkers and precise clinical description.
We opened this volume with sobering stories of the dire global challenges before us. Indeed, one would not be hard pressed to find stories of the urgency of our various environmental and social crises. While we wrote this book, the COVID-19 pandemic raged, towns in the Arctic reached unprecedented temperatures, countless hectares of forests fell while fossil fuels continued to be violently extracted from the earth, and Black, Indigenous and people of colour continued to be exploited and oppressed. Yet, despite all this, or rather because of it, we wish to begin our conclusion with hope and determination. Drawing on Solnit (2016), we believe that there is a spaciousness in the uncertainties posed by the challenges before us in that they offer new possibilities for being, thinking and acting – for renewal and purposeful redirection in our trajectory – and it is through a reawakened awareness of our rich and dynamic relationships to place that we can find a better way forward.
The Hierarchical Taxonomy of Psychopathology (HiTOP) is a classification system that seeks to organize psychopathology using quantitative evidence – yet the current model was established by narrative review. This meta-analysis provides a quantitative synthesis of literature on transdiagnostic dimensions of psychopathology to evaluate the validity of the HiTOP framework.
Methods
Published studies estimating factor-analytic models from diagnostic and statistical manual of mental disorders (DSM) diagnoses were screened. A total of 120,596 participants from 35 studies assessing 23 DSM diagnoses were included in the meta-analytic models. Data were pooled into a meta-analytic correlation matrix using a random effects model. Exploratory factor analyses were conducted using the pooled correlation matrix. A hierarchical structure was estimated by extracting one to five factors representing levels of the HiTOP framework, then calculating congruence coefficients between factors at sequential levels.
Results
Five transdiagnostic dimensions fit the DSM diagnoses well (comparative fit index = 0.92, root mean square error of approximation = 0.07, and standardized root-mean-square residual = 0.03). Most diagnoses had factor loadings >|0.30| on the expected factors, and congruence coefficients between factors indicated a hierarchical structure consistent with the HiTOP framework.
Conclusions
A model closely resembling the HiTOP framework fit the data well and placement of DSM diagnoses within transdiagnostic dimensions were largely confirmed, supporting it as valid structure for conceptualizing and organizing psychopathology. Results also suggest transdiagnostic research should (1) use traits, narrow symptoms, and dimensional measures of psychopathology instead of DSM diagnoses, (2) assess a broader array of constructs, and (3) increase focus on understudied pathologies.
Financial literacy is a core life skill for participating in modern society. But how many of us have been educated about money; the importance of budgeting and saving for a rainy day; how bank accounts and debt work and when it makes sense to save for a pension? Our brief research to date indicates a shockingly low level of financial literacy in the general population. And, it does not look like this will get better soon; regarding improving financial literacy, the Financial Services Authority stated in 2003 that “Never has the need been so great or so urgent”. And yet many children will go through school without an hour spent studying financial literacy. Furthermore, efforts to improve financial literacy at older ages are either non-existent or piecemeal at best.
The consequences of poor financial literacy are especially damaging for vulnerable people. Vulnerable groups of people are most at risk of making poor financial decisions throughout their lives, which has negative consequences for saving, home ownership, debt levels, retirement and financial inclusion. In this paper, we consider various mechanisms to protect such financial customers, whilst recognising that improving financial literacy is not a silver bullet to improve customer outcomes from financial products.
Financial literacy cannot be brought to a point where the public can understand many financial products without support and advice. But surely, awareness of basic financial literacy principles can be raised, including the most important: when to seek support and advice before undertaking important financial decisions. The paper suggests some key principles for financial literacy and will also consider methods and tools to allow the public to access much-needed support and advice.
Understanding place-based contributors to health requires geographically and culturally diverse study populations, but sharing location data is a significant challenge to multisite studies. Here, we describe a standardized and reproducible method to perform geospatial analyses for multisite studies. Using census tract-level information, we created software for geocoding and geospatial data linkage that was distributed to a consortium of birth cohorts located throughout the USA. Individual sites performed geospatial linkages and returned tract-level information for 8810 children to a central site for analyses. Our generalizable approach demonstrates the feasibility of geospatial analyses across study sites to promote collaborative translational research.
The SPARC tokamak is a critical next step towards commercial fusion energy. SPARC is designed as a high-field ($B_0 = 12.2$ T), compact ($R_0 = 1.85$ m, $a = 0.57$ m), superconducting, D-T tokamak with the goal of producing fusion gain $Q>2$ from a magnetically confined fusion plasma for the first time. Currently under design, SPARC will continue the high-field path of the Alcator series of tokamaks, utilizing new magnets based on rare earth barium copper oxide high-temperature superconductors to achieve high performance in a compact device. The goal of $Q>2$ is achievable with conservative physics assumptions ($H_{98,y2} = 0.7$) and, with the nominal assumption of $H_{98,y2} = 1$, SPARC is projected to attain $Q \approx 11$ and $P_{\textrm {fusion}} \approx 140$ MW. SPARC will therefore constitute a unique platform for burning plasma physics research with high density ($\langle n_{e} \rangle \approx 3 \times 10^{20}\ \textrm {m}^{-3}$), high temperature ($\langle T_e \rangle \approx 7$ keV) and high power density ($P_{\textrm {fusion}}/V_{\textrm {plasma}} \approx 7\ \textrm {MW}\,\textrm {m}^{-3}$) relevant to fusion power plants. SPARC's place in the path to commercial fusion energy, its parameters and the current status of SPARC design work are presented. This work also describes the basis for global performance projections and summarizes some of the physics analysis that is presented in greater detail in the companion articles of this collection.
It is not clear to what extent associations between schizophrenia, cannabis use and cigarette use are due to a shared genetic etiology. We, therefore, examined whether schizophrenia genetic risk associates with longitudinal patterns of cigarette and cannabis use in adolescence and mediating pathways for any association to inform potential reduction strategies.
Methods
Associations between schizophrenia polygenic scores and longitudinal latent classes of cigarette and cannabis use from ages 14 to 19 years were investigated in up to 3925 individuals in the Avon Longitudinal Study of Parents and Children. Mediation models were estimated to assess the potential mediating effects of a range of cognitive, emotional, and behavioral phenotypes.
Results
The schizophrenia polygenic score, based on single nucleotide polymorphisms meeting a training-set p threshold of 0.05, was associated with late-onset cannabis use (OR = 1.23; 95% CI = 1.08,1.41), but not with cigarette or early-onset cannabis use classes. This association was not mediated through lower IQ, victimization, emotional difficulties, antisocial behavior, impulsivity, or poorer social relationships during childhood. Sensitivity analyses adjusting for genetic liability to cannabis or cigarette use, using polygenic scores excluding the CHRNA5-A3-B4 gene cluster, or basing scores on a 0.5 training-set p threshold, provided results consistent with our main analyses.
Conclusions
Our study provides evidence that genetic risk for schizophrenia is associated with patterns of cannabis use during adolescence. Investigation of pathways other than the cognitive, emotional, and behavioral phenotypes examined here is required to identify modifiable targets to reduce the public health burden of cannabis use in the population.