We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This chapter describes the value of using Contemporary Integrative Interpersonal Theory (CIIT) to understand the self and social impairments that define personality disorders as a group. CIIT’s major tenets are summarized, with a particular emphasis on elaborating how the self and self-functioning are an integral part of interpersonal experience and expression. A generic definition of adaptive interpersonal functioning is provided along with a demonstration of how CIIT can accommodate specific constructs and diagnoses using borderline personality disorder and narcissism as examples.
The Paragaricocrinidae is an enigmatic late Paleozoic family of camerate crinoids that retained a robustly constructed calyx more typical of Devonian to Early Mississippian crinoids. The discovery of the oldest member of this family, Tuscumbiacrinus madisonensis n. gen. n. sp., initiated a phylogenetic investigation of the Paragaricocrinidae and consideration of its diversification and paleobiogeographic distribution. Phylogenetic analyses demonstrate the need to describe Tuscumbiacrinus n. gen and conduct revisions to preexisting taxa, resulting in the description of Palenciacrinus mudaensis n. gen. n. sp.; Pulcheracrinus n. gen.; Nipponicrinus hashimotoi n. gen. n. sp.; and Nipponicrinus akiyoshiensis n. gen. n. sp. Megaliocrinus exotericus Strimple is reassigned to Pulcheracrinus n. gen. In addition to having an anachronistic morphology, relatively few specimens are known through the ca. 76-million-year duration of this family. This pattern is unlikely to have resulted from low fossil sampling alone, and instead likely reflects low abundance and/or taxonomic richness of a long-lived waning clade. From its apparent origination in Laurussia during the Mississippian, the Paragaricocrinidae diversified into a cosmopolitan clade. Following a diversity drop during the Pennsylvanian, the Paragaricocrinidae persisted but exemplified characteristics of a dead clade walking until its eventual extinction during the middle Permian (Wordian).
Leptospirosis in NZ has historically been associated with male workers in livestock industries; however, the disease epidemiology is changing. This study identified risk factors amid these shifts. Participants (95 cases:300 controls) were recruited nationwide between 22 July 2019 and 31 January 2022, and controls were frequency-matched by sex (90% male) and rurality (65% rural). Multivariable logistic regression models, adjusted for sex, rurality, age, and season—with one model additionally including occupational sector—identified risk factors including contact with dairy cattle (aOR 2.5; CI: 1.0–6.0), activities with beef cattle (aOR 3.0; 95% CI: 1.1–8.2), cleaning urine/faeces from yard surfaces (aOR 3.9; 95% CI: 1.5–10.3), uncovered cuts/scratches (aOR 4.6; 95% CI: 1.9–11.7), evidence of rodents (aOR 2.2; 95% CI: 1.0–5.0), and work water supply from multiple sources—especially creeks/streams (aOR 7.8; 95% CI: 1.5–45.1) or roof-collected rainwater (aOR 6.6; 95% CI: 1.4–33.7). When adjusted for occupational sector, risk factors remained significant except for contact with dairy cattle, and slaughter without gloves emerged as a risk (aOR 3.3; 95% CI: 0.9–12.9). This study highlights novel behavioural factors, such as uncovered cuts and inconsistent glove use, alongside environmental risks from rodents and natural water sources.
The Society for Healthcare Epidemiology of America, the Association of Professionals in Infection Control and Epidemiology, the Infectious Diseases Society of America, and the Pediatric Infectious Diseases Society represent the core expertise regarding healthcare infection prevention and infectious diseases and have written multisociety statement for healthcare facility leaders, regulatory agencies, payors, and patients to strengthen requirements and expectations around facility infection prevention and control (IPC) programs. Based on a systematic literature search and formal consensus process, the authors advocate raising the expectations for facility IPC programs, moving to effective programs that are:
• Foundational and influential parts of the facility’s operational structure
• Resourced with the correct expertise and leadership
• Prioritized to address all potential infectious harms
This document discusses the IPC program’s leadership—a dyad model that includes both physician and infection preventionist leaders—its reporting structure, expertise, and competencies of its members, and the roles and accountability of partnering groups within the healthcare facility. The document outlines a process for identifying minimum IPC program medical director support. It applies to all types of healthcare settings except post-acute long-term care and focuses on resources for the IPC program. Long-term acute care hospital (LTACH) staffing and antimicrobial stewardship programs will be discussed in subsequent documents.
The articles compiled here offer examples of how the impacts of anthropogenic climate change in coastal settings are monitored and measured, how the broader public can be involved in these efforts, and how planning for mitigation can come about. The case studies are drawn from the southeastern United States and the British Isles, and they indicate the great potential that cooperating communities of practice can offer for addressing climate-change impacts on cultural heritage.
[I]t was natural once the conflict with Britain reached the stage where independence was the only real alternative to submission that the men of the Revolution should turn to constitution making.
Interval estimates of the Pearson, Kendall tau-a and Spearman correlations are reviewed and an improved standard error for the Spearman correlation is proposed. The sample size required to yield a confidence interval having the desired width is examined. A two-stage approximation to the sample size requirement is shown to give accurate results.
Depression is the leading cause of disability worldwide(1). The microbiota-gut-brain axis may play a role in the aetiology of depression, and probiotics show promise for improving mood and depressive state(2). Further evidence is required to support mechanisms and in high-risk populations, such as those with sub-threshold depression (which may be 2-3 times more prevalent than diagnosed depression)(3). The aims were to assess the efficacy of a probiotic compared with placebo in reducing the severity of depressive symptoms in participants with subthreshold depression, and to investigate potential mechanistic markers of inflammatory, antioxidant status and stress response. A double-blind, randomised, placebo-controlled trial was conducted in participants meeting diagnosis of subthreshold depression (DSM-5); aged 18-65 years; ≥18.5 kg/m2 body mass index; not taking antidepressants, centrally acting medications, probiotics nor antibiotics for at least 6 weeks. The probiotic (4 × 109 AFU/CFU, 2.5 g freeze-dried powder containing Lactobacillus fermentum LF16 (DSM26956), L. rhamnosus LR06 (DSM21981), L. plantarum LP01 (LMG P-21021), Bifidobacterium longum BL04 (DSM 23233)) or placebo was taken daily for 3-months. Data was collected at 3 study visits (pre-, mid- (6 weeks), post-intervention). Self-reported questionnaires measured psychological symptoms (Beck Depression Inventory, BDI; Hospital Anxiety Depression Scale, HADS) and quality of life. Blood and salivary samples were collected for biomarkers including cortisol awakening response (CAR). General linear models examined within-group and between-group differences across all time points. Thirty-nine participants completed the study (n = 19 probiotic; n = 20 placebo) using intention-to-treat analysis. The probiotic group decreased in BDI score by −6.5 (95% CI −12.3; −0.7) and −7.6 (95% CI −13.4; −1.8) at 6 and 12 weeks, respectively. The HADS-A score decreased in the probiotic group by −2.8 (95% CI −5.2; −0.4) and −2.7 (95% CI −5.1; −0.3) at 6 and 12, respectively. The HADS-D score decreased in the probiotic group by −3.0 (95% CI −5.4; −0.7) and −2.5 (−4.9; −0.2) at 6 and 12 weeks of intervention, respectively. No between group differences were found. There were no changes in perceived stress or quality of life scores. The probiotic group had reduced hs-CRP levels (7286.2 ± 1205.8 ng/dL vs. 5976.4 ± 1408.3; P = 0.003) and increased total glutathione (14.2 ± 8.9 ng/dL vs. 9.3 ± 4.7; P = 0.049) compared to placebo, post intervention. Lower levels of CAR were found in the probiotic compared to placebo (−0.04 ± 0.17 μg/dL vs. 0.16 ± 0.25; P = 0.009). A significant reduction in depressive symptoms and anxiety was observed within the probiotic group only. These results were supported by improvements observed in biomarkers, suggesting probiotics may improve psychological wellbeing in adults experiencing sub-threshold depression, by potential pathways involved in central nervous system homeostasis and inflammation. Future analyses are required to understand changes within the intestinal microbiota and to clarify how their metabolites facilitate emotional processing.
Identifying patients at imminent risk of death is critical in the management of trauma patients. This study measures the vital sign thresholds associated with death among trauma patients.
Methods:
This study included data from patients ≥15 years of age in the American College of Surgeons Trauma Quality Improvement Program (TQIP) database. Patients with vital signs of zero were excluded. Documented prehospital and emergency department (ED) vital signs included systolic pressure, heart rate, respiratory rate, and calculated shock index (SI). The area under the receiver operator curves (AUROC) was used to assess the accuracy of these variables for predicting 24-hour survival. Optimal thresholds to predict mortality were identified using Youden’s Index, 90% specificity, and 90% sensitivity. Additional analyses examined patients 70+ years of age.
Results:
There were 1,439,221 subjects in the 2019-2020 datasets that met inclusion for this analysis with <0.1% (10,270) who died within 24 hours. The optimal threshold for prehospital systolic pressure was 110, pulse rate was 110, SI was 0.9, and respiratory rate was 15. The optimal threshold for the ED systolic was 112, pulse rate was 107, SI was 0.9, and respiratory rate was 21. Among the elderly sub-analysis, the optimal threshold for prehospital systolic was 116, pulse rate was 100, SI was 0.8, and respiratory rate was 21. The optimal threshold for ED systolic was 121, pulse rate was 95, SI was 0.8, and respiratory rate was 21.
Conclusions:
Systolic blood pressure (SBP) and SI offered the best predictor of mortality among trauma patients. The SBP values predictive of mortality were significantly higher than the traditional 90mmHg threshold. This dataset highlights the need for better methods to guide resuscitation as initial vital signs have limited accuracy in predicting subsequent mortality.
Understanding the effect of phosphorus (P) fertilization on weed interference with sweet corn is important for deciding appropriate fertilization levels and weed control programs. Field experiments were conducted in 2020 and 2021 in Belle Glade, FL, to determine the influence of P fertilization levels (0 or residual P, 62.5, and 120 kg P2O5 ha−1) on the critical period of weed control (CPWC) in sweet corn on organic soils. Experimental plots were subjected to increased duration of weed interference and weed-free period treatments for each P fertilization level. The beginning and end of the CPWC based on 5% and 10% acceptable yield loss (AYL) levels were determined by fitting log-logistic and Gompertz models to represent the increasing duration of weed interference and duration of the weed-free period, respectively. The log-logistic curves did not estimate the beginning of the CPWC at 5% AYL for 0 and 125 kg P2O5 ha−1 because the estimated upper limits of the curves were lower than the 95% relative yield used for estimation of 5% AYL. Based on a 10% AYL level, the length of the CPWC in sweet corn under optimum P fertilization levels was estimated to be 27 d, from the 6- to 7-leaf stage until the silking stage of growth. Reducing P fertilization by 50% increased the CPWC to 36 d, from the 5-leaf stage until the silking to blister stage of growth. Lack of P fertilization increased the CPWC to 64 d, from sweet corn emergence until the blister to milk stage of growth. These results show that the beginning of the CPWC in sweet corn is delayed and the end is shortened as P fertilization level increases. Therefore reduction in P fertilization will require a more intensive weed management program for sweet corn because of the prolonged duration of the CPWC.
To assess the utility of the Mini Mental State Exam (MMSE) and Montreal Cognitive Assessment (MoCA) for tracking cognitive changes Huntington’s Disease.
Participants and Methods:
Currently, the most frequently used brief assessment of global cognitive functioning is the MMSE. Although the MMSE is helpful for distinguishing individuals without significant cognitive impairment from those with dementia, it is not particularly sensitive to more subtle cognitive deficits. The MoCA is another brief cognitive screening tool that has been shown to be more sensitive to mild impairment and may have greater usefulness in subcortical dementias because of its more extensive assessment of executive function. Although the MoCA appears to have high sensitivity and specificity in a variety of neurological populations, there is currently little known about its efficacy in tracking cognitive decline in individuals with HD. We used a mixed effects model to analyze MMSE and MoCA scores collected prospectively during 5 years of follow-up for 163 patients with HD seen at one academic HDSA Center of Excellence. Baseline mean age for the HD cohort was 51.35 years, mean education 14.46 years, and a mean CAG repeat length 43.95. Mean follow-up time was 3.33 years.
Results:
Mean MMSE and MoCA scores at baseline were 25.13 (SD=1.66) and 22.76 (SD=3.70) respectively. At baseline, age and gender were not associated with MMSE and MoCA scores, while years of education were. Neither age nor gender predicted rate of decline for the MoCA while years of education predicted rate of decline for the MMSE. For the MMSE, each year of education predicted on average 0.51 points higher score at enrollment; for the MoCA, each year of education predicted on average 0.79 points higher score at enrollment. The mean rates of decline on the MMSE was 0.48 points per year (p<.001) while that on the MoCA was only 0.31 points annually (p<.001) in the first five years of observation.
Conclusions:
The MMSE and MoCA decline significantly over time in an unselected HD population. The smaller rate of decline in the MoCA may be due, in part, to the greater variability in baseline, MoCA (SD=3.70) vs MMSE (SD=1.66) scores in our HD cohort. Unlike cortical dementias, such as Alzheimer’s disease (AD), where declines of 2-3 points per year have been described for the MMSE and MoCA, much lower annual rates of decline have been reported in subcortical dementias such as Parkinson’s disease. To our knowledge, this is the first report of rate of cognitive decline on the MMSE and MoCA in HD: such information is vital for adequately preparing patients and families for future needs, in addition to planning for interventional/treatment trials in HD.
Feeding difficulties after congenital heart surgery are a common concern for caregivers of children with CHD. Insight into the intricacies of their experience is lacking. With a better understanding, healthcare providers can continue to optimize the approach and support mechanisms for these families. This study will explore the psychosocial impacts on caregivers, define barriers to care, and identify areas to improve their care.
Study Design:
This mixed-methods study combined semi-structured interviews with surveys. Purposive sampling targeted caregivers of a child who underwent heart surgery and was discharged with alternative enteral feeding access. A hybrid inductive-deductive methodology was used to analyse interview transcripts. Survey scores were compared to interview content for concordance.
Results:
Fifteen interviews were conducted with socio-demographically diverse caregivers. Feeding difficulties were often identified as their greatest challenge, with the laborious feeding schedule, sleep deprivation, and tube management being common contributors. Most caregivers described feeling overwhelmed and worried. Time-intensive feeding schedules and lack of appropriate childcare options precluded caregivers’ ability to work. Barriers to care included imperfect feeding education, proximity of specialist clinics, and issues with medical supply companies. Caregiver proposals for improved care addressed easing the transition home, improving emotional support mechanisms, and intensifying feeding therapy for expedited tube removal.
Conclusion:
This study describes the psychosocial toll on the caregiver, typical barriers to care, and ideas for improved provision of care. These themes and ideas can be used to advance the family-centered approach to feeding difficulties after heart surgery.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
This study investigates the dose escalation to dominant intra-prostatic lesions (DILs) that is achievable using single-source-strength (SSS) and dual-source-strength (DSS) low-dose-rate (LDR) prostate brachytherapy and a sector-based plan approach.
Methods:
Twenty patients were retrospectively analysed. Image registration and planning were undertaken using VariSeed v9·0. SSS and DSS boost plans were produced and compared to clinical plans. Dosimetric robustness to seed displacement for SSS and DSS plans was compared to clinical plans using Monte Carlo simulations.
Results:
Fourteen out of 20 patients had DIL identifiable on magnetic resonance imaging. Median increase in sector D90 of 27% (p < 0·0001) and sector V150 of 31% (p < 0·0001) was achieved with SSS planning without exceeding local rectum and urethra dose constraints. DSS plans achieved dose distributions not statistically significantly different from the SSS plans with a median of eight fewer seeds and two fewer needles. SSS and DSS plan sensitivity to random seed displacement was similar to the clinical plans.
Conclusions:
Treatment planning using VariSeed to produce SSS and DSS focal boost plans is feasible for LDR prostate brachytherapy to achieve a median escalation in sector D90 of 27% without exceeding local urethral and rectal constraints. SSS and DSS plan dosimetric robustness was similar to clinical plan dosimetric robustness.
Atrazine and S-metolachlor are the herbicides most relied on by growers to control weeds in sweet corn crops grown in the Everglades Agricultural Area (EAA) in southern Florida. Alternative weed management programs are needed. Field experiments were conducted in 2021 and 2022 to evaluate the efficacy of 1) pyroxasulfone (183 and 237 g ha−1) alone or as a premix with carfentrazone-ethyl (13 and 17 g ha−1) or fluthiacet-methyl (6 and 7 g ha−1), S-metolachlor (1,790 g ha−1) alone or in combination with atrazine (3,360 g ha−1) applied preemergence(PRE); 2) mesotrione (105 g ha−1), topramezone (25 g ha−1), and tembotrione (92 g ha−1) applied postemergence alone or in combination with atrazine (560 and 2,240 g ha−1) or bentazon (1,120 g ha−1); and 3) mechanical cultivation alone at the fourth and the fourth followed by the sixth leaf stages of sweet corn. PRE-applied herbicides did not provide acceptable control of fall panicum, common lambsquarters, or common purslane probably due to a lack of incorporation into the soil because of limited rainfall. POST-applied topramezone alone or in combination with atrazine or bentazon resulted in effective fall panicum control (>91%). Topramezone alone provided 83% and 88% control of common lambsquarters and common purslane, respectively, whereas atrazine added to topramezone resulted in >94% control of both weed species. Mesotrione and tembotrione plus atrazine provided excellent control (>93%) of both broadleaf weed species but poor fall panicum control (<72%). Mechanical cultivation alone did not effectively control any weeds. Overall, treatments that contained topramezone resulted in greater sweet corn yield. These results show that a combination of topramezone, mesotrione, and tembotrione with atrazine resulted in improved broadleaf weed control. Fall panicum control was improved only with the combination of topramezone with atrazine, showing that atrazine is an important mixture component of these herbicides to provide effective POST weed control in sweet corn on organic soils of the EAA.
Studies have reported mixed findings regarding the impact of the coronavirus disease 2019 (COVID-19) pandemic on pregnant women and birth outcomes. This study used a quasi-experimental design to account for potential confounding by sociodemographic characteristics.
Methods
Data were drawn from 16 prenatal cohorts participating in the Environmental influences on Child Health Outcomes (ECHO) program. Women exposed to the pandemic (delivered between 12 March 2020 and 30 May 2021) (n = 501) were propensity-score matched on maternal age, race and ethnicity, and child assigned sex at birth with 501 women who delivered before 11 March 2020. Participants reported on perceived stress, depressive symptoms, sedentary behavior, and emotional support during pregnancy. Infant gestational age (GA) at birth and birthweight were gathered from medical record abstraction or maternal report.
Results
After adjusting for propensity matching and covariates (maternal education, public assistance, employment status, prepregnancy body mass index), results showed a small effect of pandemic exposure on shorter GA at birth, but no effect on birthweight adjusted for GA. Women who were pregnant during the pandemic reported higher levels of prenatal stress and depressive symptoms, but neither mediated the association between pandemic exposure and GA. Sedentary behavior and emotional support were each associated with prenatal stress and depressive symptoms in opposite directions, but no moderation effects were revealed.
Conclusions
There was no strong evidence for an association between pandemic exposure and adverse birth outcomes. Furthermore, results highlight the importance of reducing maternal sedentary behavior and encouraging emotional support for optimizing maternal health regardless of pandemic conditions.
While unobscured and radio-quiet active galactic nuclei are regularly being found at redshifts
$z > 6$
, their obscured and radio-loud counterparts remain elusive. We build upon our successful pilot study, presenting a new sample of low-frequency-selected candidate high-redshift radio galaxies (HzRGs) over a sky area 20 times larger. We have refined our selection technique, in which we select sources with curved radio spectra between 72–231 MHz from the GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) survey. In combination with the requirements that our GLEAM-selected HzRG candidates have compact radio morphologies and be undetected in near-infrared
$K_{\rm s}$
-band imaging from the Visible and Infrared Survey Telescope for Astronomy Kilo-degree Infrared Galaxy (VIKING) survey, we find 51 new candidate HzRGs over a sky area of approximately
$1200\ \mathrm{deg}^2$
. Our sample also includes two sources from the pilot study: the second-most distant radio galaxy currently known, at
$z=5.55$
, with another source potentially at
$z \sim 8$
. We present our refined selection technique and analyse the properties of the sample. We model the broadband radio spectra between 74 MHz and 9 GHz by supplementing the GLEAM data with both publicly available data and new observations from the Australia Telescope Compact Array at 5.5 and 9 GHz. In addition, deep
$K_{\rm s}$
-band imaging from the High-Acuity Widefield K-band Imager (HAWK-I) on the Very Large Telescope and from the Southern Herschel Astrophysical Terahertz Large Area Survey Regions
$K_{\rm s}$
-band Survey (SHARKS) is presented for five sources. We discuss the prospects of finding very distant radio galaxies in our sample, potentially within the epoch of reionisation at
$z \gtrsim 6.5$
.
The hippocampus is a complex brain structure with key roles in cognitive and emotional processing and with subregion abnormalities associated with a range of disorders and psychopathologies. Here we combine data from two large independent young adult twin/sibling cohorts to obtain the most accurate estimates to date of genetic covariation between hippocampal subfield volumes and the hippocampus as a single volume. The combined sample included 2148 individuals, comprising 1073 individuals from 627 families (mean age = 22.3 years) from the Queensland Twin IMaging (QTIM) Study, and 1075 individuals from 454 families (mean age = 28.8 years) from the Human Connectome Project (HCP). Hippocampal subfields were segmented using FreeSurfer version 6.0 (CA4 and dentate gyrus were phenotypically and genetically indistinguishable and were summed to a single volume). Multivariate twin modeling was conducted in OpenMx to decompose variance into genetic and environmental sources. Bivariate analyses of hippocampal formation and each subfield volume showed that 10%–72% of subfield genetic variance was independent of the hippocampal formation, with greatest specificity found for the smaller volumes; for example, CA2/3 with 42% of genetic variance being independent of the hippocampus; fissure (63%); fimbria (72%); hippocampus-amygdala transition area (41%); parasubiculum (62%). In terms of genetic influence, whole hippocampal volume is a good proxy for the largest hippocampal subfields, but a poor substitute for the smaller subfields. Additive genetic sources accounted for 49%–77% of total variance for each of the subfields in the combined sample multivariate analysis. In addition, the multivariate analyses were sufficiently powered to identify common environmental influences (replicated in QTIM and HCP for the molecular layer and CA4/dentate gyrus, and accounting for 7%–16% of total variance for 8 of 10 subfields in the combined sample). This provides the clearest indication yet from a twin study that factors such as home environment may influence hippocampal volumes (albeit, with caveats).
The Hierarchical Taxonomy of Psychopathology (HiTOP) has emerged out of the quantitative approach to psychiatric nosology. This approach identifies psychopathology constructs based on patterns of co-variation among signs and symptoms. The initial HiTOP model, which was published in 2017, is based on a large literature that spans decades of research. HiTOP is a living model that undergoes revision as new data become available. Here we discuss advantages and practical considerations of using this system in psychiatric practice and research. We especially highlight limitations of HiTOP and ongoing efforts to address them. We describe differences and similarities between HiTOP and existing diagnostic systems. Next, we review the types of evidence that informed development of HiTOP, including populations in which it has been studied and data on its validity. The paper also describes how HiTOP can facilitate research on genetic and environmental causes of psychopathology as well as the search for neurobiologic mechanisms and novel treatments. Furthermore, we consider implications for public health programs and prevention of mental disorders. We also review data on clinical utility and illustrate clinical application of HiTOP. Importantly, the model is based on measures and practices that are already used widely in clinical settings. HiTOP offers a way to organize and formalize these techniques. This model already can contribute to progress in psychiatry and complement traditional nosologies. Moreover, HiTOP seeks to facilitate research on linkages between phenotypes and biological processes, which may enable construction of a system that encompasses both biomarkers and precise clinical description.