We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The First Large Absorption Survey in H i (FLASH) is a large-area radio survey for neutral hydrogen in and around galaxies in the intermediate redshift range 0.4 < z < 1.0, using the 21-cm H i absorption line as a probe of cold neutral gas. The survey uses the ASKAP radio telescope and will cover 24,000 deg2 of sky over the next five years. FLASH breaks new ground in two ways – it is the first large H i absorption survey to be carried out without any optical preselection of targets, and we use an automated Bayesian line-finding tool to search through large datasets and assign a statistical significance to potential line detections. Two Pilot Surveys, covering around 3000 deg2 of sky, were carried out in 2019-22 to test and verify the strategy for the full FLASH survey. The processed data products from these Pilot Surveys (spectral-line cubes, continuum images, and catalogues) are public and available online. In this paper, we describe the FLASH spectral-line and continuum data products and discuss the quality of the H i spectra and the completeness of our automated line search. Finally, we present a set of 30 new H i absorption lines that were robustly detected in the Pilot Surveys, almost doubling the number of known H i absorption systems at 0.4 < z < 1. The detected lines span a wide range in H i optical depth, including three lines with a peak optical depth τ > 1, and appear to be a mixture of intervening and associated systems. Interestingly, around two-thirds of the lines found in this untargeted sample are detected against sources with a peaked-spectrum radio continuum, which are only a minor (5-20%) fraction of the overall radio-source population. The detection rate for H i absorption lines in the Pilot Surveys (0.3 to 0.5 lines per 40 deg2 ASKAP field) is a factor of two below the expected value. One possible reason for this is the presence of a range of spectral-line artefacts in the Pilot Survey data that have now been mitigated and are not expected to recur in the full FLASH survey. A future paper in this series will discuss the host galaxies of the H i absorption systems identified here.
Diagnosis in psychiatry faces familiar challenges. Validity and utility remain elusive, and confusion regarding the fluid and arbitrary border between mental health and illness is increasing. The mainstream strategy has been conservative and iterative, retaining current nosology until something better emerges. However, this has led to stagnation. New conceptual frameworks are urgently required to catalyze a genuine paradigm shift.
Methods
We outline candidate strategies that could pave the way for such a paradigm shift. These include the Research Domain Criteria (RDoC), the Hierarchical Taxonomy of Psychopathology (HiTOP), and Clinical Staging, which all promote a blend of dimensional and categorical approaches.
Results
These alternative still heuristic transdiagnostic models provide varying levels of clinical and research utility. RDoC was intended to provide a framework to reorient research beyond the constraints of DSM. HiTOP began as a nosology derived from statistical methods and is now pursuing clinical utility. Clinical Staging aims to both expand the scope and refine the utility of diagnosis by the inclusion of the dimension of timing. None is yet fit for purpose. Yet they are relatively complementary, and it may be possible for them to operate as an ecosystem. Time will tell whether they have the capacity singly or jointly to deliver a paradigm shift.
Conclusions
Several heuristic models have been developed that separately or synergistically build infrastructure to enable new transdiagnostic research to define the structure, development, and mechanisms of mental disorders, to guide treatment and better meet the needs of patients, policymakers, and society.
The performance and confidence in fault detection and diagnostic systems can be undermined by data pipelines that feature multiple compounding sources of uncertainty. These issues further inhibit the deployment of data-based analytics in industry, where variable data quality and lack of confidence in model outputs are already barriers to their adoption. The methodology proposed in this paper supports trustworthy data pipeline design and leverages knowledge gained from one fully-observed data pipeline to a similar, under-observed case. The transfer of uncertainties provides insight into uncertainty drivers without repeating the computational or cost overhead of fully redesigning the pipeline. A SHAP-based human-readable explainable AI (XAI) framework was used to rank and explain the impact of each choice in a data pipeline, allowing the decoupling of positive and negative performance drivers to facilitate the successful selection of highly-performing pipelines. This empirical approach is demonstrated in bearing fault classification case studies using well-understood open-source data.
Observations of radiocarbon (14C) in Earth’s atmosphere and other carbon reservoirs are important to quantify exchanges of CO2 between reservoirs. The amount of 14C is commonly reported in the so-called Delta notation, i.e., Δ14C, the decay- and fractionation-corrected departure of the ratio of 14C to total C from that ratio in an absolute international standard; this Delta notation permits direct comparison of 14C/C ratios in the several reservoirs. However, as Δ14C of atmospheric CO2, Δ14CO2 is based on the ratio of 14CO2 to total atmospheric CO2, its value can and does change not just because of change in the amount of atmospheric14CO2 but also because of change in the amount of total atmospheric CO2, complicating ascription of change in Δ14CO2 to change in one or the other quantity. Here we suggest that presentation of atmospheric 14CO2 amount as mole fraction relative to dry air (moles of 14CO2 per moles of dry air in Earth’s atmosphere), or as moles or molecules of 14CO2 in Earth’s atmosphere, all readily calculated from Δ14CO2 and the amount of atmospheric CO2 (with slight dependence on δ13CO2), complements presentation only as Δ14CO2, and can provide valuable insight into the evolving budget and distribution of atmospheric 14CO2.
Loss of control eating is more likely to occur in the evening and is uniquely associated with distress. No studies have examined the effect of treatment on within-day timing of loss of control eating severity. We examined whether time of day differentially predicted loss of control eating severity at baseline (i.e. pretreatment), end-of-treatment, and 6-month follow-up for individuals with binge-eating disorder (BED), hypothesizing that loss of control eating severity would increase throughout the day pretreatment and that this pattern would be less pronounced following treatment. We explored differential treatment effects of cognitive-behavioral guided self-help (CBTgsh) and Integrative Cognitive-Affective Therapy (ICAT).
Methods
Individuals with BED (N = 112) were randomized to receive CBTgsh or ICAT and completed a 1-week ecological momentary assessment protocol at baseline, end-of-treatment, and 6-month follow-up to assess loss of control eating severity. We used multilevel models to assess within-day slope trajectories of loss of control eating severity across assessment periods and treatment type.
Results
Within-day increases in loss of control eating severity were reduced at end-of-treatment and 6-month follow-up relative to baseline. Evening acceleration of loss of control eating severity was greater at 6-month follow-up relative to end-of-treatment. Within-day increases in loss of control severity did not differ between treatments at end-of-treatment; however, evening loss of control severity intensified for individuals who received CBTgsh relative to those who received ICAT at 6-month follow-up.
Conclusions
Findings suggest that treatment reduces evening-shifted loss of control eating severity, and that this effect may be more durable following ICAT relative to CBTgsh.
Whole-body tissue protein turnover is regulated, in part, by the postprandial rise in plasma amino acid concentrations, although minimal data exist on the amino acid response following non-animal-derived protein consumption. We hypothesised that the ingestion of novel plant- and algae-derived dietary protein sources would elicit divergent plasma amino acid responses when compared with vegan- and animal-derived control proteins. Twelve healthy young (male (m)/female (f): 6/6; age: 22 ± 1 years) and 10 healthy older (m/f: 5/5; age: 69 ± 2 years) adults participated in a randomised, double-blind, cross-over trial. During each visit, volunteers consumed 30 g of protein from milk, mycoprotein, pea, lupin, spirulina or chlorella. Repeated arterialised venous blood samples were collected at baseline and over a 5-h postprandial period to assess circulating amino acid, glucose and insulin concentrations. Protein ingestion increased plasma total and essential amino acid concentrations (P < 0·001), to differing degrees between sources (P < 0·001), and the increase was further modulated by age (P < 0·001). Postprandial maximal plasma total and essential amino acid concentrations were highest for pea (2828 ± 106 and 1480 ± 51 µmol·l−1) and spirulina (2809 ± 99 and 1455 ± 49 µmol·l−1) and lowest for chlorella (2053 ± 83 and 983 ± 35 µmol·l−1) (P < 0·001), but were not affected by age (P > 0·05). Postprandial total and essential amino acid availabilities were highest for pea, spirulina and mycoprotein and lowest for chlorella (all P < 0·05), but no effect of age was observed (P > 0·05). The ingestion of a variety of novel non-animal-derived dietary protein sources elicits divergent plasma amino acid responses, which are further modulated by age.
Knowledge graphs have become a common approach for knowledge representation. Yet, the application of graph methodology is elusive due to the sheer number and complexity of knowledge sources. In addition, semantic incompatibilities hinder efforts to harmonize and integrate across these diverse sources. As part of The Biomedical Translator Consortium, we have developed a knowledge graph–based question-answering system designed to augment human reasoning and accelerate translational scientific discovery: the Translator system. We have applied the Translator system to answer biomedical questions in the context of a broad array of diseases and syndromes, including Fanconi anemia, primary ciliary dyskinesia, multiple sclerosis, and others. A variety of collaborative approaches have been used to research and develop the Translator system. One recent approach involved the establishment of a monthly “Question-of-the-Month (QotM) Challenge” series. Herein, we describe the structure of the QotM Challenge; the six challenges that have been conducted to date on drug-induced liver injury, cannabidiol toxicity, coronavirus infection, diabetes, psoriatic arthritis, and ATP1A3-related phenotypes; the scientific insights that have been gleaned during the challenges; and the technical issues that were identified over the course of the challenges and that can now be addressed to foster further development of the prototype Translator system. We close with a discussion on Large Language Models such as ChatGPT and highlight differences between those models and the Translator system.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Shortness of breath, or dyspnea, is the subjective experience of breathing discomfort and is a common, distressing, and debilitating symptom of lung cancer. There are no efficacious pharmacological treatments, but there is suggestive evidence that cognitive–behavioral treatments could relieve dyspnea. For this, understanding the psychological, behavioral, and social factors that may affect dyspnea severity is critical. To this end, patients with dyspnea were interviewed with questions framed by the cognitive–behavioral model—emphasizing thoughts, emotions, and behaviors as contributors and outcomes of dyspnea.
Methods
Two trained individuals conducted semi-structured interviews with lung cancer patients (N = 15) reporting current dyspnea. Interviews assessed patients’ cognitive–behavioral experiences with dyspnea. Study personnel used a grounded theory approach for qualitative analysis to code the interviews. Inter-rater reliability of codes was high (κ = 0.90).
Results
Thoughts: Most common were patients’ catastrophic thoughts about their health and receiving enough oxygen when breathless. Emotions: Anxiety about dyspnea was the most common, followed by anger, sadness, and shame related to dyspnea. Behaviors: Patients rested and took deep breaths to relieve acute episodes of dyspnea. To reduce the likelihood of dyspnea, patients planned their daily activity or reduced their physical activity at the expense of engagement in hobbies and functional activities.
Significance of results
Patients identified cognitive–behavioral factors (thoughts, emotions, and behaviors) that coalesce with dyspnea. The data provide meaningful insights into potential cognitive–behavioral interventions that could target contributors to dyspnea.
Cognitive behavioural therapy (CBT) is an effective treatment for depression but a significant minority of clients do not complete therapy, do not respond to it, or subsequently relapse. Non-responders, and those at risk of relapse, are more likely to have adverse childhood experiences, early-onset depression, co-morbidities, interpersonal problems and heightened risk. This is a heterogeneous group of clients who are currently difficult to treat.
Aim:
The aim was to develop a CBT model of depression that will be effective for difficult-to-treat clients who have not responded to standard CBT.
Method:
The method was to unify theory, evidence and clinical strategies within the field of CBT to develop an integrated CBT model. Single case methods were used to develop the treatment components.
Results:
A self-regulation model of depression has been developed. It proposes that depression is maintained by repeated interactions of self-identity disruption, impaired motivation, disengagement, rumination, intrusive memories and passive life goals. Depression is more difficult to treat when these processes become interlocked. Treatment based on the model builds self-regulation skills and restructures self-identity, rather than target negative beliefs. A bespoke therapy plan is formed out of ten treatment components, based on an individual case formulation.
Conclusions:
A self-regulation model of depression is proposed that integrates theory, evidence and practice within the field of CBT. It has been developed with difficult-to-treat cases as its primary purpose. A case example is described in a concurrent article (Barton et al., 2022) and further empirical tests are on-going.
Dental healthcare personnel (DHCP) are at high risk of exposure to coronavirus disease 2019 (COVID-19). We sought to identify how DHCP changed their use of personal protective equipment (PPE) as a result of the COVID-19 pandemic, and to pilot an educational video designed to improve knowledge of proper PPE use.
Design:
The study comprised 2 sets of semistructured qualitative interviews.
Setting:
The study was conducted in 8 dental clinics in a Midwestern metropolitan area.
Participants:
In total, 70 DHCP participated in the first set of interviews; 63 DHCP participated in the second set of interviews.
Methods:
In September–November 2020 and March–October 2021, we conducted 2 sets of semistructured interviews: (1) PPE use in the dental community during COVID-19, and (2) feedback on the utility of an educational donning and doffing video.
Results:
Overall, 86% of DHCP reported having prior training. DHCP increased the use of PPE during COVID-19, specifically N95 respirators and face shields. DHCP reported real-world challenges to applying infection control methods, often resulting in PPE modification and reuse. DHCP reported double masking and sterilization methods to extend N95 respirator use. Additional challenges to PPE included shortages, comfort or discomfort, and compatibility with specialty dental equipment. DHCP found the educational video helpful and relevant to clinical practice. Fewer than half of DHCP reported exposure to a similar video.
Conclusions:
DHCP experienced significant challenges related to PPE access and routine use in dental clinics during the COVID-19 pandemic. An educational video improved awareness and uptake of appropriate PPE use among DHCP.
The objective of this study is to determine the physical evaluations and assessment tools used by a group of Canadian healthcare professionals treating adults with spasticity.
Methods:
A cross-sectional web-based 19-question survey was developed to determine the types of physical evaluations, tone-related impairment measurements, and assessment tools used in the management of adults with spasticity. The survey was distributed to healthcare professionals from the Canadian Advances in Neuro-Orthopedics for Spasticity Congress database.
Results:
Eighty study participants (61 physiatrists and 19 other healthcare professionals) completed the survey and were included. Nearly half (46.3%, 37/80) of the participants reported having an inter- or trans-disciplinary team managing individuals with spasticity. Visual observation of movement, available range of motion determination, tone during velocity-dependent passive range of motion looking for a spastic catch, spasticity, and clonus, and evaluation of gait were the most frequently used physical evaluations. The most frequently used spasticity tools were the Modified Ashworth Scale, goniometer, and Goal Attainment Scale. Results were similar in brain- and spinal cord-predominant etiologies. To evaluate goals, qualitative description was used most (37.5%).
Conclusion:
Our findings provide a better understanding of the spasticity management landscape in Canada with respect to staffing, physical evaluations, and outcome measurements used in clinical practice. For all etiologies of spasticity, visual observation of patient movement, Modified Ashworth Scale, and qualitative goal outcomes descriptions were most commonly used to guide treatment and optimize outcomes. Understanding the current practice of spasticity assessment will help provide guidance for clinical evaluation and management of spasticity.
Excess unabsorbed iron in the gastrointestinal tract may select for enteric pathogens and increase the incidence and severity of infectious disease. Aspergillus oryzae (Ao) is a filamentous fungus that has the ability to accumulate and store large amounts of iron, and when used as a supplement or fortificant, has similar absorption to ferrous sulphate (FeSO4) in humans. The objective of this study was to determine the effect of iron-enriched Ao (Ao iron) compared with FeSO4 on iron accumulation, growth and motility of the Gram-negative enteric pathogen, S. Typhimurium. S. Typhimurium was cultured in media containing no added iron or 1 μM elemental iron as either Ao iron or FeSO4. S. Typhimurium cultured with FeSO4 accumulated more iron than those cultured with Ao iron. Genes regulated by the iron-activated transcriptional repressor, Fur, did not differ between control and Ao iron, but decreased in S. Typhimurium cultured with FeSO4 compared with both groups. Growth of S. Typhimurium was greater when cultured with FeSO4 compared with Ao iron and control. S. Typhimurium swam faster, had greater acceleration and travelled further when cultured with FeSO4 compared with Ao iron and control; swim speed, acceleration and distance travelled did not differ between Ao iron and control. These findings provide evidence that Ao iron reduces the virulence of a common enteric pathogen in vitro. Further research is required to determine whether iron-enriched Ao is a suitable iron supplement to improve iron delivery in areas with a high infection burden.
Ingestion of mycoprotein stimulates skeletal muscle protein synthesis (MPS) rates to a greater extent than concentrated milk protein when matched for leucine content, potentially attributable to the wholefood nature of mycoprotein. We hypothesised that bolus ingestion of mycoprotein as part of its wholefood matrix would stimulate MPS rates to a greater extent compared with a leucine-matched bolus of protein concentrated from mycoprotein. Twenty-four healthy young (age, 21 ± 2 years; BMI, 24 ± 3 kg.m2) males received primed, continuous infusions of L-[ring-2H5]phenylalanine and completed a bout of unilateral resistance leg exercise before ingesting either 70 g mycoprotein (MYC; 31·4 g protein, 2·5 g leucine; n 12) or 38·2 g of a protein concentrate obtained from mycoprotein (PCM; 28·0 g protein, 2·5 g leucine; n 12). Blood and muscle samples (vastus lateralis) were taken pre- and (4 h) post-exercise/protein ingestion to assess postabsorptive and postprandial myofibrillar protein fractional synthetic rates (FSR) in resting and exercised muscle. Protein ingestion increased plasma essential amino acid and leucine concentrations (P < 0·0001), but more rapidly (both 60 v. 90 min; P < 0·0001) and to greater magnitudes (1367 v. 1346 μmol·l–1 and 298 v. 283 μmol·l–1, respectively; P < 0·0001) in PCM compared with MYC. Protein ingestion increased myofibrillar FSR (P < 0·0001) in both rested (MYC, Δ0·031 ± 0·007 %·h–1 and PCM, Δ0·020 ± 0·008 %·h–1) and exercised (MYC, Δ0·057 ± 0·011 %·h–1 and PCM, Δ0·058 ± 0·012 %·h–1) muscle, with no differences between conditions (P > 0·05). Mycoprotein ingestion results in equivalent postprandial stimulation of resting and post-exercise myofibrillar protein synthesis rates irrespective of whether it is consumed within or without its wholefood matrix.
Seabirds are declining globally and are one of the most threatened groups of birds. To halt or reverse this decline they need protection both on land and at sea, requiring site-based conservation initiatives based on seabird abundance and diversity. The Important Bird and Biodiversity Area (IBA) programme is a method of identifying the most important places for birds based on globally agreed standardised criteria and thresholds. However, while great strides have been made identifying terrestrial sites, at-sea identification is lacking. The Chagos Archipelago, central Indian Ocean, supports four terrestrial IBAs (tIBAs) and two proposed marine IBAs (mIBAs). The mIBAs are seaward extensions to breeding colonies based on outdated information and, other types of mIBA have not been explored. Here, we review the proposed seaward extension mIBAs using up-to-date seabird status and distribution information and, use global positioning system (GPS) tracking from Red-footed Booby Sula sula – one of the most widely distributed breeding seabirds on the archipelago – to identify any pelagic mIBAs. We demonstrate that due to overlapping boundaries of seaward extension to breeding colony and pelagic areas of importance there is a single mIBA in the central Indian Ocean that lays entirely within the Chagos Archipelago Marine Protected Area (MPA). Covering 62,379 km2 it constitutes ~10% of the MPA and if designated, would become the 11th largest mIBA in the world and 4th largest in the Indian Ocean. Our research strengthens the evidence of the benefits of large-scale MPAs for the protection of marine predators and provides a scientific foundation stone for marine biodiversity hotspot research in the central Indian Ocean.
This article is a clinical guide which discusses the “state-of-the-art” usage of the classic monoamine oxidase inhibitor (MAOI) antidepressants (phenelzine, tranylcypromine, and isocarboxazid) in modern psychiatric practice. The guide is for all clinicians, including those who may not be experienced MAOI prescribers. It discusses indications, drug-drug interactions, side-effect management, and the safety of various augmentation strategies. There is a clear and broad consensus (more than 70 international expert endorsers), based on 6 decades of experience, for the recommendations herein exposited. They are based on empirical evidence and expert opinion—this guide is presented as a new specialist-consensus standard. The guide provides practical clinical advice, and is the basis for the rational use of these drugs, particularly because it improves and updates knowledge, and corrects the various misconceptions that have hitherto been prominent in the literature, partly due to insufficient knowledge of pharmacology. The guide suggests that MAOIs should always be considered in cases of treatment-resistant depression (including those melancholic in nature), and prior to electroconvulsive therapy—while taking into account of patient preference. In selected cases, they may be considered earlier in the treatment algorithm than has previously been customary, and should not be regarded as drugs of last resort; they may prove decisively effective when many other treatments have failed. The guide clarifies key points on the concomitant use of incorrectly proscribed drugs such as methylphenidate and some tricyclic antidepressants. It also illustrates the straightforward “bridging” methods that may be used to transition simply and safely from other antidepressants to MAOIs.
To determine the impact of various aerosol mitigation interventions and to establish duration of aerosol persistence in a variety of dental clinic configurations.
Methods:
We performed aerosol measurement studies in endodontic, orthodontic, periodontic, pediatric, and general dentistry clinics. We used an optical aerosol spectrometer and wearable particulate matter sensors to measure real-time aerosol concentration from the vantage point of the dentist during routine care in a variety of clinic configurations (eg, open bay, single room, partitioned operatories). We compared the impact of aerosol mitigation strategies (eg, ventilation and high-volume evacuation (HVE), and prevalence of particulate matter) in the dental clinic environment before, during, and after high-speed drilling, slow–speed drilling, and ultrasonic scaling procedures.
Results:
Conical and ISOVAC HVE were superior to standard-tip evacuation for aerosol-generating procedures. When aerosols were detected in the environment, they were rapidly dispersed within minutes of completing the aerosol-generating procedure. Few aerosols were detected in dental clinics, regardless of configuration, when conical and ISOVAC HVE were used.
Conclusions:
Dentists should consider using conical or ISOVAC HVE rather than standard-tip evacuators to reduce aerosols generated during routine clinical practice. Furthermore, when such effective aerosol mitigation strategies are employed, dentists need not leave dental chairs fallow between patients because aerosols are rapidly dispersed.
Evidence suggests that cognitive subtypes exist in schizophrenia that may reflect different neurobiological trajectories. We aimed to identify whether IQ-derived cognitive subtypes are present in early-phase schizophrenia-spectrum disorder and examine their relationship with brain structure and markers of neuroinflammation.
Method
161 patients with recent-onset schizophrenia spectrum disorder (<5 years) were recruited. Estimated premorbid and current IQ were calculated using the Wechsler Test of Adult Reading and a 4-subtest WAIS-III. Cognitive subtypes were identified with k-means clustering. Freesurfer was used to analyse 3.0 T MRI. Blood samples were analysed for hs-CRP, IL-1RA, IL-6 and TNF-α.
Results
Three subtypes were identified indicating preserved (PIQ), deteriorated (DIQ) and compromised (CIQ) IQ. Absolute total brain volume was significantly smaller in CIQ compared to PIQ and DIQ, and intracranial volume was smaller in CIQ than PIQ (F(2, 124) = 6.407, p = 0.002) indicative of premorbid smaller brain size in the CIQ group. CIQ had higher levels of hs-CRP than PIQ (F(2, 131) = 5.01, p = 0.008). PIQ showed differentially impaired processing speed and verbal learning compared to IQ-matched healthy controls.
Conclusions
The findings add validity of a neurodevelopmental subtype of schizophrenia identified by comparing estimated premorbid and current IQ and characterised by smaller premorbid brain volume and higher measures of low-grade inflammation (CRP).