We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Termination of an existing failed corn stand before replanting is essential. Two studies were conducted in Stoneville and Verona, MS, from 2020 to 2021 to evaluate timing of corn or soybean replanting following different herbicide treatments applied to simulated failed stands of corn. Treatments included paraquat alone at 841 g ai ha−1, paraquat at 841 g ha−1 + metribuzin at 211 g ai ha−1, and clethodim at 51 g ai ha−1 + glyphosate at 1,121 g ae ha−1 applied at the V2 growth stage. Replant timings were 1 and 7 d after herbicide treatment (DAT). Pooled across replant timings, paraquat + metribuzin provided the greatest control 3 DAT compared with other treatments in both studies. At 14 and 21 DAT, clethodim + glyphosate controlled more corn than did paraquat + metribuzin and paraquat alone. Control of a simulated failed corn stand with paraquat alone never exceeded 50% at 3 to 21 DAT. Soybean yield in all plots receiving herbicide treatment targeting simulated failed corn stands were similar and ≥2,150 kg ha−1. When applied at the V2 corn growth stage, both clethodim + glyphosate and paraquat + metribuzin controlled a simulated failed stand of corn. This study demonstrated the importance of terminating failed stands of corn before replanting because of dramatic reductions in yield in the plots not treated with herbicide.
Healthcare-associated infections (HAIs) result in substantial patient harm and avoidable costs. Pay-for-performance programs (PFP) through the Centers for Medicare and Medicaid Services (CMS) have resulted in reductions of HAIs like central line-associated bloodstream infections (CLABSI) and methicillin-resistant Staphylococcus aureus bacteremia, through robust infection prevention programs and practices. Hospital Onset Bacteremia and Fungemia (HOB) is proposed as an alternative quality measure for public reporting and PFP, and was endorsed by the National Quality Forum in 2022. This broad measure is designed as an electronic quality measure that avoids manual abstraction and excludes risk adjustment. HOB would substantially expand the scope of focus of existing bloodstream infection measurement, and is currently being considered for voluntary reporting in 2025. In this article, we provide arguments for and against adopting HOB as a PFP measure linked to CMS payments.
Exposure to environmentally transmitted parasites should increase with population density due to accumulation of infective parasites in space. However, resource competition also increases with density, lowering immunity and increasing susceptibility, offering an alternative pathway for density-dependent infection. To test the relationships between these two processes and parasitism, we examined associations between host density, resource availability, immunity, and counts of 3 common helminth parasites using a long-term study of red deer. We found evidence that immunity increased with resource availability while parasite counts declined with immunity. We also found that greater density correlated with reduced resource availability, and while density was positively associated with both strongyle and tissue worm burdens, resource availability was independently and negatively associated with the same burdens. Our results support separate roles of density-dependent exposure and susceptibility in driving infection, providing evidence that resource competition is an important driver of infection, exacerbating effects of density-dependent increases in exposure.
High density should drive greater parasite exposure. However, evidence linking density with infection generally uses density proxies or measures of population size, rather than measures of individuals per space within a continuous population. We used a long-term study of wild sheep to link within-population spatiotemporal variation in host density with individual parasite counts. Although four parasites exhibited strong positive relationships with local density, these relationships were mostly restricted to juveniles and faded in adults. Furthermore, one ectoparasite showed strong negative relationships across all age classes. In contrast, population size – a measure of global density – had limited explanatory power, and its effects did not remove those of spatial density, but were distinct. These results indicate that local and global density can exhibit diverse and contrasting effects on infection within populations. Spatial measures of within-population local density may provide substantial additional insight to temporal metrics based on population size, and investigating them more widely could be revealing.
Healthcare-prescribed opioids are a known contributor to the opioid epidemic. Locally, there was an identified opportunity to improve opioid prescribing practices in cardiac surgical patients. The cardiac surgical team sought to standardise prescribing practices in postoperative patients and reduce opioid prescriptions at discharge. The improvement was undertaken at a large midwestern freestanding children’s hospital with over 400 beds and 120 cardiac surgeries annually. A multidisciplinary team was formed, using the model for Improvement to guide the improvement work. The key improvement interventions included standardised evidence-based prescribing guidelines based patient age and surgical approach, enhanced pain management with non-opioid medications, and integration of prescribing guidelines into the electronic health record. The primary outcome measure was rate of compliance with the prescribing guidelines and secondary measures included morphine equivalent dosing at discharge, opioid-free discharge, and length of stay. A balancing measure of opioid re-prescriptions was tracked. There were 289 patients included in the primary study period (January 2019 through December 2021). Sustainability of key outcomes was tracked though December 2022. The guideline compliance increased from 24% to 100%. The morphine equivalent dosing decreased to 22.5 in 2021 then 0 in 2022, from baseline of 36.25 in 2019. Opioid-free discharges decreased from 8% (2019) to 1.5% (2021) and 0% in 2022. Establishment and compliance with standardised guidelines for post-operative cardiac surgical pain management yielded a reduction in morphine equivalent dosing, an increase opioid-free discharges, and no increase in length of stay or opioid re-prescriptions.
Mass Gathering Medicine focuses on mitigating issues at Mass Gathering Events. Medical skills can vary substantially among staff, and the literature provides no specific guidance on staff training. This study highlights expert opinions on minimum training for medical staff to formalize preparation for a mass gathering.
Methods
This is a 3-round Delphi study. Experts were enlisted at Mass Gathering conferences, and researchers emailed participation requests through Stat59 software. Consent was obtained verbally and on Stat59 software. All responses were anonymous. Experts generated opinions. The second and third rounds used a 7-point linear ranking scale. Statements reached a consensus if the responses had a standard deviation (SD) of less than or equal to 1.0.
Results
Round 1 generated 137 open-ended statements. Seventy-three statements proceeded to round 2. 28.7% (21/73) found consensus. In round 3, 40.3% of the remaining statements reached consensus (21/52). Priority themes included venue-specific information, staff orientation to operations and capabilities, and community coordination. Mass casualty preparation and triage were also highlighted as a critical focus.
Conclusions
This expert consensus framework emphasizes core training areas, including venue-specific operations, mass casualty response, triage, and life-saving skills. The heterogeneity of Mass Gatherings makes instituting universal standards challenging. The conclusions highlight recurrent themes of priority among multiple experts.
The best prehospital transport strategy for patients with suspected stroke due to possible large vessel occlusion varies by jurisdiction and available resources. A foundational problem is the lack of a definitive diagnosis at the scene. Rural stroke presentations provide the most problematic triage destination decision-making. In Alberta, Canada, the implementation and 5-year experience with a rural field consultation approach to provide service to rural patients with acute stroke is described.
Methods:
The protocols established through the rural field consultation system and the subsequent transport patterns for suspected stroke patients during the first 5 years of implementation are presented. Outcomes are reported using home time and data are summarized using descriptive statistics.
Results:
From April 2017 to March 2022, 721 patients met the definition for a rural field consultation, and 601 patients were included in the analysis. Most patients (n = 541, 90%) were transported by ground ambulance. Intravenous thrombolysis was provided for 65 (10.8%) of patients, and 106 (17.6%) underwent endovascular thrombectomy. The median time from first medical contact to arterial access was 3.2 h (range 1.3–7.6) in the direct transfers, compared to 6.5 h (range 4.6–7.9) in patients arriving indirectly to the comprehensive stroke center (CSC). Only a small proportion of patients (n = 5, 0.8%) were routed suboptimally to a primary stroke center and then to a CSC where they underwent endovascular therapy.
Conclusions:
The rural field consultation system was associated with shortened delays to recanalization and demonstrated that it is feasible to improve access to acute stroke care for rural patients.
This study aimed to assess the impact of hypertensive disorders of pregnancy on infant neurodevelopment by comparing 6-month and 2-year psychomotor development outcomes of infants exposed to gestational hypertension (GH) or preeclampsia (PE) versus normotensive pregnancy (NTP). Participating infants were children of women enrolled in the Postpartum Physiology, Psychology and Paediatric (P4) cohort study who had NTPs, GH or PE. 6-month and 2-year Ages and Stages Questionnaires (ASQ-3) scores were categorised as passes or fails according to domain-specific values. For the 2-year Bayley Scales of Infant and Toddler Development (BSID-III) assessment, scores > 2 standard deviations below the mean in a domain were defined as developmental delay. Infants (n = 369, male = 190) exposed to PE (n = 75) versus GH (n = 20) and NTP (n = 274) were more likely to be born small for gestational age and premature. After adjustment, at 2 years, prematurity status was significantly associated with failing any domain of the ASQ-3 (p = 0.015), and maternal tertiary education with increased cognitive scores on the BSID-III (p = 0.013). However, PE and GH exposure were not associated with clinically significant risks of delayed infant neurodevelopment in this study. Larger, multicentre studies are required to further clarify early childhood neurodevelopmental outcomes following hypertensive pregnancies.
Chemical, biological, radiological, and nuclear (CBRN) incidents pose increasing transborder risks globally, necessitating enhanced health sector preparedness.
Objectives:
This study aimed to develop a comprehensive CBRN preparedness assessment tool (PAT), operational response guidelines (ORG), and tabletop simulation scenarios for the health sectors of the Middle East and North Africa (MENA) region.
Method/Description:
A mixed-methods approach comprised a systematic review of the literature up to 2022 in English and French, modified expert interviews (MIM), and an online Delphi questionnaire. Content analysis was performed on interview data. Using R-Studio™, consensus metrics and artificial intelligence techniques, including natural language processing, sentiment analysis, and unsupervised machine learning (ML) clustering algorithms, were deployed for advanced data analysis across all phases.
Results/Outcomes:
The literature review identified 63 relevant studies illustrating various preparedness strategies. The MIM’s thematic analysis, reinforced by AI-driven content analysis, emphasized the need for stronger inter-regional cooperation facilitated by organizations such as WHO and standardized tabletop simulation training. A robust consensus was achieved on the proposed assessment tool and operational response guidelines. ML analysis identified distinct expert clusters, providing additional consensus perspectives.
Conclusion:
The study emphasized the urgency for collaborative CBRN response strategies within MENA, valuing the innovative aspect of our suggested PAT, ORG, and simulation scenarios. This work advocates a dynamic, resilient approach to disaster medicine preparedness, which is crucial for regional security and global health resilience, especially in the MENA. It also highlights the significant role of AI analysis methods in enriching analytical outcomes in disaster medicine research and promoting data-informed preparedness strategies.
The Red Cross Red Crescent Health Information System (RCHIS) is an electronic medical record (EHR) and health information management system (HIS) which has been designed for international disaster responses with a cloud-based server and a local server to bridge temporary internet outages. This architecture allows for remote information management and operational support should data processing agreements allows it.
Objectives:
Describe adapting a cloud-based health information system to a fully offline setting and improve business continuity in case of a system failure.
Method/Description:
An analysis of the existing architecture of RCHIS was conducted to identify components and procedures that only work on the cloud-based server with an existing internet connection. Offline alternatives were identified and developed to ensure full offline operational capacity and redundancy.
Results/Outcomes:
A mechanism to set up a second local server for redundancy improves business continuity planning, and having locally stored backup allows the recovery of data without an internet connection. Instead of creating new user accounts in the cloud and emailing a one-time password (OTP), a mechanism to create accounts on the local server and display the OTP was added. The offline generation of the WHO EMT MDS report was embedded.
Conclusion:
Adding the capability to work fully offline to RCHIS meant significant software architecture changes. Despite losing some of the benefits, such as remote information management, RCHIS is now a robust offline tool for deployment in settings without any internet connectivity. Having a local server also means that we can comply with data sovereignty rules where they exist.
Historically, medical response efforts to large-scale disaster events have highlighted significant variability in the capabilities of responding medical providers and emergency medical teams (EMTs). Analysis of the 2010 Haiti earthquake response found that a number of medical teams were poorly prepared, inexperienced, or lacked the competencies to provide the level of medical care required, highlighting the need for medical team standards.
The World Health Organization (WHO) EMT initiative that followed created minimum team standards for responding international EMTs to improve the quality and timeliness of medical services. At the present time however, there remains a lack of globally recognized minimum competency standards at the level of the individual disaster medical responder, allowing for continued variability in patient care.
Objectives:
This study examines existing competencies for physicians, nurses, and paramedics who are members of deployable disaster response teams.
Method/Description:
A scoping review of published English-language articles on existing competencies for physicians, nurses, and paramedics who are members of deployable disaster response teams was performed in Ovid MEDLINE, Ovid Embase, CINAHL, Scopus, and Web of Science Core Collection. A total of 3,474 articles will be reviewed.
Results/Outcomes:
Data to be analyzed by October 1, 2024.
Conclusion:
There is a need to develop minimum standards for healthcare providers on disaster response teams. Identification of key existing competencies for disaster responders will provide the foundation for the creation of globally recognized minimum competency standards for individuals seeking to join an EMT in the future and will guide training and curricula development.
Objectives/Goals: The second highest fear of the aging population is cognitive decline. Diet is associated with brain aging; therefore, the objective is to determine the effects of a Western diet (WD) on cognitive decline and the efficacy of a Mediterranean diet (MeDi) fecal microbiota transplant (FMT) in WD-induced cognitive deficit progression in aged rats. Methods/Study Population: For Study 1, 12-month-old Fischer344 rats (NIA Aging Colony) will be randomly assigned to a WD, MeDi, or control (positive control) for 6 or 12 months. Microbiota composition, blood pressure, and body composition (DXA Scan) will be longitudinally assessed. Groups will undergo a battery of neurobehavioral assessments to measure cognitive performance. At the end of the study, mitochondria bioenergetic assays in isolated cerebral microvessels will be used to determine changes in cerebrovascular function. For Study 2, 18-month-old Fischer344 rats (NIA Aging Colony) will be randomly assigned to a WD, MeDi, or control for 6 months. At month 4, the WD+ MeDi-FMT group will receive once weekly MeDi-FMT for two months. Assessments will be performed as described in Study 1. Results/Anticipated Results: It is anticipated that the WD-related gut dysbiosis will increase blood pressure, fat-free mass, neurovascular dysfunction, and induce cognitive impairment relative to a MeDi. When using a MeDi-FMT as an intervention, it is anticipated that there will be measurable improvements in cognitive function relative to a WD through the regulation of gut dysbiosis, blood pressure, fat-free mass, and neurovascular dysfunction. Discussion/Significance of Impact: These results are expected to have an important positive impact because they will provide insights into the WD-induced gut dysbiosis-associated cognitive impairments, and evaluate the roles and mechanisms of MeDi-FMT in the therapeutic intervention of aged rats.
Objectives/Goals: The timing of neurosurgery is highly variable for post-hemorrhagic hydrocephalus (PHH) of prematurity. We sought to utilize microvascular imaging (MVI) in ultrasound (US) to identify biomarkers to discern the opportune time for intervention and to analyze the cerebrospinal fluid (CSF) characteristics as they pertain to neurosurgical outcome. Methods/Study Population: The inclusion criteria for the study are admission to the neonatal intensive care unit (NICU) with a diagnosis of Papile grade III or IV. Exclusion criteria are congenital hydrocephalus and hydrocephalus secondary to myelomeningocele/brain tumor/vascular malformation. We are a level IV tertiary referral center. Our current clinical care pathway utilizes brain US at admission and at weekly intervals. Patients who meet certain clinical and radiographic parameters undergo temporary or permanent CSF diversion. Results/Anticipated Results: NEL was implemented at our institution for PHH of prematurity in fall 2022. To date, we have had 20 patients who were diagnosed with grade III or IV IVH, of which 12 qualified for NEL. Our preliminary safety and feasibility results as well as the innovative bedside technique pioneered at our institution are currently in revision stages for publication. Preliminary results of the MVI data have yielded that hyperemia may confer venous congestion in the germinal matrix, which should then alert the neurosurgeon to delay any intervention to avoid progression of intraventricular blood. With regard to CSF characteristics, we anticipate that protein, cell count, hemoglobin, iron, and ferritin will decrease with NEL. Discussion/Significance of Impact: The timing of PHH of prematurity is highly variable. We expect that MVI will offer radiographic biomarkers to guide optimal timing of neurosurgical intervention. A better understanding of CSF characteristics could potentially educate the neurosurgeon with regard to optimal timing of permanent CSF diversion based on specific CSF parameters.
We examine repetition as an institution that affects coordination failure in a game with and without pre-play communication. We use probit regression with random effects to test hypotheses regarding the frequency and form of coordination failure in the presence of repeated play versus one-shot games. Our results indicate that repetition without pre-play communication results in a lower frequency of coordination failure relative to one-shot game outcomes. This result is reversed when pre-play communication is allowed. Our evidence also suggests that repeated play coordination failures tend to be suboptimal Nash equilibria, whereas one-shot game coordination failures are disequilibria regardless of the presence of pre-play communication.
Law enforcement officials face numerous decisions regarding their enforcement choices. One important decision, that is often controversial, is the amount of knowledge that law enforcement distributes to the community regarding their policing strategies. Assuming the goal is to minimize criminal activity (alternatively, maximize citation rates), our theoretical analysis suggests that agencies should reveal (shroud) their resource allocation if criminals are uncertainty seeking, and shroud (reveal) their allocation if criminals are uncertainty averse. We run a laboratory experiment to test our theoretical framework, and find that enforcement behavior is approximately optimal given the observed non-expected utility uncertainty preferences of criminals.
Invasive plants’ ability to extend their range depends upon their local environments and both positive and negative interactions with native species. Interactions between invasive and native plants may be indirectly linked to the soil fungal community, which may enhance or suppress invasion through mutualism or parasitism. Many invasive plants preferentially select fungal communities or change soil chemistry to gain a competitive advantage, and such changes can remain even after the invader is removed, known as legacy effects. Yellow toadflax (Linaria vulgaris Mill.) is an invasive forb that is aggressive in the western United States but is nonaggressive in the midwestern United States. We evaluated the relationship between soil abiotic properties, nitrogen (N) enrichment, arbuscular mycorrhizal fungal (AMF) community composition, and L. vulgaris invasion in aggressive (CO) and nonaggressive (IL) populations. We collected soil from uninvaded and invaded sites in Gothic, CO, and near Chicago, IL, and sequenced AMF community composition in each site. Using the same soil, we grew L. vulgaris and native species in pots for 120 d, with half of the pots receiving N fertilization, and harvested biomass. We also injected a 15N-labeled tracer in pots and analyzed plant tissue for 15N enrichment and net uptake rates (NUR). In CO soil, L. vulgaris rhizomes sprouted more in invaded soil, whereas in IL soil, L. vulgaris only sprouted in uninvaded soil. N fertilization had no impact on biomass, and NUR did not differ significantly between any treatments. AMF communities differed between the two sites but were not significantly influenced by invasion history. Our results suggest that L. vulgaris leaves legacy effects but that these effects are different between aggressive and nonaggressive populations. Legacy effects may facilitate reinvasion in CO, but we did not find conclusive evidence of legacy effects in IL, and differences between the sites could be shaped by endemic AMF communities.
Elucidation of transphasic mechanisms (i.e., mechanisms that occur across illness phases) underlying negative symptoms could inform early intervention and prevention efforts and additionally identify treatment targets that could be effective regardless of illness stage. This study examined whether a key reinforcement learning behavioral pattern characterized by reduced difficulty learning from rewards that have been found to underlie negative symptoms in those with a schizophrenia diagnosis also contributes to negative symptoms in those at clinical high-risk (CHR) for psychosis.
Methods
CHR youth (n = 46) and 51 healthy controls (CN) completed an explicit reinforcement learning task with two phases. During the acquisition phase, participants learned to select between pairs of stimuli probabilistically reinforced with feedback indicating receipt of monetary gains or avoidance of losses. Following training, the transfer phase required participants to select between pairs of previously presented stimuli during the acquisition phase and novel stimuli without receiving feedback. These test phase pairings allowed for inferences about the contributions of prediction error and value representation mechanisms to reinforcement learning deficits.
Results
In acquisition, CHR participants displayed impaired learning from gains specifically that were associated with greater negative symptom severity. Transfer performance indicated these acquisition deficits were largely driven by value representation deficits. In addition to negative symptoms, this profile of deficits was associated with a greater risk of conversion to psychosis and lower functioning.
Conclusions
Impairments in positive reinforcement learning, specifically effectively representing reward value, may be an important transphasic mechanism of negative symptoms and a marker of psychosis liability.
Negative symptoms are a key feature of several psychiatric disorders. Difficulty identifying common neurobiological mechanisms that cut across diagnostic boundaries might result from equifinality (i.e., multiple mechanistic pathways to the same clinical profile), both within and across disorders. This study used a data-driven approach to identify unique subgroups of participants with distinct reward processing profiles to determine which profiles predicted negative symptoms.
Methods
Participants were a transdiagnostic sample of youth from a multisite study of psychosis risk, including 110 individuals at clinical high-risk for psychosis (CHR; meeting psychosis-risk syndrome criteria), 88 help-seeking participants who failed to meet CHR criteria and/or who presented with other psychiatric diagnoses, and a reference group of 66 healthy controls. Participants completed clinical interviews and behavioral tasks assessing four reward processing constructs indexed by the RDoC Positive Valence Systems: hedonic reactivity, reinforcement learning, value representation, and effort–cost computation.
Results
k-means cluster analysis of clinical participants identified three subgroups with distinct reward processing profiles, primarily characterized by: a value representation deficit (54%), a generalized reward processing deficit (17%), and a hedonic reactivity deficit (29%). Clusters did not differ in rates of clinical group membership or psychiatric diagnoses. Elevated negative symptoms were only present in the generalized deficit cluster, which also displayed greater functional impairment and higher psychosis conversion probability scores.
Conclusions
Contrary to the equifinality hypothesis, results suggested one global reward processing deficit pathway to negative symptoms independent of diagnostic classification. Assessment of reward processing profiles may have utility for individualized clinical prediction and treatment.
This study aimed to understand the current landscape of USA-based disaster medicine (DM) programs through the lens of alumni and program directors (PDs). The data obtained from this study will provide valuable information to future learners as they ponder careers in disaster medicine and allow PDs to refine curricular offerings.
Methods
Two separate surveys were sent to USA-based DM program directors and alumni. The surveys gathered information regarding current training characteristics, career trajectories, and the outlook of DM training.
Results
The study had a 57% response rate among PDs, and 42% response rate from alumni. Most programs are 1-year and accept 1-2 fellows per class. More than 60% of the programs offer additional advanced degrees. Half of the respondents accept international medical graduates (IMGs). Only 25% accept non-MD/DO/MBBs trained applicants. Most of the alumni hold academic and governmental positions post-training. Furthermore, many alumni report that fellowship training offered an advantage in the job market and allowed them to expand their clinical practice.
Conclusions
The field of disaster medicine is continuously evolving owing to the increased recognition of the important roles DM specialists play in healthcare. The fellowship training programs are experiencing a similar evolution with an increasing trend toward standardization. Furthermore, graduates from these programs see their training as a worthwhile investment in career opportunities.
The neural correlates of working memory (WM) in schizophrenia (SZ) have been extensively studied using the multisite fMRI data acquired by the Functional Biomedical Informatics Research Network (fBIRN) consortium. Although univariate and multivariate analysis methods have been variously employed to localize brain responses under differing task conditions, important hypotheses regarding the representation of mental processes in the spatio-temporal patterns of neural recruitment and the differential organization of these mental processes in patients versus controls have not been addressed in this context. This paper uses a multivariate state-space model (SSM) to analyze the differential representation and organization of mental processes of controls and patients performing the Sternberg Item Recognition Paradigm (SIRP) WM task. The SSM is able to not only predict the mental state of the subject from the data, but also yield estimates of the spatial distribution and temporal ordering of neural activity, along with estimates of the hemodynamic response. The dynamical Bayesian modeling approach used in this study was able to find significant differences between the predictability and organization of the working memory processes of SZ patients versus healthy subjects. Prediction of some stimulus types from imaging data in the SZ group was significantly lower than controls, reflecting a greater level of disorganization/heterogeneity of their mental processes. Moreover, the changes in accuracy of predicting the mental state of the subject with respect to parametric modulations, such as memory load and task duration, may have important implications on the neurocognitive models for WM processes in both SZ and healthy adults. Additionally, the SSM was used to compare the spatio-temporal patterns of mental activity across subjects, in a holistic fashion and to derive a low-dimensional representation space for the SIRP task, in which subjects were found to cluster according to their diagnosis.