We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Do constituents care how judges are chosen? We conduct two nationally representative survey experiments focusing on state trial courts. Our first study indicates that respondents prefer judges who are elected to those who are appointed, though this does not affect their perceptions of the judiciary’s legitimacy. Our second study explores three potential mechanisms: efficacy, experience with democracy, and perceived ideological proximity. We find evidence that real-world experience with judicial elections is associated with a preference for such elections, but we do not find evidence for other mechanisms. Our study offers important new evidence for assessing proposed reforms to judicial selection.
Aquatic ecosystems - lakes, ponds and streams - are hotspots of biodiversity in the cold and arid environment of Continental Antarctica. Environmental change is expected to increasingly alter Antarctic aquatic ecosystems and modify the physical characteristics and interactions within the habitats that they support. Here, we describe physical and biological features of the peripheral ‘moat’ of a closed-basin Antarctic lake. These moats mediate connectivity amongst streams, lake and soils. We highlight the cyclical moat transition from a frozen winter state to an active open-water summer system, through refreeze as winter returns. Summer melting begins at the lakebed, initially creating an ice-constrained lens of liquid water in November, which swiftly progresses upwards, creating open water in December. Conversely, freezing progresses slowly from the water surface downwards, with water at 1 m bottom depth remaining liquid until May. Moats support productive, diverse benthic communities that are taxonomically distinct from those under the adjacent permanent lake ice. We show how ion ratios suggest that summer exchange occurs amongst moats, streams, soils and sub-ice lake water, perhaps facilitated by within-moat density-driven convection. Moats occupy a small but dynamic area of lake habitat, are disproportionately affected by recent lake-level rises and may thus be particularly vulnerable to hydrological change.
The three-dimensional order shown by the two-layer hydrates of Na- and Ca-vermiculite, prepared from Mg-vermiculite from Llano, Texas, has enabled clear, two-dimensional Fourier projections of their interlayer structures to be obtained. Structure factor calculations were made in space group C2 and with unit-cell dimensions of a = 5.358 Å, b = 9.232 Å, and ß = 96.82°; for Na-vermiculite C = 14.96 Å and for Ca-vermiculite c = 15.00 Å. In Na-vermiculite the interlayer cations are octahedrally coordinated to water molecules with the sodium-water polyhedra only located between the triads of oxygen atoms forming bases to tetrahedra in adjacent silicate layers. In Ca-vermiculite the interlayer cations are in both octahedral and 8-fold (distorted cubic) coordination with water molecules. The octahedrally coordinated Ca ions are between the bases of tetrahedra in adjacent silicate layers, but the 8-fold coordinated Ca ions are between the ditrigonal cavities. In both Na- and Ca-vermiculite some water molecules are drawn from planar networks appreciably towards the ditrigonal cavities. The three-dimensional order observed for these vermiculites contrasts with the stacking disorder reported for Mg-vermiculite from Llano. The distinct crystallographic behavior of Na+, Ca2+, and Mg2+ in the hydration layers of Llano vermiculite probably depends on cation sizes and field strengths, together with the need to achieve local charge balance near the sites of tetrahedral Al-for-Si substitution.
A vermiculite-aniline intercalate with a basal spacing of 14.78 Å was investigated by one- and two-dimensional X-ray diffraction methods. The intercalate, prepared by ion exchange between Na-saturated vermiculite from Llano, Texas, and a 1% aniline hydrochloride solution, contains only one aniline cation per single layer cell. A reduced effective cell-charge is believed to be responsible for this. Structure factor calculations were made in space group C2/c and with unit cell dimensions of a = 5.33, b = 9.18, c = 29.78 Å, and β = 97.0°. However, extra reflections in the a*b* plane, which are similar to those in a vermiculite-benzidine intercalate, showed that after aniline intercalation the true unit cell became primitive. The aniline cations are distributed statistically over equivalent crystallographic sites in the interlayer space. The organic molecules are orientated with their planes vertical and their nitrogen atoms over the projected centers of the ditrigonal cavities into which they key. The aniline cations form ordered arrays upon the silicate layers by packing into rows. Perpendicular to [010], populated and vacant rows alternate. Along populated rows aromatic ring planes are alternately parallel and perpendicular to [010]. With small adjustments this model is similar to that of benzidine-vermiculite.
PD patients commonly exhibit executive dysfunction early in the disease course which may or may not predict further cognitive decline over time. Early emergence of visuospatial and memory impairments, in contrast, are more consistent predictors of an evolving dementia syndrome. Most prior studies using fMRI have focused on mechanisms of executive dysfunction and have demonstrated that PD patients exhibit hyperactivation that is dependent on the degree of cognitive impairment, suggestive of compensatory strategies. No study has evaluated whether PD patients with normal cognition (PD-NC) and PD patients with Mild Cognitive Impairment (PD-MCI) exhibit compensatory activation patterns during visuospatial task performance.
Participants and Methods:
10 PD-NC, 12 PD-MCI, and 14 age and sex-matched healthy controls (HC) participated in the study. PD participants were diagnosed with MCI based on the Movement Disorders Society Task Force, Level II assessment (comprehensive assessment). Functional magnetic resonance imaging (fMRI) was performed during a motion discrimination task that required participants to identify the direction of horizontal global coherent motion embedded within dynamic visual noise under Low and High coherence conditions. Behavioral accuracy and functional activation were evaluated using 3 * 2 analyses of covariance (ANCOVAs) (group [HC, PD-NC, PD-MCI] * Coherence [High vs. Low]) accounting for age, sex, and education. Analyses were performed in R (v4.1.2(Team, 2013)).
Results:
PD-MCI (0.702± 0.269) patients exhibited significantly lower accuracy on the motion discrimination task than HC (0.853 ± 0.241; p = 0.033) and PD-NC (0.880 ± 0.208; p =0.039). A Group * Coherence interaction was identified in which several regions, including orbitofrontal, posterior parietal and occipital cortex, showed increased activation during High relative to Low coherence trials in the PD patient groups but not in the HC group. HC showed default mode deactivation and frontal-parietal activation during Low relative to High coherence trials that was not evident in the patient groups.
Conclusions:
PD-MCI patients exhibited worse visuospatial performance on a motion discrimination task than PD-NC and HC participants and exhibited hyperactivation of the posterior parietal and occipital regions during motion discrimination, suggesting possible compensatory activation.
Adolescents often experience heightened socioemotional sensitivity warranting their use of regulatory strategies. Yet, little is known about how key socializing agents help regulate teens’ negative emotions in daily life and implications for long-term adjustment. We examined adolescent girls’ interpersonal emotion regulation (IER) with parents and peers in response to negative social interactions, defined as parent and peer involvement in the teen’s enactment of emotion regulation strategies. We also tested associations between rates of daily parental and peer IER and depressive symptoms, concurrently and one year later. Adolescent girls (N = 112; Mage = 12.39) at temperamental risk for depressive disorders completed a 16-day ecological momentary assessment protocol measuring reactivity to negative social interactions, parental and peer IER, and current negative affect. Results indicated that adolescents used more adaptive strategies with peers and more maladaptive strategies with parents in daily life. Both parental and peer IER down-regulated negative affect, reflected by girls’ decreased likelihood of experiencing continued negative affect. Higher proportions of parental adaptive IER predicted reduced depressive symptoms one year later. Findings suggest that both parents and peers effectively help adolescent girls down-regulate everyday negative emotions; however, parents may offer more enduring benefits for long-term adjustment.
Recent meta-analyses demonstrate that small-quantity lipid-based nutrient supplements (SQ-LNS) for young children significantly reduce child mortality, stunting, wasting, anaemia and adverse developmental outcomes. Cost considerations should inform policy decisions. We developed a modelling framework to estimate the cost and cost-effectiveness of SQ-LNS and applied the framework in the context of rural Uganda.
Design:
We adapted costs from a costing study of micronutrient powder (MNP) in Uganda, and based effectiveness estimates on recent meta-analyses and Uganda-specific estimates of baseline mortality and the prevalence of stunting, wasting, anaemia and developmental disability.
Setting:
Rural Uganda.
Participants:
Not applicable.
Results:
Providing SQ-LNS daily to all children in rural Uganda (> 1 million) for 12 months (from 6 to 18 months of age) via the existing Village Health Team system would cost ∼$52 per child (2020 US dollars) or ∼$58·7 million annually. SQ-LNS could avert an average of > 242 000 disability-adjusted life years (DALYs) annually as a result of preventing 3689 deaths, > 160 000 cases of moderate or severe anaemia and ∼6000 cases of developmental disability. The estimated cost per DALY averted is $242.
Conclusions:
In this context, SQ-LNS may be more cost-effective than other options such as MNP or the provision of complementary food, although the total cost for a programme including all age-eligible children would be high. Strategies to reduce costs, such as targeting to the most vulnerable populations and the elimination of taxes on SQ-LNS, may enhance financial feasibility.
Most neuropsychological tests were developed without the benefit of modern psychometric theory. We used item response theory (IRT) methods to determine whether a widely used test – the 26-item Matrix Reasoning subtest of the WAIS-IV – might be used more efficiently if it were administered using computerized adaptive testing (CAT).
Method:
Data on the Matrix Reasoning subtest from 2197 participants enrolled in the National Neuropsychology Network (NNN) were analyzed using a two-parameter logistic (2PL) IRT model. Simulated CAT results were generated to examine optimal short forms using fixed-length CATs of 3, 6, and 12 items and scores were compared to the original full subtest score. CAT models further explored how many items were needed to achieve a selected precision of measurement (standard error ≤ .40).
Results:
The fixed-length CATs of 3, 6, and 12 items correlated well with full-length test results (with r = .90, .97 and .99, respectively). To achieve a standard error of .40 (approximate reliability = .84) only 3–7 items had to be administered for a large percentage of individuals.
Conclusions:
This proof-of-concept investigation suggests that the widely used Matrix Reasoning subtest of the WAIS-IV might be shortened by more than 70% in most examinees while maintaining acceptable measurement precision. If similar savings could be realized in other tests, the accessibility of neuropsychological assessment might be markedly enhanced, and more efficient time use could lead to broader subdomain assessment.
The prolonged COVID-19 pandemic has created unique and complex challenges in operational and capacity planning for pediatric emergency departments, as initial low pediatric patient volumes gave way to unpredictable patient surges during Delta and Omicron variants. Compounded by widespread hospital supply chain issues, staffing shortages due to infection and attrition, and a concurrent pediatric mental health crisis, the surges have pushed pediatric emergency department leaders to re-examine traditionally defined clinical processes, and adopt innovative operational strategies. This study describes the strategic surge response and lessons learned by 3 major freestanding academic pediatric emergency departments in the western United States to help inform current and future pediatric pandemic preparedness.
Among patients with a history of ESBL infection, uncertainty remains regarding whether all of these patients require ESBL-targeted therapy when presenting with a subsequent infection. We sought to determine the risks associated with a subsequent ESBL infection to help inform empiric antibiotic decisions.
Methods:
A retrospective cohort study of adult patients with positive index culture for Escherichia coli or Klebsiella pneumoniae (EC/KP) receiving medical care during 2017 was conducted. Risk assessments were performed to identify factors associated with subsequent infection caused by ESBL-producing EC/KP.
Results:
In total, 200 patients were included in the cohort, 100 with ESBL-producing EC/KP and 100 with ESBL-negative EC/KP. Of 100 patients (50%) who developed a subsequent infection, 22 infections were ESBL-producing EC/KP, 43 were other bacteria, and 35 had no or negative cultures. Subsequent infection caused by ESBL-producing EC/KP only occurred when the index culture was also ESBL-producing (22 vs 0). Among those with ESBL-producing index culture, the incidences of subsequent infection caused by ESBL-producing EC/KP versus other bacterial subsequent infection were similar (22 vs 18; P = .428). Factors associated with subsequent infection caused by ESBL-producing EC/KP include history of ESBL-producing index culture, time ≤180 days between index culture and subsequent infection, male sex, and Charlson comorbidity index score >3.
Conclusions:
History of ESBL-producing EC/KP culture is associated with subsequent infection caused by ESBL-producing EC/KP, particularly within 180 days after the historical culture. Among patients presenting with infection and a history of ESBL-producing EC/KP, other factors should be considered in making empiric antibiotic decisions, and ESBL-targeted therapy may not always be warranted.
To evaluate the clinical impact of the BioFire FilmArray Pneumonia Panel (PNA panel) in critically ill patients.
Design:
Single-center, preintervention and postintervention retrospective cohort study.
Setting:
Tertiary-care academic medical center.
Patients:
Adult ICU patients.
Methods:
Patients with quantitative bacterial cultures obtained by bronchoalveolar lavage or tracheal aspirate either before (January–March 2021, preintervention period) or after (January–March 2022, postintervention period) implementation of the PNA panel were randomly screened until 25 patients per study month (75 in each cohort) who met the study criteria were included. Antibiotic use from the day of culture collection through day 5 was compared.
Results:
The primary outcome of median time to first antibiotic change based on microbiologic data was 50 hours before the intervention versus 21 hours after the intervention (P = .0006). Also, 56 postintervention regimens (75%) were eligible for change based on PNA panel results; actual change occurred in 30 regimens (54%). Median antibiotic days of therapy (DOTs) were 8 before the intervention versus 6 after the intervention (P = .07). For the patients with antibiotic changes made based on PNA panel results, the median time to first antibiotic change was 10 hours. For patients who were initially on inadequate therapy, time to adequate therapy was 67 hours before the intervention versus 37 hours after the intervention (P = .27).
Conclusions:
The PNA panel was associated with decreased time to first antibiotic change and fewer antibiotic DOTs. Its impact may have been larger if a higher percentage of potential antibiotic changes had been implemented. The PNA panel is a promising tool to enhance antibiotic stewardship.
To test the feasibility of targeted gown and glove use by healthcare personnel caring for high-risk nursing-home residents to prevent Staphylococcus aureus acquisition in short-stay residents.
Design:
Uncontrolled clinical trial.
Setting:
This study was conducted in 2 community-based nursing homes in Maryland.
Participants:
The study included 322 residents on mixed short- and long-stay units.
Methods:
During a 2-month baseline period, all residents had nose and inguinal fold swabs taken to estimate S. aureus acquisition. The intervention was iteratively developed using a participatory human factors engineering approach. During a 2-month intervention period, healthcare personnel wore gowns and gloves for high-risk care activities while caring for residents with wounds or medical devices, and S. aureus acquisition was measured again. Whole-genome sequencing was used to assess whether the acquisition represented resident-to-resident transmission.
Results:
Among short-stay residents, the methicillin-resistant S. aureus acquisition rate decreased from 11.9% during the baseline period to 3.6% during the intervention period (odds ratio [OR], 0.28; 95% CI, 0.08–0.92; P = .026). The methicillin-susceptible S. aureus acquisition rate went from 9.1% during the baseline period to 4.0% during the intervention period (OR, 0.41; 95% CI, 0.12–1.42; P = .15). The S. aureus resident-to-resident transmission rate decreased from 5.9% during the baseline period to 0.8% during the intervention period.
Conclusions:
Targeted gown and glove use by healthcare personnel for high-risk care activities while caring for residents with wounds or medical devices, regardless of their S. aureus colonization status, is feasible and potentially decreases S. aureus acquisition and transmission in short-stay community-based nursing-home residents.
Compulsory admission procedures of patients with mental disorders vary between countries in Europe. The Ethics Committee of the European Psychiatric Association (EPA) launched a survey on involuntary admission procedures of patients with mental disorders in 40 countries to gather information from all National Psychiatric Associations that are members of the EPA to develop recommendations for improving involuntary admission processes and promote voluntary care.
Methods.
The survey focused on legislation of involuntary admissions and key actors involved in the admission procedure as well as most common reasons for involuntary admissions.
Results.
We analyzed the survey categorical data in themes, which highlight that both medical and legal actors are involved in involuntary admission procedures.
Conclusions.
We conclude that legal reasons for compulsory admission should be reworded in order to remove stigmatization of the patient, that raising awareness about involuntary admission procedures and patient rights with both patients and family advocacy groups is paramount, that communication about procedures should be widely available in lay-language for the general population, and that training sessions and guidance should be available for legal and medical practitioners. Finally, people working in the field need to be constantly aware about the ethical challenges surrounding compulsory admissions.
We examined demographic, clinical, and psychological characteristics of a large cohort (n = 368) of adults with dissociative seizures (DS) recruited to the CODES randomised controlled trial (RCT) and explored differences associated with age at onset of DS, gender, and DS semiology.
Methods
Prior to randomisation within the CODES RCT, we collected demographic and clinical data on 368 participants. We assessed psychiatric comorbidity using the Mini-International Neuropsychiatric Interview (M.I.N.I.) and a screening measure of personality disorder and measured anxiety, depression, psychological distress, somatic symptom burden, emotional expression, functional impact of DS, avoidance behaviour, and quality of life. We undertook comparisons based on reported age at DS onset (<40 v. ⩾40), gender (male v. female), and DS semiology (predominantly hyperkinetic v. hypokinetic).
Results
Our cohort was predominantly female (72%) and characterised by high levels of socio-economic deprivation. Two-thirds had predominantly hyperkinetic DS. Of the total, 69% had ⩾1 comorbid M.I.N.I. diagnosis (median number = 2), with agoraphobia being the most common concurrent diagnosis. Clinical levels of distress were reported by 86% and characteristics associated with maladaptive personality traits by 60%. Moderate-to-severe functional impairment, high levels of somatic symptoms, and impaired quality of life were also reported. Women had a younger age at DS onset than men.
Conclusions
Our study highlights the burden of psychopathology and socio-economic deprivation in a large, heterogeneous cohort of patients with DS. The lack of clear differences based on gender, DS semiology and age at onset suggests these factors do not add substantially to the heterogeneity of the cohort.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
Basic Self disturbances (BSD), including changes of the 'pre-reflexive' sense of self and the loss first-person perspective, are characteristic of the schizophrenic spectrum disorders and highly prevalent in subjects at 'ultra high risk' for psychosis (UHR). The current literature indicates that cortical midline structures (CMS) may be implicated in the neurobiological substrates of the 'basic self' in healthy controls.
Objectives
Neuroanatomical investigation of BSD in a UHR sample
Aims
To test the hypotheses :(i) UHR subjects have higher 'Examination of Anomalous Self Experience, EASE' scores as compared to controls, (ii) UHR subjects have neuroanatomical alterations as compared to controls in CMS, (iii) within UHR subjects, EASE scores are directly related to structural CMS alterations.
Methods
32 HR subjects (27 antipsychotics-naïve) and 17 healthy controls (HC) were assessed with the 57-items semi-structured EASE interview. Voxel-Based Morphometry (VBM) was conducted in the same subjects, with a-priori Region of Interests (ROIs) defined in the CMS (anterior/posterior cingulate and medial-prefrontal cortex).
Results
Despite high variability in the HR group, the overall EASE score was higher (t-test >0.01, Cohen's d =2.91) in HR (mean=30.15, SD=16.46) as compared to HC group (mean=1.79, SD=2.83). UHR subjects had gray matter reduction in CMS as compared to HC (p>0.05 FWE-corrected). Across the whole sample, lower gray matter volume in the anterior cingulate was correlated with higher EASE scores (p>0.05).
Conclusions
This study provides preliminary evidence that gray matter reductions in the CMS are correlated with BSD in UHR people.
Externalizing disorders are known to be partly heritable, but the biological pathways linking genetic risk to the manifestation of these costly behaviors remain under investigation. This study sought to identify neural phenotypes associated with genomic vulnerability for externalizing disorders.
Methods
One-hundred fifty-five White, non-Hispanic veterans were genotyped using a genome-wide array and underwent resting-state functional magnetic resonance imaging. Genetic susceptibility was assessed using an independently developed polygenic score (PS) for externalizing, and functional neural networks were identified using graph theory based network analysis. Tasks of inhibitory control and psychiatric diagnosis (alcohol/substance use disorders) were used to measure externalizing phenotypes.
Results
A polygenic externalizing disorder score (PS) predicted connectivity in a brain circuit (10 nodes, nine links) centered on left amygdala that included several cortical [bilateral inferior frontal gyrus (IFG) pars triangularis, left rostral anterior cingulate cortex (rACC)] and subcortical (bilateral amygdala, hippocampus, and striatum) regions. Directional analyses revealed that bilateral amygdala influenced left prefrontal cortex (IFG) in participants scoring higher on the externalizing PS, whereas the opposite direction of influence was observed for those scoring lower on the PS. Polygenic variation was also associated with higher Participation Coefficient for bilateral amygdala and left rACC, suggesting that genes related to externalizing modulated the extent to which these nodes functioned as communication hubs.
Conclusions
Findings suggest that externalizing polygenic risk is associated with disrupted connectivity in a neural network implicated in emotion regulation, impulse control, and reinforcement learning. Results provide evidence that this network represents a genetically associated neurobiological vulnerability for externalizing disorders.
Stream sediment geochemistry provides an innovative method of assessing the basinal history of the Caledonian slate belts. Despite glaciation, the stream sediment geochemical patterns spatially mimic the outcrop of underlying bedrock lithologies. However, erosion from rock to sediment by fluvial processes may either increase or reduce an element’s abundance depending on the nature of its mineral host. An element held in heavy, resistate minerals will be concentrated, whereas one residing in unstable ferromagnesian minerals, which readily break down to clays during weathering, may be preferentially removed. Examples are provided from the Cr-Ni-V-Mg, base metal and Rb-Sr element suites. Primary and secondary bedrock patterns are recognized in the stream sediments. Primary patterns follow the original composition of the source bedrock, with steep gradients in the element distribution coinciding with lithostratigraphical boundaries. Such patterns also reveal subtle divisions within the established geological units for which the main compositional control was the nature of the ancient sedimentary provenance. Secondary patterns reflect remobilization of elements within the bedrock and so may cut across lithostratigraphical boundaries. These patterns (or their absence) are influenced by the thermal histories of the Caledonian basins, and so are indicative of the geotectonic regime in which the sedimentary sequences were originally deposited.
Posttraumatic stress disorder (PTSD) and stress/trauma exposure are cross-sectionally associated with advanced DNA methylation age relative to chronological age. However, longitudinal inquiry and examination of associations between advanced DNA methylation age and a broader range of psychiatric disorders is lacking. The aim of this study was to examine if PTSD, depression, generalized anxiety, and alcohol-use disorders predicted acceleration of DNA methylation age over time (i.e. an increasing pace, or rate of advancement, of the epigenetic clock).
Methods
Genome-wide DNA methylation and a comprehensive set of psychiatric symptoms and diagnoses were assessed in 179 Iraq/Afghanistan war veterans who completed two assessments over the course of approximately 2 years. Two DNA methylation age indices (Horvath and Hannum), each a weighted index of an array of genome-wide DNA methylation probes, were quantified. The pace of the epigenetic clock was operationalized as change in DNA methylation age as a function of time between assessments.
Results
Analyses revealed that alcohol-use disorders (p = 0.001) and PTSD avoidance and numbing symptoms (p = 0.02) at Time 1 were associated with an increasing pace of the epigenetic clock over time, per the Horvath (but not the Hannum) index of cellular aging.
Conclusions
This is the first study to suggest that posttraumatic psychopathology is longitudinally associated with a quickened pace of the epigenetic clock. Results raise the possibility that accelerated cellular aging is a common biological consequence of stress-related psychopathology, which carries implications for identifying mechanisms of stress-related cellular aging and developing interventions to slow its pace.