We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Embedding climate resilient development principles in planning, urban design, and architecture means ensuring that transformation of the built environment helps achieve carbon neutrality, effective adaptation, and well-being for people and nature. Planners, urban designers, and architects are called to bridge the domains of research and practice and evolve their agency and capacity, developing methods and tools consistent across spatial scales to ensure the convergence of outcomes towards targets. Shaping change necessitates an innovative action-driven framework with multi-scale analysis of urban climate factors and co-mapping, co-design, and co-evaluation with city stakeholders and communities. This Element provides analysis on how urban climate factors, system efficiency, form and layout, building envelope and surface materials, and green/blue infrastructure affect key metrics and indicators related to complementary aspects like greenhouse gas emissions, impacts of extreme weather events, spatial and environmental justice, and human comfort. This title is also available as open access on Cambridge Core.
Recent increases in homophobic and transphobic harassment, hate crimes, anti-lesbian, gay, bisexual, transgender, gender nonconforming, and queer (LGBTQ+) legislation, and discrimination in healthcare toward LGBTQ+ persons require urgent attention.
This study describes seriously ill LGBTQ+ patients’ and partners’ experiences of discriminatory care delivered by healthcare providers.
Methods
Qualitative data from a mixed-methods study using an online survey were analyzed using a grounded theory approach. Seriously ill LGBTQ+ persons, their spouses/partners and widows were recruited from a wide range of organizations serving the LGBTQ+ community. Respondents were asked to describe instances where they felt they received poor care from a healthcare provider because they were LGBTQ+.
Results
Six main themes emerged: (1) disrespectful care; (2) inadequate care; (3) abusive care; (4) discriminatory care toward persons who identify as transgender; (5) discriminatory behaviors toward partners; and (6) intersectional discrimination. The findings provide evidence that some LGBTQ+ patients receive poor care at a vulnerable time in their lives. Transgender patients experience unique forms of discrimination that disregard or belittle their identity.
Significance of Results
Professional associations, accrediting bodies, and healthcare organizations should set standards for nondiscriminatory, respectful, competent, safe and affirming care for LGBTQ+ patients. Healthcare organizations should implement mechanisms for identifying problems and ensuring nondiscrimination in services and employment; safety for patients and staff; strategies for outreach and marketing to the LGBTQ+ community, and ongoing staff training to ensure high quality care for LGBTQ+ patients, partners, families, and friends. Policy actions are needed to combat discrimination and disparities in healthcare, including passage of the Equality Act by Congress.
In-patient mental health rehabilitation services provide specialist treatment to people with complex psychosis. On average, rehabilitation admissions last around a year and usually follow several years of recurrent and often lengthy psychiatric hospital admissions.
Aims
To compare in-patient service use before and after an in-patient rehabilitation admission, using electronic patient healthcare records in one National Health Service Trust in London.
Method
We carried out a retrospective cohort study comprised of individuals with an in-patient rehabilitation admission lasting ≥84 days between 1 January 2010 and 30 April 2019, with at least ≥365 days of records available before and after their rehabilitation admission. We used negative binomial regression models to compare the number of in-patient days before and after the rehabilitation admission.
Results
A total of 172 individuals met our eligibility criteria. The median percentage of days spent as an in-patient before the rehabilitation admission was 29% (interquartile range 18–52%), and 8% (interquartile range 0–31%) after the admission. The regression model adjusted for potential confounder variables produced an incidence rate ratio of 0.520 (95% CI 0.367–0.737).
Conclusions
The rate of in-patient service use was halved in the period after an in-patient rehabilitation admission compared with the period before. This suggests that in-patient rehabilitation is a clinical and cost-effective intervention in the treatment and support of people with complex psychosis.
A subgroup of CHDs can only be treated palliatively through a Fontan circulation. In case of a failing Fontan situation, serum proteins are lost unspecifically and can also lead to a loss of vaccine antibodies. In a failing Fontan situation, heart transplantation may be the only feasible option.
Patient:
We describe a 17-year-old patient born with a hypoplastic left heart complex, who underwent Fontan completion at the age of 4 years and developed a failing Fontan physiology. Therefore, a Fontan takedown with creation of a reverse 1½-circulation was performed. Multiple exacerbations of protein losing enteropathy occurred with a hypoproteinaemia, hypalbuminaemia, and hypogammaglobulinaemia. The patient was hospitalised several times and treated with intravenous application of immunoglobulins and albumin for symptom control. Before one of this substitutions, the immunoglobulin G against measles, mumps, and rubella was determined: the patient’s serum demonstrated a positive measles and rubella, but mumps was negative. After administration of the iv-therapy, the lacking antibodies were replenished, and there was a positive test for mumps.
Method:
Serum samples were analysed by neutralisation test and enzyme-linked immunosorbent assay (ELISA).
Conclusion:
Although the patient had been vaccinated according to national guidelines, we saw an intermittent immune deficiency for mumps, but not for rubella and measles. For patient with a failing Fontan circulation, we recommend to test to vaccine antibodies for mumps, measles, and rubella with an ELISA an if its negative with a neutralisation test, especially in view of a possible heart transplantation to find a possible immune deficiency.
This paper presents a flexible SiGe monolithic microwave integrated circuit (MMIC) chipset for 120 GHz ultra-wideband frequency-modulated continuous wave radar systems. The highly integrated chipset is implemented with multiple-input and multiple-output radar in mind which leads to transmit and receive MMICs with four integrated channels in each chip. The transmitter achieves an output power of 12.9 dBm with a total power consumption of only 403 mW. The receiver chip incorporates a sub-harmonic approach for suppression of leakage radiation at 120 GHz through a receive channel. Both chips integrate active multiplier chains that are driven by a third reference dual band voltage-controlled oscillator (VCO) MMIC that can deliver an output at center frequencies of 15 or 30 GHz. The reference VCO MMIC demonstrates relative tuning ranges of 32%.
This study aims to outline Clostridioides difficile infection (CDI) trends and outcomes in Mexican healthcare facilities during the COVID-19 pandemic.
Design:
Observational study of case series.
Setting:
Sixteen public hospitals and private academic healthcare institutions across eight states in Mexico from January 2016 to December 2022.
Patients:
CDI patients.
Methods:
Demographic, clinical, and laboratory data of CDI patients were obtained from clinical records. Cases were classified as community or healthcare-associated infections, with incidence rates calculated as cases per 10,000 patient days. Risk factors for 30-day all-cause mortality were analyzed by multivariate logistic regression.
Results:
We identified 2,356 CDI cases: 2,118 (90%) were healthcare-associated, and 232 (10%) were community-associated. Common comorbidities included hypertension, diabetes, and cancer. Previous high use of proton-pump inhibitors, steroids, and antibiotics was observed. Recurrent infection occurred in 112 (5%) patients, and 30-day mortality in 371 (16%). Risk factors associated with death were a high Charlson score, prior use of steroids, concomitant use of antibiotics, leukopenia, leukocytosis, elevated serum creatine, hypoalbuminemia, septic shock or abdominal sepsis, and SARS-CoV-2 coinfection. The healthcare-associated CDI incidence remained stable at 4.78 cases per 10,000 patient days during the pre-and pandemic periods. However, the incidence was higher in public hospitals.
Conclusions:
Our study underscores the need for routine epidemiology surveillance and standardized CDI classification protocols in Mexican institutions. Though CDI rates in our country align with those in some European countries, disparities between public and private healthcare sectors emphasize the importance of targeted interventions.
Accurate measurement of transcutaneous oxygen saturation is important for the assessment of cyanosis in CHD. Aim of this study was the evaluation of a supplementary transcutaneous oxygen saturation measurement with an Apple watch® in children with cyanotic heart disease.
Material and methods:
During a six-minute walk test, measurement of transcutaneous oxygen saturation was performed simultaneously with an Oximeter (Nellcor, Medtronic, USA) and an Apple watch® Series 7 (Apple inc, USA) in 36 children with cyanotic heart disease.
Results:
Median age was 9.2 (IQR 5.7–13.8) years. Transcutaneous oxygen saturation measurement with the Apple watch® was possible in 35/36 and 34/36 subjects before and after six-minute walk test. Children, in whom Apple watch® measurement was not possible, had a transcutaneous oxygen saturation < 85% on oximeter. Before six-minute walk test, median transcutaneous oxygen saturation was 93 (IQR 91–97) % measured by oximeter and 95 (IQR 93–96) % by the Apple watch®. After a median walking distance of 437 (IQR 360–487) m, transcutaneous oxygen saturation dropped to 92 (IQR 88–95, p < 0.001) % by oximeter and to 94 (IQR 90–96, p = 0.013) % measured with the Apple watch®.
Conclusion:
In children with mild cyanosis measurement of transcutaneous oxygen saturation with an Apple watch® showed only valid results if transcutaneous oxygen saturation was > 85%, with higher values being measured with the smart watch. In children with moderate or severe cyanosis transcutaneous oxygen saturation, measurement with the Apple watch® was not reliable and cannot be recommended to monitor oxygen saturation at home.
Clozapine is the most effective antipsychotic for treatment-resistant psychosis. However, clozapine is underutilised in part because of potential agranulocytosis. Accumulating evidence indicates that below-threshold haematological readings in isolation are not diagnostic of life-threatening clozapine-induced agranulocytosis (CIA).
Aims
To examine the prevalence and timing of CIA using different diagnostic criteria and to explore demographic differences of CIA in patients registered on the UK Central Non-Rechallenge Database (CNRD).
Method
We analysed data of all patients registered on the UK Clozaril® Patient Monitoring Service Central Non-Rechallenge Database (at least one absolute neutrophil count (ANC) < 1.5 × 109/L and/or white blood cell count < 3.0 × 109/L) between May 2000 and February 2021. We calculated prevalence rates of agranulocytosis using threshold-based and pattern-based criteria, stratified by demographic factors (gender, age and ethnicity). Differences in epidemiology based on rechallenge status and clozapine indication were explored. The proportion of patients who recorded agranulocytosis from a normal ANC was explored.
Results
Of the 3029 patients registered on the CNRD with 283 726 blood measurements, 593 (19.6%) were determined to have threshold-based agranulocytosis and 348 (11.4%) pattern-based agranulocytosis. In the total sample (75 533), the prevalence of threshold-based agranulocytosis and pattern-based agranulocytosis was 0.8% and 0.5%, respectively. The median time to threshold-based agranulocytosis was 32 weeks (IQR 184) and 15 (IQR 170) weeks for pattern-based agranulocytosis. Among age groups, the prevalence of pattern-based agranulocytosis and threshold-based agranulocytosis was highest in the >48 age group. Prevalence rates were greatest for White (18%) and male individuals (13%), and lowest for Black individuals (0.1%). The proportion of people who were determined to have pattern-based agranulocytosis without passing through neutropenia was 70%.
Conclusions
Threshold-based definition of agranulocytosis may over-diagnose CIA. Monitoring schemes should take into consideration neutrophil patterns to correctly identify clinically relevant CIA. In marked contrast to previous studies, CIA occurred least in Black individuals and most in White individuals.
The endemic Little Vermilion Flycatcher (LVF) Pyrocephalus nanus has suffered a drastic decline on Santa Cruz Island, Galapagos, where it was common 30 years ago. Currently, fewer than 40 individuals remain in the last remnants of natural humid forest in the Galapagos National Park on the island. This small population has low reproductive success, which is contributing to its decline in Santa Cruz. Previous studies have identified Avian Vampire Fly Philornis downsi parasitism, changes in food sources, and habitat alteration as threats to this species. In Santa Cruz, invasive plants may strongly affect the reproductive success of LVF because they limit accessibility to prey near the ground, the preferred foraging niche of these birds. Since 2019, we restored the vegetation in seven plots of 1 ha each by removing invasive blackberry plants and other introduced plant species. In all nests that reached late incubation, we also reduced the number of Avian Vampire Fly larvae. In this study, we compared foraging and perch height, pair formation, incubation time, and reproductive success between managed and unmanaged areas. As predicted, we found significantly lower foraging height and perch height in 2021 in managed areas compared with unmanaged areas. In 2020, the daily failure rate (DFR) of nests in the egg stage did not differ between management types; however, in 2021, the DFR in the egg stage was significantly lower in managed areas than in unmanaged areas. The DFR during the nestling stage was similar between managed and unmanaged areas in 2020, but in 2021, only nests in managed areas reached the nestling stage. Females brooded significantly more during the incubation phase in managed areas. Additionally, we found significantly higher reproductive success in managed areas compared with unmanaged areas in 2021, but not in 2020. Habitat restoration is a long-term process and these findings suggest that habitat management positively affects this small population in the long term.
Production and utilization of crop residues as mulch and effective weed management are two central elements in the successful implementation of Conservation Agriculture (CA) systems in southern Africa. Yet, the challenges of crop residue availability for mulch or the difficulties in managing weed proliferation in CA systems are bigger than a micro-level focus on weeds and crop residues themselves. The bottlenecks are symptoms of broader systemic complications that cannot be resolved without appreciating the interactions between the current scientific understanding of CA and its application in smallholder systems, private incentives, social norms, institutions, and government policy. In this paper, we elucidate a series of areas that represent some unquestioned answers about chemical weed control and unanswered questions about how to maintain groundcover demanding more research along the natural and social sciences continuum. In some communities, traditional rules that allow free-range grazing of livestock after harvesting present a barrier in surface crop residue management. On the other hand, many of the communities either burn, remove, or incorporate the residues into the soil thus hindering the near-permanent soil cover required in CA systems. The lack of soil cover also means that weed management through soil mulch is unachievable. Herbicides are often a successful stopgap solution to weed control, but they are costly, and most farmers do not use them as recommended, which reduces efficacy. Besides, the use of herbicides can cause environmental hazards and may affect human health. Here, we suggest further assessment of the manipulation of crop competition, the use of vigorously growing cover crops, exploration of allelopathy, and use of microorganisms in managing weeds and reducing seed production to deplete the soil weed seed bank. We also suggest in situ production of plant biomass, use of unpalatable species for mulch generation and change of grazing by-laws towards a holistic management of pastures to reduce the competition for crop residues. However, these depend on the socio-economic status dynamics at farmer and community level.
Electronic measurement systems in the THz frequency range are often bulky and expensive devices. While some compact single-chip systems operating in the high millimeter-wave frequency range have recently been published, compact measurement systems in the low THz frequency range are still rare. The emergence of new silicon-germanium (SiGe) semiconductor technologies allow the integration of system components, like oscillators, frequency multipliers, frequency dividers, and antennas, operating in the low THz frequency range, into a compact monolithic microwave integrated circuits (MMIC), which contains most components to implement a low-cost and compact frequency-modulated continuous-wave-radar transceiver. This article presents a single transceiver solution containing all necessary components. It introduces a $0.48\,\mathrm{THz}$ radar transceiver MMIC with a tuning range of $43\,\mathrm{GHz}$ and an output power of up to $-9.4\,\mathrm{dBm}$ in the SG13G3 $130\,\mathrm{nm}$ SiGe technology by IHP. The MMIC is complemented by a dielectric lens antenna design consisting of polytetrafluoroethylene, providing up to $39.3\,\mathrm d\mathrm B\mathrm i$ of directivity and half-power beam widths of 0.95∘ in transmit and receive direction. The suppression of clutter from unwanted targets deviating from antenna boresight more than 6∘ is higher than $24.6\,\mathrm d \mathrm B$ in E- and H-Plane.
Binge-eating disorder (BED) co-occurs with neurobehavioral alterations in the processing of disorder-relevant content such as visual food stimuli. Whether neurofeedback (NF) directly targeting them is suited for treatment remains unclear. This study sought to determine feasibility and estimate effects of individualized, functional near-infrared spectroscopy-based real-time NF (rtfNIRS-NF) and high-beta electroencephalography-based NF (EEG-NF), assuming superiority over waitlist (WL).
Methods
Single-center, assessor-blinded feasibility study with randomization to rtfNIRS-NF, EEG-NF, or WL and assessments at baseline (t0), postassessment (t1), and 6-month follow-up (t2). NF comprised 12 60-min food-specific rtfNIRS-NF or EEG-NF sessions over 8 weeks. Primary outcome was the binge-eating frequency at t1 assessed interview-based. Secondary outcomes included feasibility, eating disorder symptoms, mental and physical health, weight management-related behavior, executive functions, and brain activity at t1 and t2.
Results
In 72 patients (intent-to-treat), the results showed feasibility of NF regarding recruitment, attrition, adherence, compliance, acceptance, and assessment completion. Binge eating improved at t1 by −8.0 episodes, without superiority of NF v. WL (−0.8 episodes, 95% CI −2.4 to 4.0), but with improved estimates in NF at t2 relative to t1. NF was better than WL for food craving, anxiety symptoms, and body mass index, but overall effects were mostly small. Brain activity changes were near zero.
Conclusions
The results show feasibility of food-specific rtfNIRS-NF and EEG-NF in BED, and no posttreatment differences v. WL, but possible continued improvement of binge eating. Confirmatory and mechanistic evidence is warranted in a double-blind randomized design with long-term follow-up, considering dose–response relationships and modes of delivery.
In England, a range of mental health crisis care models and approaches to organising crisis care systems have been implemented, but characteristics associated with their effectiveness are poorly understood.
Aims
To (a) develop a typology of catchment area mental health crisis care systems and (b) investigate how crisis care service models and system characteristics relate to psychiatric hospital admissions and detentions.
Method
Crisis systems data were obtained from a 2019 English national survey. Latent class analyses were conducted to identify discernible typologies, and mixed-effects negative binomial regression models were fitted to explore associations between crisis care models and admissions and detention rates, obtained from nationally reported data.
Results
No clear typology of catchment area crisis care systems emerged. Regression models suggested that provision of a crisis telephone service within the local crisis system was associated with a 11.6% lower admissions rate and 15.3% lower detention rate. Provision of a crisis cafe was associated with a 7.8% lower admission rates. The provision of a crisis assessment team separate from the crisis resolution and home treatment service was associated with a 12.8% higher admission rate.
Conclusions
The configuration of crisis care systems varies considerably in England, but we could not derive a typology that convincingly categorised crisis care systems. Our results suggest that a crisis phone line and a crisis cafe may be associated with lower admission rates. However, our findings suggest crisis assessment teams, separate from home treatment teams, may not be associated with reductions in admission and detentions.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
The Eighth World Congress of Pediatric Cardiology and Cardiac Surgery (WCPCCS) will be held in Washington DC, USA, from Saturday, 26 August, 2023 to Friday, 1 September, 2023, inclusive. The Eighth World Congress of Pediatric Cardiology and Cardiac Surgery will be the largest and most comprehensive scientific meeting dedicated to paediatric and congenital cardiac care ever held. At the time of the writing of this manuscript, The Eighth World Congress of Pediatric Cardiology and Cardiac Surgery has 5,037 registered attendees (and rising) from 117 countries, a truly diverse and international faculty of over 925 individuals from 89 countries, over 2,000 individual abstracts and poster presenters from 101 countries, and a Best Abstract Competition featuring 153 oral abstracts from 34 countries. For information about the Eighth World Congress of Pediatric Cardiology and Cardiac Surgery, please visit the following website: [www.WCPCCS2023.org]. The purpose of this manuscript is to review the activities related to global health and advocacy that will occur at the Eighth World Congress of Pediatric Cardiology and Cardiac Surgery.
Acknowledging the need for urgent change, we wanted to take the opportunity to bring a common voice to the global community and issue the Washington DC WCPCCS Call to Action on Addressing the Global Burden of Pediatric and Congenital Heart Diseases. A copy of this Washington DC WCPCCS Call to Action is provided in the Appendix of this manuscript. This Washington DC WCPCCS Call to Action is an initiative aimed at increasing awareness of the global burden, promoting the development of sustainable care systems, and improving access to high quality and equitable healthcare for children with heart disease as well as adults with congenital heart disease worldwide.
Objective of this contribution is to present the use of Model-based Systems Engineering within the engineering design community. Based on a differentiation between SE and MBSE, the definition of three core MBSE elements, namely modelling method, modelling language, and modelling tool as well as the three major aspects of a consistent system model (requirements, behaviour, and structure) a structured review is conducted, focussing on the understanding and motivation as well as the modelling of systems. The review includes 93 publications from Design Society library and proceedings of the CIRP Design conferences during the period from 2011 to 2022. The review points out, that there is an increasing application of MBSE within the engineering design community, mainly focussing on architecture definition or combined engineering activities. Only a small portion of works (16 publications) are providing a consistent approach as these publications link all aspects of the system model and consider all three MBSE elements. It can be concluded, that there is a diffuse understanding of MBSE and different motivations are given to apply more formal system models as well as modelling tools.
Frontal ablation, the combination of submarine melting and iceberg calving, changes the geometry of a glacier's terminus, influencing glacier dynamics, the fate of upwelling plumes and the distribution of submarine meltwater input into the ocean. Directly observing frontal ablation and terminus morphology below the waterline is difficult, however, limiting our understanding of these coupled ice–ocean processes. To investigate the evolution of a tidewater glacier's submarine terminus, we combine 3-D multibeam point clouds of the subsurface ice face at LeConte Glacier, Alaska, with concurrent observations of environmental conditions during three field campaigns between 2016 and 2018. We observe terminus morphology that was predominately overcut (52% in August 2016, 63% in May 2017 and 74% in September 2018), accompanied by high multibeam sonar-derived melt rates (4.84 m d−1 in 2016, 1.13 m d−1 in 2017 and 1.85 m d−1 in 2018). We find that periods of high subglacial discharge lead to localized undercut discharge outlets, but adjacent to these outlets the terminus maintains significantly overcut geometry, with an ice ramp that protrudes 75 m into the fjord in 2017 and 125 m in 2018. Our data challenge the assumption that tidewater glacier termini are largely undercut during periods of high submarine melting.
The use of personal protective equipment (PPE) in prehospital emergency care has significantly increased since the onset of the coronavirus disease 2019 (COVID-19) pandemic. Several studies investigating the potential effects of PPE use by Emergency Medical Service providers on the quality of chest compressions during resuscitation have been inconclusive.
Study Objectives:
This study aimed to determine whether the use of PPE affects the quality of chest compressions or influences select physiological biomarkers that are associated with stress.
Methods:
This was a prospective randomized, quasi-experimental crossover study with 35 Emergency Medical Service providers who performed 20 minutes of chest compressions on a manikin. Two iterations were completed in a randomized order: (1) without PPE and (2) with PPE consisting of Tyvek, goggles, KN95 mask, and nitrile gloves. The rate and depth of chest compressions were measured. Salivary cortisol, lactate, end-tidal carbon dioxide (EtCO2), and body temperature were measured before and after each set of chest compressions.
Results:
There were no differences in the quality of chest compressions (rate and depth) between the two groups (P >.05). After performing chest compressions, the group with PPE did not have elevated levels of cortisol, lactate, or EtCO2 when compared to the group without PPE, but did have a higher body temperature (P <.001).
Conclusion:
The use of PPE during resuscitation did not lower the quality of chest compressions, nor did it lead to higher stress-associated biomarker levels, with the exception of body temperature.
Reward processing has been proposed to underpin the atypical social feature of autism spectrum disorder (ASD). However, previous neuroimaging studies have yielded inconsistent results regarding the specificity of atypicalities for social reward processing in ASD.
Aims
Utilising a large sample, we aimed to assess reward processing in response to reward type (social, monetary) and reward phase (anticipation, delivery) in ASD.
Method
Functional magnetic resonance imaging during social and monetary reward anticipation and delivery was performed in 212 individuals with ASD (7.6–30.6 years of age) and 181 typically developing participants (7.6–30.8 years of age).
Results
Across social and monetary reward anticipation, whole-brain analyses showed hypoactivation of the right ventral striatum in participants with ASD compared with typically developing participants. Further, region of interest analysis across both reward types yielded ASD-related hypoactivation in both the left and right ventral striatum. Across delivery of social and monetary reward, hyperactivation of the ventral striatum in individuals with ASD did not survive correction for multiple comparisons. Dimensional analyses of autism and attention-deficit hyperactivity disorder (ADHD) scores were not significant. In categorical analyses, post hoc comparisons showed that ASD effects were most pronounced in participants with ASD without co-occurring ADHD.
Conclusions
Our results do not support current theories linking atypical social interaction in ASD to specific alterations in social reward processing. Instead, they point towards a generalised hypoactivity of ventral striatum in ASD during anticipation of both social and monetary rewards. We suggest this indicates attenuated reward seeking in ASD independent of social content and that elevated ADHD symptoms may attenuate altered reward seeking in ASD.