We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Attention-deficit/hyperactivity disorder (ADHD) is a highly prevalent psychiatric condition that frequently originates in early development and is associated with a variety of functional impairments. Despite a large functional neuroimaging literature on ADHD, our understanding of the neural basis of this disorder remains limited, and existing primary studies on the topic include somewhat divergent results.
Objectives
The present meta-analysis aims to advance our understanding of the neural basis of ADHD by identifying the most statistically robust patterns of abnormal neural activation throughout the whole-brain in individuals diagnosed with ADHD compared to age-matched healthy controls.
Methods
We conducted a meta-analysis of task-based functional magnetic resonance imaging (fMRI) activation studies of ADHD. This included, according to PRISMA guidelines, a comprehensive PubMed search and predetermined inclusion criteria as well as two independent coding teams who evaluated studies and included all task-based, whole-brain, fMRI activation studies that compared participants diagnosed with ADHD to age-matched healthy controls. We then performed multilevel kernel density analysis (MKDA) a well-established, whole-brain, voxelwise approach that quantitatively combines existing primary fMRI studies, with ensemble thresholding (p<0.05-0.0001) and multiple comparisons correction.
Results
Participants diagnosed with ADHD (N=1,550), relative to age-matched healthy controls (N=1,340), exhibited statistically significant (p<0.05-0.0001; FWE-corrected) patterns of abnormal activation in multiple brains of the cerebral cortex and basal ganglia across a variety of cognitive control tasks.
Conclusions
This study advances our understanding of the neural basis of ADHD and may aid in the development of new brain-based clinical interventions as well as diagnostic tools and treatment matching protocols for patients with ADHD. Future studies should also investigate the similarities and differences in neural signatures between ADHD and other highly comorbid psychiatric disorders.
Understanding characteristics of healthcare personnel (HCP) with SARS-CoV-2 infection supports the development and prioritization of interventions to protect this important workforce. We report detailed characteristics of HCP who tested positive for SARS-CoV-2 from April 20, 2020 through December 31, 2021.
Methods:
CDC collaborated with Emerging Infections Program sites in 10 states to interview HCP with SARS-CoV-2 infection (case-HCP) about their demographics, underlying medical conditions, healthcare roles, exposures, personal protective equipment (PPE) use, and COVID-19 vaccination status. We grouped case-HCP by healthcare role. To describe residential social vulnerability, we merged geocoded HCP residential addresses with CDC/ATSDR Social Vulnerability Index (SVI) values at the census tract level. We defined highest and lowest SVI quartiles as high and low social vulnerability, respectively.
Results:
Our analysis included 7,531 case-HCP. Most case-HCP with roles as certified nursing assistant (CNA) (444, 61.3%), medical assistant (252, 65.3%), or home healthcare worker (HHW) (225, 59.5%) reported their race and ethnicity as either non-Hispanic Black or Hispanic. More than one third of HHWs (166, 45.2%), CNAs (283, 41.7%), and medical assistants (138, 37.9%) reported a residential address in the high social vulnerability category. The proportion of case-HCP who reported using recommended PPE at all times when caring for patients with COVID-19 was lowest among HHWs compared with other roles.
Conclusions:
To mitigate SARS-CoV-2 infection risk in healthcare settings, infection prevention, and control interventions should be specific to HCP roles and educational backgrounds. Additional interventions are needed to address high social vulnerability among HHWs, CNAs, and medical assistants.
Bromide-containing impurities were found to decrease the thermal stability of quaternary alkyl ammonium-modified layered silicates. Improved purification procedures completely removed bromide and led to a 20°C to >100°C increase in organic modified layered silicate thermal stability. Using mass spectrometry and thermal and electrochemical analysis, N,N-dimethyl-N,N-dioctadecyl quaternary ammonium-modified montmorillonite and fluorinated synthetic mica were found to degrade primarily through elimination and nucleophilic attack by these anions. The nature of residual bromides was identified and quantified, and the efficiency of removing these anions was found to be solvent dependent; sequential extraction, first ethanol then tetrahydrofuran, gave the best results. This exhaustive extraction method represents a viable alternative to the use of expensive, more thermally stable oniumion treatments for layered silicates.
Different fertilization strategies can be adopted to optimize the productive components of an integrated crop–livestock systems. The current research evaluated how the application of P and K to soybean (Glycine max (L.) Merr.) or Urochloa brizantha (Hochst. ex A. Rich.) R. D. Webster cv. BRS Piatã associated with nitrogen or without nitrogen in the pasture phase affects the accumulation and chemical composition of forage and animal productivity. The treatments were distributed in randomized blocks with three replications. Four fertilization strategies were tested: (1) conventional fertilization with P and K in the crop phase (CF–N); (2) conventional fertilization with nitrogen in the pasture phase (CF + N); (3) system fertilization with P and K in the pasture phase (SF–N); (4) system fertilization with nitrogen in the pasture phase (SF + N). System fertilization increased forage accumulation from 15 710 to 20 920 kg DM ha/year compared to conventional without nitrogen. Stocking rate (3.1 vs. 2.8 AU/ha; SEM = 0.12) and gain per area (458 vs. 413 kg BW/ha; SEM = 27.9) were higher in the SF–N than CF–N, although the average daily gain was lower (0.754 vs. 0.792 kg LW/day; SEM = 0.071). N application in the pasture phase, both, conventional and system fertilization resulted in higher crude protein, stocking rate and gain per area. Applying nitrogen and relocate P and K from crop to pasture phase increase animal productivity and improve forage chemical composition in integrated crop–livestock system.
Area-based conservation is a widely used approach for maintaining biodiversity, and there are ongoing discussions over what is an appropriate global conservation area coverage target. To inform such debates, it is necessary to know the extent and ecological representativeness of the current conservation area network, but this is hampered by gaps in existing global datasets. In particular, although data on privately and community-governed protected areas and other effective area-based conservation measures are often available at the national level, it can take many years to incorporate these into official datasets. This suggests a complementary approach is needed based on selecting a sample of countries and using their national-scale datasets to produce more accurate metrics. However, every country added to the sample increases the costs of data collection, collation and analysis. To address this, here we present a data collection framework underpinned by a spatial prioritization algorithm, which identifies a minimum set of countries that are also representative of 10 factors that influence conservation area establishment and biodiversity patterns. We then illustrate this approach by identifying a representative set of sampling units that cover 10% of the terrestrial realm, which included areas in only 25 countries. In contrast, selecting 10% of the terrestrial realm at random included areas across a mean of 162 countries. These sampling units could be the focus of future data collation on different types of conservation area. Analysing these data could produce more rapid and accurate estimates of global conservation area coverage and ecological representativeness, complementing existing international reporting systems.
As the scale of cosmological surveys increases, so does the complexity in the analyses. This complexity can often make it difficult to derive the underlying principles, necessitating statistically rigorous testing to ensure the results of an analysis are consistent and reasonable. This is particularly important in multi-probe cosmological analyses like those used in the Dark Energy Survey (DES) and the upcoming Legacy Survey of Space and Time, where accurate uncertainties are vital. In this paper, we present a statistically rigorous method to test the consistency of contours produced in these analyses and apply this method to the Pippin cosmological pipeline used for type Ia supernova cosmology with the DES. We make use of the Neyman construction, a frequentist methodology that leverages extensive simulations to calculate confidence intervals, to perform this consistency check. A true Neyman construction is too computationally expensive for supernova cosmology, so we develop a method for approximating a Neyman construction with far fewer simulations. We find that for a simulated dataset, the 68% contour reported by the Pippin pipeline and the 68% confidence region produced by our approximate Neyman construction differ by less than a percent near the input cosmology; however, they show more significant differences far from the input cosmology, with a maximal difference of 0.05 in $\Omega_{M}$ and 0.07 in w. This divergence is most impactful for analyses of cosmological tensions, but its impact is mitigated when combining supernovae with other cross-cutting cosmological probes, such as the cosmic microwave background.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Method
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Results
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
Conclusions
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.
Since the advent of direct-acting antiviral therapy, the elimination of hepatitis c virus (HCV) as a public health concern is now possible. However, identification of those who remain undiagnosed, and re-engagement of those who are diagnosed but remain untreated, will be essential to achieve this. We examined the extent of HCV infection among individuals undergoing liver function tests (LFT) in primary care. Residual biochemistry samples for 6007 patients, who had venous blood collected in primary care for LFT between July 2016 and January 2017, were tested for HCV antibody. Through data linkage to national and sentinel HCV surveillance databases, we also examined the extent of diagnosed infection, attendance at specialist service and HCV treatment for those found to be HCV positive. Overall HCV antibody prevalence was 4.0% and highest for males (5.0%), those aged 37–50 years (6.2%), and with an ALT result of 70 or greater (7.1%). Of those testing positive, 68.9% had been diagnosed with HCV in the past, 84.9% before the study period. Most (92.5%) of those diagnosed with chronic infection had attended specialist liver services and while 67.7% had ever been treated only 38% had successfully cleared infection. More than half of HCV-positive people required assessment, and potentially treatment, for their HCV infection but were not engaged with services during the study period. LFT in primary care are a key opportunity to diagnose, re-diagnose and re-engage patients with HCV infection and highlight the importance of GPs in efforts to eliminate HCV as a public health concern.
The coronavirus disease (COVID-19) pandemic has presented unique challenges to pediatric emergency medicine (PEM) departments. The purpose of this study was to identify these challenges and ascertain how centers overcame barriers in creating solutions to continue to provide high-quality care and keep their workforce safe during the early pandemic.
Methods:
This is a qualitative study based on semi-structured interviews with physicians in leadership positions who have disaster or emergency management experience. Participants were identified through purposive sampling. Interviews were recorded and transcribed electronically. Themes and codes were extracted from the transcripts by 2 independent coders. Constant comparison analysis was performed until thematic saturation was achieved. Member-checking was completed to ensure trustworthiness.
Results:
Fourteen PEM-trained physicians participated in this study. Communication, leadership and planning, clinical practice, and personal adaptations were the principal themes identified. Recommendations elicited include improving communication strategies; increasing emergency department (ED) representation within hospital-wide incident command; preparing for a surge and accepting adult patients; personal protective equipment supply and usage; developing testing strategies; and adaptations individuals made to their practice to keep themselves and their families safe.
Conclusions:
By sharing COVID-19 experiences and offering solutions to commonly encountered problems, pediatric EDs may be better prepared for future pandemics.
Describe nutrition and physical activity practices, nutrition self-efficacy and barriers and food programme knowledge within Family Child Care Homes (FCCH) and differences by staffing.
Design:
Baseline, cross-sectional analyses of the Happy Healthy Homes randomised trial (NCT03560050).
Setting:
FCCH in Oklahoma, USA.
Participants:
FCCH providers (n 49, 100 % women, 30·6 % Non-Hispanic Black, 2·0 % Hispanic, 4·1 % American Indian/Alaska Native, 51·0 % Non-Hispanic white, 44·2 ± 14·2 years of age. 53·1 % had additional staff) self-reported nutrition and physical activity practices and policies, nutrition self-efficacy and barriers and food programme knowledge. Differences between providers with and without additional staff were adjusted for multiple comparisons (P < 0·01).
Results:
The prevalence of meeting all nutrition and physical activity best practices ranged from 0·0–43·8 % to 4·1–16·7 %, respectively. Average nutrition and physical activity scores were 3·2 ± 0·3 and 3·0 ± 0·5 (max 4·0), respectively. Sum nutrition and physical activity scores were 137·5 ± 12·6 (max 172·0) and 48·4 ± 7·5 (max 64·0), respectively. Providers reported high nutrition self-efficacy and few barriers. The majority of providers (73·9–84·7 %) felt that they could meet food programme best practices; however, knowledge of food programme best practices was lower than anticipated (median 63–67 % accuracy). More providers with additional staff had higher self-efficacy in family-style meal service than did those who did not (P = 0·006).
Conclusions:
Providers had high self-efficacy in meeting nutrition best practices and reported few barriers. While providers were successfully meeting some individual best practices, few met all. Few differences were observed between FCCH providers with and without additional staff. FCCH providers need additional nutrition training on implementation of best practices.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
The SARS-CoV-2 pandemic has highlighted the need for rapid creation and management of ICU field hospitals with effective remote monitoring which is dependent on the rapid deployment and integration of an Electronic Health Record (EHR). We describe the use of simulation to evaluate a rapidly scalable hub-and-spoke model for EHR deployment and monitoring using asynchronous training.
Methods:
We adapted existing commercial EHR products to serve as the point of entry from a simulated hospital and a separate system for tele-ICU support and monitoring of the interfaced data. To train our users we created a modular video-based curriculum to facilitate asynchronous training. Effectiveness of the curriculum was assessed through completion of common ICU documentation tasks in a high-fidelity simulation. Additional endpoints include assessment of EHR navigation, user satisfaction (Net Promoter), system usability (System Usability Scale-SUS), and cognitive load (NASA-TLX).
Results:
A total of 5 participants achieved a 100% task completion on all domains except ventilator data (91%). Systems demonstrated high degrees of satisfaction (Net Promoter = 65.2), acceptable usability (SUS = 66.5), and acceptable cognitive load (NASA-TLX = 41.5); with higher levels of cognitive load correlating with the number of screens employed.
Conclusions:
Clinical usability of a comprehensive and rapidly deployable EHR was acceptable in an intensive care simulation which was preceded by < 1 hour of video education about the EHR. This model should be considered in plans for integrated clinical response with remote and accessory facilities.
We present an overview of the Middle Ages Galaxy Properties with Integral Field Spectroscopy (MAGPI) survey, a Large Program on the European Southern Observatory Very Large Telescope. MAGPI is designed to study the physical drivers of galaxy transformation at a lookback time of 3–4 Gyr, during which the dynamical, morphological, and chemical properties of galaxies are predicted to evolve significantly. The survey uses new medium-deep adaptive optics aided Multi-Unit Spectroscopic Explorer (MUSE) observations of fields selected from the Galaxy and Mass Assembly (GAMA) survey, providing a wealth of publicly available ancillary multi-wavelength data. With these data, MAGPI will map the kinematic and chemical properties of stars and ionised gas for a sample of 60 massive (${>}7 \times 10^{10} {\mathrm{M}}_\odot$) central galaxies at $0.25 < z <0.35$ in a representative range of environments (isolated, groups and clusters). The spatial resolution delivered by MUSE with Ground Layer Adaptive Optics ($0.6-0.8$ arcsec FWHM) will facilitate a direct comparison with Integral Field Spectroscopy surveys of the nearby Universe, such as SAMI and MaNGA, and at higher redshifts using adaptive optics, for example, SINS. In addition to the primary (central) galaxy sample, MAGPI will deliver resolved and unresolved spectra for as many as 150 satellite galaxies at $0.25 < z <0.35$, as well as hundreds of emission-line sources at $z < 6$. This paper outlines the science goals, survey design, and observing strategy of MAGPI. We also present a first look at the MAGPI data, and the theoretical framework to which MAGPI data will be compared using the current generation of cosmological hydrodynamical simulations including EAGLE, Magneticum, HORIZON-AGN, and Illustris-TNG. Our results show that cosmological hydrodynamical simulations make discrepant predictions in the spatially resolved properties of galaxies at $z\approx 0.3$. MAGPI observations will place new constraints and allow for tangible improvements in galaxy formation theory.
To determine whether age, gender and marital status are associated with prognosis for adults with depression who sought treatment in primary care.
Methods
Medline, Embase, PsycINFO and Cochrane Central were searched from inception to 1st December 2020 for randomised controlled trials (RCTs) of adults seeking treatment for depression from their general practitioners, that used the Revised Clinical Interview Schedule so that there was uniformity in the measurement of clinical prognostic factors, and that reported on age, gender and marital status. Individual participant data were gathered from all nine eligible RCTs (N = 4864). Two-stage random-effects meta-analyses were conducted to ascertain the independent association between: (i) age, (ii) gender and (iii) marital status, and depressive symptoms at 3–4, 6–8,<Vinod: Please carry out the deletion of serial commas throughout the article> and 9–12 months post-baseline and remission at 3–4 months. Risk of bias was evaluated using QUIPS and quality was assessed using GRADE. PROSPERO registration: CRD42019129512. Pre-registered protocol https://osf.io/e5zup/.
Results
There was no evidence of an association between age and prognosis before or after adjusting for depressive ‘disorder characteristics’ that are associated with prognosis (symptom severity, durations of depression and anxiety, comorbid panic disorderand a history of antidepressant treatment). Difference in mean depressive symptom score at 3–4 months post-baseline per-5-year increase in age = 0(95% CI: −0.02 to 0.02). There was no evidence for a difference in prognoses for men and women at 3–4 months or 9–12 months post-baseline, but men had worse prognoses at 6–8 months (percentage difference in depressive symptoms for men compared to women: 15.08% (95% CI: 4.82 to 26.35)). However, this was largely driven by a single study that contributed data at 6–8 months and not the other time points. Further, there was little evidence for an association after adjusting for depressive ‘disorder characteristics’ and employment status (12.23% (−1.69 to 28.12)). Participants that were either single (percentage difference in depressive symptoms for single participants: 9.25% (95% CI: 2.78 to 16.13) or no longer married (8.02% (95% CI: 1.31 to 15.18)) had worse prognoses than those that were married, even after adjusting for depressive ‘disorder characteristics’ and all available confounders.
Conclusion
Clinicians and researchers will continue to routinely record age and gender, but despite their importance for incidence and prevalence of depression, they appear to offer little information regarding prognosis. Patients that are single or no longer married may be expected to have slightly worse prognoses than those that are married. Ensuring this is recorded routinely alongside depressive ‘disorder characteristics’ in clinic may be important.
Healthcare personnel with severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection were interviewed to describe activities and practices in and outside the workplace. Among 2,625 healthcare personnel, workplace-related factors that may increase infection risk were more common among nursing-home personnel than hospital personnel, whereas selected factors outside the workplace were more common among hospital personnel.
Susceptibility to infection such as SARS-CoV-2 may be influenced by host genotype. TwinsUK volunteers (n = 3261) completing the C-19 COVID-19 symptom tracker app allowed classical twin studies of COVID-19 symptoms, including predicted COVID-19, a symptom-based algorithm to predict true infection, derived from app users tested for SARS-CoV-2. We found heritability of 49% (32−64%) for delirium; 34% (20−47%) for diarrhea; 31% (8−52%) for fatigue; 19% (0−38%) for anosmia; 46% (31−60%) for skipped meals and 31% (11−48%) for predicted COVID-19. Heritability estimates were not affected by cohabiting or by social deprivation. The results suggest the importance of host genetics in the risk of clinical manifestations of COVID-19 and provide grounds for planning genome-wide association studies to establish specific genes involved in viral infectivity and the host immune response.
Haematopoietic stem cell transplantation is an important and effective treatment strategy for many malignancies, marrow failure syndromes, and immunodeficiencies in children, adolescents, and young adults. Despite advances in supportive care, patients undergoing transplant are at increased risk to develop cardiovascular co-morbidities.
Methods:
This study was performed as a feasibility study of a rapid cardiac MRI protocol to substitute for echocardiography in the assessment of left ventricular size and function, pericardial effusion, and right ventricular hypertension.
Results:
A total of 13 patients were enrolled for the study (age 17.5 ± 7.7 years, 77% male, 77% white). Mean study time was 13.2 ± 5.6 minutes for MRI and 18.8 ± 5.7 minutes for echocardiogram (p = 0.064). Correlation between left ventricular ejection fraction by MRI and echocardiogram was good (ICC 0.76; 95% CI 0.47, 0.92). None of the patients had documented right ventricular hypertension. Patients were given a survey regarding their experiences, with the majority both perceiving that the echocardiogram took longer (7/13) and indicating they would prefer the MRI if given a choice (10/13).
Conclusion:
A rapid cardiac MRI protocol was shown feasible to substitute for echocardiogram in the assessment of key factors prior to or in follow-up after haematopoietic stem cell transplantation.
Subglacial sediments have the potential to reveal information about the controls on glacier flow, changes in ice-sheet history and characterise life in those environments. Retrieving sediments from beneath the ice, through hot water drilled access holes at remote field locations, present many challenges. Motivated by the need to minimise weight, corer diameter and simplify assembly and operation, British Antarctic Survey, in collaboration with UWITEC, developed a simple mechanical percussion corer. At depths over 1000 m however, manual operation of the percussion hammer is compromised by the lack of clear operator feedback at the surface. To address this, we present a new auto-release-recovery percussion hammer mechanism that makes coring operations depth independent and improves hammer efficiency. Using a single rope tether for both the corer and hammer operation, this modified percussion corer is relatively simple to operate, easy to maintain, and has successfully operated at a depth of >2130 m.