We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The effect of varying concentrations of Al3+ and H+ upon the simultaneous diffusion of 85Sr and 86Rb was measured in salt-free aliquots of clay having different Al:H ratios. The Sr and Rb saturation of the CEC was held constant while the exchangeable Al and H were varied from Al52:H1 to Al12:H34. Aliquots of each clay-treatment were dually tagged with 85Sr and 86Rb. Self-diffusion of Sr and Rb was measured at 4, 24, 48, and 75°C. Radioassay of 85Sr and 86Rb was made with an automatic gamma-detection system equipped with a 400 channel analyzer and card punch unit. The selfdiffusion equation was programmed for the 7040 computer to permit the simultaneous calculation of 85Sr and 86Rb self-diffusion coefficients. Rb diffusion was not significantly altered as the Al3+ concentration was increased from 12 to 52 per cent. The diffusion of Sr was significantly increased as Al3+ increased from 12 to 52 per cent. The faster diffusing Rb ion had a greater energy of activation than Sr (4·8 to 3·6 kcal/mole), however the Arrhenius frequency factor for Rb (a measure of the probability of ion exchange) was much greater for Rb than Sr (28·8 and 0·4 × 10−4, respectively). Altering the Rb and Sr saturation and the complementary ions resulted in changes in the diffusivity, the energy of activation, and the frequency factor for these ions, but not always in the same direction or to the same degree.
Background: Sex and gender are related but distinct determinants of disease, treatment response, and research reproducibility whose consideration is increasingly required for research funding. Nevertheless, the quality of sex and gender reporting in neurological randomized controlled trials (RCTs) remains unknown. Methods: This ongoing study of RCTs associated with Food and Drug Administration neurological drug approvals aims to determine the frequency of accurate reporting of RCT participants’ sex and gender. Secondary outcomes include changes in reporting over time and RCT design characteristics. Results: Preliminary analysis included 145 RCTs (153,410 participants) associated with 77 medications approved in 1985-2023, most commonly for epilepsy (19%), migraine (16%), and multiple sclerosis (16%). Sixty-six RCTs (45.5%) used sex-related terms appropriately. Nine RCTs (6.2%) reported gender accurately. Fifty-three RCTs (37%) used sex- or gender-related terms interchangeably. There are no statistically significant differences in the proportions of studies reporting sex and/or gender accurately when comparing those published until versus after 2017. No RCT reported sex or gender collection methods, definitions of sex or gender, or including sex or gender minority participants. Conclusions: Preliminary results suggest shortcomings in reporting sex and, especially, gender accurately and inclusively among neurological drug RCTs and no significant improvement thereof in recent years.
Although food insecurity affects a significant proportion of young children in New Zealand (NZ)(1), evidence of its association with dietary intake and sociodemographic characteristics in this population is lacking. This study aims to assess the household food security status of young NZ children and its association with energy and nutrient intake and sociodemographic factors. This study included 289 caregiver and child (1-3 years old) dyads from the same household in either Auckland, Wellington, or Dunedin, NZ. Household food security status was determined using a validated and NZ-specific eight-item questionnaire(2). Usual dietary intake was determined from two 24-hour food recalls, using the multiple source method(3). The prevalence of inadequate nutrient intake was assessed using the Estimated Average Requirement (EAR) cut-point method and full probability approach. Sociodemographic factors (i.e., socioeconomic status, ethnicity, caregiver education, employment status, household size and structure) were collected from questionnaires. Linear regression models were used to estimate associations with statistical significance set at p <0.05. Over 30% of participants had experienced food insecurity in the past 12 months. Of all eight indicator statements, “the variety of foods we are able to eat is limited by a lack of money,” had the highest proportion of participants responding “often” or “sometimes” (35.8%). Moderately food insecure children exhibited higher fat and saturated fat intakes, consuming 3.0 (0.2, 5.8) g/day more fat, and 2.0 (0.6, 3.5) g/day more saturated fat compared to food secure children (p<0.05). Severely food insecure children had lower g/kg/day protein intake compared to food secure children (p<0.05). In comparison to food secure children, moderately and severely food insecure children had lower fibre intake, consuming 1.6 (2.8, 0.3) g/day and 2.6 (4.0, 1.2) g/day less fibre, respectively. Severely food insecure children had the highest prevalence of inadequate calcium (7.0%) and vitamin C (9.3%) intakes, compared with food secure children [prevalence of inadequate intakes: calcium (2.3%) and vitamin C (2.8%)]. Household food insecurity was more common in those of Māori or Pacific ethnicity; living in areas of high deprivation; having a caregiver who was younger, not in paid employment, or had low educational attainment; living with ≥2 other children in the household; and living in a sole-parent household. Food insecure young NZ children consume a diet that exhibits lower nutritional quality in certain measures compared to their food-secure counterparts. Food insecurity was associated with various sociodemographic factors that are closely linked with poverty or low income. As such, there is an urgent need for poverty mitigation initiatives to safeguard vulnerable young children from the adverse consequences of food insecurity.
The Eastern Gangetic Plains are a densely populated region of South Asia with comparatively low productivity yet a strong potential to intensify production to meet growing food demands. Conservation agriculture-based sustainable intensification (CASI) has gained academic and policy traction in the region, yet despite considerable promotional activities, uptake remains limited. Based on emerging evidence delving beyond a binary classification of adoption, this qualitative study seeks to explore the experiences and perspectives of smallholder farmers who express positive sentiments about CASI, yet have not progressed to (autonomous) adoption. After thematic coding of semi-structured interviews with 44 experimenting farmers and 38 interested non-users, ten common themes emerged that explain why farmers stagnate in their adoption process. Seven of the ten themes were non-specific to CASI and would constraint promotion and uptake of any agri-system change, highlighting the need for contextual clarity when promoting practice changes in smallholder systems. We summaries this to propose the ‘four T's’ that are required to be addressed to enable agricultural change in smallholder systems: Targeting; Training; Targeted incentives; and Time. Through this more nuanced evaluation approach, we argue the need for a stronger focus on enabling environments rather than technological performance evaluations generically, if promotional efforts are to be successful and emerging sustainable intensification technologies are to be adopted by smallholder farmers.
In recognition of an increasing number of high-consequence infectious disease events, a group of subject-matter experts identified core safety principles that can be applied across all donning and doffing protocols for personal protective equipment.
Ultra-processed plant-based foods, such as plant-based burgers, have gained in popularity. Particularly in the out-of-home (OOH) environment, evidence regarding their nutritional profile and environmental sustainability is still evolving. Plant-based burgers available at selected OOH sites were randomly sampled in Amsterdam, Copenhagen, Lisbon and London. Plant-based burgers (patty, bread and condiment) (n 41) were lab analysed for their energy, macronutrients, amino acids and minerals content per 100 g and serving and were compared with reference values. For the plant-based burgers, the median values per 100 g were 234 kcal, 20·8 g carbohydrates, 3·5 g dietary fibre and 12·0 g fat, including 0·08 g TFS and 2·2 g SFA. Protein content was 8·9 g/100 g, with low protein quality according to amino acid composition. Median Na content was 389 mg/100 g, equivalent to 1 g salt. Compared with references, the median serving provided 31% of energy intake based on a 2000 kcal per day and contributed to carbohydrates (17–28%), dietary fibre (42%), protein (40%), total fat (48%), SFA (26%) and Na (54%). One serving provided 15–23% of the reference values for Ca, K and Mg, while higher contributions were found for Zn, Mn, P and Fe (30–67%). The ultra-processed plant-based burgers provide protein, dietary fibre and essential minerals and contain relatively high levels of energy, Na and total fats. The amino acid composition indicated low protein quality. The multifaceted nutritional profile of plant-based burgers highlights the need for manufacturers to implement improvements to better support healthy dietary habits, including reducing energy, Na and total fats.
Derived from the National Pediatric Cardiology Quality Improvement Collaborative registry, the NEONATE risk score predicted freedom from interstage mortality or heart transplant for patients with single ventricle CHD and aortic arch hypoplasia discharged home following Stage 1 palliation.
Objectives:
We sought to validate the score in an external, modern cohort.
Methods:
This was a retrospective cohort analysis of single ventricle CHD and aortic arch hypoplasia patients enrolled in the National Pediatric Cardiology Quality Improvement Collaborative Phase II registry from 2016 to 2020, who were discharged home after Stage 1 palliation. Points were allocated per the NEONATE score (Norwood type—Norwood/Blalock–Taussig shunt: 3, Hybrid: 12; extracorporeal membrane oxygenation post-op: 9, Opiates at discharge: 6, No Digoxin at discharge: 9, Arch Obstruction on discharge echo: 9, Tricuspid regurgitation ≥ moderate on discharge echo: 12; Extra oxygen plus ≥ moderate tricuspid regurgitation: 28). The composite primary endpoint was interstage mortality or heart transplant.
Results:
In total, 1026 patients met inclusion criteria; 61 (6%) met the primary outcome. Interstage mortality occurred in 44 (4.3%) patients at a median of 129 (IQR 62,195) days, and 17 (1.7%) were referred for heart transplant at a 167 (114,199) days of life. The median NEONATE score was 0(0,9) in those who survived to Stage 2 palliation compared to 9(0,15) in those who experienced interstage mortality or heart transplant (p < 0.001). Applying a NEONATE score cut-off of 17 points that separated patients into low- and high-risk groups in the learning cohort provided 91% specificity, negative predictive value of 95%, and overall accuracy of 87% (85.4–89.5%).
Conclusion:
In a modern cohort of patients with single ventricle CHD and aortic arch hypoplasia, the NEONATE score remains useful at discharge post-Stage 1 palliation to predict freedom from interstage mortality or heart transplant.
The Wisconsin high-temperature superconductor axisymmetric mirror experiment (WHAM) will be a high-field platform for prototyping technologies, validating interchange stabilization techniques and benchmarking numerical code performance, enabling the next step up to reactor parameters. A detailed overview of the experimental apparatus and its various subsystems is presented. WHAM will use electron cyclotron heating to ionize and build a dense target plasma for neutral beam injection of fast ions, stabilized by edge-biased sheared flow. At 25 keV injection energies, charge exchange dominates over impact ionization and limits the effectiveness of neutral beam injection fuelling. This paper outlines an iterative technique for self-consistently predicting the neutral beam driven anisotropic ion distribution and its role in the finite beta equilibrium. Beginning with recent work by Egedal et al. (Nucl. Fusion, vol. 62, no. 12, 2022, p. 126053) on the WHAM geometry, we detail how the FIDASIM code is used to model the charge exchange sources and sinks in the distribution function, and both are combined with an anisotropic magnetohydrodynamic equilibrium solver method to self-consistently reach an equilibrium. We compare this with recent results using the CQL3D code adapted for the mirror geometry, which includes the high-harmonic fast wave heating of fast ions.
The National Pediatric Cardiology Quality Improvement Collaborative (NPC-QIC) lacks a rigorous enrollment audit process, unlike other collaborative networks. Most centers require individual families to consent to participate. It is unknown whether there is variation across centers or biases in enrollment.
Methods:
We used the Pediatric Cardiac Critical Care Consortium (PC4) registry to assess enrollment rates in NPC-QIC for those centers participating in both registries using indirect identifiers (date of birth, date of admission, gender, and center) to match patient records. All infants born 1/1/2018–12/31/2020 and admitted 30 days of life were eligible. In PC4, all infants with a fundamental diagnosis of hypoplastic left heart or variant or who underwent a surgical or hybrid Norwood or variant were eligible. Standard descriptive statistics were used to describe the cohort and center match rates were plotted on a funnel chart.
Results:
Of 898 eligible NPC-QIC patients, 841 were linked to 1,114 eligible PC4 patients (match rate 75.5%) in 32 centers. Match rates were lower in patients of Hispanic/Latino ethnicity (66.1%, p = 0.005), and those with any specified chromosomal abnormality (57.4%, p = 0.002), noncardiac abnormality (67.8%, p = 0.005), or any specified syndrome (66.5%, p = 0.001). Match rates were lower for patients who transferred to another hospital or died prior to discharge. Match rates varied from 0 to 100% across centers.
Conclusions:
It is feasible to match patients between the NPC-QIC and PC4 registries. Variation in match rates suggests opportunities for improvement in NPC-QIC patient enrollment.
Conservation agriculture-based sustainable intensification (CASI) is gaining prominence as an agricultural pathway to poverty reduction and enhancement of sustainable food systems among government and development actors in the Eastern Gangetic Plains (EGP) of South Asia. Despite substantial investment in research and extension programs and a growing understanding of the agronomic, economic and labor-saving benefits of CASI, uptake remains limited. This study explores farmer experiences and perspectives to establish why farmers choose not to implement CASI systems despite a strong body of recent scientific evidence establishing the benefits of them doing so. Through thematic coding of semi-structured interviews, key constraints are identified, which establishes a narrative that current households' resources are insufficient to enable practice change, alongside limited supporting structures for resource supplementation. Such issues create a dependency on subsidies and outside support, a situation that is likely to impact any farming system change given the low-risk profiles of farmers and their limited resource base. This paper hence sets out broad implications for creating change in smallholder farming systems in order to promote the adoption of sustainable agricultural technologies in resource-poor smallholder contexts, especially with regard to breaking the profound poverty cycles that smallholder farmers find themselves in and which are unlikely to be broken by the current set of technologies promoted to them.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
The surgical treatment of insular gliomas requires specialized knowledge. Over the last three decades, increased momentum in surgical resection of insular gliomas shifted the focus from one of expectant management to maximal safe resection to establish a diagnosis, characterize tumor genetics, treat preoperative symptoms (i.e., seizures), and delay malignant transformation through tumor cytoreduction. A comprehensive review of the literature was performed regarding insular glioma classification/genetics, insular anatomy, surgical approaches, and patient outcomes. Modern large, published series of insular resections have reported a median 80% resection, 80% improvement in preoperative seizures, and postsurgical permanent neurologic deficits of less than 10%. Major complication avoidance includes recognition and preservation of eloquent cortex for language and respecting the lateral lenticulostriate arteries.
Plasmodium coatneyi has been proposed as an animal model for human Plasmodium falciparum malaria as it appears to replicate many aspects of pathogenesis and clinical symptomology. As part of the ongoing evaluation of the rhesus macaque model of severe malaria, a detailed ultrastructural analysis of the interaction between the parasite and both the host erythrocytes and the microvasculature was undertaken. Tissue (brain, heart and kidney) from splenectomized rhesus macaques and blood from spleen-intact animals infected with P. coatneyi were examined by electron microscopy. In all three tissues, similar interactions (sequestration) between infected red blood cells (iRBC) and blood vessels were observed with evidence of rosette and auto-agglutinate formation. The iRBCs possessed caveolae similar to P. vivax and knob-like structures similar to P. falciparum. However, the knobs often appeared incompletely formed in the splenectomized animals in contrast to the intact knobs exhibited by spleen intact animals. Plasmodium coatneyi infection in the monkey replicates many of the ultrastructural features particularly associated with P. falciparum in humans and as such supports its use as a suitable animal model. However, the possible effect on host–parasite interactions and the pathogenesis of disease due to the use of splenectomized animals needs to be taken into consideration.
A chloroacetamide herbicide by application timing factorial experiment was conducted in 2017 and 2018 in Mississippi to investigate chloroacetamide use in a dicamba-based Palmer amaranth management program in cotton production. Herbicides used were S-metolachlor or acetochlor, and application timings were preemergence, preemergence followed by (fb) early postemergence, preemergence fb late postemergence, early postemergence alone, late postemergence alone, and early postemergence fb late postemergence. Dicamba was included in all preemergence applications, and dicamba plus glyphosate was included with all postemergence applications. Differences in cotton and weed response due to chloroacetamide type were minimal, and cotton injury at 14 d after late postemergence application was less than 10% for all application timings. Late-season weed control was reduced up to 30% and 53% if chloroacetamide application occurred preemergence or late postemergence only, respectively. Late-season weed densities were minimized if multiple applications were used instead of a single application. Cotton height was reduced by up to 23% if a single application was made late postemergence relative to other application timings. Chloroacetamide application at any timing except preemergence alone minimized late-season weed biomass. Yield was maximized by any treatment involving multiple applications or early postemergence alone, whereas applications preemergence or late postemergence alone resulted in up to 56% and 27% yield losses, respectively. While no yield loss was reported by delaying the first of sequential applications until early postemergence, forgoing a preemergence application is not advisable given the multiple factors that may delay timely postemergence applications such as inclement weather.
When presenting with a first episode of psychosis (FEP), migrants can have different demographic and clinical characteristics to the native-born population and this was examined in an Irish Early Intervention for Psychosis service.
Methods:
All cases of treated FEP from three local mental health services within a defined catchment area were included. Psychotic disorder diagnoses were determined using the SCID and symptom and functioning domains were measured using validated and reliable measures.
Results:
From a cohort of 612 people, 21.1% were first-generation migrants and there was no difference in the demographic characteristics, diagnoses, symptoms or functioning between migrants and those born in the Republic of Ireland, except that migrants from Africa presented with less insight. Of those admitted, 48.6% of admissions for migrants were involuntary compared to 37.7% for the native-born population (p = 0.09).
Conclusions:
First-generation migrants now make up a significant proportion of people presenting with a FEP to an Irish EI for psychosis service. Broadly the demographic and clinical characteristics of migrants and those born in the Republic of Ireland are similar, except for less insight in migrants from Africa and a trend for a higher proportion of involuntary admissions in the total migrant group.
Efforts to move community engagement in research from marginalized to mainstream include the NIH requiring community engagement programs in all Clinical and Translational Science Awards (CTSAs). However, the COVID-19 pandemic has exposed how little these efforts have changed the dominant culture of clinical research. When faced with the urgent need to generate knowledge about prevention and treatment of the novel coronavirus, researchers largely neglected to involve community stakeholders early in the research process. This failure cannot be divorced from the broader context of systemic racism in the US that has contributed to Black, Indigenous, and People of Color (BIPOC) communities bearing a disproportionate toll from COVID-19, being underrepresented in COVID-19 clinical trials, and expressing greater hesitancy about COVID-19 vaccination. We call on research funders and research institutions to take decisive action to make community engagement obligatory, not optional, in all clinical and translational research and to center BIPOC communities in this process. Recommended actions include funding agencies requiring all research proposals involving human participants to include a community engagement plan, providing adequate funding to support ongoing community engagement, including community stakeholders in agency governance and proposal reviews, promoting racial and ethnic diversity in the research workforce, and making a course in community engaged research a requirement for Masters of Clinical Research curricula.
Typical enteropathogenic Escherichia coli (tEPEC) infection is a major cause of diarrhoea and contributor to mortality in children <5 years old in developing countries. Data were analysed from the Global Enteric Multicenter Study examining children <5 years old seeking care for moderate-to-severe diarrhoea (MSD) in Kenya. Stool specimens were tested for enteric pathogens, including by multiplex polymerase chain reaction for gene targets of tEPEC. Demographic, clinical and anthropometric data were collected at enrolment and ~60-days later; multivariable logistic regressions were constructed. Of 1778 MSD cases enrolled from 2008 to 2012, 135 (7.6%) children tested positive for tEPEC. In a case-to-case comparison among MSD cases, tEPEC was independently associated with presentation at enrolment with a loss of skin turgor (adjusted odds ratio (aOR) 2.08, 95% confidence interval (CI) 1.37–3.17), and convulsions (aOR 2.83, 95% CI 1.12–7.14). At follow-up, infants with tEPEC compared to those without were associated with being underweight (OR 2.2, 95% CI 1.3–3.6) and wasted (OR 2.5, 95% CI 1.3–4.6). Among MSD cases, tEPEC was associated with mortality (aOR 2.85, 95% CI 1.47–5.55). This study suggests that tEPEC contributes to morbidity and mortality in children. Interventions aimed at defining and reducing the burden of tEPEC and its sequelae should be urgently investigated, prioritised and implemented.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Background: With an aging population, increasingly complex care, and frequent re-admissions, prevention of healthcare-associated infections (HAIs) in nursing homes (NHs) is a federal priority. However, few contemporary sources of HAI data exist to inform surveillance, prevention, and policy. Prevalence surveys (PSs) are an efficient approach to generating data to measure the burden and describe the types of HAI. In 2017, the Centers for Disease Control and Prevention (CDC) performed its first large-scale HAI PS through the Emerging Infections Program (EIP) to measure the prevalence and describe the epidemiology of HAI in NH residents. Methods: NHs from several states (CA, CO, CT, GA, MD, MN, NM, NY, OR, & TN) were randomly selected and asked to participate in a 1-day HAI PS between April and October 2017; participation was voluntary. EIP staff reviewed available medical records for NH residents present on the survey date to collect demographic and basic clinical information and infection signs and symptoms. HAIs with onset on or after NH day 3 were identified using revised McGeer infection definitions applied to data collected by EIP staff and were reported to the CDC through a web-based system. Data were reviewed by CDC staff for potential errors and to validate HAI classifications prior to analysis. HAI prevalence, number of residents with >1 HAI per number of surveyed residents ×100, and 95% CIs were calculated overall (pooled mean) and for selected resident characteristics. Data were analyzed using SAS v9.4 software. Results: Among 15,296 residents in 161 NHs, 358 residents with 375 HAIs were identified. The most common HAI sites were skin (32%), respiratory tract (29%), and urinary tract (20%). Cellulitis, soft-tissue or wound infection, symptomatic UTI, and cold or pharyngitis were the most common individual HAIs (Fig. 1). Overall HAI prevalence was 2.3 per 100 residents (95% CI, 2.1–2.6); at the NH level, the median HAI prevalence was 1.8 and ranged from 0 to 14.3 (interquartile range, 0–3.1). At the resident level (Fig. 2), HAI prevalence was significantly higher in persons admitted for postacute care with diabetes, with a pressure ulcer, receiving wound care, or with a device. Conclusions: In this large-scale survey, 1 in 43 NH residents had an HAI on a given day. Three HAI types comprised >80% of infections. In addition to identifying characteristics that place residents at higher risk for HAIs, these findings provide important data on HAI epidemiology in NHs that can be used to expand HAI surveillance and inform prevention policies and practices.