We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Embed climate-focused energy awareness in every step of your educational program with this unique guide to specifying, designing, implementing and evaluating educational energy initiatives. Discover how to design programs for different learner groups, and keep learners engaged; develop energy-focused project-based hands-on experiential teaching approaches; champion professional development; embed systems, modelling, and computational analysis within curricula; and address issues in justice and equity. This uniquely interdisciplinary approach spans engineering, the physical sciences, and the social sciences, supporting instructors in delivering programs that feed global demand for energy-related climate education, while highlighting ways to avoid the pitfalls of engineering-only energy programs. Ideal for academics involved in teaching and developing undergraduate and graduate courses in energy, academic educational program managers, and professionals in energy-related early career onboarding, this is your key to unlock an empowered energy transition workforce.
Malnutrition from poor diet is a persistent issue in Sri Lanka, especially among women and children. High rates of undernutrition and micronutrient deficiencies are documented among rural poor communities(1). Household food production may enhance maternal and child nutrition directly by increasing access to diverse foods and indirectly by providing income to diversify diets(2). This study explores the cross-sectional relationship between household food production and individual dietary diversity among women aged 18-45 years and children aged 2-5 years in Batticaloa district, Sri Lanka. We randomly selected 450 low-income mother-child pairs receiving a Samurdhi subsidiary, having a home garden. Through face-to-face interview, we gathered information on the types of crops grown and livestock reared in the preceding 12 months. Production quantity and utilization were also detailed. Additionally, socio-demographic information and market access were obtained. To measure women’s dietary diversity (DD), we used a scale based on 10-food groups and a 7-food group scale for children. Women who consumed five or more food groups were defined as meeting the Minimum Dietary Diversity of Women (MDD-W), whereas children who consumed of four or more food groups met the minimum standards. Multiple linear regression and binary logistic regression were used to identify the factors predicting individual DD. Complete data for 411 pairs were analysed. The results showed, only 15.3% of the women met MDD-W, with a mean DDS of 3.3 (SD = 1.2). Children had a mean DDS of 3.3 (SD = 1.2), and 41.1% of them met the minimum diversity. Regression analysis indicated that growing leafy vegetables was positively associated with increased dietary diversity of women (β = 0.337; 95% CI: 0.13, 0.54; p = 0.001) and children (β = 0.234; 95% CI: 0.05, 0.42; p = 0.013) but not with meeting the minimum diversity. Moreover, monthly income above 35,000 LKR, higher education level, a secondary income source andfood security were also positively associated with women’s DD. Conversely, living further away from the main road reduced the women’s DD. Interestingly, livestock ownership was only associated with women meeting the MDD-W, but not for children. For children, monthly income was a strong predictor of DD and meeting minimum diversity. Surprisingly, living far from the market was associated with increased DD in children (β = 0.018; 95% CI: 0.01, 0.03; p = 0.013), while distance to main road had a similar effect as in women. Notably, selling their produce at the market contributed to meeting the minimum dietary diversity in children (β = 0.573; 95% CI: 0.14, 1.02; p = 0.013). These findings suggest that enhancing household food production could play a crucial role in improving dietary diversity and addressing malnutrition, particularly in rural Sri Lankan communities, and potentially in other similar settings.
Household food production is considered a key avenue for improving food security and nutritional status, particularly for low-income people from developing countries. However, little is known about what aspects of home garden production enhance nutritional outcomes. This paper aims to assess how home gardens influence nutritional status while considering the impact of various child, maternal, and household characteristics such as birthweight, age, education, and income. We also examined the impact of distance to the market mediating this association. We conducted a cross-sectional study of 403 children (24-60 months) and their mothers (18-45 years) in Batticaloa district, Sri Lanka using a pre-tested structured questionnaire. Maternal and child anthropometric measures were taken, and children were classified as stunted, wasted and underweight based on the WHO references, and BMI was calculated for mothers(1). Logistic regression was used to analyse the factors associated with the dependent variable, nutritional outcomes. Food production diversity was not associated with maternal or child nutritional outcomes. The only production variable associated with child nutritional outcome was livestock ownership, and it was negatively associated with child wasting (P < 0.01). Surprisingly, increased market distance improved the child undernutrition (P <0.05). Higher levels of maternal education were significantly associated with reducing stunting and underweight in children (P < 0.01). Childbirth weight showed a negative association with a child underweight (P < 0.01), and we also observed a small negative effect of a child’s age on stunting. These findings suggest that while home gardens can be an entry point, improving nutrition may require a multifaceted approach that addresses a broader range of factors.
Micronutrient malnutrition is a public health concern in many developing countries including Sri Lanka. Rural poor households are more vulnerable to micronutrient malnutrition due to their monotonous rice-based diet, which lacks dietary diversification(1). Despite the potential of home gardens on increased food access and diversity, their contribution to household dietary diversity remains unclear. This study aimed to investigate the impact of home gardens on diet diversity among rural Sri Lankan households. Low-income households with children under five were randomly selected from the Samurdhi beneficiary list, and 450 households having a home garden agreed to be interviewed. We collected information on types of crops and livestock produced over the past 12 months and their utilisation. We also collected the socio-demographic characteristics of the households. We measured household dietary diversity using the Household Dietary Diversity Score (HDDS) based on FAO guidelines. Multiple linear regression was used to identify the predictors of HDDS. Complete data sets were only available for 411 households and were included in the analysis. The HDDS ranged from 3 to 10 with a mean of 6.4 (±1.37 SD) indicating a moderate level of dietary diversity. However, only 20.4% of the households met the adequacy threshold, which is higher than the third quartile(2). Cereals, and fats and oils were the only food groups consumed by all the households. Although many households produced fruits (67.2%) and reared livestock (48.2%), the consumption of these groups were the lowest among the 12 food groups. Predictors of HDDS included monthly household income which had a strong positive relationship, especially earnings above 35,000 LKR (β = 1.02; S.E = 0.246; p = 0.000). Surprisingly, living far from the market was associated with increased HDDS (β = 0.026; S.E = 0.008; p = 0.004). Conversely, living further away from the main road reduced the HDDS (β = −0.133; S.E = 0.049; p = 0.007). Growing staples reduced the HDDS (β = −0.395; S.E = 0.174; p = 0.023), whereas growing leafy vegetables increased the diet diversity (β = 0.394; S.E = 0.154; p = 0.010). Selling homegrown products also increased HDDS (β = 0.276; S.E = 0.136; p = 0.043). However, other covariates such as the education level of the female adult, household food security status, home garden yield (kg), and livestock richness, which showed significant correlation in the bivariate analysis did not significant in the multiple regression analysis. Although all households in this district engage in some form of home gardening, 79.6% of households did not have adequate dietary diversity. There is a need to understand how home gardens can better contribute to dietary diversity.
SHEA, in partnership with ASGE, APIC, AAMI, AORN, HSPA, IDSA, SGNA, and The Joint Commission, developed this multisociety infection prevention guidance document for individuals and organizations that engage in sterilization or high-level disinfection (HLD). This document follows the CDC Guideline for Disinfection and Sterilization in Healthcare Facilities. This guidance is based on a synthesis of published scientific evidence, theoretical rationale, current practices, practical considerations, writing group consensus, and consideration of potential harm when applicable. The supplementary material includes a summary of recommendations. The guidance provides an overview of the Spaulding Classification and considerations around manufacturers’ instructions for use (MIFUs). Its recommendations address: point-of-use treatment prior to sterilization or HLD, preparation of reusable medical devices at the location of processing, sterilization, and immediate use steam sterilization (IUSS), HLD of lumened and non-lumened devices, processing of reusable medical devices used with lubricating or defoaming agents, monitoring for effectiveness of processing, handling of devices after HLD, augments and alternatives to HLD, processing of investigational devices, tracking of reusable medical devices, and approaches to implementation.
Duchenne muscular dystrophy is a devastating neuromuscular disorder characterized by the loss of dystrophin, inevitably leading to cardiomyopathy. Despite publications on prophylaxis and treatment with cardiac medications to mitigate cardiomyopathy progression, gaps remain in the specifics of medication initiation and optimization.
Method:
This document is an expert opinion statement, addressing a critical gap in cardiac care for Duchenne muscular dystrophy. It provides thorough recommendations for the initiation and titration of cardiac medications based on disease progression and patient response. Recommendations are derived from the expertise of the Advance Cardiac Therapies Improving Outcomes Network and are informed by established guidelines from the American Heart Association, American College of Cardiology, and Duchenne Muscular Dystrophy Care Considerations. These expert-derived recommendations aim to navigate the complexities of Duchenne muscular dystrophy-related cardiac care.
Results:
Comprehensive recommendations for initiation, titration, and optimization of critical cardiac medications are provided to address Duchenne muscular dystrophy-associated cardiomyopathy.
Discussion:
The management of Duchenne muscular dystrophy requires a multidisciplinary approach. However, the diversity of healthcare providers involved in Duchenne muscular dystrophy can result in variations in cardiac care, complicating treatment standardization and patient outcomes. The aim of this report is to provide a roadmap for managing Duchenne muscular dystrophy-associated cardiomyopathy, by elucidating timing and dosage nuances crucial for optimal therapeutic efficacy, ultimately improving cardiac outcomes, and improving the quality of life for individuals with Duchenne muscular dystrophy.
Conclusion:
This document seeks to establish a standardized framework for cardiac care in Duchenne muscular dystrophy, aiming to improve cardiac prognosis.
Efficient evidence generation to assess the clinical and economic impact of medical therapies is critical amid rising healthcare costs and aging populations. However, drug development and clinical trials remain far too expensive and inefficient for all stakeholders. On October 25–26, 2023, the Duke Clinical Research Institute brought together leaders from academia, industry, government agencies, patient advocacy, and nonprofit organizations to explore how different entities and influencers in drug development and healthcare can realign incentive structures to efficiently accelerate evidence generation that addresses the highest public health needs. Prominent themes surfaced, including competing research priorities and incentives, inadequate representation of patient population in clinical trials, opportunities to better leverage existing technology and infrastructure in trial design, and a need for heightened transparency and accountability in research practices. The group determined that together these elements contribute to an inefficient and costly clinical research enterprise, amplifying disparities in population health and sustaining gaps in evidence that impede advancements in equitable healthcare delivery and outcomes. The goal of addressing the identified challenges is to ultimately make clinical trials faster, more inclusive, and more efficient across diverse communities and settings.
Diquat2+ (1, 1’-ethylene-2, 2’-dipyridinium ion) and paraquat2+ (1, 1’-dimethyl-4, 4’-di-pyridinium ion) were competitively adsorbed by Na-saturated kaolinites, smectites and expanded and collapsed vermiculites. The relative preference for one or the other cation varied with the surface charge densities of the adsorbents and the location of the adsorption site, i.e. internal or external. Minerals with high surface charge exhibited preference for diquat whereas minerals with low surface charge preferred paraquat. Expanded vermiculites preferentially adsorbed diquat on internal surfaces. Collapsed vermiculites generally showed a preference for paraquat. Smectites and kaolinites preferentially adsorbed paraquat.
Surface charge densities of the layer silicates vs. the relative preference for diquat revealed two linear relationships, one for internal adsorption and one for external adsorption. Internal adsorption was characterized by a strong preference for paraquat on low-charged smectites, a relative decreasing preference for paraquat with higher-charged smectites, and a strong preference for diquat on high-charged expanded vermiculites.
Preferential adsorption for paraquat by kaolinite was quite similar to adsorption of paraquat on the external sites of vermiculites. There was no apparent relationship between competitive adsorption and surface charge density of kaolinite.
The COVID-19 has had major direct (e.g., deaths) and indirect (e.g., social inequities) effects in the United States. While the public health response to the epidemic featured some important successes (e.g., universal masking ,and rapid development and approval of vaccines and therapeutics), there were systemic failures (e.g., inadequate public health infrastructure) that overshadowed these successes. Key deficiency in the U.S. response were shortages of personal protective equipment (PPE) and supply chain deficiencies. Recommendations are provided for mitigating supply shortages and supply chain failures in healthcare settings in future pandemics. Some key recommendations for preventing shortages of essential components of infection control and prevention include increasing the stockpile of PPE in the U.S. National Strategic Stockpile, increased transparency of the Stockpile, invoking the Defense Production Act at an early stage, and rapid review and authorization by FDA/EPA/OSHA of non-U.S. approved products. Recommendations are also provided for mitigating shortages of diagnostic testing, medications and medical equipment.
Throughout the COVID-19 pandemic, many areas in the United States experienced healthcare personnel (HCP) shortages tied to a variety of factors. Infection prevention programs, in particular, faced increasing workload demands with little opportunity to delegate tasks to others without specific infectious diseases or infection control expertise. Shortages of clinicians providing inpatient care to critically ill patients during the early phase of the pandemic were multifactorial, largely attributed to increasing demands on hospitals to provide care to patients hospitalized with COVID-19 and furloughs.1 HCP shortages and challenges during later surges, including the Omicron variant-associated surges, were largely attributed to HCP infections and associated work restrictions during isolation periods and the need to care for family members, particularly children, with COVID-19. Additionally, the detrimental physical and mental health impact of COVID-19 on HCP has led to attrition, which further exacerbates shortages.2 Demands increased in post-acute and long-term care (PALTC) settings, which already faced critical staffing challenges difficulty with recruitment, and high rates of turnover. Although individual healthcare organizations and state and federal governments have taken actions to mitigate recurring shortages, additional work and innovation are needed to develop longer-term solutions to improve healthcare workforce resiliency. The critical role of those with specialized training in infection prevention, including healthcare epidemiologists, was well-demonstrated in pandemic preparedness and response. The COVID-19 pandemic underscored the need to support growth in these fields.3 This commentary outlines the need to develop the US healthcare workforce in preparation for future pandemics.
Throughout history, pandemics and their aftereffects have spurred society to make substantial improvements in healthcare. After the Black Death in 14th century Europe, changes were made to elevate standards of care and nutrition that resulted in improved life expectancy.1 The 1918 influenza pandemic spurred a movement that emphasized public health surveillance and detection of future outbreaks and eventually led to the creation of the World Health Organization Global Influenza Surveillance Network.2 In the present, the COVID-19 pandemic exposed many of the pre-existing problems within the US healthcare system, which included (1) a lack of capacity to manage a large influx of contagious patients while simultaneously maintaining routine and emergency care to non-COVID patients; (2) a “just in time” supply network that led to shortages and competition among hospitals, nursing homes, and other care sites for essential supplies; and (3) longstanding inequities in the distribution of healthcare and the healthcare workforce. The decades-long shift from domestic manufacturing to a reliance on global supply chains has compounded ongoing gaps in preparedness for supplies such as personal protective equipment and ventilators. Inequities in racial and socioeconomic outcomes highlighted during the pandemic have accelerated the call to focus on diversity, equity, and inclusion (DEI) within our communities. The pandemic accelerated cooperation between government entities and the healthcare system, resulting in swift implementation of mitigation measures, new therapies and vaccinations at unprecedented speeds, despite our fragmented healthcare delivery system and political divisions. Still, widespread misinformation or disinformation and political divisions contributed to eroded trust in the public health system and prevented an even uptake of mitigation measures, vaccines and therapeutics, impeding our ability to contain the spread of the virus in this country.3 Ultimately, the lessons of COVID-19 illustrate the need to better prepare for the next pandemic. Rising microbial resistance, emerging and re-emerging pathogens, increased globalization, an aging population, and climate change are all factors that increase the likelihood of another pandemic.4
The Society for Healthcare Epidemiology in America (SHEA) strongly supports modernization of data collection processes and the creation of publicly available data repositories that include a wide variety of data elements and mechanisms for securely storing both cleaned and uncleaned data sets that can be curated as clinical and research needs arise. These elements can be used for clinical research and quality monitoring and to evaluate the impacts of different policies on different outcomes. Achieving these goals will require dedicated, sustained and long-term funding to support data science teams and the creation of central data repositories that include data sets that can be “linked” via a variety of different mechanisms and also data sets that include institutional and state and local policies and procedures. A team-based approach to data science is strongly encouraged and supported to achieve the goal of a sustainable, adaptable national shared data resource.
Tight focusing with very small f-numbers is necessary to achieve the highest at-focus irradiances. However, tight focusing imposes strong demands on precise target positioning in-focus to achieve the highest on-target irradiance. We describe several near-infrared, visible, ultraviolet and soft and hard X-ray diagnostics employed in a ∼1022 W/cm2 laser–plasma experiment. We used nearly 10 J total energy femtosecond laser pulses focused into an approximately 1.3-μm focal spot on 5–20 μm thick stainless-steel targets. We discuss the applicability of these diagnostics to determine the best in-focus target position with approximately 5 μm accuracy (i.e., around half of the short Rayleigh length) and show that several diagnostics (in particular, 3$\omega$ reflection and on-axis hard X-rays) can ensure this accuracy. We demonstrated target positioning within several micrometers from the focus, ensuring over 80% of the ideal peak laser intensity on-target. Our approach is relatively fast (it requires 10–20 laser shots) and does not rely on the coincidence of low-power and high-power focal planes.
Structural and diffraction criteria for distinguishing between t-1M, c-1M, m-1M, and 3T illite varieties are described. The t-1M illite corresponds to a one-layer monoclinic structure with vacant transsites. The c-1M illite has vacant cis-octahedra forming one of two symmetrically independent point systems; the other cis-octahedra as well as the trans-octahedra are occupied; and the m-1M illite corresponds to the structure in which cations are statistically distributed over available trans- and cis-sites. For t-1M, c-1M, and m-1M, the values of |c cos β/a| are equal to 0.39–0.41, 0.29–0.31, and 0.333, respectively. Application of these criteria demonstrates that illite samples described in the literature as the 3T polytype usually are c-1M instead. The relatively common occurrence of c-1M illite in association with t-1M and 2M1 polytypes has been recognized in illite from hydrothermal alterations around uranium deposits located in the Athabasca basement (Saskatchewan, Canada). The c-1M illite from these deposits was previously described as 3T one.
In 2015, the United Nations articulated the ambition to move toward a prosperous, socially inclusive, and environmentally sustainable future for all by adopting the Sustainable Development Goals (SDGs). However, little is known about the pathways that could lead to their concurrent achievement. We provide an overview of the current literature on quantitative pathways toward the SDGs, indicate the commonly used methods and indicators, and identify the most comprehensive pathways that have been published to date. Our results indicate that there is a need for more scenarios toward the full set of SDGs, using a wider range of underlying narratives.
Technical Summary
Quantitative goal-seeking scenario studies could help to explore the needed systems' transformations to implement the 2030 Agenda for Sustainable Development by identifying enabling conditions and accounting for the synergies and trade-offs between the SDGs. Given that the SDGs were adopted some time ago, here, we review the existing global scenario literature to determine what it can offer in this context. We found only a few scenarios that address a large set of SDGs, while many more deal with specific clusters of 2–6 SDGs. We identified the most frequent clusters and compared the results of the most comprehensive sustainable development scenarios. The latter is complicated because of the diversity of methods, indicators, and assumptions used. Therefore, we suggest that an effort is needed to develop a wider set of scenarios that would achieve multiple SDGs, using a more standardized framework of targets and indicators.
Social Media Summary
This study reviews the current global pathways toward the SDGs and shows the need for a broader set of SDG scenarios.
We evaluated diagnostic test and antibiotic utilization among 252 patients from 11 US hospitals who were evaluated for coronavirus disease 2019 (COVID-19) pneumonia during the severe acute respiratory coronavirus virus 2 (SARS-CoV-2) omicron variant pandemic wave. In our cohort, antibiotic use remained high (62%) among SARS-CoV-2–positive patients and even higher among those who underwent procalcitonin testing (68%).
Metacognition is defined as the ability to observe, monitor, and make judgments about one’s own cognitive status. Judgments of learning (JOLs) and retrospective confidence judgments (RCJs) are two elements of metacognition related to memory, or metamemory. JOLs refer to one’s assumptions of their memory performance prior to completing a memory task, while RCJs describe one’s subjective assessment of their memory performance after they have completed the task. Traumatic brain injury (TBI) is known to negatively impact general metacognitive functioning. However, the nuanced effects of TBI on constituent metacognitive subprocesses like JOLs and RCJs remain unclear. This study aimed to characterize patterns of brain activity that occur when individuals with TBI render JOLs and RCJs during a meta-memory task. Differences between JOL- and RCJ-related patterns of activation were also explored.
Participants and Methods:
20 participants with moderate-to-severe TBI completed a metacognition task while undergoing functional magnetic resonance imaging (fMRI). Participants were first exposed to target slides with a set of polygons placed in specific locations, then asked to identify the target slides within a set of distractors. Before identifying the target slides, participants rated how well they believed they would remember the polygons’ shape and location (JOL). After answering, they rated how confident they were that the answer they provided was correct (RCJ). First-level time series analyses of fMRI data were conducted for each participant using FSL FEAT. Higher-level random effects modeling was then performed to assess average activation across all participants. Finally, contrasts were applied to examine and compare JOL- and RCJ-specific patterns of activation.
Results:
JOLs were associated with activation of the left frontal gyri, bilateral anterior cingulate, left insula, and right putamen (p < 0.01). RCJs were associated with activation of the bilateral frontal gyri, bilateral posterior and anterior cingulate, left insula, right putamen, and left thalamus (p < 0.01). Compared to RCJs, JOLs demonstrated greater left insula activation (p < 0.01). Compared to JOLs, RCJs demonstrated greater activation of the left superior frontal gyrus, bilateral middle frontal gyrus, and bilateral anterior cingulate (p < 0.01).
Conclusions:
The areas of activation found in this study were consistent with structures previously identified in the broader metacognition literature. Overall, RCJs produced activity in a greater number of regions that was more bilaterally distributed compared to JOLs. Moreover, several regions that were active during both metacognitive subprocesses tended to be even more active during RCJs. A hypothesis for this observation suggests that, unlike JOLs, the additional involvement of reflecting on one’s immediate memory of completing the task during RCJs may require greater recruitment of resources compared to JOLs. Importantly, these findings suggest that, while different metacognitive subprocesses may recruit similar brain circuitry, some subprocesses may require more potent and widespread activation of this circuitry than others. As such, subprocesses with greater activational needs and complexity, such as RCJs, may be more susceptible to damage caused by TBI. Future research should aim to compare patterns of activation associated with certain metacognitive subprocesses between survivors of TBI and healthy controls.