We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To understand caregivers’ perceptions about their children’s mealtime social experiences at school and how they believe these social experiences impact their children’s consumption of meals at school (both meals brought from home and school meals).
Design:
Qualitative data were originally collected as part of a larger mixed methods study using an embedded-QUAN dominant research design.
Setting:
Semi-structured interviews were conducted with United States (U.S.) caregivers over ZoomTM in English and Spanish during the 2021–2022 school year. The interview guide contained 14 questions on caregivers’ perceptions about their children’s experiences with school meals.
Participants:
Caregivers of students in elementary, middle and high schools in rural, suburban and urban communities in California (n 46) and Maine (n 20) were interviewed. Most (60·6 %) were caregivers of children who were eligible for free or reduced-price meals.
Results:
Caregivers reported that an important benefit of eating meals at school is their child’s opportunity to socialise with their peers. Caregivers also stated that their child’s favourite aspect of school lunch is socialising with friends. However, some caregivers reported the cafeteria environment caused their children to feel anxious and not eat. Other caregivers reported that their children sometimes skipped lunch and chose to socialise with friends rather than wait in long lunch lines.
Conclusions:
Socialising during school meals is important to both caregivers and students. Policies such as increasing lunch period lengths and holding recess before lunch have been found to promote school meal consumption and could reinforce the positive social aspects of mealtime for students.
Disease-modifying therapies (DMTs) for Alzheimer’s disease (AD) are emerging treatment options. This study aimed to estimate the potential health system and associated environmental impacts of DMTs by modeling future bed-days and carbon dioxide equivalent (CO2e) emissions for the UK population under various scenarios for access to and efficacy of DMTs.
Methods
A cohort Markov model was developed to predict the UK population distribution from 2020 to 2040 across five health states—cognitively unimpaired and four stages of AD (mild cognitive impairment, and mild, moderate, severe dementia). These distributions were estimated using national population projections, AD prevalence data, and stage-specific transition rates. Annual bed-days per person for each state and associated CO2e emissions from published literature were applied to estimate total bed-days and emissions. Modeled scenarios combined ranges of DMT efficacy estimates (20 to 30%) and access levels (25 to 58% eligible patients receiving treatment) elicited from expert opinion to explore the extent of potential DMT impacts.
Results
Without DMT access, annual bed-days across the four AD stages were projected to increase from 5.5 million to 8.6 million from 2020 to 2040, with cumulative bed-days totaling 140 million. Associated annual emissions increased from 0.7 Mt to 1.1 Mt CO2e, reaching 17 Mt CO2e cumulatively from 2020 to 2040. Under the various high-access (58% eligible patients treated) DMT efficacy scenarios, relative to no DMT access, annual reductions of 430 thousand to 650 thousand bed-days and 54 kt to 81 kt CO2e were estimated by 2040, and cumulative emissions decreased by 419 kt to 633 kt CO2e. Decreasing DMT access to 25 percent, assuming 25 percent DMT efficacy, reduced annual bed-days by 230 thousand by 2040, and annual emission savings decreased to 29 kt CO2e.
Conclusions
DMTs for AD may contribute to efforts by healthcare systems to reduce the carbon emissions from hospital inpatient care. Environmental sustainability should be considered as part of a holistic value proposition when assessing the benefits of new medicines.
School-based interventions encouraging children to replace sugar-sweetened beverages with water show promise for reducing child overweight. However, students with child food insecurity (CFI) may not respond to nutrition interventions like children who are food-secure.
Design:
The Water First cluster-randomised trial found that school water access and promotion prevented child overweight and increased water intake. This secondary analysis used mixed-effects regression to evaluate the interaction between the Water First intervention and food insecurity, measured using the Child Food Security Assessment, on child weight status (anthropometric measurements) and dietary intake (student 24-h recalls, beverage intake surveys).
Setting:
Eighteen elementary schools (serving ≥ 50 % children from low-income households), in which drinking water had not been previously promoted, in the San Francisco Bay Area.
Participants:
Students in fourth-grade classes (n 1056).
Results:
Food insecurity interacted with the intervention. Among students with no CFI, the intervention group had a lower prevalence of obesity from baseline to 7 months (–0·04, CI –0·08, 0·01) compared with no CFI controls (0·01, CI –0·01, 0·04) (P = 0·04). Among students with high CFI, the intervention group had a pronounced increase in the volume of water consumed between baseline and 7 months (86·2 %, CI 21·7, 185·0 %) compared with high CFI controls (–13·6 %, CI –45·3, 36·6 %) (P = 0·02).
Conclusions:
Addressing food insecurity in the design of water promotion interventions may enhance the benefit to children, reducing the prevalence of obesity.
The Amsterdam Instrumental Activities of Daily Living Questionnaire (A-IADL-Q) is well validated and commonly used to assess difficulties in everyday functioning regarding dementia. To facilitate interpretation and clinical implementation across different European countries, we aim to provide normative data and a diagnostic cutoff for dementia.
Methods:
Cross-sectional data from Dutch Brain Research Registry (N = 1,064; mean (M) age = 62 ± 11 year; 69.5% female), European Medial Information Framework-Alzheimer’s Disease 90 + (N = 63; Mage = 92 ± 2 year; 52.4% female), and European Prevention of Alzheimer’s Dementia Longitudinal Cohort Study (N = 247; Mage = 63 ± 7 year; 72.1% female) were used. The generalized additive models for location, scale, and shape framework were used to obtain normative values (Z-scores). The beta distribution was applied, and combinations of age, sex, and educational attainment were modeled. The optimal cutoff for dementia was calculated using area under receiver operating curves (AUC-ROC) and Youden Index, using data from Amsterdam Dementia Cohort (N = 2,511, Mage = 64 ± 8 year, 44.4% female).
Results:
The best normative model accounted for a cubic-like decrease of IADL performance with age that was more pronounced in low compared to medium/high educational attainment. The cutoff for dementia was 1.85 standard deviation below the population mean (AUC = 0.97; 95% CI [0.97–0.98]).
Conclusion:
We provide regression-based norms for A-IADL-Q and a diagnostic cutoff for dementia, which help improve clinical assessment of IADL performance across European countries.
Strategies are needed to ensure greater participation of underrepresented groups in diabetes research. We examined the impact of a remote study protocol on enrollment in diabetes research, specifically the Pre-NDPP clinical trial. Recruitment was conducted among 2807 diverse patients in a safety-net healthcare system. Results indicated three-fold greater odds of enrolling in remote versus in-person protocols (AOR 2.90; P < 0.001 [95% CI 2.29–3.67]). Priority populations with significantly higher enrollment included Latinx and Black individuals, Spanish speakers, and individuals who had Medicaid or were uninsured. A remote study design may promote overall recruitment into clinical trials, while effectively supporting enrollment of underrepresented groups.
Though diet quality is widely recognised as linked to risk of chronic disease, health systems have been challenged to find a user-friendly, efficient way to obtain information about diet. The Penn Healthy Diet (PHD) survey was designed to fill this void. The purposes of this pilot project were to assess the patient experience with the PHD, to validate the accuracy of the PHD against related items in a diet recall and to explore scoring algorithms with relationship to the Healthy Eating Index (HEI)-2015 computed from the recall data. A convenience sample of participants in the Penn Health BioBank was surveyed with the PHD, the Automated Self-Administered 24-hour recall (ASA24) and experience questions. Kappa scores and Spearman correlations were used to compare related questions in the PHD to the ASA24. Numerical scoring, regression tree and weighted regressions were computed for scoring. Participants assessed the PHD as easy to use and were willing to repeat the survey at least annually. The three scoring algorithms were strongly associated with HEI-2015 scores using National Health and Nutrition Examination Survey 2017–2018 data from which the PHD was developed and moderately associated with the pilot replication data. The PHD is acceptable to participants and at least moderately correlated with the HEI-2015. Further validation in a larger sample will enable the selection of the strongest scoring approach.
Ancylostoma caninum is the most common nematode parasite of dogs in the United States. The present study aimed to describe the molecular epidemiology of A. caninum isolates from the central and eastern states of the United States using the partial mitochondrial cytochrome oxidase (cox1) gene and to compare them with those reported globally. We isolated eggs from faecal samples of dogs and characterized each isolate based on cox1 sequences. A total of 60 samples originating from Kansas, Iowa, New York, Florida and Massachusetts were included. 25 haplotypes were identified in the United States dataset with high haplotype diversity (0.904). Sequence data were compared to sequences from other world regions available in GenBank. Global haplotype analysis demonstrated 35 haplotypes with a haplotype diversity of 0.931. Phylogenetic and network analysis provide evidence for the existence of moderate geographical structuring of A. caninum haplotypes. Our results provide an updated summary of A. caninum haplotypes and data for neutral genetic markers with utility for tracking hookworm populations. Sequences have been deposited in GenBank (ON980650–ON980674). Further studies of isolates from other regions are essential to understand the genetic diversity of this parasite.
We conducted a retrospective review of a hybrid antimicrobial restriction process demonstrating adherence to appropriate use criteria in 72% of provisional-only orders, in 100% of provisional orders followed by ID orders, and in 97% of ID-initiated orders. Therapy interruptions occurred in 24% of provisional orders followed by ID orders.
To examine associations between household food insecurity and children’s physical activity and sedentary behaviours.
Design:
Secondary analysis was conducted on the Healthy Communities Study, an observational study from 2013 to 2015. Household food insecurity was assessed by two items from the US Department of Agriculture’s 18-item US Household Food Security Survey Module. Physical activity was measured using the 7-d Physical Activity Behaviour Recall instrument. Data were analysed using multilevel statistical modelling.
Setting:
A total of 130 communities in the USA.
Participants:
In sum, 5138 US children aged 4–15 years.
Results:
No associations were found for the relationship between household food insecurity and child physical activity. A significant interaction between household food insecurity and child sex for sedentary behaviours was observed (P = 0·03).
Conclusions:
Additional research capturing a more detailed assessment of children’s experiences of food insecurity in relation to physical activity is warranted. Future studies may consider adopting qualitative study designs or utilising food insecurity measures that specifically target child-level food insecurity. Subsequent research may also seek to further explore sub-group analyses by sex.
In 2019, California and Wilmington, Delaware‘ implemented policies requiring healthier default beverages with restaurant kids’ meals. The current study assessed restaurant beverage offerings and manager perceptions.
Design:
Pre-post menu observations were conducted in California and Wilmington. Observations of cashiers/servers during orders were conducted pre-post implementation in California and post-implementation in Wilmington. Changes in California were compared using multilevel logistic regression and paired t tests. Post-implementation, managers were interviewed.
Setting:
Inside and drive-through ordering venues in a sample of quick-service restaurants in low-income California communities and all restaurants in Wilmington subject to the policy, the month before and 7–12 months after policy implementation.
Participants:
Restaurant observations (California n 110; Wilmington n 14); managers (California n 75; Wilmington n 15).
Results:
Pre-implementation, the most common kids’ meal beverages on California menus were unflavoured milk and water (78·8 %, 52·0 %); in Wilmington, juice, milk and sugar-sweetened beverages were most common (81·8 %, 66·7 % and 46·2 %). Post-implementation, menus including only policy-consistent beverages significantly increased in California (9·7 % to 66·1 %, P < 0·0001), but remained constant in Wilmington (30·8 %). During orders, cashiers/servers offering only policy-consistent beverages significantly decreased post-implementation in California (5·0 % to 1·0 %, P = 0·002). Few managers (California 29·3 %; Wilmington 0 %) reported policy knowledge, although most expressed support. Most managers wanted additional information for customers and staff.
Conclusions:
While the proportion of menus offering only policy-consistent kids’ meal default beverages increased in California, offerings did not change in Wilmington. In both jurisdictions, managers lacked policy knowledge, and few cashiers/servers offered only policy-consistent beverages. Additional efforts are needed to strengthen implementation of kids’ meal beverage policies.
ABSTRACT IMPACT: Measuring and analyzing qualitative and quantitative traits using phenomics approaches will yield previously unrecognized heart failure subphenotypes and has the potential to improve our knowledge of heart failure pathophysiology, identify novel biomarkers of disease, and guide the development of targeted therapeutics for heart failure. OBJECTIVES/GOALS: Current classification schemes fail to capture the broader pathophysiologic heterogeneity in heart failure. Phenomics offers a newer unbiased approach to identify subtypes of complex disease syndromes, like heart failure. The goal of this research is to use data-driven associations to redefine the classification of the heart failure syndrome. METHODS/STUDY POPULATION: We will identify < 10 subphenotypes of patients with heart failure using unsupervised machine learning approaches for dense multidimensional quantitative (i.e. demographics, comorbid conditions, physiologic measurements, clinical laboratory, imaging, and medication variables; disease diagnosis, procedure, and billing codes) and qualitative data extracted from an integrated health system electronic health record. The heart failure subphenotypes we identify from the integrated health system electronic health record will be replicated in other heart failure population datasets using unsupervised learning approaches. We will explore the potential to establish associations between identified subphenotypes and clinical outcomes (e.g. all-cause mortality, cardiovascular mortality). RESULTS/ANTICIPATED RESULTS: We expect to identify < 10 mutually exclusive phenogroups of patients with heart failure that have differential risk profiles and clinical trajectories. DISCUSSION/SIGNIFICANCE OF FINDINGS: We will attempt to derive and validate a data-driven unbiased approach to the categorization of novel phenogroups in heart failure. This has the potential to improve our knowledge of heart failure pathophysiology, identify novel biomarkers of disease, and guide the development of targeted therapeutics for heart failure.
Trifludimoxazin, a new protoporphyrinogen oxidase–inhibiting herbicide, is being evaluated for possible use as a soil-residual active herbicide treatment in cotton for control of small-seeded annual broadleaf weeds. Laboratory and greenhouse studies were conducted to compare vertical mobility and cotton tolerance of trifludimoxazin to flumioxazin and saflufenacil, which are two currently registered protoporphyrinogen oxidase–inhibiting herbicides for use in cotton, in three West Texas soils. Vertical soil mobility of trifludimoxazin was similar to flumioxazin in Acuff loam and Olton loam soils, but was more mobile than flumioxazin in the Amarillo loamy sand soil. The depth of trifludimoxazin movement after a 2.5-cm irrigation event ranged from 2.5 to 5.0 cm in all soils, which would not allow for crop selectivity based on herbicide placement, because ideal cotton seeding depth is from 0.6 to 2.54 cm deep. Greenhouse studies indicated that PRE treatments were more injurious than the 14 d preplant treatment when summarized across soils for the three herbicides (43% and 14% injury, respectively). No differences in visual cotton response or dry weight was observed after trifludimoxazin preplant as compared with the nontreated control within each of the three West Texas soils and was similar to the flumioxazin preplant across soils. On the basis of these results, a use pattern for trifludimoxazin in cotton may be established with the use of a more than 14-d preplant restriction before cotton planting.
Understanding the clinical risk factors for COVID-19 disease severity and outcomes requires a combination of data from electronic health records and patient reports. To facilitate the collection of patient-reported data, as well as accelerate and standardize the collection of data about host factors, we have constructed a COVID-19 survey. This survey is freely available to the scientific community to send electronically for patients to complete online. This patient survey is designed to be comprehensive, yet not overly burdensome, to gather data useful for a range of clinical investigations, and to accommodate a wide variety of implementation settings including at a COVID-19 testing site, at home during infection or after recovery, and/or for individuals while they are hospitalized. A widely adopted standardized survey that can be implemented online with minimal resources can serve as a critical tool for combining and comparing data across studies to improve our understanding of COVID-19 disease.
We consider various aspects of longevity trend risk viewed through the prism of a finite time window. We show the broad equivalence of value-at-risk (VaR) capital requirements at a p-value of 99.5% to conditional tail expectations (CTEs) at 99%. We also show how deferred annuities have higher risk, which can require double the solvency capital of equivalently aged immediate anuities. However, results vary considerably with the choice of model and so longevity trend-risk capital can only be determined through consideration of multiple models to inform actuarial judgement. This model risk is even starker when trying to value longevity derivatives. We briefly discuss the importance of using smoothed models and describe two methods to considerably shorten VaR and CTE run times.
Neuroticism is a risk factor for selected mental and physical illnesses and is inversely associated with intelligence. Intelligence appears to interact with neuroticism and mitigate its detrimental effects on physical health and mortality. However, the inter-relationships of neuroticism and intelligence for major depressive disorder (MDD) and psychological distress has not been well examined.
Methods:
Associations and interactions between neuroticism and general intelligence (g) on MDD, self-reported depression, and psychological distress were examined in two population-based cohorts: Generation Scotland: Scottish Family Health Study (GS:SFHS, n = 19,200) and UK Biobank (n = 90,529). The Eysenck Personality Scale Short Form-Revised measured neuroticism and g was extracted from multiple cognitive ability tests in each cohort. Family structure was adjusted for in GS:SFHS.
Results:
Neuroticism was strongly associated with increased risk for depression and higher psychological distress in both samples. Although intelligence conferred no consistent independent effects on depression, it did increase the risk for depression across samples once neuroticism was adjusted for. Results suggest that higher intelligence may ameliorate the association between neuroticism and self-reported depression although no significant interaction was found for clinical MDD. Intelligence was inversely associated with psychological distress across cohorts. A small interaction was found across samples such that lower psychological distress associates with higher intelligence and lower neuroticism, although effect sizes were small.
Conclusions:
From two large cohort studies, our findings suggest intelligence acts a protective factor in mitigating the effects of neuroticism on psychological distress. Intelligence does not confer protection against diagnosis of depression in those high in neuroticism.
The Age-Period-Cohort-Improvement (APCI) model is a new addition to the canon of mortality forecasting models. It was introduced by Continuous Mortality Investigation as a means of parameterising a deterministic targeting model for forecasting, but this paper shows how it can be implemented as a fully stochastic model. We demonstrate a number of interesting features about the APCI model, including which parameters to smooth and how much better the model fits to the data compared to some other, related models. However, this better fit also sometimes results in higher value-at-risk (VaR)-style capital requirements for insurers, and we explore why this is by looking at the density of the VaR simulations.
The chemical, mineralogical, and textural changes involved in the weathering of basalt have been traced through various stages from fresh rock (which has a cation exchange capacity of 10 meq/100 g due to the presence of a swelling chlorite mineral) to reddened basaltic rubble consisting of interstratified montmorillonite-illite, hematite, and anatase. The cation exchange capacities of the rocks increase progressively with the formation of secondary clay from labradorite as Al, Fe, and Ti accumulate and Si, Mg, Ca, and Na are depleted—much of the K is retained in the secondary clay mineral. The weathering is considered to be contemporaneous with the formation of the Antrim bauxites but not so intense.
Introduction: The World Health Organization recommends emergency care training for laypeople in low-resource settings, but the effects of these programs on patient outcomes and community health have not been systematically reviewed. Our objective was to identify the individual and community health effects of educating laypeople to deliver emergency care in low-resource settings. Methods: We conducted a systematic review to address this question: in low-resource populations (P), does emergency care education for laypeople (I) confer any measurable effect on patient morbidity and mortality, or community capacity and resilience for emergency health conditions (O), in comparison with no training or other education(C)? We searched 12 electronic databases and grey literature for quantitative studies. We conducted duplicate and independent title and abstract screening, methodological and outcomes extraction, and study quality assessment using the Effective Public Health Practice Tool. We developed a narrative summary of findings. (PROSPERO: CRD42014009685) Results: We reviewed 16,017 abstracts and 372 full-text papers. 38 met inclusion criteria. Most topically relevant papers were excluded because they assessed educational outcomes. Cardiopulmonary resuscitation training (6 papers) improved cardiac arrest survival and enhanced capacity to respond to cardiac arrest in rural Norway, Denmark and commercial aircraft operations. A public education campaign in remote Denmark improved absolute cardiac arrest survival by 5.4% (95%CI 2-12). Lay trauma training (12 papers) reduced absolute injury mortality and improved community capacity in Iraq, Cambodia, Iran and Indigenous New Zealand communities. A trauma care program in Iraq and Cambodia reduced absolute mortality by 25% (95%CI 17.2-33). Education for mothers on paediatric fevers in Ethiopia was associated with 40% relative reductions in under-5 mortality (95%CI 29.2-50.6). Similar training improved access to care for paediatric malnutrition, malaria, pneumonia, and gastrointestinal disease in Nigeria, Kenya, Senegal, Burkina Faso, Mali, and India (13 papers). Overdose education and naloxone distribution was associated with reductions in opioid overdose deaths (3 papers), including in Massachusetts where high-uptake communities for overdose education had significantly lower overdose fatality rates than no-uptake communities (rate ratio 0.54, 95%CI 0.39-0.76). Community education improved measures of access to emergency care for remote Indigenous populations in Canada, Alaska and Nepal (3 papers) and adolescent mental health capacity in Australia (1 paper). Studies were of low or medium quality. Conclusion: In addition to established interventions for injury and cardiac arrest, emergency care training can improve community capacity in underserviced populations, and save lives in opioid overdose, paediatric infectious disease and malnutrition.
Introduction: Current guideline recommendations for optimal management of non-purulent skin and soft tissue infections (SSTIs) are based on expert consensus. There is currently a lack of evidence to guide emergency physicians on when to select oral versus intravenous antibiotic therapy. The primary objective was to identify risk factors associated with oral antibiotic treatment failure. A secondary objective was to describe the epidemiology of adult emergency department (ED) patients with non-purulent SSTIs. Methods: We performed a health records review of adults (age 18 years) with non-purulent SSTIs treated at two tertiary care EDs. Patients were excluded if they had a purulent infection or infected ulcers without surrounding cellulitis. Treatment failure was defined any of the following after a minimum of 48 hours of oral therapy: (i) hospitalization for SSTI; (ii) change in class of oral antibiotic owing to infection progression; or (iii) change to intravenous therapy owing to infection progression. Multivariable logistic regression was used to identify predictors independently associated with the primary outcome of oral antibiotic treatment failure after a minimum of 48 hours of oral therapy. Results: We enrolled 500 patients (mean age 64 years, 279 male (55.8%) and 126 (25.2%) with diabetes) and the hospital admission rate was 29.6%. The majority of patients (70.8%) received at least one intravenous antibiotic dose in the ED. Of 288 patients who had received a minimum of 48 hours of oral antibiotics, there were 85 oral antibiotic treatment failures (29.5%). Tachypnea at triage (odds ratio [OR]=6.31, 95% CI=1.80 to 22.08), chronic ulcers (OR=4.90, 95% CI=1.68 to 14.27), history of MRSA colonization or infection (OR=4.83, 95% CI=1.51 to 15.44), and cellulitis in the past 12 months (OR=2.23, 95% CI=1.01 to 4.96) were independently associated with oral antibiotic treatment failure. Conclusion: This is the first study to evaluate potential predictors of oral antibiotic treatment failure for non-purulent SSTIs in the ED. We observed a high rate of treatment failure and hospitalization. Tachypnea at triage, chronic ulcers, history of MRSA colonization or infection and cellulitis within the past year were independently associated with oral antibiotic treatment failure. Emergency physicians should consider these risk factors when deciding on oral versus intravenous antimicrobial therapy for non-purulent SSTIs being managed as outpatients.