We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To characterise children’s lunchbox contents for food, waste and packaging.
Design:
A cross-sectional study was conducted. Lunchboxes were photographed at two time points on the same day: before first morning break to capture food and packaging and post-lunch break to capture food waste. Contents were coded using an audit tool developed using REDCap.
Setting:
Twenty-three sites across metropolitan Adelaide, South Australia including fourteen preschools and nine primary schools in low (n 8), medium (n 7) and high (n 8) socioeconomic areas.
Participants:
Preschool (ages 3–5 years) to Grade 7 primary school (ages 6–13 years) students.
Results:
673 lunchboxes were analysed. Grain foods dominated (with at least half of them being discretionary varieties), with 92 % of lunchboxes having at least one item from that category, followed by fruits (78 %), snacks (62 %), dairy (32 %) and vegetables (26 %). Lunchboxes of preschool children contained more fruits (92 % v. 65 %; χ2(1) = 73·3, P < 0·01), vegetables (36 % v. 16 %; χ2(1) = 34·0, P < 0·01) and dairy items (45 % v. 19 %; χ2(1) = 53·6, P < 0·01), compared to lunchboxes of primary school children. Snack foods were more prevalent in primary school (68 %) than preschool (55 %; χ2(1) = 11·2, P < 0·01). Discretionary foods appeared more frequently, and single-use packaging accounted for half (53 %) of all packaging in lunchboxes, primarily from snacks and grain foods. Preschool children had less single-use packaging but more food waste. Vegetables were the most wasted food group.
Conclusions:
Sandwiches, fruits and various snacks are typical lunchbox foods, often accompanied by single-use packaging. Considering both health and environmental factors in lunchbox choices could benefit children and sustainability efforts in schools.
Amid resurgent geopolitical fissures and in the aftermath of the Covid-19 pandemic, there is a growing awareness in the sector of the need for, and concern about, national and international collaboration in archaeological projects. This article reflects on present-day challenges for international collaboration in central Eurasian archaeology and furthers a much-needed discussion about (re)integrating local narratives with inter-regional trends in future research. Responsible and practical proposals for bridging collaborator differences in institutional or publishing obligations, language capacities and access to resources are discussed.
This study identified 26 late invasive primary surgical site infection (IP-SSI) within 4–12 months of transplantation among 2073 SOT recipients at Duke University Hospital over the period 2015–2019. Thoracic organ transplants accounted for 25 late IP-SSI. Surveillance for late IP-SSI should be maintained for at least one year following transplant.
This study aimed to understand rural–urban differences in the uptake of COVID-19 vaccinations during the peak period of the national vaccination roll-out in Aotearoa New Zealand (NZ). Using a linked national dataset of health service users aged 12+ years and COVID-19 immunization records, age-standardized rates of vaccination uptake were calculated at fortnightly intervals, between June and December 2021, by rurality, ethnicity, and region. Rate ratios were calculated for each rurality category with the most urban areas (U1) used as the reference. Overall, rural vaccination rates lagged behind urban rates, despite early rapid rural uptake. By December 2021, a rural–urban gradient developed, with age-standardized coverage for R3 areas (most rural) at 77%, R2 81%, R1 83%, U2 85%, and U1 (most urban) 89%. Age-based assessments illustrate the rural–urban vaccination uptake gap was widest for those aged 12–44 years, with older people (65+) having broadly consistent levels of uptake regardless of rurality. Variations from national trends are observable by ethnicity. Early in the roll-out, Indigenous Māori residing in R3 areas had a higher uptake than Māori in U1, and Pacific peoples in R1 had a higher uptake than those in U1. The extent of differences in rural–urban vaccine uptake also varied by region.
Two independent temporal-spatial clusters of hospital-onset Rhizopus infections were evaluated using whole-genome sequencing (WGS). Phylogenetic analysis confirmed that isolates within each cluster were unrelated despite epidemiological suspicion of outbreaks. The ITS1 region alone was insufficient for accurate analysis. WGS has utility for rapid rule-out of suspected nosocomial Rhizopus outbreaks.
A conceptual model, based on field observations and assumed physics of a perennial firn aquifer near Helheim Glacier (southeast Greenland), is evaluated via steady-state 2-D simulation of liquid water flow and energy transport with phase change. The simulation approach allows natural representation of flow and energy advection and conduction that occur in vertical meltwater recharge through the unsaturated zone and in lateral flow within the saturated aquifer. Agreement between measured and simulated aquifer geometry, temperature, and recharge and discharge rates confirms that the conceptual field-data-based description of the aquifer is consistent with the primary physical processes of groundwater flow, energy transport and phase change. Factors that are found to control simulated aquifer configuration include surface temperature, meltwater recharge rate, residual total-water saturation and capillary fringe thickness. Simulation analyses indicate that the size of perennial firn aquifers depends primarily on recharge rates from surface snowmelt. Results also imply that the recent aquifer expansion, likely due to a warming climate, may eventually produce lakes on the ice-sheet surface that would affect the surface energy balance.
Methicillin-resistant Staphylococcus aureus (MRSA) is an important pathogen in neonatal intensive care units (NICU) that confers significant morbidity and mortality.
Objective:
Improving our understanding of MRSA transmission dynamics, especially among high-risk patients, is an infection prevention priority.
Methods:
We investigated a cluster of clinical MRSA cases in the NICU using a combination of epidemiologic review and whole-genome sequencing (WGS) of isolates from clinical and surveillance cultures obtained from patients and healthcare personnel (HCP).
Results:
Phylogenetic analysis identified 2 genetically distinct phylogenetic clades and revealed multiple silent-transmission events between HCP and infants. The predominant outbreak strain harbored multiple virulence factors. Epidemiologic investigation and genomic analysis identified a HCP colonized with the dominant MRSA outbreak strain who cared for most NICU patients who were infected or colonized with the same strain, including 1 NICU patient with severe infection 7 months before the described outbreak. These results guided implementation of infection prevention interventions that prevented further transmission events.
Conclusions:
Silent transmission of MRSA between HCP and NICU patients likely contributed to a NICU outbreak involving a virulent MRSA strain. WGS enabled data-driven decision making to inform implementation of infection control policies that mitigated the outbreak. Prospective WGS coupled with epidemiologic analysis can be used to detect transmission events and prompt early implementation of control strategies.
McPherson, Smith-Lovin, and Cook’s (2001) Annual Review of Sociology piece “Birds of a Feather” (“Birds”, hereafter) focused on the phenomenon of homophily – the empirical reality that connections are more likely between similar others than dissimilar others.
Agenesis of the corpus callosum (AgCC) is associated with a range of cognitive deficits, including mild to moderate problems in higher order executive functions evident in neuropsychological assessments. Previous research has also suggested a lack of self-awareness in persons with AgCC.
Method:
We investigated daily executive functioning and self-awareness in 36 individuals with AgCC by analyzing self-ratings on the Behavior Rating Inventory of Executive Function-Adult Version (BRIEF-A), as well as ratings on the same instrument from close relatives. Discrepancies between self- and informant-ratings were compared to the normative sample and exploratory analyses examined possible moderating effects of participant and informant characteristics.
Results:
Significant deficiencies were found in the Behavioral Regulation and Metacognitive indices for both the self and informant results, with elevated frequency of metacognition scores in the borderline to clinical range. Informants also endorsed elevated frequency of borderline to clinically significant behavioral regulation scores. The proportion of AgCC participants whose self-ratings indicated less metacognitive impairment than informant-ratings was greater than in the normative sample. Self-ratings of behavioral regulation impairment decreased with age and informant-ratings of metacognition were higher in males than females.
Conclusions:
These findings provide evidence that individuals with AgCC experience mild to moderate executive functioning problems in everyday behavior which are observed by others. Results also suggest a lack of self-understanding or insight into the severity of these problems in the individuals with AgCC, particularly with respect to their metacognitive functioning.
Background: Shared Healthcare Intervention to Eliminate Life-threatening Dissemination of MDROs in Orange County, California (SHIELD OC) was a CDC-funded regional decolonization intervention from April 2017 through July 2019 involving 38 hospitals, nursing homes (NHs), and long-term acute-care hospitals (LTACHs) to reduce MDROs. Decolonization in NH and LTACHs consisted of universal antiseptic bathing with chlorhexidine (CHG) for routine bathing and showering plus nasal iodophor decolonization (Monday through Friday, twice daily every other week). Hospitals used universal CHG in ICUs and provided daily CHG and nasal iodophor to patients in contact precautions. We sought to evaluate whether decolonization reduced hospitalization and associated healthcare costs due to infections among residents of NHs participating in SHIELD compared to nonparticipating NHs. Methods: Medicaid insurer data covering NH residents in Orange County were used to calculate hospitalization rates due to a primary diagnosis of infection (counts per member quarter), hospital bed days/member-quarter, and expenditures/member quarter from the fourth quarter of 2015 to the second quarter of 2019. We used a time-series design and a segmented regression analysis to evaluate changes attributable to the SHIELD OC intervention among participating and nonparticipating NHs. Results: Across the SHIELD OC intervention period, intervention NHs experienced a 44% decrease in hospitalization rates, a 43% decrease in hospital bed days, and a 53% decrease in Medicaid expenditures when comparing the last quarter of the intervention to the baseline period (Fig. 1). These data translated to a significant downward slope, with a reduction of 4% per quarter in hospital admissions due to infection (P < .001), a reduction of 7% per quarter in hospitalization days due to infection (P < .001), and a reduction of 9% per quarter in Medicaid expenditures (P = .019) per NH resident. Conclusions: The universal CHG bathing and nasal decolonization intervention adopted by NHs in the SHIELD OC collaborative resulted in large, meaningful reductions in hospitalization events, hospitalization days, and healthcare expenditures among Medicaid-insured NH residents. The findings led CalOptima, the Medicaid provider in Orange County, California, to launch an NH incentive program that provides dedicated training and covers the cost of CHG and nasal iodophor for OC NHs that enroll.
Funding: None
Disclosures: Gabrielle M. Gussin, University of California, Irvine, Stryker (Sage Products): Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Clorox: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Medline: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Xttrium: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes.
Although we find Gangestad & Simpson's argument intriguing, we question some of its underlying assumptions, including: (1) that fluctuating asymmetry (FA) is consistently heritable; (2) that symmetry is driving the effects; (3) that use of parametric tests with FA is appropriate; and (4) that a short-term mating strategy produces more offspring than a long-term strategy.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
The Virtual Personalities Model is a motive-based neural network model that provides both a psychological model and a computational implementation that explicates the dynamics and often large within-person variability in behavior that arises over time. At the same time the same model can produce—across many virtual personalities—between-subject variability in behavior that when factor analyzed yields familiar personality structure (e.g., the Big Five). First, we describe our personality model and its implementation as a neural network model. Second, we focus on detailing the neurobiological underpinnings of this model. Third, we examine the learning mechanisms, and their biological substrates, as ways that the model gets “wired up,” discussing Pavlovian and Instrumental conditioning, Pavlovian to Instrumental transfer, and habits. Finally, we describe the dynamics of how initial differences in propensities (e.g., dopamine functioning), wiring differences due to experience, and other factors could operate together to develop and change personality over time, and how this might be empirically examined. Thus, our goal is to contribute to the rising chorus of voices seeking a more precise neurobiologically based science of the complex dynamics underlying personality.
Parenting, Eating and Activity for Child Health (PEACH) is a multi-component lifestyle intervention for families with overweight and obese children. PEACH was translated from an efficacious randomised-controlled trial (RCT) and delivered at scale as PEACH Queensland (QLD) in Queensland, Australia. The aim of this study is to explore pre–post changes in parenting, and child-level eating, activity and anthropometry, in the PEACH QLD service delivery project. PEACH QLD enrolled 926 overweight/obese children (817 families). Pre-programme evaluation was completed for 752 children and paired pre–post-programme evaluation data were available for 388 children. At baseline, children with pre–post-programme data were (mean) 8·8 years old, and at follow-up were 9·3 years old, with mean time between pre–post-programme measures of 0·46 years. Outcomes reflected each domain of the PEACH programme: parenting, eating behaviour of the child and activity behaviours (means reported). Parents reported improvements in parenting self-efficacy (3·6 to 3·7, P=0·001). Children had improved eating behaviours: eating more daily serves of vegetables (2·0 to 2·6, P=0·001) and fewer non-milk sweetened beverages (0·9 to 0·6, P=0·001) and discretionary foods (2·2 to 1·5, P=0·001). Children spent more time in moderate-to-vigorous physical activity (86 to 105 min/d, P=0·001) and less time in sedentary screen-based behaviours (190 to 148 min/d, P=0·001). Consequently, there were significant improvements in mean BMIz (−0·112; P<0·001) and weight status (healthy weight/overweight/obese/morbidly obese prevalence from 0/22/33/45 % to 2/27/34/37 %, P<0·001). When delivered at scale, PEACH remains an effective family-based, multi-component, lifestyle weight management programme for overweight and obese children whose families engage in the programme.
For nine years, an REU program placed over 200 undergraduate researchers at Northeastern University, the University of Massachusetts Lowell, and the University of New Hampshire through the NSF-funded Nanoscale Science and Engineering Center for High-rate Nanomanufacturing. The cross-university professional development program included university-based research skills, communication skills with the Boston Museum of Science, and a unique method for researcher evaluation of the societal impact of their decisions. This work presents the impacts of this research program as measured at program end, along with the career progress of the REU participants, recent interviews with REU participants, and reflections by REU program leaders.
Anxiety is the most pervasive childhood mental health disorder today. This study examined the parent component of a school-based universal prevention and early intervention program. Participating parents (N = 122) completed four measures on anxiety, the Anxiety Sensitivity Index, the Center for Epidemiological Studies-Depression, the Penn State Worry Questionnaire, and the Screen for Child Anxiety Related Emotional Disorders, before and after the parent program. The effectiveness of the program was investigated by analysing mean scores of the parent self-reported anxiety symptoms and parent reports of child anxiety symptoms. The main analyses conducted were 2 × 2 between-within ANOVAs for each measure. The hypothesis that parents who participated in the program (n = 20) would report reduced anxiety symptoms for themselves and for their children when compared to parents who did not attend (n = 120) was not confirmed. The parent's satisfaction level with the program was also studied, with high acceptability ratings providing strong social validity for this program. Implications of the findings, strengths, limitations and suggestions for further research are discussed.
To describe the rates of several key outcomes and healthcare-associated infections (HAIs) among hospitals that participated in the Duke Infection Control Outreach Network (DICON).
Design and Setting.
Prospective, observational cohort study of patients admitted to 24 community hospitals from 2003 through 2009.
Methods.
The following data were collected and analyzed: incidence of central line-associated bloodstream infections (CLABSIs), ventilator-associated pneumonia (VAP), catheter-associated urinary tract infections (CAUTIs), and HAIs caused by methicillin-resistant Staphylococcus aureus (MRSA); employee exposures to bloodborne pathogens (EBBPs); physician EBBPs; patient-days; central line-days; ventilator-days; and urinary catheter-days. Poisson regression was used to determine whether incidence rates of these HAIs and exposures changed during the first 5 and 7 years of participation in DICON; nonrandom clustering of each outcome was controlled for. Cost saved and lives saved were calculated on the basis of published estimates.
Results.
In total, we analyzed 6.5 million patient-days, 4,783 EBPPs, 2,948 HAIs due to MRSA, and 2,076 device-related infections. Rates of employee EBBPs, HAIs due to MRSA, and device-related infections decreased significantly during the first 5 years of participation in DICON (P < .05 for all models; average decrease was approximately 50%); in contrast, physician EBBPs remained unchanged. In aggregate, 210 CLABSIs, 312 cases of VAP, 332 CAUTIs, 1,042 HAIs due to MRSA, and 1,016 employee EBBPs were prevented. Each hospital saved approximately $100,000 per year of participation, and collectively the hospitals may have prevented 52-105 deaths from CLABSI or VAP. The 7-year analysis demonstrated that these trends continued with further participation.
Conclusions.
Hospitals with long-term participation in an infection control network decreased rates of significant HAIs by approximately 50%, decreased costs, and saved lives.