We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Vitamin A deficiency (VAD) poses significant health risks and is prevalent in children and adolescents in India. This study aimed to determine the effect of seasonal variation and availability of vitamin A-rich (VA-rich) foods on serum retinol in adolescents. Data on serum retinol levels from adolescents (n 2297, mean age 14 years) from the Comprehensive National Nutrition Survey (2016–2018) in India were analysed, with VAD defined as serum retinol < 0·7 µmol/L. Five states were selected based on a comparable under-five mortality rate and the seasonal spread of the data collection period. Dietary data from adolescents and children ≤ 4 years old were used to assess VA-rich food consumption. A linear mixed model framework was employed to analyse the relationship between serum retinol, month of the year and VA-rich food consumption, with a priori ranking to control for multiple hypothesis testing. Consumption of VA-rich foods, particularly fruits and vegetables/roots and tubers, showed seasonal patterns, with higher consumption during summer and monsoon months. Significant associations were found between serum retinol concentrations and age, month of sampling, consumption of VA-rich foods and fish. VAD prevalence was lowest in August, coinciding with higher consumption of VA-rich fruits and foods. Findings highlight the importance of considering seasonality in assessing VAD prevalence and careful interpretation of survey findings. Intentional design, analysis and reporting of surveys to capture seasonal variation is crucial for accurate assessment and interpretation of VAD prevalence, including during monitoring and evaluation of programmes, and to ensure that public health strategies are appropriately informed.
Evidence-based insertion and maintenance bundles are effective in reducing the incidence of central line-associated bloodstream infections (CLABSI) in intensive care unit (ICU) settings. We studied the adoption and compliance of CLABSI prevention bundle programs and CLABSI rates in ICUs in a large network of acute care hospitals across Canada.
It is widely accepted that meeting recommended protein intake is protective of muscle mass(1). Insufficient intake is related to accelerated sarcopenia and impaired physical function, contributing to increased mortality and morbidity. The recommended target set by the American dietary guidelines is 0.8 g of protein per kg of body weight, based on data collated by the National Academies published in 2005(2). Currently approximately 50% of women and 30% of men do not meet these targets(3). It is of public interest to analyse current patterns of intake to allow for improved strategy through awareness of factors that impact protein intake.
To investigate the factors which determine an individual’s protein intake and how they can be used to predict daily intake.
A secondary data analysis of longitudinal data collected in the National Health and Nutrition Examination Survey (NHANES) between 2011 and 2020 has been carried out(4). Data was accessed from the public domain on the Centers for Disease Control and Prevention (CDC) website. The study protocol received approval from the research ethics review board of the National Center for Health Statistics (NCHS) of the CDC. Average protein intake has been calculated and participant demographics reported. STATA software has been used to carry out a bivariate regression of factors associated with protein intake, an adjusted multivariate regression analysis and a parsimonious model.
19601 participants (52.4% women) aged 20 and over had valid protein data. Adjusted regression analysis generated three model fits, with the parsimonious model excluding BMI categories and household income had a statistically insignificant impact on protein intake. Men consumed 23.99g more protein per day compared to women (p<0.001: 95% CI 23.09 to 24.89). Individuals over 65 consumed 13.92g less protein per day compared to those aged 20-35 years old (p<0.001: 95% CI − 15.25 to −12.59). Mexican American individuals consumed 7.47g more protein than Non-Hispanic White individuals (p<0.001: 95% CI 5.89 to 9.04) and Non-Hispanic White individuals consumed 2.95g more protein compared to non-Hispanic Black individuals (p<0.001: 95% CI 4.13 to 1.77). Those with the lowest educational attainment consumed 10.77g less protein compared to individuals with a college degree (p<0.001: 95% CI −12.79 to −8.74). From 2011 to 2020, there was a gradual decline in protein intake which is statistically significant when comparing 2011/12 with 2015/16 and 2017/2020.
Protein intake can be predicted by an individual’s gender, age, ethnicity, level of education attainment and time period. This study informs policymakers that individuals aged 65 and above are at risk of insufficient protein intake and there has been a general decrease in protein consumption over time. This provides evidence to support initiatives focused on this age category to maximise change and reduce rates of sarcopenia.
Essential minerals are cofactors for synthesis of neurotransmitters supporting cognition and mood. An 8-week fully-blind randomised controlled trial of multinutrients for attention-deficit/hyperactivity disorder (ADHD) demonstrated three times as many children (age 6–12) had significantly improved behaviour (‘treatment responders’) on multinutrients (54 %) compared with placebo (18 %). The aim of this secondary study was to evaluate changes in fasted plasma and urinary mineral concentrations following the intervention and their role as mediators and moderators of treatment response. Fourteen essential or trace minerals were measured in plasma and/or urine at baseline and week eight from eighty-six participants (forty-nine multinutrients, thirty-seven placebos). Two-sample t tests/Mann–Whitney U tests compared 8-week change between treatment and placebo groups, which were also evaluated as potential mediators. Baseline levels were evaluated as potential moderators, using logistic regression models with clinical treatment response as the outcome. After 8 weeks, plasma boron, Cr (in females only), Li, Mo, Se and vanadium and urinary iodine, Li and Se increased more with multinutrients than placebo, while plasma phosphorus decreased. These changes did not mediate treatment response. However, baseline urinary Li trended towards moderation: participants with lower baseline urinary Li were more likely to respond to multinutrients (P = 0·058). Additionally, participants with higher baseline Fe were more likely to be treatment responders regardless of the treatment group (P = 0·036.) These results show that multinutrient treatment response among children with ADHD is independent of their baseline plasma mineral levels, while baseline urinary Li levels show potential as a non-invasive biomarker of treatment response requiring further study.
Understanding characteristics of healthcare personnel (HCP) with SARS-CoV-2 infection supports the development and prioritization of interventions to protect this important workforce. We report detailed characteristics of HCP who tested positive for SARS-CoV-2 from April 20, 2020 through December 31, 2021.
Methods:
CDC collaborated with Emerging Infections Program sites in 10 states to interview HCP with SARS-CoV-2 infection (case-HCP) about their demographics, underlying medical conditions, healthcare roles, exposures, personal protective equipment (PPE) use, and COVID-19 vaccination status. We grouped case-HCP by healthcare role. To describe residential social vulnerability, we merged geocoded HCP residential addresses with CDC/ATSDR Social Vulnerability Index (SVI) values at the census tract level. We defined highest and lowest SVI quartiles as high and low social vulnerability, respectively.
Results:
Our analysis included 7,531 case-HCP. Most case-HCP with roles as certified nursing assistant (CNA) (444, 61.3%), medical assistant (252, 65.3%), or home healthcare worker (HHW) (225, 59.5%) reported their race and ethnicity as either non-Hispanic Black or Hispanic. More than one third of HHWs (166, 45.2%), CNAs (283, 41.7%), and medical assistants (138, 37.9%) reported a residential address in the high social vulnerability category. The proportion of case-HCP who reported using recommended PPE at all times when caring for patients with COVID-19 was lowest among HHWs compared with other roles.
Conclusions:
To mitigate SARS-CoV-2 infection risk in healthcare settings, infection prevention, and control interventions should be specific to HCP roles and educational backgrounds. Additional interventions are needed to address high social vulnerability among HHWs, CNAs, and medical assistants.
The 57Fe Mössbauer spectra of six nontronite samples were measured at appropriate temperatures of 4.2 and 1.3 K. Three of the nontronites gave a complex magnetic hyperfine spectrum showing magnetic ordering at 4.2 K, and the other three required a lower temperature of 1.3 K to produce similar magnetic ordering. The spectra were computer-fitted with three closely overlapping sextets which are considered to arise from: (1) Fe3+ that is ordered magnetically in the cis-octahedral sites with a greater number of neighboring tetrahedral Fe3+ ions (51 T); (2) the cis-octahedral site with the greater number of neighboring Si4+ ions (46 T); and (4) the tetrahedral sites (41 T). In an untreated sample a further sextet corresponding to interlayer Fe3+ (36 T) was identified. The magnetic ordering was complicated and not directly related to the iron content of these sites. It probably depended also on the overall composition and structural order of the particular nontronite. The ordering appears to have been essentially two-dimensional, consistent with the layer structure of this material.
Cave carbonate mineral deposits (speleothems) contain trace elements that are intensively investigated for their significance as palaeoclimate and environmental proxies. However, chlorine, which is abundant in marine and meteoric waters, has been overlooked as a potential palaeo-proxy, while cosmogenic 36Cl could, in principle, provide a solar irradiance proxy. Here, total Cl concentrations analysed from various speleothems were low (3–14 mg/kg), with variations linked to crystal fabrics. High-resolution synchrotron radiation micro X-ray fluorescence (μ-XRF) trace element mapping showed Cl often associated with Na, Si, and Al. We propose that speleothems incorporate Cl in two fractions: (1) water soluble (e.g., fluid inclusions) and (2) water insoluble and strongly bound (e.g., associated with detrital particulates). However, disparities indicated that alternate unidentified mechanisms for Cl incorporation were present, raising important questions regarding incorporation of many trace elements into speleothems. Our first measurements of 36Cl/Cl ratios in speleothems required large samples due to low Cl concentrations, limiting the potential of 36Cl as a solar irradiance proxy. Critically, our findings highlight a knowledge gap into how Cl and other trace elements are incorporated into speleothems, how the incorporation mechanisms and final elemental concentrations are related to speleothem fabrics, and the significance this may have for how trace elements in speleothems are interpreted as palaeoclimate proxies.
Chapter 5 gives an extended empirical example of the Benford agreement procedure for assessing the validity of social science data. The example uses country-level data collected and estimated by the Sea Around Us organization on the dollar values of reported and unreported fish landings from 2010 to 2016. We report Benford agreement analyses for the Sea Around Us data (1) by reporting status, (2) by decade, (3) for a large fishing region of 22 West African countries, and (4) foreach of the 22 individual countries in West Africa.
Chapter 4 begins with a discussion of the types and kinds of data most suitable for an analysis that uses the Benford probability distribution. Next we describe an R computer program – program Benford – designed to evaluate observed data for agreement with the Benford probability distribution; and we give an example of output from the program using a typical dataset. We then move to an overview of our workflow of Benford agreement analyses where we outline our process for assessing the validity of data using Benford agreement analyses. We end the chapter with a discussion of the concept of Benford validity, which we will employ in subsequent chapters.
Chapter 7 takes a closer look at some of the Sea Around Us fish-landings data that we assessed for Benford agreement in Chapter 5. We chose these data because of the mixed agreement findings among them: while the full dataset and several sets of subgroups indicated that the data exhibited Benford validity, when we analyzed West African countries individually, a number of them were found to have unacceptable Benford agreement and therefore problematic Benford validity. We present ways in which researchers can assess the impact of unacceptable Benford agreement on their analyses.
Chapter 3 describes and illustrates the Benford probability distribution. A brief summary of the origin and evolution of the Benford distribution is drawn and the development and assessment of various measures of goodness of fit between an empirical distribution and the Benford distribution are described and illustrated. These masures are Pearson’s chi-squared, Wilks’ likelihood-ratio, Hardy and Ramanujan’s partition theory, Fisher’s exact test, Kuiper’s measure, Tam Cho and Gaines’ d measure, Cohen’s w measure, and Nigrini’s MAD measure.
Chapter 6 provides a second empirical example of the Benford agreement procedure: here we analyze new daily COVID-19 cases at the US state level and at the global level across nations. Both the state-level and the global analyses consider time as a variable. Specifically we examine, (1) for the United States, new reports of COVID-19 between January 22, 2020 and November 16, 2021 at the state level, and (2) for the cross-national data, new reports of COVID-19 between February 24, 2020 and January 13, 2022. At the state level, we report Benford agreement analyses for (1) the full dataset, (2) cases grouped alphabetically, (3) cases grouped regionally, (4) cases grouped by days of the week, and (5) cases grouped by their governor’s party (Republican or Democratic). We then turn our Benford agreement analysis to global cross-national COVID-19 data to assess whether Benford agreement of COVID-19 varies across countries.
This chapter gives an overview of the remainder of the book. We first provide commonsense and social science examples of reliability and validity, two necessary conditions that data must posses to have trustworthy conclusions based upon it. We next introduce Benford’s law and offer a brief overview of other social science studies that have employed it to check the accuracy of their data. We then turn to an overview of our Benford agreement analysis procedure and introduce the concept of Benford validity. The chapter concludes with a plan for the remainder of the book.