To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Increased frequency and occurrence of herbicide-resistant biotypes heightens the need for alternative wild oat management strategies. There is an opportunity to exploit the height differential between wild oat and crops by targeting wild oat between panicle emergence and seed shed timing. Two field studies were conducted either in Lacombe, AB, or Lacombe, AB and Saskatoon, SK from 2015-2017. In the first study, we compared panicle removal methods: hand clipping, use of a hedge trimmer and a selective herbicide crop topping application to a weedy check and an industry standard in-crop herbicide application in wheat. These treatments were tested early (at panicle emergence), late (at initiation of seed shed) or in combination at one location over three years. In the second study, we investigated optimal timing of panicle removal via a hedge trimmer with weekly removals in comparison to a weedy check in wheat and lentil. This study was conducted at two locations, Lacombe, AB and Saskatoon, SK over three years. Among all the tested methods, the early crop topping treatment consistently had the largest impact on wild oat density, dockage, seedbank and subsequent year crop yield. The early (at panicle emergence) or combination of the early and late (at initiation of seed shed) treatments tended to reduce wild oat populations the following season the most compared to the late treatments. Subsequent wild oat populations were not influenced by panicle removal timing, but only by crop and location interactions. Panicle removal timing did significantly affect wild oat dockage in the year of treatment but no consistent optimal timing could be identified. However, the two studies together highlight a number of additional questions to be investigated, as well as the opportunity to manage wild oat seedbank inputs at the panicle emergence stage of the wild oat lifecycle.
To prioritise and refine a set of evidence-informed statements into advice messages to promote vegetable liking in early childhood, and to determine applicability for dissemination of advice to relevant audiences.
A nominal group technique (NGT) workshop and a Delphi survey were conducted to prioritise and achieve consensus (≥70% agreement) on 30 evidence-informed maternal (perinatal and lactation stage), infant (complementary feeding stage) and early years (family diet stage) vegetable-related advice messages. Messages were validated via triangulation analysis against the strength of evidence from an Umbrella review of strategies to increase children’s vegetable liking, and gaps in advice from a Desktop review of vegetable feeding advice.
A purposeful sample of key stakeholders (NGT workshop, n=8 experts; Delphi survey, n=23 end-users).
Participant consensus identified the most highly ranked priority messages associated with the strategies of: ‘in-utero exposure’ (perinatal and lactation, n=56 points); and ‘vegetable variety’ (complementary feeding, n=97 points; family diet, n=139 points). Triangulation revealed two strategies (‘repeated exposure’ and ‘variety’) and their associated advice messages suitable for policy and practice, 12 for research and four for food industry.
Supported by national and state feeding guideline documents and resources, the advice messages relating to ‘repeated exposure’ and ‘variety’ to increase vegetable liking can be communicated to families and caregivers by healthcare practitioners. The food industry provides a vehicle for advice promotion and product development. Further research, where stronger evidence is needed, could further inform strategies for policy and practice, and food industry application.
To determine which established diet quality indices best predict weight-related outcomes in young women.
In this cross-sectional analysis, we collected dietary information using the Harvard FFQ and measured body fat percentage (BF%) by dual-energy X-ray absorptiometry. We used FFQ data to derive five diet quality indices: Recommended Food Score (RFS), Healthy Eating Index 2015 (HEI-2015), Alternate Healthy Eating Index 2010 (AHEI-2010), alternate Mediterranean Diet Score (aMED) and Healthy Plant-Based Diet Index (HPDI).
University of Massachusetts at Amherst.
Two hundred sixty healthy women aged 18–30 years.
The AHEI-2010 and HPDI were associated with BMI and BF%, such that a ten-point increase in either diet score was associated with a 1·2 percentage-point lower BF% and a 0·5 kg/m2 lower BMI (P < 0·05). Odds of excess body fat (i.e. BF% > 32 %) were 50 % lower for those in the highest v. lowest tertile of the AHEI-2010 (P = 0·04). Neither the RFS nor HEI-2015 was associated with BMI or BF%; the aMED was associated with BMI but not BF%.
These results suggest that diet quality tends to be inversely associated with BMI and BF% in young women, but that this association is not observed for all diet quality indices. Diet indices may have limited utility in populations where the specific healthful foods and food groups emphasised by the index are not widely consumed. Future research should aim to replicate these findings in longitudinal studies that compare body composition changes over time across diet indices in young women.
One of the most common concerns for parents is their child’s sleep behaviour. Inadequate sleep can impact cognitive, behavioural and social-emotional functioning. There are predictable developmental changes that occur in sleep behaviour. It is important to know that sleep problems throughout childhood and adolescence are common and that there is a spectrum from sleep problems through to diagnosed sleep disorders. This chapter starts with a brief overview of what sleep is, how it is regulated, steps for assessment and theoretical underpinnings that aid in further understanding treatment principles for behaviourally based sleep problems (e.g., cognitive and behavioural theories and the 4-P model). Then a developmental framework is used to outline common behaviourally based sleep problems experienced across developmental stages and the range of family-based behavioural interventions that can be applied from infancy through to adolescence. Throughout the chapter, the impact of behaviourally based sleep problems on the family is considered. Finally, the role of the therapist in working with children experiencing behaviourally based sleep problems and the importance of implementing a core competencies approach are discussed.
This study compared the level of education and tests from multiple cognitive domains as proxies for cognitive reserve.
The participants were educationally, ethnically, and cognitively diverse older adults enrolled in a longitudinal aging study. We examined independent and interactive effects of education, baseline cognitive scores, and MRI measures of cortical gray matter change on longitudinal cognitive change.
Baseline episodic memory was related to cognitive decline independent of brain and demographic variables and moderated (weakened) the impact of gray matter change. Education moderated (strengthened) the gray matter change effect. Non-memory cognitive measures did not incrementally explain cognitive decline or moderate gray matter change effects.
Episodic memory showed strong construct validity as a measure of cognitive reserve. Education effects on cognitive decline were dependent upon the rate of atrophy, indicating education effectively measures cognitive reserve only when atrophy rate is low. Results indicate that episodic memory has clinical utility as a predictor of future cognitive decline and better represents the neural basis of cognitive reserve than other cognitive abilities or static proxies like education.
Identifying developmental endophenotypes on the pathway between genetics and behavior is critical to uncovering the mechanisms underlying neurodevelopmental conditions. In this proof-of-principle study, we explored whether early disruptions in visual attention are a unique or shared candidate endophenotype of autism spectrum disorder (ASD) and attention-deficit/hyperactivity disorder (ADHD). We calculated the duration of the longest look (i.e., peak look) to faces in an array-based eye-tracking task for 335 14-month-old infants with and without first-degree relatives with ASD and/or ADHD. We leveraged parent-report and genotype data available for a proportion of these infants to evaluate the relation of looking behavior to familial (n = 285) and genetic liability (using polygenic scores, n = 185) as well as ASD and ADHD-relevant temperament traits at 2 years of age (shyness and inhibitory control, respectively, n = 272) and ASD and ADHD clinical traits at 6 years of age (n = 94).
Results showed that longer peak looks at the face were associated with elevated polygenic scores for ADHD (β = 0.078, p = .023), but not ASD (β = 0.002, p = .944), and with elevated ADHD traits in mid-childhood (F(1,88) = 6.401, p = .013,
=0.068; ASD: F (1,88) = 3.218, p = .076), but not in toddlerhood (ps > 0.2). This pattern of results did not emerge when considering mean peak look duration across face and nonface stimuli. Thus, alterations in attention to faces during spontaneous visual exploration may be more consistent with a developmental endophenotype of ADHD than ASD. Our work shows that dissecting paths to neurodevelopmental conditions requires longitudinal data incorporating polygenic contribution, early neurocognitive function, and clinical phenotypic variation.
Background: A prolonged outbreak of carbapenemase-producing Serratia marcescens (CPSM) was identified in our quaternary healthcare center over a 2-year period from 2015 through 2017. A reservoir of IMP-4–producing S. marcescens in sink drains of clinical hand basins (CHB) was implicated in propagating transmission, supported by evidence from whole-genome sequencing (WGS). We assessed the impact of manual bioburden reduction intervention on further transmission of CPSM. Methods: Environmental sampling of frequently touched wet and dry areas around CPSM clinical cases was undertaken to identify potential reservoirs and transmission pathways. After identifying CHB as a source of CPSM, a widespread annual CHB cleaning intervention involving manual scrubbing of sink drains and the proximal pipes was implemented. Pre- and postintervention point prevalence surveys (PPS) of CHB drains performed to assess for CPSM colonization. Surveillance for subsequent transmission was conducted through weekly screening of patients and annual screening of CHB in transmission areas, and 6-monthly whole-hospital PPS of patients. All CPSM isolates were assessed by WGS. Results: In total, 6 patients were newly identified with CPSM from 2015 to 2017 (4.3 transmission events per 100,000 surveillance bed days [SBD]; 95% CI, 1.6–9.4). All clinical CPSM isolates were linked to CHB isolates by WGS. The CHB cleaning intervention resulted in a reduction in CHB colonization with CPSM in transmission areas from 72% colonization to 28% (ARR, 0.44; 95% CI, 0.25–0.63). A single further clinical case of CPSM linked to the CHB isolates was detected over 2 years of surveillance from 2017 to 2019 following the implementation of the annual CHB cleaning program (0.7 transmissions per 100,000 SBD; 95% CI, 0.0–3.9). No transmissions were linked to undertaking the cleaning intervention. Conclusions: A simple intervention targeted at reducing the biological burden of CPSM in CHB drains at regular intervals was effective in preventing transmission of carbapenemase-producing Enterobacterales from the hospital environment to patients over a prolonged period of intensive surveillance. These findings highlight the importance of detailed cleaning for controlling the spread of multidrug-resistant organisms from healthcare environments.
Infants struggle to understand familiar words spoken in unfamiliar accents. Here, we examine whether accent exposure facilitates accent-specific adaptation. Two types of pre-exposure were examined: video-based (i.e., listening to pre-recorded stories; Experiment 1) and live interaction (reading books with an experimenter; Experiments 2 and 3). After video-based exposure, Canadian English-learning 15- to 18-month-olds failed to recognize familiar words spoken in an unfamiliar accent. However, after face-to-face interaction with a Mandarin-accented talker, infants showed enhanced recognition for words produced in Mandarin English compared to Australian English. Infants with live exposure to an Australian talker were not similarly facilitated, perhaps due to the lower vocabulary scores of the infants assigned to the Australian exposure condition. Thus, live exposure can facilitate accent adaptation, but this ability is fragile in young infants and is likely influenced by vocabulary size and the specific mapping between the speaker and the listener's phonological system.
Southeastern Appalachian Ohio has more than double the national average of diabetes and a critical shortage of healthcare providers. Paradoxically, there is limited research focused on primary care providers’ experiences treating people with diabetes in this region. This study explored providers’ perceived barriers to and facilitators for treating patients with diabetes in southeastern Appalachian Ohio.
We conducted in-depth interviews with healthcare providers who treat people with diabetes in rural southeastern Ohio. Interviews were transcribed, coded, and analyzed via content and thematic analyses using NVivo 12 software (QSR International, Chadstone, VIC, Australia).
Qualitative analysis revealed four themes: (1) patients’ diabetes fatalism and helplessness: providers recounted story after story of patients believing that their diabetes was inevitable and that they were helpless to prevent or delay diabetes complications. (2) Comorbid psychosocial issues: providers described high rates of depression, anxiety, incest, abuse, and post-traumatic stress disorder among people with diabetes in this region. (3) Inter-connected social determinants interfering with diabetes care: providers identified major barriers including lack of access to providers, lack of access to transportation, food insecurity, housing insecurity, and financial insecurity. (4) Providers’ cultural understanding and recommendations: providers emphasized the importance of understanding of the values central to Appalachian culture and gave culturally attuned clinical suggestions for how to use these values when working with this population.
Evidence-based interventions tailored to Appalachian culture and training designed to increase the cultural competency and cultural humility of primary care providers may be effective approaches to reduce barriers to diabetes care in Appalachian Ohio.
We examined whether change in added sugar intake is associated with change in δ13C, a novel sugar biomarker, in thirty-nine children aged 5–10 years selected from a Colorado (USA) prospective cohort of children at increased risk for type 1 diabetes. Reported added sugar intake via FFQ and δ13C in erythrocytes were measured at two time points a median of 2 years apart. Change in added sugar intake was associated with change in the δ13C biomarker, where for every 1-g increase in added sugar intake between the two time points, there was an increase in δ13C of 0⋅0082 (P = 0⋅0053), independent of change in HbA1c and δ15N. The δ13C biomarker may be used as a measure of compliance in an intervention study of children under the age of 10 years who are at increased risk for type 1 diabetes, in which the goal was to reduce dietary sugar intake.
There is increasing evidence that both black and green tea are beneficial for prevention of cardiovascular disease (CVD). We conducted a systematic review and meta-analysis evaluating the effects of tea flavonoids on cardiovascular (CVD) and all-cause mortality outcomes.Searches across five databases including PubMed and Embase were conducted through November 2018 to identify randomized controlled trials (RCTs) and prospective cohort studies reporting cardiovascular and all-cause mortality outcomes. Two investigators independently conducted abstract and full-text screenings, data extractions, and risk of bias (ROB) assessments using the Nutrition Evidence Library Bias Assessment Tool (NEL BAT). Mixed-effects dose-response meta-regression and standard random-effects meta-analyses for outcomes with ≥ 4 studies were performed. 0 RCTs and 38 prospective cohort studies were included in the systematic review. NEL BAT scores ranged from 0–15 (0 being the lowest risk). Our linear meta-regression model showed that each cup increase in daily tea consumption (about 280 mg and 338 mg of total flavonoids for black and green tea, respectively) was associated with 3–4% lower risk of CVD mortality (predicted adjusted RR = 0.96; CI 0.93–0.99 for green tea and RR = 0.97; CI 0.94–0.99 for black tea). Furthermore, eachcup increase in daily tea consumption was associated a 2% lower risk of all-cause mortality (predicted adjusted relative risk (RR) = 0.98; 95% CI 0.97–0.99 for black tea and RR = 0.98; CI 0.96–0.99 for green tea, respectively). Two studies reported multivariable Cox regression analysis results for the relationship between black tea intake and risks of all-cause mortality outcomes. The results from these two studies were combined with our linear meta-regression result in a random-effects model meta-analysis and showed that each cup increase in daily black tea consumption was associated with an average of 3% lower risk of all-cause mortality (pooled adjusted RR = 0.97; 95% CI 0.87- 1.00) with large heterogeneity (I2 = 81.4%; p = 0.005). Current evidence indicates that increased tea consumption may reduce cardiovascular and all-cause mortality in a dose-response manner. This systematic review was registered on PROSPERO.
The material in this volume ranges from Germanic epic and early Welsh saints’ lives to twenty-first century comic books. This is characteristic of the Arthurian Literature series which since its inception in 1981 has always cast its net very widely over Western European culture. We are delighted that the founding editor, Richard Barber, has contributed a characteristically stimulating interdisciplinary study of swords belonging to Arthurian and other heroes. He himself has heroic stature in the world of Arthurian studies, both as an historian and as an editor and publisher. Andrew Rabin's discussion of Caradog's Vita Gildae throws light on the complex attitudes to Arthur of contemporaries of Geoffrey of Monmouth in a time of political turmoil in England, the Anarchy: Arthur is represented both as a tyrannical ruler and a conciliator, an ambivalence which Rabin notes in other Latin accounts of the king produced at this time. Christopher Berard also considers the use of Arthurian material for political purposes: borrowings from Geoffrey's Historia appear in a chronicle of Anglo-Scottish relations in the time of Edward I, a well-known admirer of the Arthurian legend. Berard argues that these borrowings would have appealed to the clerical élite of the time. Usha Vishnuvajjala focuses on women and their friendships in Ywain and Gawain, the only known close English adaptation of a romance by Chrétien. She argues that this text does not align with received wisdom about medieval friendship, or with conventional binaries about stereotypical gendered behaviour. Natalie Goodison considers the mixture of sacred and secular in The Turke and Gawain, and finds fascinating alchemical parallels for a puzzling beheading episode. Mary Bateman discusses the views on native and foreign sources of three sixteenth-century defenders of Arthur, both English and Welsh – John Leland, John Prise and Humphrey Llwyd – and their responses to the criticisms of Polydore Vergil.
In twentieth-century reception history, John Steinbeck was an ardent Arthurian enthusiast: Elaine Treharne and William J. Fowler look at the significance of his annotations to his copy of Malory as he worked on a modern adaptation, the posthumously published The Acts of King Arthur and his Noble Knights.
We analyzed antibiotic use data from 29 southeastern US hospitals over a 5-year period to determine changes in antibiotic use after the fluoroquinolone US Food and Drug Administration (FDA) advisory update in 2016. Fluoroquinolone use declined both before and after the FDA announcement, and the use of select, alternative antibiotics increased after the announcement.
Fluoroquinolones are among the 4 most commonly prescribed antibiotic classes.1,2 Postmarketing reports of serious adverse events linked to fluoroquinolones include tendonitis, neuropathy, hypoglycemia, psychiatric side effects, and possible aortic vessel rupture, leading to safety label changes in July 2008 and August 2013.3 In July 2016, the US Food and Drug Administration (FDA) strengthened the “black box” warning following an initial safety announcement in May 2016, recommending avoidance of fluoroquinolones for uncomplicated infections such as acute exacerbation of chronic bronchitis, uncomplicated urinary tract infections, and acute bacterial sinusitis.4 Concerns over safety and the association with Clostridiodes difficile infection have led inpatient antimicrobial stewardship programs (ASPs) to develop initiatives to promote avoidance of quinolones. The objective of this study was to quantify the effect of the 2016 FDA “black box” update on inpatient antibiotic use among a cohort of southeastern US hospitals.
To examine the relationship between protein intake and the risk of incident premenstrual syndrome (PMS).
Nested case–control study. FFQ were completed every 4 years during follow-up. Our main analysis assessed protein intake 2–4 years before PMS diagnosis (for cases) or reference year (for controls). Baseline (1991) protein intake was also assessed.
Nurses’ Health Study II (NHS2), a large prospective cohort study of registered female nurses in the USA.
Participants were premenopausal women between the ages of 27 and 44 years (mean: 34 years), without diagnosis of PMS at baseline, without a history of cancer, endometriosis, infertility, irregular menstrual cycles or hysterectomy. Incident cases of PMS (n 1234) were identified by self-reported diagnosis during 14 years of follow-up and validated by questionnaire. Controls (n 2426) were women who did not report a diagnosis of PMS during follow-up and confirmed experiencing minimal premenstrual symptoms.
In logistic regression models adjusting for smoking, BMI, B-vitamins and other factors, total protein intake was not associated with PMS development. For example, the OR for women with the highest intake of total protein 2–4 years before their reference year (median: 103·6 g/d) v. those with the lowest (median: 66·6 g/d) was 0·94 (95 % CI 0·70, 1·27). Additionally, intakes of specific protein sources and amino acids were not associated with PMS. Furthermore, results substituting carbohydrates and fats for protein were also null.
Overall, protein consumption was not associated with risk of developing PMS.