We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The authors report on ancient DNA data from two human skeletons buried within the chancel of the 1608–1616 church at the North American colonial settlement of Jamestown, Virginia. Available archaeological, osteological and documentary evidence suggest that these individuals are Sir Ferdinando Wenman and Captain William West, kinsmen of the colony's first Governor, Thomas West, Third Baron De La Warr. Genomic analyses of the skeletons identify unexpected maternal relatedness as both carried the mitochondrial haplogroup H10e. In this unusual case, aDNA prompted further historical research that led to the discovery of illegitimacy in the West family, an aspect of identity omitted, likely intentionally, from genealogical records.
The purpose of this paper is to review scientific evidence concerning pathogens that could potentially be transmitted via bovine semen. As a result of a careful analysis of the characteristics of infections that may cause transmission of disease through semen, effective control procedures can be identified that provide minimal constraint to the introduction of new bulls into herds for natural breeding and importation of valuable novel genetics through artificial insemination. The potential for transmission through bovine semen and corresponding effective control procedures are described for bovine herpesvirus 1, bovine viral diarrhea virus, bovine leukemia virus, lumpy skin disease virus, bluetongue virus, foot-and-mouth disease virus, and Schmallenberg virus. Brief consideration is also provided regarding the potential for transmission via semen of Tritrichomonas foetus, Campylobacter fetus venerealis, Brucella abortus, Leptospira spp., Histophilus somni, Ureaplasma diversum, Mycobacterium avium subsp. paratuberculosis, Chlamydiaceae, Mycobacterium bovis, Coxiella burnetii, Mycoplasma mycoides ssp. mycoides and Neospora caninum. Thoughtful and systematic control procedures can ensure the safety of introducing new bulls and cryopreserved semen into cattle production systems.
In 2010, a grower survey was administered to 1,299 growers in 22 states to determine changes in weed management in the United States from 2006 to 2009. The majority of growers had not changed weed management practices in the previous 3 yr; however, 75% reported using weed management practices targeted at glyphosate-resistant (GR) weeds. Growers were asked to rate their efforts at controlling GR weeds and rate the effectiveness of various practices for controlling/preventing GR weeds regardless of whether they were personally using them. Using the herbicide labeled rate, scouting fields, and rotating crops were among the practices considered by growers as most effective in managing GR weeds. Sixty-seven percent of growers reported effective management of GR weeds. Between the 2005 and 2010 Benchmark surveys, the frequency of growers using specific actions to manage GR weeds increased markedly. Although the relative effectiveness of practices, as perceived by growers, remained the same, the effectiveness rating of tillage and the use of residual and POST herbicides increased.
Almost 1,650 corn, cotton, and soybean growers in 22 states participated in a 2010 telephone survey to determine their attitudes with regard to which weed species were most problematic in glyphosate-resistant (GR) crop production systems for corn, cotton, and soybean. The survey is a follow-up to a previous 2005 to 2006 survey that utilized a smaller set of growers from fewer states. In general, growers continued to estimate weed populations as low and few challenges have been created following adoption of GR cropping systems. Pigweed and foxtail species were dominant overall, whereas other species were more commodity and state specific. Corn, cotton, and soybean growers cited velvetleaf, annual morningglory, and waterhemp, respectively, as predominant weeds. Growers in the South region were more likely to report pigweed and waterhemp (Amaranthus spp.), whereas growers in the East and West reported horseweed. When growers were asked with which GR weeds they had experienced personally, horseweed was reported in all regions, but growers in the South more frequently reported pigweed, whereas growers in the East and West regions more frequently reported waterhemp. Comparisons with the previous 2005 survey indicated that more growers believed they were experiencing GR weeds and were more aware of specific examples in their state. In particular, the Amaranthus complex was of greatest concern in continuously cropped soybean and cotton.
A 2010 survey of 1,299 corn, cotton, and soybean growers was conducted to determine their attitudes and awareness regarding glyphosate-resistant (GR) weeds and resultant implications on weed management practices. An additional 350 growers included in the current study participated in a 2005 survey, and these answers were compared across time so that cross-sectional and longitudinal comparisons of responses could be made. Most growers surveyed in 2010 were aware of the potential for weeds to evolve resistance to glyphosate; however, many growers were not aware of glyphosate resistance in specific weeds in their county or state. Growers in the South were different from growers in other geographic regions and were significantly more aware of local cases of GR weeds. Awareness of GR weeds did not increase appreciably from 2005 to 2010, but the percentage who reported GR weeds as problematic was significantly higher. Grower reports of GR weeds on-farm in 2010 were up considerably from 2005, with growers in the South reporting significantly more instances than growers in other regions. Growers in the South were also more likely to consider glyphosate resistance a serious problem. Overall, 30% of growers did not consider GR weeds to be a problem. It appears that most growers received information about glyphosate resistance from farm publications, although in the South this percentage was less than for other geographic regions. Growers in the South received more information from universities and extension sources.
Approximately 1,300 growers from 22 states were surveyed during 2010 to determine herbicide use. Cropping systems included continuous glyphosate-resistant corn, cotton, and soybean, and various combinations of these crops and rotations with non–glyphosate-resistant crops. The most commonly used herbicide for both fall and spring applications was glyphosate followed by synthetic auxin herbicides. Herbicide application in spring was favored over application in the fall. The percentage of growers in a glyphosate-only system was as high as 69% for some cropping systems. Excluding glyphosate, the most frequently used herbicides included photosystem II, mitotic, and protoporphyrinogen oxidase inhibitors. A higher percentage of growers integrated herbicides other than glyphosate during 2010 compared with 2005. Extensive educational efforts have promoted resistance management by increasing the diversity of herbicides in glyphosate-resistant cropping systems. However, a considerable percentage of growers continued use of only glyphosate from the period of 2005 to 2010, and this practice most likely will continue to exert a high level of selection for evolved glyphosate-resistant weed species.
The majority of children requiring emergency care are treated in general emergency departments (EDs) with variable levels of pediatric care expertise. The goal of the Translating Emergency Knowledge for Kids (TREKK) initiative is to implement the latest research in pediatric emergency medicine in general EDs to reduce clinical variation.
Objectives
To determine national pediatric information needs, seeking behaviours, and preferences of health care professionals working in general EDs.
Methods
An electronic cross-sectional survey was conducted with health care professionals in 32 Canadian general EDs. Data were collected in the EDs using the iPad and in-person data collectors.
Results
Total of 1,471 surveys were completed (57.1% response rate). Health care professionals sought information on children’s health care by talking to colleagues (n=1,208, 82.1%), visiting specific medical/health websites (n=994, 67.7%), and professional development opportunities (n=941, 64.4%). Preferred child health resources included protocols and accepted treatments for common conditions (n=969, 68%), clinical pathways and practice guidelines (n=951, 66%), and evidence-based information on new diagnoses and treatments (n=866, 61%). Additional pediatric clinical information is needed about multisystem trauma (n=693, 49%), severe head injury (n=615, 43%), and meningitis (n=559, 39%). Health care professionals preferred to receive child health information through professional development opportunities (n=1,131, 80%) and printed summaries (n=885, 63%).
Conclusion
By understanding health care professionals’ information seeking behaviour, information needs, and information preferences, knowledge synthesis and knowledge translation initiatives can be targeted to improve pediatric emergency care. The findings from this study will inform the following two phases of the TREKK initiative to bridge the research-practice gap in Canadian general EDs.
An increasing number of studies have reported a heritable component for the regulation of energy intake and eating behaviour, although the individual polymorphisms and their ‘effect size’ are not fully elucidated. The aim of the present study was to examine the relationship between specific SNP and appetite responses and energy intake in overweight men. In a randomised cross-over trial, forty overweight men (age 32 (sd 09) years; BMI 27 (sd 2) kg/m2) attended four sessions 1 week apart and received three isoenergetic and isovolumetric servings of dairy snacks or water (control) in random order. Appetite ratings were determined using visual analogue scales and energy intake at an ad libitum lunch was assessed 90 min after the dairy snacks. Individuals were genotyped for SNP in the fat mass and obesity-associated (FTO), leptin (LEP), leptin receptor (LEPR) genes and a variant near the melanocortin-4 receptor (MC4R) locus. The postprandial fullness rating over the full experiment following intake of the different snacks was 17·2 % (P= 0·026) lower in A carriers compared with TT homozygotes for rs9939609 (FTO, dominant) and 18·6 % (P= 0·020) lower in G carriers compared with AA homozygotes for rs7799039 (LEP, dominant). These observations indicate that FTO and LEP polymorphisms are related to the variation in the feeling of fullness and may play a role in the regulation of food intake. Further studies are required to confirm these initial observations and investigate the ‘penetrance’ of these genotypes in additional population subgroups.
With the substantial economic and social burden of CVD, the need to modify diet and lifestyle factors to reduce risk has become increasingly important. Milk and dairy products, being one of the main contributors to SFA intake in the UK, are a potential target for dietary SFA reduction. Supplementation of the dairy cow's diet with a source of MUFA or PUFA may have beneficial effects on consumers' CVD risk by partially replacing milk SFA, thus reducing entry of SFA into the food chain. A total of nine chronic human intervention studies have used dairy products, modified through bovine feeding, to establish their effect on CVD risk markers. Of these studies, the majority utilised modified butter as their primary test product and used changes in blood cholesterol concentrations as their main risk marker. Of the eight studies that measured blood cholesterol, four reported a significant reduction in total and LDL-cholesterol (LDL-C) following chronic consumption of modified milk and dairy products. Data from one study suggested that a significant reduction in LDL-C could be achieved in both the healthy and hypercholesterolaemic population. Thus, evidence from these studies suggests that consumption of milk and dairy products with modified fatty acid composition, compared with milk and dairy products of typical milk fat composition, may be beneficial to CVD risk in healthy and hypercholesterolaemic individuals. However, current evidence is insufficient and further work is needed to investigate the complex role of milk and cheese in CVD risk and explore the use of novel markers of CVD risk.
Dietary regulation of appetite may contribute to the prevention and management of excess body weight. The present study examined the effect of consumption of individual dairy products as snacks on appetite and subsequent ad libitum lunch energy intake. In a randomised cross-over trial, forty overweight men (age 32 (sd 9) years; BMI 27 (sd 2) kg/m2) attended four sessions 1 week apart and received three isoenergetic (841 kJ) and isovolumetric (410 ml) servings of dairy snacks or water (control) 120 min after breakfast. Appetite profile was determined throughout the morning and ad libitum energy intake was assessed 90 min after the intake of snacks. Concentrations of amino acids, glucose, insulin, ghrelin and peptide tyrosine tyrosine were measured at baseline (0 min) and 80 min after the intake of snacks. Although the results showed that yogurt had the greatest suppressive effect on appetite, this could be confounded by the poor sensory ratings of yogurt. Hunger rating was 8, 10 and 24 % (P < 0·001) lower after the intake of yogurt than cheese, milk and water, respectively. Energy intake was 11, 9 and 12 % (P < 0·02) lower after the intake of yogurt, cheese and milk, respectively, compared with water (4312 (se 226) kJ). Although there was no difference in the postprandial responses of hormones, alanine and isoleucine concentrations were higher after the intake of yogurt than cheese and milk (P < 0·05). In conclusion, all dairy snacks reduced appetite and lunch intake compared with water. Yogurt had the greatest effect on suppressing subjective appetite ratings, but did not affect subsequent food intake compared with milk or cheese.
As the incidence of obesity is reaching ‘epidemic’ proportions, there is currently widespread interest in the impact of dietary components on body-weight and food intake regulation. The majority of data available from both epidemiological and intervention studies provide evidence of a negative but modest association between milk and dairy product consumption and BMI and other measures of adiposity, with indications that higher intakes result in increased weight loss and lean tissue maintenance during energy restriction. The purported physiological and molecular mechanisms underlying the impact of dairy constituents on adiposity are incompletely understood but may include effects on lipolysis, lipogeneis and fatty acid absorption. Furthermore, accumulating evidence indicates an impact of dairy constituents, in particular whey protein derivatives, on appetite regulation and food intake. The present review summarises available data and provides an insight into the likely contribution of dairy foods to strategies aimed at appetite regulation, weight loss or the prevention of weight gain.
Three in vitro methods, one enzymic and two microbial, were applied satisfactorily to the determination of the dry matter digestibility of forages, but failed when applied to a variety of concentrate feeds. The microbial methods had the advantage that the proportion of weight lost from forages during the in vitro process approximated closely with their determined in vivo digestibilities.
The microbial method based on sheep faeces was as effective as that using rumen liquor in digesting ruminant feedstuff's in vitro. Since sheep faeces are readily obtained from intact animals, the faeces liquor method would seem to have a distinct advantage in use.
Recent research has suggested that autistic social impairment (ASI) is continuously distributed in nature and that subtle autistic-like social impairments aggregate in the family members of children with pervasive developmental disorders (PDDs). This study examined the longitudinal course of quantitatively characterized ASI in 3- to 18-year-old boys with and without PDD. We obtained assessments of 95 epidemiologically ascertained male–male twin pairs and a clinical sample of 95 affected children using the Social Responsiveness Scale (SRS), at two time points, spaced 1–5 years apart. Longitudinal course was examined as a function of age, familial loading for PDD, and autistic severity at baseline. Interindividual variation in SRS scores was highly preserved over time, with test–retest correlation of 0.90 for the entire sample. SRS scores exhibited modest general improvement over the study period; individual trajectories varied as a function of severity at baseline and were highly familial. Quantitative measurements of ASI reflect heritable traitlike characteristics. Such measurements can serve as reliable indices of phenotypic severity for genetic and neurobiologic studies, and have potential utility for ascertaining incremental response to intervention.
The use of faecal inoculum in in vitro feed evaluation methods was examined by Balfe (1985). However, there is limited information concerning the chemical composition of faeces and factors affecting this. The chemical composition of faeces may reflect the microbial population and hence its fermentative activity. A knowledge of the faecal composition is essential as this affects the quality of faecal inoculum. The objective of this work was therefore to study the relationships between diet and the chemical composition of faeces using data obtained from sheep.