We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
An accurate accounting of prior sport-related concussion (SRC) is critical to optimizing the clinical care of athletes with SRC. Yet, obtaining such a history via medical records or lifetime monitoring is often not feasible necessitating the use of self-report histories. The primary objective of the current project is to determine the degree to which athletes consistently report their SRC history on serial assessments throughout their collegiate athletic career.
Participants and Methods:
Data were obtained from the NCAA-DoD CARE Consortium and included 1621 athletes (914 male) from a single Division 1 university who participated in athletics during the 2014-2017 academic years. From this initial cohort, 752 athletes completed a second-year assessment and 332 completed a third-year assessment. Yearly assessments included a brief self-report survey that queried SRC history of the previous year. Consistency of self-reported SRC history was defined as reporting the same number of SRC on subsequent yearly evaluation as had been reported the previous year.
For every year of participation, the number of SRC reported on the baseline exam (Reported) and the number of SRC recorded by athletes and medical staff during the ensuing season (Recorded) were tabulated. In a subsequent year, the expected number of SRC (Expected) was computed as the sum of Reported and Recorded. For participation years in which Expected could be computed, the reporting deviation (RepDev) gives the difference between the number of SRCs which were expected to be reported at a baseline exam based on previous participation year data and the number of SRCs which was actually reported by the athlete or medical record during the baseline exam. The reporting deviation was computed only for those SRC that occurred while the participant was enrolled in the current study (RepDevSO). Oneway intraclass correlations (ICC) were computed between the expected and reported numbers of SRC.
Results:
341 athletes had a history of at least one SRC and 206 of those (60.4%) had a RepDev of 0. The overall ICC for RepDev was 0.761 (95% CI 0.73-0.79). The presence of depression (ICC 0.87, 95% CI 0.79-0.92) and loss of consciousness (ICC 0.80, 95% CI 0.720.86) were associated with higher ICCs compared to athletes without these variables. Female athletes demonstrated higher self-report consistency (ICC 0.82, 95% CI 0.79-0.85) compared to male athletes (ICC 0.72, 95% CI 0.68-0.76). Differences in the classification of RepDev according to sex and sport were found to be significant (x2=77.6, df=56, p=0.03). The sports with the highest consistency were Women’s Tennis, Men’s Diving, and Men’s Tennis with 100% consistency between academic years. Sports with the lowest consistency were Women’s Gymnastics (69%), Men’s Lacrosse (70%), and Football (72%). 96 athletes had at least one study-only SRC in the previous year and 69 of those (71.9%) had a RepDevSO of 0 (ICC 0.673, 95% CI 0.64-0.71).
Conclusions:
Approximately 40% of athletes do not consistently report their SRC history, potentially further complicating the clinical management of SRC. These findings encourage clinicians to be aware of factors which could influence the reliability of self-reported SRC history.
In difficult-to-treat depression (DTD) the outcome metrics historically used to evaluate treatment effectiveness may be suboptimal. Metrics based on remission status and on single end-point (SEP) assessment may be problematic given infrequent symptom remission, temporal instability, and poor durability of benefit in DTD.
Methods
Self-report and clinician assessment of depression symptom severity were regularly obtained over a 2-year period in a chronic and highly treatment-resistant registry sample (N = 406) receiving treatment as usual, with or without vagus nerve stimulation. Twenty alternative metrics for characterizing symptomatic improvement were evaluated, contrasting SEP metrics with integrative (INT) metrics that aggregated information over time. Metrics were compared in effect size and discriminating power when contrasting groups that did (N = 153) and did not (N = 253) achieve a threshold level of improvement in end-point quality-of-life (QoL) scores, and in their association with continuous QoL scores.
Results
Metrics based on remission status had smaller effect size and poorer discrimination of the binary QoL outcome and weaker associations with the continuous end-point QoL scores than metrics based on partial response or response. The metrics with the strongest performance characteristics were the SEP measure of percentage change in symptom severity and the INT metric quantifying the proportion of the observation period in partial response or better. Both metrics contributed independent variance when predicting end-point QoL scores.
Conclusions
Revision is needed in the metrics used to quantify symptomatic change in DTD with consideration of INT time-based measures as primary or secondary outcomes. Metrics based on remission status may not be useful.
Axisymmetric standing waves occur across a wide range of free surface flows. When these waves reach a critical height (steepness), wave breaking and jet formation occur. For travelling surface gravity waves, wave breaking is generally considered to limit wave height and reversible wave motion. In the ocean, the behaviour of directionally spread waves lies between the limits of purely travelling (two dimensions) and axisymmetric (three dimensions). Hence, understanding wave breaking and jet formation on axisymmetric surface gravity waves is an important step in understanding extreme and breaking waves in the ocean. We examine an example of axisymmetric wave breaking and jet formation colloquially known as the ‘spike wave’, created in the FloWave circular wave tank at the University of Edinburgh, UK. We generate this spike wave with maximum crest amplitudes of 0.15–6.0 m (0.024–0.98 when made non-dimensional by characteristic radius), with wave breaking occurring for crest amplitudes greater than 1.0 m (0.16 non-dimensionalised). Unlike two-dimensional travelling waves, wave breaking does not limit maximum crest amplitude, and our measurements approximately follow the jet height scaling proposed by Ghabache et al. (J. Fluid Mech., vol. 761, 2014, pp. 206–219) for cavity collapse. The spike wave is predominantly created by linear dispersive focusing. A trough forms, then collapses producing a jet, which is sensitive to the trough's shape. The evolution of the jets that form in our experiments is predicted well by the hyperbolic jet model proposed by Longuet–Higgins (J. Fluid Mech., vol. 127, 1983, pp. 103–121), previously applied to jets forming on bubbles.
Fibricola and Neodiplostomum are diplostomid genera with very similar morphology that are currently separated based on their definitive hosts. Fibricola spp. are normally found in mammals, while Neodiplostomum spp. typically parasitize birds. Previously, no DNA sequence data was available for any member of Fibricola. We generated nuclear ribosomal and mtDNA sequences of Fibricola cratera (type-species), Fibricola lucidum and 6 species of Neodiplostomum. DNA sequences were used to examine phylogenetic interrelationships among Fibricola and Neodiplostomum and re-evaluate their systematics. Molecular phylogenies and morphological study suggest that Fibricola should be considered a junior synonym of Neodiplostomum. Therefore, we synonymize the two genera and transfer all members of Fibricola into Neodiplostomum. Specimens morphologically identified as Neodiplostomum cratera belonged to 3 distinct phylogenetic clades based on mitochondrial data. One of those clades also included sequences of specimens identified morphologically as Neodiplostomum lucidum. Further study is necessary to resolve the situation regarding the morphology of N. cratera. Our results demonstrated that some DNA sequences of N. americanum available in GenBank originate from misidentified Neodiplostomum banghami. Molecular phylogentic data revealed at least 2 independent host-switching events between avian and mammalian hosts in the evolutionary history of Neodiplostomum; however, the directionality of these host-switching events remains unclear.
Approximately one-third of individuals in a major depressive episode will not achieve sustained remission despite multiple, well-delivered treatments. These patients experience prolonged suffering and disproportionately utilize mental and general health care resources. The recently proposed clinical heuristic of ‘difficult-to-treat depression’ (DTD) aims to broaden our understanding and focus attention on the identification, clinical management, treatment selection, and outcomes of such individuals. Clinical trial methodologies developed to detect short-term therapeutic effects in treatment-responsive populations may not be appropriate in DTD. This report reviews three essential challenges for clinical intervention research in DTD: (1) how to define and subtype this heterogeneous group of patients; (2) how, when, and by what methods to select, acquire, compile, and interpret clinically meaningful outcome metrics; and (3) how to choose among alternative clinical trial design options to promote causal inference and generalizability. The boundaries of DTD are uncertain, and an evidence-based taxonomy and reliable assessment tools are preconditions for clinical research and subtyping. Traditional outcome metrics in treatment-responsive depression may not apply to DTD, as they largely reflect the only short-term symptomatic change and do not incorporate durability of benefit, side effect burden, or sustained impact on quality of life or daily function. The trial methodology will also require modification as trials will likely be of longer duration to examine the sustained impact, raising complex issues regarding control group selection, blinding and its integrity, and concomitant treatments.
To describe the cumulative seroprevalence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies during the coronavirus disease 2019 (COVID-19) pandemic among employees of a large pediatric healthcare system.
Design, setting, and participants:
Prospective observational cohort study open to adult employees at the Children’s Hospital of Philadelphia, conducted April 20–December 17, 2020.
Methods:
Employees were recruited starting with high-risk exposure groups, utilizing e-mails, flyers, and announcements at virtual town hall meetings. At baseline, 1 month, 2 months, and 6 months, participants reported occupational and community exposures and gave a blood sample for SARS-CoV-2 antibody measurement by enzyme-linked immunosorbent assays (ELISAs). A post hoc Cox proportional hazards regression model was performed to identify factors associated with increased risk for seropositivity.
Results:
In total, 1,740 employees were enrolled. At 6 months, the cumulative seroprevalence was 5.3%, which was below estimated community point seroprevalence. Seroprevalence was 5.8% among employees who provided direct care and was 3.4% among employees who did not perform direct patient care. Most participants who were seropositive at baseline remained positive at follow-up assessments. In a post hoc analysis, direct patient care (hazard ratio [HR], 1.95; 95% confidence interval [CI], 1.03–3.68), Black race (HR, 2.70; 95% CI, 1.24–5.87), and exposure to a confirmed case in a nonhealthcare setting (HR, 4.32; 95% CI, 2.71–6.88) were associated with statistically significant increased risk for seropositivity.
Conclusions:
Employee SARS-CoV-2 seroprevalence rates remained below the point-prevalence rates of the surrounding community. Provision of direct patient care, Black race, and exposure to a confirmed case in a nonhealthcare setting conferred increased risk. These data can inform occupational protection measures to maximize protection of employees within the workplace during future COVID-19 waves or other epidemics.
In response to advancing clinical practice guidelines regarding concussion management, service members, like athletes, complete a baseline assessment prior to participating in high-risk activities. While several studies have established test stability in athletes, no investigation to date has examined the stability of baseline assessment scores in military cadets. The objective of this study was to assess the test–retest reliability of a baseline concussion test battery in cadets at U.S. Service Academies.
Methods:
All cadets participating in the Concussion Assessment, Research, and Education (CARE) Consortium investigation completed a standard baseline battery that included memory, balance, symptom, and neurocognitive assessments. Annual baseline testing was completed during the first 3 years of the study. A two-way mixed-model analysis of variance (intraclass correlation coefficent (ICC)3,1) and Kappa statistics were used to assess the stability of the metrics at 1-year and 2-year time intervals.
Results:
ICC values for the 1-year test interval ranged from 0.28 to 0.67 and from 0.15 to 0.57 for the 2-year interval. Kappa values ranged from 0.16 to 0.21 for the 1-year interval and from 0.29 to 0.31 for the 2-year test interval. Across all measures, the observed effects were small, ranging from 0.01 to 0.44.
Conclusions:
This investigation noted less than optimal reliability for the most common concussion baseline assessments. While none of the assessments met or exceeded the accepted clinical threshold, the effect sizes were relatively small suggesting an overlap in performance from year-to-year. As such, baseline assessments beyond the initial evaluation in cadets are not essential but could aid concussion diagnosis.
A greater understanding of the rumen microbiota and its function may help find new strategies to improve feed efficiency in cattle. This study aimed to investigate whether the cattle breed affects specific ruminal taxonomic microbial groups and functions associated with feed conversion ratio (FCR), using two genetically related Angus breeds as a model. Total RNA was extracted from 24 rumen content samples collected from purebred Black and Red Angus bulls fed the same forage diet and then subjected to metatranscriptomic analysis. Multivariate discriminant analysis (sparse partial least square discriminant analysis (sPLS-DA)) and analysis of composition of microbiomes were conducted to identify microbial signatures characterizing Black and Red Angus cattle. Our analyses revealed relationships among bacterial signatures, host breeds and FCR. Although Black and Red Angus are genetically similar, sPLS-DA detected 25 bacterial species and 10 functions that differentiated the rumen microbial signatures between those two breeds. In Black Angus, we identified bacterial taxa Chitinophaga pinensis, Clostridium stercorarium and microbial functions with large and small subunits ribosomal proteins L16 and S7 exhibiting a higher abundance in the rumen microbiome. In Red Angus, nonetheless, we identified the poorly characterized bacterial taxon Oscillibacter valericigenes with a higher abundance and pathways related to carbohydrate metabolism. Analysis of composition of microbiomes revealed that C. pinensis and C. stercorarium exhibited a higher abundance in Black Angus compared to Red Angus associated with FCR, suggesting that these bacterial species may play a key role in the feed conversion efficiency of forage-fed bulls. This study highlights how the discovery of signatures of bacterial taxa and their functions can be used to harness the full potential of the rumen microbiome in Angus cattle.
Ruminants are unique among livestock due to their ability to efficiently convert plant cell wall carbohydrates into meat and milk. This ability is a result of the evolution of an essential symbiotic association with a complex microbial community in the rumen that includes vast numbers of bacteria, methanogenic archaea, anaerobic fungi and protozoa. These microbes produce a diverse array of enzymes that convert ingested feedstuffs into volatile fatty acids and microbial protein which are used by the animal for growth. Recent advances in high-throughput sequencing and bioinformatic analyses have helped to reveal how the composition of the rumen microbiome varies significantly during the development of the ruminant host, and with changes in diet. These sequencing efforts are also beginning to explain how shifts in the microbiome affect feed efficiency. In this review, we provide an overview of how meta-omics technologies have been applied to understanding the rumen microbiome, and the impact that diet has on the rumen microbial community.
Freak or rogue waves are so called because of their unexpectedly large size relative to the population of smaller waves in which they occur. The 25.6 m high Draupner wave, observed in a sea state with a significant wave height of 12 m, was one of the first confirmed field measurements of a freak wave. The physical mechanisms that give rise to freak waves such as the Draupner wave are still contentious. Through physical experiments carried out in a circular wave tank, we attempt to recreate the freak wave measured at the Draupner platform and gain an understanding of the directional conditions capable of supporting such a large and steep wave. Herein, we recreate the full scaled crest amplitude and profile of the Draupner wave, including bound set-up. We find that the onset and type of wave breaking play a significant role and differ significantly for crossing and non-crossing waves. Crucially, breaking becomes less crest-amplitude limiting for sufficiently large crossing angles and involves the formation of near-vertical jets. In our experiments, we were only able to reproduce the scaled crest and total wave height of the wave measured at the Draupner platform for conditions where two wave systems cross at a large angle.
Moringa oleifera seeds are currently being used as a livestock feed across tropical regions of the world due to its availability and palatability. However, limited knowledge exists on the effects of the raw seeds on ruminant metabolism. As such, the rumen stimulation technique was used to evaluate the effects of substituting increasing concentrations of ground Moringa seeds (0, 100, 200 and 400 g/kg concentrate dry matter (DM)) in the diet on rumen fermentation and methane production. Two identical, Rusitec apparatuses, each with eight fermenters were used with the first 8 days used for adaptation and days 9 to 16 used for measurements. Fermenters were fed a total mixed ration with Urochloa brizantha as the forage. Disappearance of DM, CP, NDF and ADF linearly decreased (P<0.01) with increasing concentrations of Moringa seeds in the diet. Total volatile fatty acid production and the acetate to propionate ratio were also linearly decreased (P<0.01). However, only the 400 g/kg (concentrate DM basis) treatment differed (P<0.01) from the control. Methane production (%), total microbial incorporation of 15N and total production of microbial N linearly decreased (P<0.01) as the inclusion of Moringa seeds increased. Though the inclusion of Moringa seeds in the diet decreased CH4 production, this arose from an unfavourable decrease in diet digestibility and rumen fermentation parameters.
For sufficiently directionally spread surface gravity wave groups, the set-down of the wave-averaged free surface, first described by Longuet-Higgins and Stewart (J. Fluid Mech. vol. 13, 1962, pp. 481–504), can turn into a set-up. Using a multiple-scale expansion for two crossing wave groups, we examine the structure and magnitude of this wave-averaged set-up, which is part of a crossing wave pattern that behaves as a modulated partial standing wave: in space, it consists of a rapidly varying standing-wave pattern slowly modulated by the product of the envelopes of the two groups; in time, it grows and decays on the slow time scale associated with the translation of the groups. Whether this crossing wave pattern actually enhances the surface elevation at the point of focus depends on the phases of the linear wave groups, unlike the set-down, which is always negative and inherits the spatial structure of the underlying envelope(s). We present detailed laboratory measurements of the wave-averaged free surface, examining both single wave groups, varying the degree of spreading from small to very large, and the interaction between two wave groups, varying both the degree of spreading and the crossing angle between the groups. In both cases, we find good agreement between the experiments, our simple expressions for the set-down and set-up, and existing second-order theory based on the component-by-component interaction of individual waves with different frequencies and directions. We predict and observe a set-up for wave groups with a Gaussian angular amplitude distribution with standard deviations of above $30{-}40^{\circ }$ ($21{-}28^{\circ }$ for energy spectra), which is relatively large for realistic sea states, and for crossing sea states with angles of separation of $50{-}70^{\circ }$ and above, which are known to occur in the ocean.
In the near future, ruminants may be forced to consume low-quality water since potable drinking water will become increasingly scarce in some regions of the world. A completely randomized design trial was completed to evaluate the effect of increasing concentrations of total dissolved salts (TDS) (640, 3187, 5740 and 8326 mg TDS/l) in drinking water on the performance, diet digestibility, microbial protein synthesis, nitrogen (N) and water balance using 24 Red Sindhi heifers (200 ± 5 kg) that were fed Buffel (Cenchrus ciliaris) grass hay and concentrate in a ratio of 50 : 50. After a 15-day diet adaptation period, the digestion study was completed over a 5-day period and the performance trial was completed over a 56-day period. Dry matter intake, average daily gain, feed:gain, intake and digestibility of most feed components were unaffected by the concentration of salt in the water. However, intake and digestibility of neutral detergent fibre declined linearly as TDS inclusion rate increased. Further, the inclusion of TDS resulted in a linear increase in the intake of drinking water and total (food plus drinking) water intake. Similarly, TDS inclusion levels resulted in a linear increase in total water excretion, with urine being the major route of water excretion. In contrast, increasing concentrations of TDS caused a linear decrease in creatinine and allantoin excretions. Finally, increasing the inclusion rate of TDS resulted in a linear decrease in N retention and a linear increase in urinary N excretion, which may pose a considerable challenge for farmers with respect to the reduction and management of nutrient losses.
A number of sophisticated modelling approaches are available to investigate potential associations between antimicrobial use (AMU) and resistance (AMR) in animal health settings. All have their advantages and disadvantages, making it unclear as to which model is most appropriate. We used advanced regression modelling to investigate AMU-AMR associations in faecal non-type-specific Escherichia coli (NTSEC) isolates recovered from 275 pens of feedlot cattle. Ten modelling strategies were employed to investigate AMU associations with resistance to chloramphenicol, ampicillin, sulfisoxazole, tetracycline and streptomycin. Goodness-of-fit statistics did not show a consistent advantage for any one model type. Three AMU-AMR associations were significant in all models. Recent parenteral tetracycline use increased the odds of finding tetracycline-resistant NTSEC [odds ratios (OR) 1·1–3·2]; recent parenteral sulfonamide use increased the odds of finding sulfisoxazole-resistant NTSEC (OR 1·4–2·5); and recent parenteral macrolide use decreased the odds of recovering ampicillin-resistant NTSEC (OR 0·03–0·2). Other results varied markedly depending on the modelling approach, emphasizing the importance of exploring and reporting multiple modelling methods based on a balanced consideration of important factors such as study design, mathematical appropriateness, research question and target audience.
The objectives of this study were to determine: (1) the effect of wheat dried distillers grain with solubles (DDGS) inclusion, and (2) dietary feed enzyme (FE; Econase XT) supplementation in a finishing diet containing wheat DDGS on fatty acid profile of the pars costalis diaphragmatis muscle of beef cattle. A total of 160 crossbred yearling steers with initial BW of 495±38 kg were blocked by BW and randomized into 16 pens (10 head/pen). The pens were randomly assigned to one of the four treatments: (1) control (CON; 10% barley silage and 90% barley grain-based concentrate, dry matter (DM) basis); (2) diet containing 30% wheat DDGS in place of barley grain without FE (WDG); (3) WDG diet supplemented with low FE (WDGL; 1 ml FE/kg DM); and (4) WDG diet supplemented with high FE (2 ml FE/kg DM). The pars costalis diaphragmatis muscle samples were collected from cattle at slaughter at the end of the finishing period (120 days) with a targeted live weight of 650 kg. No differences in organic matter intake, final BW and average daily gain were observed among treatments. However, steers fed WDG had greater (P<0.01) feed conversion ratio than those fed CON, and increasing FE application in wheat DDGS-based diets tended (P<0.10) to linearly decrease feed conversion ratio. In assessing the effects of including WDG diets without FE, concentration of total polyunsaturated fatty acids (PUFA) in muscle tended to be greater (P<0.10) for steers fed WDG than steers fed CON. In addition, inclusion of wheat DDGS into the diet increased (P<0.05) concentration of CLA and vaccenic acid (VA) in muscle and also resulted in a higher (P<0.05) ratio of n-6/n-3 PUFA compared with that from steers fed CON diet. Increasing FE application in wheat DDGS-based diets did not modify the concentrations of individual or total fatty acids. These results suggest that inclusion of wheat DDGS in finishing diets may improve fatty acid profile of beef muscle which could benefit human health.
In vitro batch cultures were used to screen four fibrolytic enzyme mixtures at two dosages added to a 60 : 40 silage : concentrate diet containing the C4 tropical grass Andropogon gayanus grass ensiled at two maturities – vegetative stage (VS) and flowering stage (FS). Based on these studies, one enzyme mixture was selected to treat the same diets and evaluate its impact on fermentation using an artificial rumen (Rusitec). In vitro batch cultures were conducted as a completely randomized design with two runs, four replicates per run and 12 treatments in a factorial arrangement (four enzyme mixtures×three doses). Enzyme additives (E1, E2, E3 and E4) were commercial products and contained a range of endoglucanase, exoglucanase and xylanase activities. Enzymes were added to the complete diet 2 h before incubation at 0, 2 and 4 μl/g of dry matter (DM). Gas production (GP) was measured after 3, 6, 12, 24 and 48 h of incubation. Disappearance of DM (DMD), NDF (NDFD) and ADF (ADFD) were determined after 24 and 48 h. For all four enzyme mixtures, a dosage effect (P<0.05) was observed for NDFD and ADFD after 24 h and for DMD, NDFD and ADFD after 48 h of incubation of the VS diet. For the FS diet, a dosage effect was observed for GP and NDFD after 24 h and for GP, DMD, NDFD and ADFD after 48 h of incubation. There was no difference among enzyme mixtures nor was there an enzyme×dose interaction for the studied parameters. Because of the greatest numerical effect on NDF disappearance and the least cost price, enzyme mixture E2 at 4 µl/g of diet DM was selected for the Rusitec experiment. The enzyme did not impact (P>0.05) DM, N, NDF or ADF disappearance after 48 h of incubation nor daily ammonia-N, volatile fatty acids or CH4 production. However, enzyme application increased (P<0.05) microbial N production in feed particle-associated (loosely-associated) and silage feed particle-bound (firmly associated) fractions. With A. gayanus silage diets, degradation may not be limited by microbial colonization, but rather by the ability of fibrolytic enzymes to degrade plant cell walls within this recalcitrant forage.
The objective of this study was to develop emission factors (EF) for methane (CH4) emissions from enteric fermentation in cattle native to Benin. Information on livestock characteristics and diet practices specific to the Benin cattle population were gathered from a variety of sources and used to estimate EF according to Tier 2 methodology of the 2006 Intergovernmental Panel on Climate Change (IPCC) Guidelines for National Greenhouse Gas Inventories. Most cattle from Benin are Bos taurus represented by Borgou, Somba and Lagune breeds. They are mainly multi-purpose, being used for production of meat, milk, hides and draft power and grazed in open pastures and crop lands comprising tropical forages and crops. Estimated enteric CH4 EFs varied among cattle breeds and subcategory owing to differences in proportions of gross energy intake expended to meet maintenance, production and activity. EFs ranged from 15.0 to 43.6, 16.9 to 46.3 and 24.7 to 64.9 kg CH4/head per year for subcategories of Lagune, Somba and Borgou cattle, respectively. Average EFs for cattle breeds were 24.8, 29.5 and 40.2 kg CH4/head per year for Lagune, Somba and Borgou cattle, respectively. The national EF for cattle from Benin was 39.5 kg CH4/head per year. This estimated EF was 27.4% higher than the default EF suggested by IPCC for African cattle with the exception of dairy cattle. The outcome of the study underscores the importance of obtaining country-specific EF to estimate global enteric CH4 emissions.
The current study compared beef production, quality and fatty acid (FA) profiles of yearling steers fed a control diet containing 70 : 30 red clover silage (RCS) : barley-based concentrate, a diet containing 11% sunflower seed (SS) substituted for barley, and diets containing SS with15% or 30% wheat dried distillers’ grain with solubles (DDGS). Additions of DDGS were balanced by reductions in RCS and SS to maintain crude fat levels in diets. A total of two pens of eight animals were fed per diet for an average period of 208 days. Relative to the control diet, feeding the SS diet increased (P<0.05) average daily gain, final live weight and proportions of total n-6 FA, non-conjugated 18:2 biohydrogenation products (i.e. atypical dienes) with the first double bond at carbon 8 or 9 from the carboxyl end, conjugated linoleic acid isomers with the first double bond from carbon 7 to 10 from the carboxyl end, t-18:1 isomers, and reduced (P<0.05) the proportions of total n-3 FA, conjugated linolenic acids, branched-chain FA, odd-chain FA and 16:0. Feeding DDGS-15 and DDGS-30 diets v. the SS diet further increased (P<0.05) average daily gains, final live weight, carcass weight, hot dressing percentage, fat thickness, rib-eye muscle area, and improved instrumental and sensory panel meat tenderness. However, in general feeding DGGS-15 or DDGS-30 diets did not change FA proportions relative to feeding the SS diet. Overall, adding SS to a RCS-based diet enhanced muscle proportions of 18:2n-6 biohydrogenation products, and further substitutions of DDGS in the diet improved beef production, and quality while maintaining proportions of potentially functional bioactive FA including vaccenic and rumenic acids.
Forage sorghum (FS) (Sorghum bicolor (L.) Moench) is a key feed source for ruminants owing to its high yield and drought tolerance. The present paper assessed the agronomic characteristics, silage quality, intake and digestibility of five new Brazilian sorghum cultivars (BRS Ponta Negra variety, BRS 655 hybrid, BR 601 hybrid, BRS 506 variety and BRS 610 hybrid). Forages were grown (randomized complete block design) in a typical Brazilian north-eastern semi-arid climate, irrigated with 267 mm water, harvested as plants reached the soft dough stage of grain maturity and ensiled under laboratory and farm conditions. Apparent digestibility of the silages was determined using 25 Santa Inês lambs. BRS 506 outperformed the other cultivars in dry matter (DM) and digestible DM yields/ha. BRS 506 exhibited the lowest neutral detergent fibre (NDF) and acid detergent fibre (ADF) contents and the highest in vitro dry matter digestibility (IVDMD) of the cultivars examined. BRS 655 produced the lowest level of lactic acid and the highest pH and ammonia-N concentration. There was no difference in intake or digestibility of DM among cultivars. Silages produced from BRS Ponta Negra resulted in higher crude protein (CP) intake than BRS 655. Silages made from BRS 506 and BRS Ponta Negra resulted in a greater digestibility of CP than those produced from BRS 655. Intake of NDF in silages generated from BRS Ponta Negra and BRS 610 was higher than that found in other cultivars. Although an average Brazilian North-eastern FS exhibited similar characteristics to other cultivars grown in dry regions around the world, the results indicated that BRS 506 had a yield advantage and higher nutritive value under Brazilian semi-arid conditions as compared to the other cultivars examined.
Twenty ruminally cannulated beef heifers were fed a high corn grain diet in a randomized block design to determine the effect of three direct fed microbial (DFM) strains of Propionibacterium on ruminal fermentation, nutrient digestibility and methane (CH4) emissions. The heifers were blocked in five groups on the basis of BW and used in five 28-day periods. Dietary treatments included (1) Control and three strains of Propionibacterium (2) P169, (3) P5, and (4) P54. Strains were administered directly into the rumen at 5×109 CFU with 10 g of a maltodextrin carrier in a gel capsule; Control heifers received carrier only. All heifers were fed the basal diet (10 : 90 forage to concentrate, dry matter basis). Rumen contents were collected on days 15 and 18, ruminal pH was measured continuously between days 15 and 22, enteric CH4 emissions were measured between days 19 and 22 and diet digestibility was measured from days 25 to 28. Mean ruminal pH was 5.91 and was not affected by treatments. Similarly, duration of time that pH<5.8 and 5.6 was not affected by treatment. Likewise, total and major volatile fatty acid profiles were similar among all treatments. No effects were observed on dry matter intake and total tract digestibility of nutrients. Total enteric CH4 production (g/day) was not affected by Propionibacterium strains and averaged 139 g/day. Similarly, mean CH4 yield (g CH4/kg of dry matter intake) was similar for all the treatments. The relative abundance of total Propionibacteria in the rumen increased with administration of DFM and were greater 3 h post-dosing relative to Control, but returned to baseline levels before feeding. Populations of Propionibacterium P169 were higher at 3 and 9 h as compared with the levels at 0 h. In conclusion, moderate persistency of the inoculated strains within the ruminal microbiome and pre-existing high propionate production due to elevated levels of starch fermentation might have reduced the efficacy of Propionibacterium strains to increase molar proportion of propionate and subsequently reduce CH4 emissions.