We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We present the serendipitous radio-continuum discovery of a likely Galactic supernova remnant (SNR) G305.4–2.2. This object displays a remarkable circular symmetry in shape, making it one of the most circular Galactic SNRs known. Nicknamed Teleios due to its symmetry, it was detected in the new Australian Square Kilometre Array Pathfinder (ASKAP) Evolutionary Map of the Universe (EMU) radio–continuum images with an angular size of 1 320$^{\prime\prime}$$\times$1 260$^{\prime\prime}$ and PA = 0$^\circ$. While there is a hint of possible H$\alpha$ and gamma-ray emission, Teleios is exclusively seen at radio–continuum frequencies. Interestingly, Teleios is not only almost perfectly symmetric, but it also has one of the lowest surface brightnesses discovered among Galactic SNRs and a steep spectral index of $\alpha$=–0.6$\pm$0.3. Our best estimates from Hi studies and the $\Sigma$–D relation place Teleios as a type Ia SNR at a distance of either $\sim$2.2 kpc (near-side) or $\sim$7.7 kpc (far-side). This indicates two possible scenarios, either a young (under 1 000 yr) or a somewhat older SNR (over 10 000 yr). With a corresponding diameter of 14/48 pc, our evolutionary studies place Teleios at the either early or late Sedov phase, depending on the distance/diameter estimate. However, our modelling also predicts X-ray emission, which we do not see in the present generation of eROSITA images. We also explored a type Iax explosion scenario that would point to a much closer distance of $\lt$1 kpc and Teleios size of only $\sim$3.3 pc, which would be similar to the only known type Iax remnant SN1181. Unfortunately, all examined scenarios have their challenges, and no definitive Supernova (SN) origin type can be established at this stage. Remarkably, Teleios has retained its symmetrical shape as it aged even to such a diameter, suggesting expansion into a rarefied and isotropic ambient medium. The low radio surface brightness and the lack of pronounced polarisation can be explained by a high level of ambient rotation measure (RM), with the largest RM being observed at Teleios’s centre.
The prevalence of ADHD diagnoses more than doubled in VA settings between 2009 and 2016 (Hale et al., 2020). However, attentional difficulties are not exclusive to ADHD and can also be seen in non-neurodevelopmental disorders, including depression, anxiety, substance use, and PTSD (Marshall et al., 2018, Suhr et al., 2008). Further, patients can easily feign symptoms of ADHD with few available instruments for accurate detection (Robinson & Rogers, 2018). Given the significant symptom overlap and rising rates of reported ADHD among Veterans, accurate detection of feigned ADHD is essential.
This study examined the utility of the experimental Dissimulation ADHD scale (Ds-ADHD; Robinson & Rogers, 2018) on the MMPI-2, in detecting feigned ADHD presentation within a mixed sample of Veterans.
Participants and Methods:
In this retrospective study, 173 Veterans (Mage = 36.18, SDage = 11.10, Medu = 14.01, SDedu = 2.11, 88% male, 81% White, and 17% Black) were referred for neuropsychological evaluation of ADHD that included the MMPI-2 and up to 10 PVTs. Participants were assigned to a credible group (n=146) if they passed all PVTs or a non-credible group (n=27) if they failed two or more PVTs. Group assignment was also clinically confirmed. The Ds-ADHD was used to differentiate groups who either had credible or non-credible performance on cognitive measures. Consistent with Robinson and Rogers’ study, “true” answers (i.e., erroneous stereotypes) were coded as 1 and “false” answers were coded as 2, creating a 10- to 20-point scale. Lower scores were associated with a higher likelihood of a feigned ADHD presentation.
Results:
Preliminary analyses revealed no significant group differences in age, education, race, or gender (ps > .05). An ANOVA indicated a significant difference between groups (F[1, 171] = 10.44, p = .001; Cohen’s d = .68) for Ds-ADHD raw scores; Veterans in the non-credible group reported more “erroneous stereotypes” of ADHD (M raw score = 13.33, SD = 2.20) than those in the credible group (M = 14.82, SD = 2.20). A ROC analysis indicated AUC of .691 (95% CI = .58 to .80). In addition, a cut score of <12 resulted in specificity of 91.8% and sensitivity of 18.5%, whereas a cut score of <13 resulted in specificity of 83.6% and sensitivity of 44.4%.
Conclusions:
The Ds-ADHD scale demonstrated significant differences between credible and non-credible respondents in a real-world setting. Previously, this scale has primarily been studied within laboratory settings. Further, results indicate a cut score of <12 could be used in order to achieve adequate specificity (i.e., >90%), which were similar findings to a study examining SVT-based groups (Winiarski et al., 2023). These results differ slightly from prior research by Robinson and Rogers (2018), who indicated a cut score of <13 based on the initial simulation-based study. In similar clinical settings, where there are high rates of psychiatric comorbidity, a cut score of <12 may prove clinically useful. However, this cut-score was associated with low sensitivity within this mixed Veteran sample. Further research should focus on replicating findings within other clinical settings, including ones with larger non-credible samples.
Accurate identification of Attention-Deficit/Hyperactivity Disorder (ADHD) is complicated by possible secondary gain, overlap of symptoms with psychiatric disorders, and face validity of measures (Suhr et al., 2011; Shura et al., 2017). To assist with diagnostic clarification, an experimental Dissimulation ADHD scale (Ds-ADHD; Robinson & Rogers, 2018) on the MMPI-2 was found to distinguish credible from non-credible respondents defined by Performance Validity Test (PVT)-based group assignment in Veterans (Burley et al., 2023). However, symptom and performance validity have been understood as unique constructs (Van Dyke et al., 2013), with Symptom Validity Tests (SVTs) more accurately identifying over-reporting of symptoms in ADHD (White et al., 2022). The current study sought to evaluate the effectiveness of the Ds-ADHD scale using an SVT, namely the Infrequency Index of CAARS (CII; Suhr et al., 2011), for group assignment within a mixed sample of Veterans.
Participants and Methods:
In this retrospective study, 187 Veterans (Mage = 36.76, SDage = 11.25, Medu = 14.02, SDedu = 2.10, 83% male, 19% black, 78% white) were referred for neuropsychological evaluation of ADHD and administered a battery that included internally consistent MMPI-2 and CAARS profiles. Veterans were assigned to a credible group (n=134) if CII was <21 or a non-credible group (n=53) if CII was >21. The Ds-ADHD scale was calculated for the MMPI-2. Consistent with Robinson and Rogers (2018), “true” answers (i.e., erroneous stereotypes) were coded as 1 and “false” answers were coded as 2, creating a 10- to 20-point scale. Lower scores were associated with a higher likelihood of a feigned ADHD presentation.
Results:
Analyses revealed no significant differences in age, education, race, or gender (ps > .05) between credible and non-credible groups. An ANOVA indicated a significant difference between groups (F[1,185] = 24.78, p <.001; Cohen’s d = 0.80) for Ds-ADHD raw scores. Veterans in the non-credible group reported more “erroneous stereotypes” of ADHD (M raw score = 13.23, SD = 2.10) than those in the credible group (M = 14.94, SD = 2.13). A ROC analysis indicated AUC of .72 (95% CI = .64 to .80). In addition, a Ds-ADHD cut score of <12 resulted in specificity of 94.5% and sensitivity of 22.6%, whereas a cut score of <13 resulted in specificity of 85.8% and sensitivity of 50.9%. When analyzing other CII cut scores recommended in the literature, results were essentially similar. Specifically, analyses were repeated when group assignment was defined by cut score of CII<18 and by removing an intermediate group (CII = 18 to 21; n=24).
Conclusions:
The Ds-ADHD scale demonstrated significant differences between credible and non-credible respondents in a Veteran population. Results suggest a cut score of <12 had adequate specificity (.95) with low sensitivity (.23). This is consistent with findings using PVTs for group assignment that indicated a cut score of <12 had adequate specificity (.92) with low sensitivity (.19; Burley et al., 2023). Taken together, findings suggest that the Ds-ADHD scale demonstrates utility in the dissociation of credible from non-credible responding. Further research should evaluate the utility of the scale in other clinical populations.
The MMPI-2-RF contains scales that assess different types of invalid response styles, especially potential symptom over-reporting (e.g., F-r, Fs, Fp-r, FBS-r, RBS). However, these scales are not designed to specifically capture noncredible symptoms reports associated with Attention-Deficit/Hyperactivity Disorder (ADHD). Robinson & Rogers (2018) proposed the experimental Dissimulation ADHD validity scale (Ds-ADHD) on the MMPI-2-RF that was effective in distinguishing credible and non-credible ADHD diagnoses via a simulator-based study. Within the current study, the Ds-ADHD scale was compared to the established MMPI-2-RF validity scales within a mixed sample of U.S. Military Veterans.
Participants and Methods:
173 Veterans (Mage = 36.18, SDage = 11.10, Medu = 14.01, SDedu = 2.11, 88% male, 81% White, 17% Black) completed a neuropsychological evaluation which included an internally consistent MMPI-2-RF profile and up to 10 performance validity tests (PVTs) as well as a question about a possible ADHD diagnosis. The credible group was determined if participants passed all PVTs (n=146) and completed at least 2 PVTs. The non-credible group was determined by failing two or more PVTs (n=27). Group assignment was clinically confirmed. The Ds-ADHD scale was calculated according to Robinson & Rogers’ (2018); responses of “true” (i.e., erroneous stereotypes) were coded as 1 and “false” answers were coded 2, creating a 10- to 20-point scale. Thus, lower scores would be associated with a higher likelihood of a feigned ADHD presentation. Other MMPI-2-RF validity scales of interest included F-r, Fs, Fp-r, FBS-r, and RBS.
Results:
The established MMPI-2-RF validity scales were significantly correlated with PVT group membership, but correlations were weak to moderately strong (rS ranged from -.43 to -.18; ps < .05). A series of stepwise regression models were completed with the Ds-ADHD scale and one of the MMPI-2-RF validity scales as independent variables, with group membership as the dependent variable. Ds-ADHD) contributed uniquely to each model (CÜ ranged from .03 to .04, ps < .05). The established MMPI-2-RF validity scales effectively classified group membership (AUC values ranged from .57 to .68), and the Ds-ADHD scale had a marginally higher AUC (.69); however, it was not statistically significantly stronger than any of the established scales (ps > .05).
Conclusions:
Clinicians interested in identifying potentially simulated ADHD presentations with the MMPI-2-RF may desire to calculate the Ds-ADHD scale, which previously only had support from a simulator-based study. The Ds-ADHD scale significantly contributed to each model, suggesting that it helped explain groups over and above each of the traditional MMPI-2-RF validity scales. However, it only had a marginally stronger ability to classify participants, indicating that there may be diminishing returns for clinicians. Among the traditional validity scales, RBS and F-r best classified groups, and FBS-r was the least effective. This study employed a cross-sectional design in a mixed sample of Veterans undergoing a neuropsychological evaluation. Future research should focus on replicating the findings using a credible sample that was limited to an independently verified diagnosis of ADHD.
We propose an economic reformulation of contribution policy integrating: (1) formalization of sustainability as the steady-state contribution rate, incorporating both the expected return on risky assets and a low-risk discount rate for liabilities; (2) derivation of contribution adjustment policies required for convergence toward the target funded ratio and contribution rate; and (3) a stylized optimization framework for simultaneous determination of the target portfolio return and funded ratio. This analysis provides new theoretical insights into the basis for pre-funding vs. pay-as-you-go, resting on the convexity of the long-run risk–return relationship, and also potentially practical guidelines for contribution policy.
The current study investigated the effects of pre-grazing herbage mass (PGHM, 1500 or 2500 kg dry matter (DM)/ha) and post-grazing sward height (PGSH, 4 or 6 cm) on herbage production and its nutritive value and DM intake, grazing behaviour and growth of Charolais steers (n = 96; 12 months of age; 396 ± 19.0 kg) during a 222-day grazing season, and the subsequent effect of an indoor finishing diet (grass silage alone or supplemented with concentrates) for 146 days, on performance and carcass traits. Steers were assigned to one of 12 grazing groups and group was assigned to a 2 (PGHM) × 2 (PGSH) factorial arrangement of treatments. At the end of the grazing season, live-weight was 16 kg heavier for PGHM-1500 than PGHM-2500 and 34 kg heavier for PGSH-6 than PGSH-4. After indoor finishing, there was no difference in carcass weight between PGHM treatments, but PGSH-6 had a 19 kg heavier carcass than PGSH-4. Herbage production was 881 and 517 kg DM/ha greater for PGHM-2500 than PGHM-1500 and for PGSH-4 than PGSH-6, respectively. Grazing stocking rate did not differ between PGHM treatments but PGSH-4 carried 1.35 more steers/ha than PGSH-6. Supplementing concentrates during the indoor period increased carcass weight (42 kg) and fat score (2.10 units). In conclusion, grazing to 6 rather than 4 cm, increased individual carcass weight but not carcass weight gain/ha. Compared to PGHM-2500, grazing PGHM-1500 increased steer live-weight gain at pasture, but did not affect carcass weight following indoor finishing.
As TeV gamma-ray astronomy progresses into the era of the Cherenkov Telescope Array (CTA), there is a desire for the capacity to instantaneously follow up on transient phenomena and continuously monitor gamma-ray flux at energies above $10^{12}\,\mathrm{eV}$. To this end, a worldwide network of Imaging Air Cherenkov Telescopes (IACTs) is required to provide triggers for CTA observations and complementary continuous monitoring. An IACT array sited in Australia would contribute significant coverage of the Southern Hemisphere sky. Here, we investigate the suitability of a small IACT array and how different design factors influence its performance. Monte Carlo simulations were produced based on the Small-Sized Telescope (SST) and Medium-Sized Telescope (MST) designs from CTA. Angular resolution improved with larger baseline distances up to 277 m between telescopes, and energy thresholds were lower at 1 000 m altitude than at 0 m. The ${\sim} 300\,\mathrm{GeV}$ energy threshold of MSTs proved more suitable for observing transients than the ${\sim}1.2\,\mathrm{TeV}$ threshold of SSTs. An array of four MSTs at 1 000 m was estimated to give a 5.7$\sigma$ detection of an RS Ophiuchi-like nova eruption from a 4-h observation. We conclude that an array of four MST-class IACTs at an Australian site would ideally complement the capabilities of CTA.
There is a requirement in some beef markets to slaughter bulls at under 16 months of age. This requires high levels of concentrate feeding. Increasing the slaughter age of bulls to 19 months facilitates the inclusion of a grazing period, thereby decreasing the cost of production. Recent data indicate few quality differences in longissimus thoracis (LT) muscle from conventionally reared 16-month bulls and 19-month-old bulls that had a grazing period prior to finishing on concentrates. The aim of the present study was to expand this observation to additional commercially important muscles/cuts. The production systems selected were concentrates offered ad libitum and slaughter at under 16 months of age (16-C) or at 19 months of age (19-CC) to examine the effect of age per se, and the cheaper alternative for 19-month bulls described above (19-GC). The results indicate that muscles from 19-CC were more red, had more intramuscular fat and higher cook loss than those from 16-C. No differences in muscle objective texture or sensory texture and acceptability were found between treatments. The expected differences in composition and quality between the muscles were generally consistent across the production systems examined. Therefore, for the type of animal and range of ages investigated, the effect of the production system on LT quality was generally representative of the effect on the other muscles analysed. In addition, the data do not support the under 16- month age restriction, based on meat acceptability, in commercial suckler bull production.
Breeding values for feed intake and feed efficiency in beef cattle are generally derived indoors on high-concentrate (HC) diets. Within temperate regions of north-western Europe, however, the majority of a growing beef animal’s lifetime dietary intake comes from grazed grass and grass silage. Using 97 growing beef cattle, the objective of the current study was to assess the repeatability of both feed intake and feed efficiency across 3 successive dietary test periods comprising grass silage plus concentrates (S+C), grazed grass (GRZ) and a HC diet. Individual DM intake (DMI), DMI/kg BW and feed efficiency-related parameters, residual feed intake (RFI) and gain to feed ratio (G : F) were assessed. There was a significant correlation for DMI between the S+C and GRZ periods (r = 0.32; P < 0.01) as well as between the S+C and HC periods (r = 0.41; P < 0.001), whereas there was no association for DMI between the GRZ and HC periods. There was a significant correlation for DMI/kg BW between the S+C and GRZ periods (r = 0.33; P < 0.01) and between the S+C and HC periods (r = 0.40; P < 0.001), but there was no association for the trait between the GRZ and HC periods. There was a significant correlation for RFI between the S+C and GRZ periods (r = 0.25; P < 0.05) as well as between S+C and HC periods (r = 0.25; P < 0.05), whereas there was no association for RFI between the GRZ and HC periods. Gain to feed ratio was not correlated between any of the test periods. A secondary aspect of the study demonstrated that traits recorded in the GRZ period relating to grazing bite rate, the number of daily grazing bouts and ruminating bouts were associated with DMI (r = 0.28 to 0.42; P < 0.05 - 0.001), DMI/kg BW (r = 0.36 to 0.45; P < 0.01 - 0.001) and RFI (r = 0.31 to 0.42; P < 0.05 - 0.001). Additionally, the number of ruminating boli produced per day and per ruminating bout were associated with G : F (r = 0.28 and 0.26, respectively; P < 0.05). Results from this study demonstrate that evaluating animals for both feed intake and feed efficiency indoors on HC diets may not reflect their phenotypic performance when consuming conserved forage-based diets indoors or when grazing pasture.
Cellular mitochondrial function has been suggested to contribute to variation in feed efficiency (FE) among animals. The objective of this study was to determine mitochondrial abundance and activities of various mitochondrial respiratory chain complexes (complex I (CI) to complex IV (CIV)) in liver and muscle tissue from beef cattle phenotypically divergent for residual feed intake (RFI), a measure of FE. Individual DM intake (DMI) and growth were measured in purebred Simmental heifers (n = 24) and bulls (n = 28) with an initial mean BW (SD) of 372 kg (39.6) and 387 kg (50.6), respectively. All animals were offered concentrates ad libitum and 3 kg of grass silage daily, and feed intake was recorded for 70 days. Residuals of the regression of DMI on average daily gain (ADG), mid-test BW0.75 and backfat (BF), using all animals, were used to compute individual RFI coefficients. Animals were ranked within sex, by RFI into high (inefficient; top third of the population), medium (middle third of population) and low (efficient; bottom third of the population) terciles. Statistical analysis was carried out using the MIXED procedure of SAS v 9.3. Overall mean ADG (SD) and daily DMI (SD) for heifers were 1.2 (0.4) and 9.1 (0.5) kg, respectively, and for bulls were 1.8 (0.3) and 9.5 (1.02) kg, respectively. Heifers and bulls ranked as high RFI consumed 10% and 15% more (P < 0.05), respectively, than their low RFI counterparts. There was no effect of RFI on mitochondrial abundance in either liver or muscle (P > 0.05). An RFI × sex interaction was apparent for CI activity in muscle. High RFI animals had an increased activity (P < 0.05) of CIV in liver tissue compared to their low RFI counterparts; however, the relevance of that observation is not clear. Our data provide no clear evidence that cellular mitochondrial function within either skeletal muscle or hepatic tissue has an appreciable contributory role to overall variation in FE among beef cattle.
Finishing late-maturing bulls on grass may alter the antioxidant/prooxidant balance leading to beef with higher susceptibility to lipid oxidation and a lower colour stability compared to bulls finished on cereal concentrates. In this context, lipid oxidation and colour stability of beef from late-maturing bulls finished on pasture, with or without concentrate supplements, or indoors on concentrate was assessed. Charolais or Limousin sired bulls (n = 48) were assigned to four production systems: (1) pasture only (P), (2) pasture plus 25% dietary DM intake as barley-based concentrate (PC25), (3) pasture plus 50% dietary DM intake as barley-based concentrate (PC50) or (4) a barley-based concentrate ration (C). Following slaughter and postmortem ageing, M. Longissimus thoracis et lumborum was subjected to simulated retail display (4°C, 1000 lux for 12 h out of 24 h) for 3, 7, 10 and 14 days in modified atmosphere packs (O2 : CO2; 80 : 20). Lipid oxidation was determined using the 2-thiobarbituric acid-reactive substances assay; α-tocopherol was determined by HPLC; fatty acid methyl esters were determined using Gas Chromatography. Using a randomised complete block design, treatment means were compared by either ANOVA or repeated measures ANOVA using the MIXED procedure of SAS. Total polyunsaturated fatty acid (PUFA) concentrations were not affected by treatment, n-3 PUFAs were higher (P < 0.001) and the ratio of n-6 to n-3 PUFAs was lower (P < 0.001) in muscle from P, PC25 and PC50 compared to C. α-Tocopherol concentration was higher in muscle from P compared to PC50 and C bulls (P = 0.001) and decreased (P < 0.001) in all samples by day 14. Lipid oxidation was higher in muscle from C compared to P bulls on day 10 and day 14 of storage (P < 0.01). Finishing on pasture without supplementation did not affect beef colour stability and led to lower lipid oxidation, possibly due to the higher α-tocopherol concentration compared to concentrate finished beef.
Colostrum-derived passive immunity is central to the health, performance and welfare of neonatal beef-suckler calves, and economics of beef-farming enterprises. Compared to dairy calves, mainly Holstein-Friesian, there is much less research carried out on passive immunity and associated factors in beef calves. Thus, this review aimed to summarise and interpret published information and highlight areas requiring further research. The transfer of immunoglobulin G1 (IgG1) from blood to mammary secretions is greater for beef × dairy cows compared to most beef breed types. Considerable between-animal variance is evident in first-milking colostrum yield and immunoglobulin concentration of beef-suckler cow breed types. First-milking colostrum immunoglobulin concentrations are similar for within-quarter fractions and for the front and rear quarters of the udder. First-milking colostrum yield is higher for beef × dairy cows than beef × beef and purebred beef breeds, and higher for multiparous than primiparous cows, but generally colostrum immunoglobulin concentration is relatively similar for each of the respective categories. Consequently, colostrum immunoglobulin mass (volume × concentration) production in beef cows seems to be primarily limited by colostrum volume. The effect of maternal nutrition during late gestation on colostrum yield is not well documented; however, most studies provide evidence that colostrum immunoglobulin concentration is not adversely affected by under-nutrition. Factors that impinge upon the duration between birth and first suckling, including dam parity, udder and teat anatomy and especially dystocia, negatively impact on calf passive immunity. Colostrum immunoglobulin mass ingested relative to birth weight post-parturition is the most important variable determining calf passive immunity. Research indicates that feeding the beef calf a colostrum volume equivalent to 5% of birth weight shortly after parturition, with subsequent suckling of the dam (or a second feed) 6 to 8 h later, ensures adequate passive immunity, equivalent to a well-managed suckling situation. Within beef-suckler cow genotypes, calf passive immunity is similar for many common beef breeds, but is generally higher for calves from beef × dairy cows. Compared to older cows, calves from younger cows, especially primiparous animals, have lower serum immunoglobulin concentrations. Most studies have shown no adverse impact of maternal dietary restriction on calf passive immunity. The prevalence of failure of passive transfer (FPT) in beef calves varies considerably across studies depending on the test used, and what cut-off value is assumed or how it is classified. The accuracy and precision of methodologies used to determine immunoglobulin concentrations is concerning; caution is required in interpreting laboratory results regarding defining colostrum ‘quality’ and calf passive immune ‘status’. Further research is warranted on colostrum-related factors limiting passive immunity of beef calves, and on the validation of laboratory test cut-off points for determining FPT, based on their relationships with key health and performance measures.
Animal’s feed efficiency in growing cattle (i.e. the animal ability to reach a market or adult BW with the least amount of feed intake), is a key factor in the beef cattle industry. Feeding systems have made huge progress to understand dietary factors influencing the average animal feed efficiency. However, there exists a considerable amount of animal-to-animal variation around the average feed efficiency observed in beef cattle reared in similar conditions, which is still far from being understood. This review aims to identify biological determinants and molecular pathways involved in the between-animal variation in feed efficiency with particular reference to growing beef cattle phenotyped for residual feed intake (RFI). Moreover, the review attempts to distinguish true potential determinants from those revealed through simple associations or indirectly linked to RFI through their association with feed intake. Most representative and studied biological processes which seem to be connected to feed efficiency were reviewed, such as feeding behaviour, digestion and methane production, rumen microbiome structure and functioning, energy metabolism at the whole body and cellular levels, protein turnover, hormone regulation and body composition. In addition, an overall molecular network analysis was conducted for unravelling networks and their linked functions involved in between-animal variation in feed efficiency. The results from this review suggest that feeding and digestive-related mechanisms could be associated with RFI mainly because they co-vary with feed intake. Although much more research is warranted, especially with high-forage diets, the role of feeding and digestive related mechanisms as true determinants of animal variability in feed efficiency could be minor. Concerning the metabolic-related mechanisms, despite the scarcity of studies using reference methods it seems that feed efficient animals have a significantly lower energy metabolic rate independent of the associated intake reduction. This lower heat production in feed efficient animals may result from a decreased protein turnover and a higher efficiency of ATP production in mitochondria, both mechanisms also identified in the molecular network analysis. In contrast, hormones and body composition could not be conclusively related to animal-to-animal variation in feed efficiency. The analysis of potential biological networks underlying RFI variations highlighted other significant pathways such as lipid metabolism and immunity and stress response. Finally, emerging knowledge suggests that metabolic functions underlying genetic variation in feed efficiency could be associated with other important traits in animal production. This emphasizes the relevance of understanding the biological basis of relevant animal traits to better define future balanced breeding programmes.
Improvements in feed efficiency of beef cattle have the potential to increase producer profitability and simultaneously lower the environmental footprint of beef production. Although there are many different approaches to measuring feed efficiency, residual feed intake (RFI) has increasingly become the measure of choice. Defined as the difference between an animal’s actual and predicted feed intake (based on weight and growth), RFI is conceptually independent of growth and body size. In addition, other measurable traits related to energy expenditure such as estimates of body composition can be included in the calculation of RFI to also force independence from these traits. Feed efficiency is a multifactorial and complex trait in beef cattle and inter-animal variation stems from the interaction of many biological processes influenced, in turn, by physiological status and management regimen. Thus, the purpose of this review was to summarise and interpret current published knowledge and provide insight into research areas worthy of further investigation. Indeed, where sufficient suitable reports exist, meta-analyses were conducted in order to mitigate ambiguity between studies in particular. We have identified a paucity of information on the contribution of key biological processes, including appetite regulation, post-ruminal nutrient absorption, and cellular energetics and metabolism to the efficiency of feed utilisation in cattle. In addition, insufficient information exists on the relationship between RFI status and productivity-related traits at pasture, a concept critical to the overall lifecycle of beef production systems. Overall, published data on the effect of RFI status on both terminal and maternal traits, coupled with the moderate repeatability and heritability of the trait, suggest that breeding for improved RFI, as part of a multi-trait selection index, is both possible and cumulative, with benefits evident throughout the production cycle. Although the advent of genomic selection, with associated improved prediction accuracy, will expedite the introgression of elite genetics for feed efficiency within beef cattle populations, there are challenges associated with this approach which may, in the long-term, be overcome by increased international collaborative effort but, in the short term, will not obviate the on-going requirement for accurate measurement of the primary phenotype.
The aim of this study was to analyse cow reproductive performance on 37 Irish suckler beef farms and determine how reproductive efficiency influences farm profitability. The main reproductive factors associated with gross output value per livestock unit (GO/LU) were average age at first calving (r=−0.19, P<0.01) and number of months with a calving (r=−0.15, P<0.05). A 1 month increase in average age at first calving was shown to reduce GO/LU by €14 across suckler farms. Average age at first calving was positively correlated with calving interval (r=0.21, P<0.001) and the number of months with a calving (r=0.18, P<0.01). Number of months with a calving was also positively correlated with calf mortality (r=0.21, P<0.01). However, these relationships between reproductive variables had no statistically significant impact on farm financial performance. It is therefore concluded that additional analysis at animal level is required to determine key reproductive indicators contributing to farm profitability.
This study aimed to examine the effects of replacing rolled barley (high in starch) with citrus pulp (high in digestible fibre) in a supplement on intake and performance of young growing cattle offered grass silage ad libitum for 101 days. Weaned, early- and late-maturing breed, male suckled beef calves (n=120) were blocked by sire breed, gender and weight and from within block randomly assigned to one of two concentrate supplements based mainly on rolled barley (BAR) or citrus pulp (CIT) and formulated to have similar concentrations of true protein digestible in the small intestine. On day 87, blood samples were taken before and 2 h after feeding, and rumen fluid samples were collected 2 h post-feeding. Supplement type did not affect (P>0.05) grass silage intake, live weight gain, final live weight, ultrasonically assessed body composition or measurements of skeletal size. Rumen pH (6.64 v. 6.79), ammonia (51 v. 81 mg/l) and acetate-to-propionate ratio (2.7 v. 3.2) were lower (P<0.001) for CIT than BAR. In conclusion, citrus pulp can replace barley in concentrate supplements for growing cattle without negatively affecting performance.
This study aimed to compare the quality of beef from suckler bulls raised on a high-energy concentrate ration and slaughtered at different carcass weights (CW)/ages. In total, 42 spring-born, Charolais and Limousin-sired, weaned suckler bulls were provided with a finishing diet of ad libitum concentrates and grass silage until they reached target CW of 340, 380 and 420 kg. Intramuscular fat (IMF) content tended (P<0.06) to be higher for 420 kg CW than for 380 and 340 kg CW. Sensory tenderness was lower (P<0.001) for 420 kg CW than for 380 and 340 kg CW. Juiciness was higher (P<0.05) for 420 kg CW than for 380 kg CW. Flavour liking was higher (P<0.05) for 420 and 380 kg CW (which did not differ) than for 340 kg CW. Overall, an increase in CW resulted in a slight increase in IMF content which could be responsible for the increase in juiciness and flavour liking of the beef. An increase in CW led to a decrease in the tenderness of the beef even though the overall liking of the beef was not affected.
The performance of early-maturing breed sired suckler bulls finished at pasture, with or without concentrate supplementation, at 15 or 19 months of age was evaluated. In total, 60 Aberdeen Angus-sired bulls were assigned to a two (slaughter age (SA): 15 (S15) or 19 (S19) months)×two (finishing strategies (FS): grass only or grass+barley-based concentrate) factorial arrangement. There were no (P>0.05) SA×FS interactions. Increasing SA increased carcass weight (265 v. 355 kg), kill-out proportion (542 v. 561 g/kg), conformation (6.7 v. 8.3, 1 to 15) (P<0.001) and fat (5.8 v. 6.8) scores (P<0.01), and resulted in yellower subcutaneous fat (‘b’ value, 6.6 v. 8.3) and darker muscle (‘L’ value, 30.0 v. 28.3) (P<0.01). Supplementation reduced estimated herbage intake by 0.60 and 0.47 kg dry matter (DM)/kg DM of concentrates for S15 and S19, respectively. Supplementation increased carcass weight (+6.7%, P<0.001) and kill-out proportion (+1.8%, P=0.06) but had no effect on carcass fat and conformation scores or fat and muscle colour. In conclusion, carcasses were adequately finished, with or without concentrates for S19, but not for S15. Supplementation had no effect, and age had relatively minor effects, on fat and muscle colour.
This experiment aimed to assess the effect of different indoor winter growth rates (WGR) followed by different concentrate supplementation levels at pasture on meat quality of 90 bulls. During the first winter, bulls were offered grass silage ad libitum and either 3 kg (WGR3) or 6 kg (WGR6) of concentrates. After turn-out to pasture, bulls were offered: grass without supplementation (PO), grass plus 0.2 predicted dry matter intake (DMI) as concentrates (PL) or grass plus 0.4 predicted DMI as concentrates (PH). After finishing, colour, chemical composition (unaged), instrumental texture and sensory characteristics (14 days of ageing) of longissimus thoracis were measured. WGR6 bulls had heavier carcasses than WGR3 bulls. There was an interaction between WGR and supplementation for instrumental texture and redness (a). Within WGR3, PO beef was the most tender, whereas within WRG6, PL was the most tender. However, these differences were not detected by the sensory panel. Within WGR3, redness was the lowest for PL, whereas within WRG6, PO was the least red. No differences were found for chemical composition. The multivariate analysis highlighted WGR as the main variable affecting meat quality characteristics. In conclusion, variations in growth path exerted minor effects on appearance and instrumental texture which did not affect the perception of bull beef by a trained sensory panel.
Accommodating cattle indoors during the winter is widely practiced throughout Europe. There is currently no legislation surrounding the space allowance and floor type that should be provided to cattle during this time, however, concerns have been raised regarding the type of housing systems currently in use. The objective of the study was to investigate the effect of space allowance and floor type on performance and welfare of finishing beef heifers. Continental crossbred heifers (n=240: mean initial live; weight, 504 (SD 35.8) kg) were blocked by breed, weight and age and randomly assigned to one of four treatments; (i) 3.0 m2, (ii) 4.5 m2 and (iii) 6.0 m2 space allowance per animal on a fully slatted concrete floor and (iv) 6.0 m2 space allowance per animal on a straw-bedded floor, for 105 days. Heifers were offered a total mixed ration ad libitum. Dry matter intake was recorded on a pen basis and refusals were weighed back twice weekly. Heifers were weighed, dirt scored and blood sampled every 3 weeks. Whole blood was analysed for complete cell counts and serum samples were assayed for metabolite concentrations. Behaviour was recorded continuously using IR cameras from days 70 to 87. Heifers’ hooves were inspected for lesions at the start of the study and again after slaughter. Post-slaughter, carcass weight, conformation and fat scores and hide weight were recorded. Heifers housed at 4.5 m2 had a greater average daily live weight gain (ADG) than those on both of the other concrete slat treatments; however, space allowance had no effect on carcass weight. Heifers accommodated on straw had a greater ADG (0.15 kg) (P<0.05), hide weight (P<0.01) better feed conversion ratio (P<0.05) and had greater dirt scores (P<0.05) at slaughter than heifers accommodated on concrete slats at 6.0 m2. The number of heifers lying at any one time was greater (P<0.001) on straw than on concrete slats. Space allowance and floor type had no effect on the number of hoof lesions gained or on any of the haematological or metabolic variables measured. It was concluded that increasing space allowance above 3.0 m2/animal on concrete slats was of no benefit to animal performance but it did improve animal cleanliness. Housing heifers on straw instead of concrete slats improved ADG and increased lying time; however carcass weight was not affected.