We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
Multicenter clinical trials are essential for evaluating interventions but often face significant challenges in study design, site coordination, participant recruitment, and regulatory compliance. To address these issues, the National Institutes of Health’s National Center for Advancing Translational Sciences established the Trial Innovation Network (TIN). The TIN offers a scientific consultation process, providing access to clinical trial and disease experts who provide input and recommendations throughout the trial’s duration, at no cost to investigators. This approach aims to improve trial design, accelerate implementation, foster interdisciplinary teamwork, and spur innovations that enhance multicenter trial quality and efficiency. The TIN leverages resources of the Clinical and Translational Science Awards (CTSA) program, complementing local capabilities at the investigator’s institution. The Initial Consultation process focuses on the study’s scientific premise, design, site development, recruitment and retention strategies, funding feasibility, and other support areas. As of 6/1/2024, the TIN has provided 431 Initial Consultations to increase efficiency and accelerate trial implementation by delivering customized support and tailored recommendations. Across a range of clinical trials, the TIN has developed standardized, streamlined, and adaptable processes. We describe these processes, provide operational metrics, and include a set of lessons learned for consideration by other trial support and innovation networks.
The First Large Absorption Survey in H i (FLASH) is a large-area radio survey for neutral hydrogen in and around galaxies in the intermediate redshift range $0.4\lt z\lt1.0$, using the 21-cm H i absorption line as a probe of cold neutral gas. The survey uses the ASKAP radio telescope and will cover 24,000 deg$^2$ of sky over the next five years. FLASH breaks new ground in two ways – it is the first large H i absorption survey to be carried out without any optical preselection of targets, and we use an automated Bayesian line-finding tool to search through large datasets and assign a statistical significance to potential line detections. Two Pilot Surveys, covering around 3000 deg$^2$ of sky, were carried out in 2019-22 to test and verify the strategy for the full FLASH survey. The processed data products from these Pilot Surveys (spectral-line cubes, continuum images, and catalogues) are public and available online. In this paper, we describe the FLASH spectral-line and continuum data products and discuss the quality of the H i spectra and the completeness of our automated line search. Finally, we present a set of 30 new H i absorption lines that were robustly detected in the Pilot Surveys, almost doubling the number of known H i absorption systems at $0.4\lt z\lt1$. The detected lines span a wide range in H i optical depth, including three lines with a peak optical depth $\tau\gt1$, and appear to be a mixture of intervening and associated systems. Interestingly, around two-thirds of the lines found in this untargeted sample are detected against sources with a peaked-spectrum radio continuum, which are only a minor (5–20%) fraction of the overall radio-source population. The detection rate for H i absorption lines in the Pilot Surveys (0.3 to 0.5 lines per 40 deg$^2$ ASKAP field) is a factor of two below the expected value. One possible reason for this is the presence of a range of spectral-line artefacts in the Pilot Survey data that have now been mitigated and are not expected to recur in the full FLASH survey. A future paper in this series will discuss the host galaxies of the H i absorption systems identified here.
We present the first results from a new backend on the Australian Square Kilometre Array Pathfinder, the Commensal Realtime ASKAP Fast Transient COherent (CRACO) upgrade. CRACO records millisecond time resolution visibility data, and searches for dispersed fast transient signals including fast radio bursts (FRB), pulsars, and ultra-long period objects (ULPO). With the visibility data, CRACO can localise the transient events to arcsecond-level precision after the detection. Here, we describe the CRACO system and report the result from a sky survey carried out by CRACO at 110-ms resolution during its commissioning phase. During the survey, CRACO detected two FRBs (including one discovered solely with CRACO, FRB 20231027A), reported more precise localisations for four pulsars, discovered two new RRATs, and detected one known ULPO, GPM J1839 $-$10, through its sub-pulse structure. We present a sensitivity calibration of CRACO, finding that it achieves the expected sensitivity of 11.6 Jy ms to bursts of 110 ms duration or less. CRACO is currently running at a 13.8 ms time resolution and aims at a 1.7 ms time resolution before the end of 2024. The planned CRACO has an expected sensitivity of 1.5 Jy ms to bursts of 1.7 ms duration or less and can detect $10\times$ more FRBs than the current CRAFT incoherent sum system (i.e. 0.5 $-$2 localised FRBs per day), enabling us to better constrain the models for FRBs and use them as cosmological probes.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
New technologies and disruptions related to Coronavirus disease-2019 have led to expansion of decentralized approaches to clinical trials. Remote tools and methods hold promise for increasing trial efficiency and reducing burdens and barriers by facilitating participation outside of traditional clinical settings and taking studies directly to participants. The Trial Innovation Network, established in 2016 by the National Center for Advancing Clinical and Translational Science to address critical roadblocks in clinical research and accelerate the translational research process, has consulted on over 400 research study proposals to date. Its recommendations for decentralized approaches have included eConsent, participant-informed study design, remote intervention, study task reminders, social media recruitment, and return of results for participants. Some clinical trial elements have worked well when decentralized, while others, including remote recruitment and patient monitoring, need further refinement and assessment to determine their value. Partially decentralized, or “hybrid” trials, offer a first step to optimizing remote methods. Decentralized processes demonstrate potential to improve urban-rural diversity, but their impact on inclusion of racially and ethnically marginalized populations requires further study. To optimize inclusive participation in decentralized clinical trials, efforts must be made to build trust among marginalized communities, and to ensure access to remote technology.
Bovine tuberculosis (bTB) is a chronic, infectious and zoonotic disease of domestic and wild animals caused mainly by Mycobacterium bovis. This study investigated farm management factors associated with recurrent bTB herd breakdowns (n = 2935) disclosed in the period 23 May 2016 to 21 May 2018 and is a follow-up to our 2020 paper which looked at long duration bTB herd breakdowns. A case control study design was used to construct an explanatory set of farm-level management factors associated with recurrent bTB herd breakdowns. In Northern Ireland, a Department of Agriculture Environment and Rural Affairs (DAERA) Veterinarian investigates bTB herd breakdowns using standardised guidelines to allocate a disease source. In this study, source was strongly linked to carryover of infection, suggesting that the diagnostic tests had failed to clear herd infection during the breakdown period. Other results from this study associated with recurrent bTB herd breakdowns were herd size and type (dairy herds 43% of cases), with both these variables intrinsically linked. Other associated risk factors were time of application of slurry, badger access to silage clamps, badger setts in the locality, cattle grazing silage fields immediately post-harvest, number of parcels of land the farmer associated with bTB, number of land parcels used for grazing and region of the country.
This article is a clinical guide which discusses the “state-of-the-art” usage of the classic monoamine oxidase inhibitor (MAOI) antidepressants (phenelzine, tranylcypromine, and isocarboxazid) in modern psychiatric practice. The guide is for all clinicians, including those who may not be experienced MAOI prescribers. It discusses indications, drug-drug interactions, side-effect management, and the safety of various augmentation strategies. There is a clear and broad consensus (more than 70 international expert endorsers), based on 6 decades of experience, for the recommendations herein exposited. They are based on empirical evidence and expert opinion—this guide is presented as a new specialist-consensus standard. The guide provides practical clinical advice, and is the basis for the rational use of these drugs, particularly because it improves and updates knowledge, and corrects the various misconceptions that have hitherto been prominent in the literature, partly due to insufficient knowledge of pharmacology. The guide suggests that MAOIs should always be considered in cases of treatment-resistant depression (including those melancholic in nature), and prior to electroconvulsive therapy—while taking into account of patient preference. In selected cases, they may be considered earlier in the treatment algorithm than has previously been customary, and should not be regarded as drugs of last resort; they may prove decisively effective when many other treatments have failed. The guide clarifies key points on the concomitant use of incorrectly proscribed drugs such as methylphenidate and some tricyclic antidepressants. It also illustrates the straightforward “bridging” methods that may be used to transition simply and safely from other antidepressants to MAOIs.
The role of the Eurasian badger (Meles meles) as a wildlife host has complicated the management of bovine tuberculosis (bTB) in cattle. Badger ranging behaviour has previously been found to be altered by culling of badgers and has been suggested to increase the transmission of bTB either among badgers or between badgers and cattle. In 2014, a five-year bTB intervention research project in a 100 km2 area in Northern Ireland was initiated involving selective removal of dual path platform (DPP) VetTB (immunoassay) test positive badgers and vaccination followed by release of DPP test negative badgers (‘Test and Vaccinate or Remove’). Home range sizes, based on position data obtained from global positioning system collared badgers, were compared between the first year of the project, where no DPP test positive badgers were removed, and follow-up years 2–4 when DPP test positive badgers were removed. A total of 105 individual badgers were followed over 21 200 collar tracking nights. Using multivariable analyses, neither annual nor monthly home ranges differed significantly in size between years, suggesting they were not significantly altered by the bTB intervention that was applied in the study area.
This study determined farm management factors associated with long-duration bovine tuberculosis (bTB) breakdowns disclosed in the period 23 May 2016 to 21 May 2018; a study area not previously subject to investigation in Northern Ireland. A farm-level epidemiological investigation (n = 2935) was completed when one or more Single Intradermal Comparative Cervical Test (SICCT) reactors or when one or more confirmed (positive histological and/or bacteriological result) lesion at routine slaughter were disclosed. A case-control study design was used to construct an explanatory set of management factors associated with long-duration bTB herd breakdowns; with a case (n = 191) defined as an investigation into a breakdown of 365 days or longer. Purchase of infected animal(s) had the strongest association as the most likely source of infection for long-duration bTB herd breakdowns followed by badgers and then cattle-to-cattle contiguous herd spread. However, 73.5% (95% CI 61.1–85.9%) of the herd type contributing to the purchase of infection source were defined as beef fattening herds. This result demonstrates two subpopulations of prolonged bTB breakdowns, the first being beef fattening herds with main source continuous purchase of infected animals and a second group of primary production herds (dairy, beef cows and mixed) with risk from multiple sources.
Biodiversity offsetting aims to achieve at least no net loss of biodiversity by fully compensating for residual development-induced biodiversity losses after the mitigation hierarchy (avoid, minimize, remediate) has been applied. Actions used to generate offsets can include securing site protection, or maintaining or enhancing the condition of targeted biodiversity at an offset site. Protection and maintenance actions aim to prevent future biodiversity loss, so such offsets are referred to as averted loss offsets. However, the benefits of such approaches can be highly uncertain and opaque, because assumptions about the change in likelihood of loss as a result of the offset action are often implicit. As a result, the gain generated by averting losses can be intentionally or inadvertently overestimated, leading to offset outcomes that are insufficient for achieving no net loss of biodiversity. We present a method and decision tree to guide consistent and credible estimation of the likelihood of biodiversity loss for a proposed offset site with and without protection, for use when calculating the amount of benefit associated with the protection component of averted loss offsets. In circumstances such as when a jurisdictional offset policy applies to most impacts, plausible estimates of averted loss can be very low. Averting further loss of biodiversity is desirable, and averted loss offsets can be a valid approach for generating tangible gains. However, overestimation of averted loss benefits poses a major risk to biodiversity.
Fully slatted concrete floors are labour-efficient, cost-effective and thus common in beef cattle housing. However, the welfare of cattle accommodated on them has been questioned. The objective of this study was to evaluate the effect of floor and diet on hoof health and lying behaviours of housed dairy-origin bulls, from a mean age of 8 months to slaughter at 15.5 months old. Forty-eight bulls, which had a mean initial live weight of 212 (SD = 23.7) kg, were allocated to one of four treatments, which consisted of two floors and two diets arranged in a 2 × 2 factorial design. The floors evaluated were a fully slatted concrete floor and a fully slatted concrete floor overlaid with rubber, while the diets offered were either a high concentrate diet or a grass-silage-based diet supplemented with concentrates. The mean total duration of the study was 216 days. Floor had no significant effect on claw measurements measured on day 62 or 139. However, bulls accommodated on slats overlaid with rubber had a tendency to have a higher front toe length measured pre-slaughter than those accommodated on concrete slats (P = 0.063). Floor had no significant effect on the net growth of toes or heels during the duration of the study. The number of bruises (P < 0.01) and the bruising score (P < 0.05) were significantly higher on day 62 in bulls accommodated on fully slatted concrete floors than on concrete slats overlaid with rubber, but there was no significant effect of floor on these parameters on day 139 or at the measurement taken pre-slaughter. There was a tendency for bulls accommodated on concrete slats to have a higher probability of having sole bruising at the end of the experiment than those accommodated on slats overlaid with rubber (P = 0.052). Diet had no significant effect on toe length or heel height, number of bruises, or overall bruising score at any time point of the study. There was little evidence in the current study to suggest that bulls lying on fully slatted concrete floors could not express lying postures similar to those on concrete slats overlaid with rubber.
Fully slatted concrete floors are prevalent in beef cattle housing. However, concerns have been raised about welfare of cattle accommodated on slats. The objective of this study was to evaluate the effect of diet and floor type on the intake, performance and cleanliness of dairy-origin bulls from a mean age of 8 months to slaughter at 15.5 months old. Forty-eight bulls, which had a mean initial live weight of 212 kg (SD = 23.7), were allocated one of four treatments which consisted of two floors and two diets, arranged in a 2×2 factorial design. The floors evaluated were a fully slatted concrete floor and a fully slatted concrete floor covered with rubber; while the diets offered were either a high concentrate diet or a grass silage-based diet supplemented with concentrates. Over the entire experimental period, floor type had no significant effect on intake. Interestingly, however, when bulls were offered concentrates ad libitum, those accommodated on rubber covered slats consumed more concentrates than those accommodated on concrete slats. No effect of floor type on intake was noted when bulls were offered the grass silage supplemented with concentrate diet. There were no significant interactions between floor and diet on animal performance. Animals accommodated on rubber covered slats had a significantly better performance than those accommodated on concrete slats, as assessed by live weight at slaughter and live weight gain/day (P < 0.01) and estimated carcass gain/day (P < 0.05). The diet offered had no significant effect on animal performance. Bulls accommodated on rubber covered slats were significantly cleaner than those accommodated on concrete slats on day 97 (P < 0.001), but there was no significant effect of floor type when measured at other time points in the experiment. It is concluded from this study that diet has an important role to play in assessing bulls’ responses in performance to the effect of covering concrete slatted floors with rubber. Bulls offered a high concentrate diet had a higher concentrate intake, higher performance but a similar feed conversion ratio (FCR) when accommodated on rubber covered slats compared to those accommodated on fully concrete slatted floors. Animals offered this intensive diet were less efficient (as measured by a higher FCR) than those offered a supplemented grass silage-based diet.
This study aimed to evaluate the effect of using different floor types to accommodate growing and finishing beef cattle on lameness. In all, 80 dairy origin bulls were blocked according to live weight and breed into 20 groups, and randomly allocated within groups to one of four treatments. The floor types studied were fully slatted flooring throughout the entire experimental period (CS); fully slatted flooring covered with rubber strips throughout the entire experimental period (RS); fully slatted flooring during the growing period and then moved to a solid floor covered with straw bedding during the finishing period (CS-S) and fully slatted flooring during the growing period and then moved to fully slatted flooring covered with rubber strips during the finishing period (CS-RS). The total duration of the study was 204 days. The first 101 days was defined as the growing period, with the remainder of the study defined as the finishing period. During the growing period, there was a tendency for bulls accommodated on CS to have a higher locomotion score compared with those accommodated on RS (P=0.059). However, floor type had no significant effect on locomotion score during the finishing period. There was also no significant effect of floor type on digital dermatitis during both the growing or finishing period. Floor type had no significant effect on swelling at the leg joints at the end of the finishing period. Bulls accommodated on RS had the least probability of bruised soles during both the growing and finishing period (P<0.01). Growing bulls accommodated on CS had significantly greater front heel height net growth compared with those accommodated on RS (P<0.05). However, bulls accommodated on RS had a tendency to have greater front toe net growth compared with those accommodated on CS (P=0.087). Finishing bulls accommodated on CS-RS had the greatest front toe net growth (P<0.001). Heel height net growth was greatest in bulls accommodated on CS-S (P<0.001). Floor type had no significant effect on mean maximum hoof temperature during the growing period. Finishing bulls accommodated on CS-S had a significantly lower mean maximum hoof temperature compared with those accommodated on any other floor type (P<0.001). The study concluded that rubber flooring is a suitable alternative to fully slatted flooring, reducing the prevalence of bruised soles. Despite greater toe net growth in bulls accommodated on rubber flooring, there was no effect of floor type on locomotion score, suggesting that increased toe net growth does not adversely affect walking ability. In addition, although mean maximum hoof temperature was lowest in bulls accommodated on straw bedding, there was no evidence to suggest this is indicative of improved hoof health.
Determination of the proportion of bovine tuberculosis (bTB) breakdowns attributed to a herd purchasing infected animals has not been previously quantified using data from the Animal and Public Health Information System (APHIS) database in Northern Ireland. We used a case–control study design to account for the infection process occurring in the disclosing bTB breakdown herds. Cases (N = 6926) were cattle moving to a future confirmed bTB breakdown where they would disclose as a confirmed bTB reactor or a Lesion at Routine Slaughter (LRS). Controls (N = 303 499) were cattle moving to a future confirmed bTB breakdown where they did not become a bTB reactor or LRS. Our study showed that the cattle leaving herds which disclosed bTB within 450 days had an increased odds of becoming a confirmed bTB reactor or LRS compared with the cattle which left herds that remained free for 450 days (odds ratio (OR) = 2·09: 95% CI 1·96–2·22). Of the 12 060 confirmed bTB breakdowns included in our study (2007–2015 inclusive), 31% (95% CI 29·8–31·5) contained a confirmed bTB reactor(s) or LRS(s) at the disclosing test which entered the herd within the previous 450 days. After controlling for the infection process occurring in the disclosing bTB breakdown herd, our study showed that 6·4% (95% CI 5·9–6·8) of bTB breakdowns in Northern Ireland were directly attributable to the movement of infected animals.
The aim of this study was to evaluate the effect of using different floor types to accommodate growing and finishing beef cattle on their performance, cleanliness, carcass characteristics and meat quality. In total, 80 dairy origin young bulls (mean initial live weight 224 kg (SD=28.4 kg)) were divided into 20 blocks with four animals each according to live weight. The total duration of the experimental period was 204 days. The first 101 days was defined as the growing period, with the remainder of the study defined as the finishing period. Cattle were randomly assigned within blocks to one of four floor type treatments, which included fully slatted flooring throughout the entire experimental period (CS); fully slatted flooring covered with rubber strips throughout the entire experimental period (RS); fully slatted flooring during the growing period and moved to a solid floor covered with straw bedding during the finishing period (CS-S) and fully slatted flooring during the growing period and moved to fully slatted flooring covered with rubber strips during the finishing period (CS-RS). Bulls were offered ad libitum grass silage supplemented with concentrates during the growing period. During the finishing period, bulls were offered concentrates supplemented with chopped barley straw. There was no significant effect of floor type on total dry matter intake (DMI), feed conversion ratio, daily live weight gain or back fat depth during the growing and finishing periods. Compared with bulls accommodated on CS, RS and CS-RS, bulls accommodated on CS-S had a significantly lower straw DMI (P<0.01). Although bulls accommodated on CS and CS-S were significantly dirtier compared with those accommodated on RS and CS-RS on days 50 (P<0.05) and 151 (P<0.01), there was no effect of floor type on the cleanliness of bulls at the end of the growing and finishing periods. There was also no significant effect of floor type on carcass characteristics or meat quality. However, bulls accommodated on CS-S had a tendency for less channel, cod and kidney fat (P=0.084) compared with those accommodated on CS, RS and CS-RS. Overall, floor type had no effect on the performance, cleanliness, carcass characteristics or meat quality of growing or finishing beef cattle.
The aim of this 3 year study was to compare two suckler cow genotypes, namely Limousin×Holstein (LH) (sourced from the dairy herd) and Stabiliser (ST) (a composite breed), in terms performance at calving. Both dam genotypes were bred to a ST sire and calved in spring/early summer. There was no significant effect of dam genotype on concentrations of casein, lactose, protein or urea nitrogen in the colostrum. Colostrum from LH cows had a significantly higher fat concentration compared with ST cows (P<0.05). Dam genotype had no effect on incidence of calving difficulty, cow temperament or mothering ability score. There was a significant difference in milk supply scores between the two breeds of cows when the 3 years of data were combined (P=0.002), with a higher percentage of LH cows having a plentiful supply of milk compared with ST cows and conversely a higher percentage of ST having limited milk compared with LH cows. However this was not a consistent effect over the 3 years. This study demonstrated that both dam breeds exhibit high maternal attributes at calving. However further work is required to investigate if LH cows have a more plentiful milk supply since this has potential to influence growth rate of progeny.
The most important factors known to influence the eating quality of beef are well established and include both pre- and post-slaughter events with many of the determinants interacting with each other. A substantial programme of work has been conducted by the Agri-Food and Biosciences Institute in Northern Ireland aimed at quantifying those factors of most importance to the local beef industry. Post-slaughter effects such as carcase chilling and electrical stimulation, ageing, carcase hanging and cooking method have been shown to have a significant impact on eating quality when compared with pre-slaughter activities such as animal handling and lairage time in the Northern Ireland studies. However, the effect of animal breed, particularly the use of dairy breed animals, was shown to significantly improve eating quality. Many of these factors were found to interact with each other.
This study aimed to evaluate levels of beef cow fertility using calving interval (CI; measured in days) as a measure, and investigate the effects of breed, season, year and progeny gender on CI. The CI data included 273 764 records collected between 1997 and 2012 and included the seven most common breeds (and their crosses) in Northern Ireland (Charolais, Limousin, Belgian Blue, Simmental, Blonde d’Aquitaine, Aberdeen Angus and Hereford), accounting for 94.1% of beef dams recorded. Mean CI for all cows was 395 days, 30 days longer than the optimum 365 days. Charolais and Belgian Blue dams had the longest CI (P<0.05). Cows older than 144 months had a longer CI (P<0.05) compared with cows younger than 144 months. Charolais sires had a shorter subsequent CI of 392 days (P<0.05) compared with the other breeds. Cows calving in June had the shortest subsequent CI (376 days; P<0.05), whereas cows calving in November had the longest subsequent CI (410 days). Progeny gender did not significantly affect CI. This study establishes the level of beef cow fertility using CI as a measure in Northern Ireland is sub optimal and there are opportunities for improvement. Factors identified as influencing CI included dam breed, sire breed and month of parturition. This knowledge can be used to direct breeding programmes and inform knowledge transfer protocol to improve sustainability of beef production.
To assess the impact of farm management on herd fertility, a survey of 105 beef farms in Northern Ireland was conducted to establish the relationship between management variables and fertility. Each herd's average calving interval (CI) and the proportion of cows with a CI > 450 days (extended calving interval, ECI) was calculated to establish herd fertility. The relationship between each response variable (CI and proportion ECI) and each explanatory variable (respondents’ answers to questionnaire) was examined using univariate linear regression analyses. All response variables found to be associated with the explanatory variables were modelled against each group in turn using a fully automated multivariate stepwise regression algorithm employing the method of forward selection with backward elimination. The optimum 365-day CI and a proportion of 0 cows per hundred calved ECI targets were not widely attained in the current study. The distribution of CI and proportion ECI in the current study suggests more realistic targets would be a 379-day CI and 5 cows per hundred calved with ECI in commercial beef breeding herds. Six management factors were found to be associated with herd fertility: herd vaccination, bull selection, fertility management, breeding female management, perception of extension service (rural education provided by the government) and record keeping. It was found that respondents who vaccinated cows had a reduction of 5 cows per hundred calved in the proportion of cows with ECI, and as the number of vaccines administered to a cow increased, the CI decreased. Regular vaccination of breeding bulls was associated with a 9-day reduction in CI. Bull selection strategy had several associations with herd fertility; most notable was that respondents who used visual selection rather than estimated breeding values (EBVs) to select bulls were found to have a 15-day longer CI and 7 cows per hundred calved higher proportion of cows with ECI. For each 0·01 increase in the proportion of cows served by artificial insemination, CI increased by 0·16 days. Respondents who rated their beef breeding herd fertility as ‘very good’ had lower ECI and CI than those who rated beef breeding herd fertility as poor or satisfactory. Condition scoring of cows at weaning lowered ECI by 5 cows per hundred calved. Those who perceived the extension service to be very useful had the lowest CI and lowest ECI. Respondents who did not keep a record of CI to assess herd fertility had an 11-day longer CI and 6 cows per hundred calved higher proportion ECI than those who did not. In conclusion, the survey found a number of important variables linked to improved fertility including selecting sires based on EBVs and using a robust vaccination programme.