We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Ischemic stroke often results in long-term motor impairments due to disrupted corticospinal pathways. Transcranial magnetic stimulation (TMS) motor mapping is a non-invasive technique used to assess corticospinal integrity by measuring motor evoked potentials (MEPs). This study investigates whether MEP amplitudes can predict impairment severity and functional performance in chronic stroke. Methods: Four non-human primates (NHPs) with chronic stroke (> six months) following transient right middle cerebral artery occlusion underwent TMS motor mapping using neuronavigation under ketamine anesthesia. Single pulses of TMS (50-70% of maximum stimulator output) were applied to the affected and contralesional primary motor cortices to elicit MEPs and assess cortical excitability. Intramuscular electromyography recorded muscle responses from the biceps, extensor digitorum longus, and abductor pollicis brevis. Neurological dysfunction was evaluated daily for three weeks using the NHP Stroke Scale, NHP Upper Extremity Motor Dysfunction Scale, and the primate Rankin Scale. Results: MEPs were present in NHP1, NHP3, and NHP4 but absent in NHP2. Stronger MEPs correlated with lower impairment severity and better functional performance, while NHP2 exhibited higher impairment and poorer performance. Conclusions: MEP presence and strength can serve as biomarkers of motor recovery potential, highlighting their role in assessing corticospinal integrity and functional outcomes.
It remains unclear which individuals with subthreshold depression benefit most from psychological intervention, and what long-term effects this has on symptom deterioration, response and remission.
Aims
To synthesise psychological intervention benefits in adults with subthreshold depression up to 2 years, and explore participant-level effect-modifiers.
Method
Randomised trials comparing psychological intervention with inactive control were identified via systematic search. Authors were contacted to obtain individual participant data (IPD), analysed using Bayesian one-stage meta-analysis. Treatment–covariate interactions were added to examine moderators. Hierarchical-additive models were used to explore treatment benefits conditional on baseline Patient Health Questionnaire 9 (PHQ-9) values.
Results
IPD of 10 671 individuals (50 studies) could be included. We found significant effects on depressive symptom severity up to 12 months (standardised mean-difference [s.m.d.] = −0.48 to −0.27). Effects could not be ascertained up to 24 months (s.m.d. = −0.18). Similar findings emerged for 50% symptom reduction (relative risk = 1.27–2.79), reliable improvement (relative risk = 1.38–3.17), deterioration (relative risk = 0.67–0.54) and close-to-symptom-free status (relative risk = 1.41–2.80). Among participant-level moderators, only initial depression and anxiety severity were highly credible (P > 0.99). Predicted treatment benefits decreased with lower symptom severity but remained minimally important even for very mild symptoms (s.m.d. = −0.33 for PHQ-9 = 5).
Conclusions
Psychological intervention reduces the symptom burden in individuals with subthreshold depression up to 1 year, and protects against symptom deterioration. Benefits up to 2 years are less certain. We find strong support for intervention in subthreshold depression, particularly with PHQ-9 scores ≥ 10. For very mild symptoms, scalable treatments could be an attractive option.
Associations between childhood trauma, neurodevelopment, alcohol use disorder (AUD), and posttraumatic stress disorder (PTSD) are understudied during adolescence.
Methods
Using 1652 participants (51.75% female, baseline Mage = 14.3) from the Collaborative Study of the Genetics of Alcoholism, we employed latent growth curve models to (1) examine associations of childhood physical, sexual, and non-assaultive trauma (CPAT, CSAT, and CNAT) with repeated measures of alpha band EEG coherence (EEGc), and (2) assess whether EEGc trajectories were associated with AUD and PTSD symptoms. Sex-specific models accommodated sex differences in trauma exposure, AUD prevalence, and neural development.
Results
In females, CSAT was associated with higher mean levels of EEGc in left frontocentral (LFC, ß = 0.13, p = 0.01) and interhemispheric prefrontal (PFI, ß = 0.16, p < 0.01) regions, but diminished growth in LFC (ß = −0.07, p = 0.02) and PFI (ß = −0.07, p = 0.02). In males, CPAT was associated with lower mean levels (ß = −0.17, p = 0.01) and increased growth (ß = 0.11, p = 0.01) of LFC EEGc. Slope of LFC EEGc was inversely associated with AUD symptoms in females (ß = −1.81, p = 0.01). Intercept of right frontocentral and PFI EEGc were associated with AUD symptoms in males, but in opposite directions. Significant associations between EEGc and PTSD symptoms were also observed in trauma-exposed individuals.
Conclusions
Childhood assaultive trauma is associated with changes in frontal alpha EEGc and subsequent AUD and PTSD symptoms, though patterns differ by sex and trauma type. EEGc findings may inform emerging treatments for PTSD and AUD.
We present the first of two papers dedicated to verifying the Australian Epoch of Reionisation pipeline (AusEoRPipe) through simulation. The AusEoRPipe aims to disentangle 21-cm radiation emitted by gas surrounding the very first stars from contaminating foreground astrophysical sources and has been in the development for close to a decade. In this paper, we build an accurate 21-cm sky model that can be used by the WODEN simulation software to create visibilities containing a predictable 21-cm signal. We verify that the power spectrum (PS) estimator CHIPS can recover this signal in the absence of foregrounds. We also investigate how measurements in Fourier-space are correlated and how their gridded density affects the PS. We measure and fit for this effect using Gaussian-noise simulations of the Murchison Widefield Array (MWA) phase I layout. We find a gridding density correction factor of 2.651 appropriate for integrations equal to or greater than 30 minutes of data, which contain observations with multiple primary beam pointings and LSTs. Paper II of this series will use the results of this paper to test the AusEoRPipe in the presence of foregrounds and instrumental effects.
The Stop the Bleed course aims to improve bystander hemorrhage control skills and may be improved with point-of-care aids. We sought to create and examine a variety of cognitive aids to identify an optimal method to augment bystander hemorrhage control skills in an emergency scenario.
Methods:
Randomized trial of 346 college students. Effects of a visual or visual-audio aid on hemorrhage control skills were assessed through randomization into groups with and without prior training or familiarization with aids compared with controls. Tourniquet placement, wound packing skills, and participant comfortability were assessed during a simulated active shooter scenario.
Results:
A total of 325 (94%) participants were included in the final analyses. Participants who had attended training (odds ratio [OR], 12.67; P = 9.3 × 10−11), were provided a visual-audio aid (OR, 1.96; P = 0.04), and were primed on their aid (OR, 2.23; P = 0.01) were superior in tourniquet placement with less errors (P < 0.05). Using an aid did not improve wound packing scores compared with bleeding control training alone (P > 0.05). Aid use improved comfortability and likelihood to intervene emergency hemorrhage scenarios (P < 0.05).
Conclusions:
Using cognitive aids can improve bystander hemorrhage control skills with the strongest effects if they were previously trained and used an aid which combined visual and audio feedback that they were previously introduced to during the course training.
Virtual reality has emerged as a unique educational modality for medical trainees. However, incorporation of virtual reality curricula into formal training programmes has been limited. We describe a multi-centre effort to develop, implement, and evaluate the efficacy of a virtual reality curriculum for residents participating in paediatric cardiology rotations.
Methods:
A virtual reality software program (“The Stanford Virtual Heart”) was utilised. Users are placed “inside the heart” and explore non-traditional views of cardiac anatomy. Modules for six common congenital heart lesions were developed, including narrative scripts. A prospective case–control study was performed involving three large paediatric residency programmes. From July 2018 to June 2019, trainees participating in an outpatient cardiology rotation completed a 27-question, validated assessment tool. From July 2019 to February 2020, trainees completed the virtual reality curriculum and assessment tool during their cardiology rotation. Qualitative feedback on the virtual reality experience was also gathered. Intervention and control group performances were compared using univariate analyses.
Results:
There were 80 trainees in the control group and 52 in the intervention group. Trainees in the intervention group achieved higher scores on the assessment (20.4 ± 2.9 versus 18.8 ± 3.8 out of 27 questions answered correctly, p = 0.01). Further analysis showed significant improvement in the intervention group for questions specifically testing visuospatial concepts. In total, 100% of users recommended integration of the programme into the residency curriculum.
Conclusions:
Virtual reality is an effective and well-received adjunct to clinical curricula for residents participating in paediatric cardiology rotations. Our results support continued virtual reality use and expansion to include other trainees.
Understanding how cardiovascular structure and physiology guide management is critically important in paediatric cardiology. However, few validated educational tools are available to assess trainee knowledge. To address this deficit, paediatric cardiologists and fellows from four institutions collaborated to develop a multimedia assessment tool for use with medical students and paediatric residents. This tool was developed in support of a novel 3-dimensional virtual reality curriculum created by our group.
Methods:
Educational domains were identified, and questions were iteratively developed by a group of clinicians from multiple centres to assess understanding of key concepts. To evaluate content validity, content experts completed the assessment and reviewed items, rating item relevance to educational domains using a 4-point Likert scale. An item-level content validity index was calculated for each question, and a scale-level content validity index was calculated for the assessment tool, with scores of ≥0.78 and ≥0.90, respectively, representing excellent content validity.
Results:
The mean content expert assessment score was 92% (range 88–97%). Two questions yielded ≤50% correct content expert answers. The item-level content validity index for 29 out of 32 questions was ≥0.78, and the scale-level content validity index was 0.92. Qualitative feedback included suggestions for future improvement. Questions with ≤50% content expert agreement and item-level content validity index scores <0.78 were removed, yielding a 27-question assessment tool.
Conclusions:
We describe a multi-centre effort to create and validate a multimedia assessment tool which may be implemented within paediatric trainee cardiology curricula. Future efforts may focus on content refinement and expansion to include additional educational domains.
Racial disparities in colorectal cancer (CRC) can be addressed through increased adherence to screening guidelines. In real-life encounters, patients may be more willing to follow screening recommendations delivered by a race concordant clinician. The growth of telehealth to deliver care provides an opportunity to explore whether these effects translate to a virtual setting. The primary purpose of this pilot study is to explore the relationships between virtual clinician (VC) characteristics and CRC screening intentions after engagement with a telehealth intervention leveraging technology to deliver tailored CRC prevention messaging.
Methods:
Using a posttest-only design with three factors (VC race-matching, VC gender, intervention type), participants (N = 2267) were randomised to one of eight intervention treatments. Participants self-reported perceptions and behavioral intentions.
Results:
The benefits of matching participants with a racially similar VC trended positive but did not reach statistical significance. Specifically, race-matching positively influenced screening intentions for Black participants but not for Whites (b = 0.29, p = 0.10). Importantly, perceptions of credibility, attractiveness, and message relevance significantly influenced screening intentions and the relationship with race-matching.
Conclusions:
To reduce racial CRC screening disparities, investments are needed to identify patient-focused interventions to address structural barriers to screening. This study suggests that telehealth interventions that match Black patients with a Black VC can enhance perceptions of credibility and message relevance, which may then improve screening intentions. Future research is needed to examine how to increase VC credibility and attractiveness, as well as message relevance without race-matching.
Flavonoids have shown anti-hypertensive and anti-atherosclerotic properties: the impact of habitual flavonoid intake on vascular function, central haemodynamics and arterial stiffness may be important. We investigated the relationship between habitual flavonoid consumption and measures of central blood pressure and arterial stiffness. We performed cross-sectional analysis of 381 non-smoking healthy older adults (mean age 66·0 (sd 4·1) years; BMI, 26·4 (sd 4·41) kg/m2; 41 % male) recruited as part of the Australian Research Council Longevity Intervention study. Flavonoid intake (i.e. flavonols, flavones, flavanones, anthocyanins, isoflavones, flavan-3-ol monomers, proanthocyanidins, theaflavins/thearubigins and total consumption) was estimated from FFQ using the US Department of Agriculture food composition databases. Measures of central haemodynamics and arterial stiffness included systolic blood pressure (cSBP), diastolic blood pressure (cDBP), mean arterial pressure (cMAP) and augmentation index (cAIx). After adjusting for demographic and lifestyle confounders, each sd/d higher intake of anthocyanins ((sd 44·3) mg/d) was associated with significantly lower cDBP (−1·56 mmHg, 95 % CI −2·65, −0·48) and cMAP (−1·62 mmHg, 95 % CI −2·82, −0·41). Similarly, each sd/d higher intake of flavanones ((sd 19·5) mg/d) was associated with ~1 % lower cAIx (−0·93 %, 95 % CI −1·77, −0·09). These associations remained significant after additional adjustment for (1) a dietary quality score and (2) other major nutrients that may affect blood pressure or arterial stiffness (i.e. Na, K, Ca, Mg, n-3, total protein and fibre). This study suggests a possible benefit of dietary anthocyanin and flavanone intake on central haemodynamics and arterial stiffness; these findings require corroboration in further research.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
UK trees require increased conservation efforts due to sparse and fragmented populations. Ex situ conservation, including seed banking, can be used to better manage these issues. We conducted accelerated ageing tests on seeds of 22 UK native woody species, in order to assess their likely longevity and optimize their conservation in a seed bank. Germination at four ageing time points was determined to construct survival curves, and it was concluded that multiple samples within a species showed comparable responses for most species tested, except for Fraxinus excelsior. Of all species studied, one could be classified as very short-lived, four as short-lived and 17 as medium, with none exceeding the medium category. The most important finding of this manuscript is that although some taxonomic trends were observed, the results indicate the need for caution when making broad conclusions on potential seed storage life at a species, genus or family level. Longevity predictions were compared to actual performance of older collections held in long-term storage at the Millennium Seed Bank, Kew. Although most collections remain high in viability in storage after more than 20 years, for short-lived species at least, there is some indication that accelerated ageing predicts longevity in seed bank conditions. For species with reduced potential longevity, such as Fagus sylvatica and Ulmus glabra, additional storage options are recommended for long-term gene banking.
Introduction: Prognostication and disposition among older Emergency Department (ED) patients with suspected infection remains challenging. Frailty is increasingly recognized as a predictor of poor prognosis among critically ill patients, however its association with clinical outcomes among older ED patients with suspected infection is unknown. Methods: We conducted a multicentre prospective cohort study at two tertiary care EDs. We included older ED patients (≥ 75 years) presenting with suspected infection. Frailty at baseline (prior to index illness) was explicitly measured for all patients by the treating physicians using the Clinical Frailty Scale (CFS). We defined frailty as a CFS 5-8. The primary outcome was 30-day mortality. We used multivariable logistic regression to adjust for known confounders. We also compared the prognostic accuracy of frailty against the Systemic Inflammatory Response Syndrome (SIRS) and Quick Sequential Organ Failure Assessment (qSOFA) criteria. Results: We enrolled 203 patients, of whom 117 (57.6%) were frail. Frail patients were more likely to develop septic shock (adjusted odds ratio [aOR]: 1.83, 95% confidence interval [CI]: 1.08-2.51) and more likely to die within 30 days of ED presentation (aOR 2.05, 95% CI: 1.02-5.24). Sensitivity for mortality was highest among the CFS (73.1%, 95% CI: 52.2-88.4), as compared to SIRS ≥ 2 (65.4%, 95% CI: 44.3-82.8) or qSOFA ≥ 2 (38.4, 95% CI: 20.2-59.4). Conclusion: Frailty is a highly prevalent prognostic factor that can be used to risk-stratify older ED patients with suspected infection. ED clinicians should consider screening for frailty in order to optimize disposition in this population.
Public hospital systems have struggled to identify ways of cutting costs while improving quality of mental health treatment, even more since the economic downturn.
Objective
To compare mental health care expenditures and quality in two large sites, Boston and Madrid, and to analyze the amount of the expenditure corresponding to pharmacy, ER, outpatient and inpatient care.
Methods
Data are mental health electronic records from three hospitals in Madrid (n=31,183 person-years) and in Boston(n=8,805). Adequacy of care was measured as four or more visits within the last year. Unadjusted comparisons of variables were conducted using t-tests. Multivariate generalized linear regression models were computed with log link and residual variance proportional to mean squared, adjusting for covariates. Results were also adjusted for World Bank Purchasing Power Parity and converted to U.S. dollars.
Results
The annual average treatment expenditure is $4,874 in Boston and $2,693 in Madrid . Boston patients had a bigger percentage of use (13,6% vs 5,3%) and greater annual expenditure ($25,175 vs $15,470) for inpatient services (p<0,05). Conversely, Madrid patients used and spent more on outpatient treatments (87% vs 84%;$1,670 vs $1,378;p<0,05). Being in the Boston site, having a bipolar, psychotic or alcohol disorder was a significant positive predictor of total expenditure. Adequacy of care was bigger in Boston (32,8% vs 23,1%)
Conclusions
Emphasis on outpatient care appears to reduce inpatient stays and global expenditures. An earlier recognition due to a more open access to treatments in Spain may help decreasing costs. Bipolar, psychotic and alcohol disorders imply bigger costs.
Prior studies have identified that individuals with comorbid substance use disorder and mental health disorder are at a greater risk of benzodiazepine abuse compared to individuals that present with mental health disorder without an accompanying substance use disorder. These studies were conducted in predominantly white populations, and little is known if the same associations are seen in safety net health care networks. Also, the literature is mixed as to whether or not psychiatrists’ prescription of benzodiazepines places individuals at undue risk of benzodiazepine abuse.
We use 2013–2015 electronic health record data from a Boston healthcare system. Patients with benzodiazapene abuse were identified if they had received treatment under the ICD-9 code 304.1. Benzodiazepine abuse was compared between patients with only mental illness and patients with existing comorbid substance and mental health disorder, in unadjusted comparisons and adjusted regression models. Covariates in regression models were used to identify subgroups at higher risk of benzodiazepine abuse.
Individuals with benzodiazepine abuse had higher rates of emergency room and inpatient use than patients with other mental health and/or substance use disorders. Those with comorbid substance and mental disorder were significantly more likely than individuals with mental or substance use disorder alone to abuse benzodiazepines (P < .01). Among those with benzodiazepine abuse, 93.3% were diagnosed with a mental illness, 75.6% were diagnosed with a substance use disorder (other than benzodiazepine), and 64.4% had comorbid anxiety disorder and substance use disorder. These analyses suggest that patients with benzodiazepine abuse have complex presentations and intensive service use.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Chronic non-malignant pain (CNMP) is defined as pain lasting a minimum of three months. In general, chronic pain affects 20% adult worldwide population. Moreover, pain is more common in patients with depression, anxiety, and substance-use disorders and with low socioeconomic status. We aimed to better understand the influence of pain on substance use and treatment use patterns of individuals who experienced clinically recognized pain and have substance use disorder.
Methods
Patients with pain disturbances were identified in Electronic Health Records (EHR) through ICD-9 code 338*, medical written diagnoses, or diagnoses of fibromyalgia. A patient was considered to have a substance use disorder if he received treatment for illicit drug or alcohol abuse or dependence. We combined 2010–2012 (EHR) data from primary care and specialty mental health setting in a Boston healthcare system (n = 131,966 person-years) and a specialty mental health care setting in Madrid, Spain (n = 43,309 person-years).
Results
We identified that 35.3% of individuals with clinically recognized pain also report substance use disorder, compared to only 10.6% of individuals without clinically recognized pain (P < 0.01). Those with co-morbid pain and substance use disorder were significantly more likely than their specialty care counterparts without co-morbid pain and substance disorders to be seen in the emergency room (56.5% vs. 36.6%, respectively, P < 0.01).
Conclusion
The findings suggest that CNMP is associated with an increase risk of substance abuse disorder.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
US Latinos have higher rates of substance use disorders (SUDs) than Latinas, but Latinas face substantial barriers to treatment and tend to enter care with higher SUD severity. Immigrant Latinas may face greater barriers to care than native-born despite lower overall SUD prevalence. This study aimed to identify how SUD treatment needs of Latinos are addressed depending on patient gender and immigrant status within an urban healthcare system serving a diverse population.
Methods
Data from electronic health records of adult Latino/a primary care patients (n = 29,887 person-years) were used to identify rates of SUD treatment in primary and specialty care. Treatment characteristics and receipt of adequate care were compared by gender and immigrant status.
Results
Tobacco was the most frequently treated substance followed by alcohol and other drugs. Forty-six percent of SUD patients had a comorbid psychiatric condition. Treatment rates ranged from 2.52% (female non-immigrants) to 8.38% (male immigrants). Women had lower treatment rates than men, but male and female immigrants had significantly higher treatment rates than their non-immigrant counterparts. Receipt of minimally adequate outpatient care varied significantly by gender and immigrant status (female non-immigrants 12.5%, immigrants 28.57%; male non-immigrants 13.46%, immigrants 17.09%) in unadjusted and adjusted analyses.
Discussion
Results indicate overall low prevalence of SUD treatment in the healthcare system. Low rates of minimally adequate care evidence the challenge of delivering integrated behavioral healthcare for Latinos with SUD. Results also demonstrate gender and immigrant status disparities in an unexpected direction, with immigrant women receiving the highest rates of adequate care.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
There is no comprehensive evidence on the influence of sleep disturbances (SD) on substance use disorders (SUD) or treatment use patterns of individuals with comorbid disturbances.
Objective/aim
To better understand comorbidities and treatment use patterns of individuals with SD and SUD.
Methods
We combine 2010–2012 electronic health record (EHR) data from healthcare system in Boston (n = 131,966 person-years) and Madrid, Spain (n = 43,309 person-years). Patients with sleep disturbances (SD) were identified in the EHR through ICD-9 codes and medical records and substance use disorders (SUD) identified by documented treatment for drug or alcohol abuse or dependence. Rates of SUD are compared between individuals with and without SD. Among those with both, adequacy of mental health treatment (defined as eight or more outpatient visits or four or more outpatient visits with a psychotropic prescription) and ER use is compared.
Results
Among the individuals, 21.1% with SD also report SUD, compared to only 10.6% of individuals without SD (P < .01). Those with comorbidities were more likely than their specialty care counterparts without comorbidities to be seen in the ER (57.1% vs. 36.6%, respectively, P < .05). Limiting the sample to only those with both SD and SUD in specialty mental health care (n = 268 in Boston and n = 28 in Madrid), 49.2% of Boston patients received adequate care compared to 38.5% of Madrid patients, and 57.8% of Boston patients had any ER use in the last year vs. 50% of Madrid patients.
Conclusions
SD is correlated with SUD and comorbid patients are more likely to use emergency services.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Yukon Territory (YT) is a remote region in northern Canada with ongoing spread of tuberculosis (TB). To explore the utility of whole genome sequencing (WGS) for TB surveillance and monitoring in a setting with detailed contact tracing and interview data, we used a mixed-methods approach. Our analysis included all culture-confirmed cases in YT (2005–2014) and incorporated data from 24-locus Mycobacterial Interspersed Repetitive Units-Variable Number of Tandem Repeats (MIRU-VNTR) genotyping, WGS and contact tracing. We compared field-based (contact investigation (CI) data + MIRU-VNTR) and genomic-based (WGS + MIRU-VNTR + basic case data) investigations to identify the most likely source of each person's TB and assessed the knowledge, attitudes and practices of programme personnel around genotyping and genomics using online, multiple-choice surveys (n = 4) and an in-person group interview (n = 5). Field- and genomics-based approaches agreed for 26 of 32 (81%) cases on likely location of TB acquisition. There was less agreement in the identification of specific source cases (13/22 or 59% of cases). Single-locus MIRU-VNTR variants and limited genetic diversity complicated the analysis. Qualitative data indicated that participants viewed genomic epidemiology as a useful tool to streamline investigations, particularly in differentiating latent TB reactivation from the recent transmission. Based on this, genomic data could be used to enhance CIs, focus resources, target interventions and aid in TB programme evaluation.
The bovine appeasing substance (BAS) is expected to have calming effects in cattle experiencing stressful situations. Therefore, this study investigated the impacts of BAS administration during two of the most stressful events within beef production systems: weaning and feedlot entry. In experiment 1, 186 Bos indicus-influenced calves (73 heifers, 113 bulls) were weaned at 211 ± 1 days of age (day 0). At weaning, calves were ranked by sex and BW, and assigned to receive BAS (Nutricorp, Araras, SP, Brazil; n = 94) or water (CON; n = 92). Treatments (5 ml) were topically applied to the nuchal skin area of each animal. Calf BW was recorded and samples of blood and tail-switch hair were collected on days 0, 15 and 45. Calves that received BAS had greater (P < 0.01) BW gain from day 0 to 15 compared with CON. Overall BW gain (days 0 to 45) and BW on days 15 and 45 were also greater (P ≤ 0.03) in BAS v. CON. Plasma haptoglobin concentration was less (P < 0.01) in BAS v. CON on day 15, whereas cortisol concentrations in plasma and tail-switch hair did not differ between treatments (P ≥ 0.13). In experiment 2, 140 B. indicus-influenced bulls (∼27 months of age) from 2 different pasture-based systems (70 bulls/origin) were transported to a commercial feedlot (≤ 200-km transport; day -1). On day 0, bulls were ranked by source and BW, and assigned to receive BAS (n = 70) or CON (n = 70) and the same sampling procedures as in experiment 1. Bulls receiving BAS had greater (P = 0.04) BW gain from day 0 to 15, but less (P < 0.01) BW gain from day 15 to 45 compared to CON. No other treatment effects were detected (P > 0.14). Therefore, BAS administration to beef calves alleviated the haptoglobin response associated with weaning, and improved calf growth during the subsequent 45 days. Administration of BAS to beef bulls at feedlot entry improved BW gain during the initial 15 days, but these benefits were not sustained throughout the 45-day experiment.