We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Functional impairment in daily activities, such as work and socializing, is part of the diagnostic criteria for major depressive disorder and most anxiety disorders. Despite evidence that symptom severity and functional impairment are partially distinct, functional impairment is often overlooked. To assess whether functional impairment captures diagnostically relevant genetic liability beyond that of symptoms, we aimed to estimate the heritability of, and genetic correlations between, key measures of current depression symptoms, anxiety symptoms, and functional impairment.
Methods
In 17,130 individuals with lifetime depression or anxiety from the Genetic Links to Anxiety and Depression (GLAD) Study, we analyzed total scores from the Patient Health Questionnaire-9 (depression symptoms), Generalized Anxiety Disorder-7 (anxiety symptoms), and Work and Social Adjustment Scale (functional impairment). Genome-wide association analyses were performed with REGENIE. Heritability was estimated using GCTA-GREML and genetic correlations with bivariate-GREML.
Results
The phenotypic correlations were moderate across the three measures (Pearson’s r = 0.50–0.69). All three scales were found to be under low but significant genetic influence (single-nucleotide polymorphism-based heritability [h2SNP] = 0.11–0.19) with high genetic correlations between them (rg = 0.79–0.87).
Conclusions
Among individuals with lifetime depression or anxiety from the GLAD Study, the genetic variants that underlie symptom severity largely overlap with those influencing functional impairment. This suggests that self-reported functional impairment, while clinically relevant for diagnosis and treatment outcomes, does not reflect substantial additional genetic liability beyond that captured by symptom-based measures of depression or anxiety.
The transition from breastmilk to solid foods (weaning) is a critical stage in infant development and plays a decisive role in the maturation of the complex microbial community inhabiting the human colon. Diet is a major factor shaping the colonic microbiota, which ferments nutrients reaching the colon unabsorbed by the host to produce a variety of microbial metabolites influencing host physiology(1). Therefore, making adequate dietary choices during weaning can positively modulate the colonic microbiota, ultimately contributing to health in infancy and later life(2). However, our understanding of how complementary foods impact the colonic microbiota of weaning infants is limited. To address this knowledge gap, we employed a metagenome-scale modelling approach to simulate the impact of complementary foods, either combined with breastmilk or with breastmilk and other foods, on the production of organic acids by colonic microbes of weaning infants(3). Complementary foods and combinations of foods with the greatest impact on the in silico microbial production of organic acids were identified. These foods and food combinations were further tested in vitro, individually or in combination with infant formula. Fifty-three food samples were digested using a protocol adapted from INFOGEST to mimic infant digestion and then fermented with faecal inoculum from 6 New Zealand infants (5-11 months old). After 24h of fermentation, the production of organic acids was measured by gas chromatography. Differences in organic acid production between samples were determined using the Tukey Honestly Significant Difference test to account for multiple comparisons. The microbial composition was characterised by amplicon sequencing of the V3-V4 regions of the 16S bacterial gene. Taxonomy was assigned using the DADA2 pipeline and the SILVA database (version 138.1). Bioinformatic and statistical analyses were conducted using the R packages phyloseq and ANCOM-BC2, with the Holm-Bonferroni adjustment to account for false discovery rates in differential abundance testing. Blackcurrant and raspberries increased the production of acetate and propionate (Tukey’s test, p<0.05) and the relative abundance of the genus Parabacteroides (Dunnett’s test, adjusted p<0.05) compared to other foods. Raspberries also increased the abundance of the genus Eubacterium (Dunnett’s test, adjusted p<0.05). When combined with infant formula, black beans stood out for increasing the production of butyrate (Tukey’s test, p<0.05) and the relative abundance of the genus Clostridium (Dunnett’s test, adjusted p<0.05). In conclusion, this study provides new evidence on how complementary foods, both individually or in combination with other dietary compounds, influence the colonic microbiota of weaning infants in vitro. Insights generated by this research can help design future clinical trials, ultimately enhancing our understanding of the relationship between human nutrition and colonic microbiota composition and function in post-weaning life.
Micronutrient deficiencies (MND) are a significant global health issue, particularly affecting children’s growth and cognitive potential and predisposing to adverse health outcomes for women of reproductive age (WRA).(1) Over half of global MND cases occur in Sub-Saharan Africa (SSA), with 80% of women estimated to be deficient in at least one of three micronutrients(2). Large-scale food fortification is a cost-effective strategy recommended for combatting widespread MND and has been effectively implemented in many developed countries(3). In developing countries such as SSA, socio-economic barriers and a fragmented food processing industry hinders effective implementation of food fortification(4). As a result, countries with fortification programmes face significant challenges, including low coverage of fortified food in the population and poor compliance with fortification standards by food producers(5) The contribution of food fortification to nutrient intakes of WRA in SSA have yet to be fully assessed. This study sought to evaluate mandatory food fortification programmes in SSA and estimate the contribution of fortified food consumption to micronutrient intakes and requirements of WRA. We utilised multi-national fortification data from the global fortification data exchange, which includes data on country fortification standards and the estimated level of compliance to fortification requirements. Data on the supply and consumption of fortifiable food was also included from the FAO. We calculated the potential nutrient intake from fortified food consumption for each nutrient using country fortification standards and food availability. We adjusted the estimated intake for each nutrient by multiplying with the estimated compliance percentage. We also assessed what proportion of women’s requirements for essential micronutrients, folate, iron, iodine, vitamin A, and zinc, are met through fortified food consumption using RNI values from WHO/FAO for WRA. Between 2019 and 2021, we estimated that mandatory fortification of wheat and maize flour, oil and salt in SSA contributes a median of 138µgDFE of folic acid, 217µg of iodine, 43µg RAE of vitamin A and 2.1mg and 2.0mg of iron and zinc respectively to the intakes of WRA daily. These intakes represent 12.8% (0.0-49.2) of iron, 27.5% (0.0-83.2) of zinc, 55.0% (0.0-245.0) of folate, 8.8% (0.0-37.2) of vitamin A and 228.2% (98.2-358.6) of iodine requirements respectively, taking into consideration the lower bioavailability of iron and zinc from cereal-based diets of SSA populations. In reality, compliance with fortification requirements in SSA is low, estimated at a median of 22% (0.0 - 83.4) for maize flour, 44% (0.0 - 72.0) for vegetable oil and 83% (0.0 - 100.0) for wheat flour fortification and is a major factor limiting the overall contribution of fortification to micronutrient intakes. Inadequate regulatory monitoring to ensure compliance with fortification requirements in SSA have resulted in lower-quality fortified foods, limiting women’s potential to achieve adequate micronutrient intake through fortified food consumption.
This longitudinal survey examined the effect of the National Healthcare Safety Network’s (NHSN) recently updated Clostridioides difficile test method definition on reporting of hospital-onset C. difficile. Among six hospitals with ≥ 5 years of data available, the updated NHSN definition was associated with improved concordance between predicted versus reported cases.
As hybrid work arrangements have become more prevalent in the wake of the COVID-19 pandemic, the alignment between jobs and workers has also evolved, arguably in ways that research has yet to fully capture. We build on the theoretical foundation of person-environment fit – and person-job fit specifically – to investigate how employees’ work arrangements and their perceived fit with their work arrangements influence important personal (e.g., work-life balance, stress) and work-related (e.g., organizational commitment, engagement) outcomes. Quantitative evidence from a survey of 427 hybrid workers supports the idea that the extent to which an individual’s desires, needs, and values align with their work arrangement plays an important role in their personal and work-related well-being. We advocate for expanding the conceptualization of person-job and person-environment fit models to incorporate work arrangements and provide recommendations for research and practice.
Background: Our prior six-year review (n=2165) revealed 24% of patients undergoing posterior decompression surgeries (laminectomy or discectomy) sought emergency department (ED) care within three months post-surgery. We established an integrated Spine Assessment Clinic (SAC) to enhance patient outcomes and minimize unnecessary ED visits through pre-operative education, targeted QI interventions, and early post-operative follow-up. Methods: We reviewed 13 months of posterior decompression data (n=205) following SAC implementation. These patients received individualized, comprehensive pre-operative education and follow-up phone calls within 7 days post-surgery. ED visits within 90 days post-surgery were tracked using provincial databases and compared to our pre-SAC implementation data. Results: Out of 205 patients, 24 (11.6%) accounted for 34 ED visits within 90 days post-op, showing a significant reduction in ED visits from 24% to 11.6%, and decreased overall ED utilization from 42.1% to 16.6% (when accounting for multiple visits by the same patient). Early interventions including wound monitoring, outpatient bloodwork, and prescription adjustments for pain management, helped mitigate ED visits. Patient satisfaction surveys (n=62) indicated 92% were “highly satisfied” and 100% would recommend the SAC. Conclusions: The SAC reduced ED visits after posterior decompression surgery by over 50%, with pre-operative education, focused QI initiatives, and its individualized, proactive approach.
Cattle (Bos spp.) grazing on weed–mixed forage biomass may potentially spread weed seeds, leading to plant invasions across pasturelands. Understanding the possibility and intensity of this spread is crucial for developing effective weed control methods in grazed areas. This research undertook an in vitro experiment to evaluate the germination and survival of five dominant weed species in the southern United States [Palmer amaranth (Amaranthus palmeri S. Watson), yellow foxtail [Setaria pumila (Poir.) Roem. & Schult.], johnsongrass [Sorghum halepense (L.) Pers.], field bindweed (Convolvulus arvensis L.) and pitted morningglory (Ipomoea lacunosa L.)] upon incubation in rumen fluid for eight time periods (0, 4, 8, 12, 24, 24, 48, 72, and 96 h). For the 96-h treatment, a full Tilley and Terry procedure was applied after 48 h for stopping fermentation, followed by incubation for another 48 h simulating abomasum digestion. Seed germination, upon incubation, varied significantly among weed species, with I. lacunosa reaching zero germination after only 24 h of incubation, whereas A. palmeri and S. halepense retained up to 3% germination even after 96 h of incubation. The hard seed coats of A. palmeri and S. halepense likely made them highly resistant, whereas the I. lacunosa seed coat became easily permeable and ruptured under rumen fluid incubation. This suggests that cattle grazing can selectively affect seed distribution and invasiveness of weeds in grazed grasslands and rangelands, including the designated invasive and noxious weed species. As grazing is a significant component in animal husbandry, a major economic sector in the U.S. South, our research provides important insights into the potential role of grazing as a dispersal mechanism for some of the troublesome arable weeds in the United States. The results offer opportunities for devising customized feeding and grazing practices combined with timely removal of weeds in grazeable lands at the pre-flowering stage for effective containment of weeds.
Religious language plays a pivotal role in shaping political behavior and attitudes. This study investigates how representatives utilize religious rhetoric when addressing the House floor and their constituents, and how this language is influenced by congressional leadership. The inauguration of openly religious Mike Johnson as House Speaker in 2023 provides a unique case to explore these dynamics. Using difference-in-differences and triple difference models, we analyze House speeches and newsletters from before and after Johnson became House Speaker to assess changes in religious speech between Republican and Democratic representatives. Our findings reveal a significant increase in newsletters using religious language sent out by Republicans after Johnson became Speaker, while religious speech on the House floor remains unchanged. Overall, our findings contribute to the literature on the relationship between religion, partisanship, and Congressional leadership, highlighting the potential influence of the Speaker of the House on religious communication to constituents.
This study investigates screening practices for antimicrobial-resistant organisms (AROs) in seventy-five hospitals participating in the Canadian Nosocomial Infection Surveillance Program (CNISP). Screening practices varied with widespread MRSA screening, selective carbapenemase-producing organisms (CPO) screening, and limited vancomycin-resistant Enterococcus (VRE) screening. These findings may help interpret ARO rates within CNISP hospitals and inform screening practices.
To estimate the cost-effectiveness of methicillin-resistant Staphylococcus aureus (MRSA) nares poymerase chain reaction (PCR) use in pediatric pneumonia and tracheitis.
Methods:
We built a cost-effectiveness model based on MRSA prevalence and probability of empiric treatment for MRSA pneumonia or tracheitis, with all parameters varied in sensitivity analyses. The hypothetical patient cohort was <18 years of age and hospitalized in the pediatric intensive care unit for community-acquired pneumonia (CAP) or tracheitis. Two strategies were compared: MRSA nares PCR-guided antibiotic therapy versus usual care. The primary measure was cost per incorrect treatment course avoided. Length of stay and hospital costs unrelated to antibiotic costs were assumed to be the same regardless of PCR use. Both literature data and expert estimates informed sensitivity analysis ranges.
Results:
When estimating the health care system willingness-to-pay threshold for PCR testing as $140 (varied in sensitivity analyses) per incorrect treatment course avoided, reflecting estimated additional costs of MRSA targeted antibiotics, and MRSA nares PCR true cost as $64, PCR testing was generally favored if empiric MRSA treatment likelihood was >52%. PCR was not favored in some scenarios when simultaneously varying MRSA infection prevalence and likelihood of MRSA empiric treatment. Screening becomes less favorable as MRSA PCR cost increased to the highest range value of the parameter ($88). Individual variation of MRSA colonization rates over wide ranges (0% – 30%) had lesser effects on results.
Conclusions:
MRSA nares PCR use in hospitalized pediatric patients with CAP or tracheitis was generally favored when empiric MRSA empiric treatment rates are moderate or high.
This study evaluated Medicaid claims (MC) data as a valid source for outpatient antimicrobial stewardship programs (ASPs) by comparing it to electronic medical record (EMR) data from a single academic center.
Methods:
This retrospective study compared pediatric patients’ MC data with EMR data from the Marshall Health Network (MHN). Claims were matched to EMR records based on patient Medicaid ID, service date, and provider NPI number. Demographics, antibiotic choice, diagnosis appropriateness, and guideline concordance were assessed across both data sources.
Setting:
The study was conducted within the MHN, involving multiple pediatric and family medicine outpatient practices in West Virginia, USA.
Patients:
Pediatric patients receiving care within MHN with Medicaid coverage.
Results:
MC and EMR data showed >90% agreement in antibiotic choice, gender, and date of service. Discrepancies were observed in diagnoses, especially for visits with multiple infectious diagnoses. MC data demonstrated similar accuracy to EMR data in identifying inappropriate prescriptions and assessing guideline concordance. Additionally, MC data provided timely information, enhancing the feasibility of impactful outpatient ASP interventions.
Conclusion:
MC data is a valid and timely resource for outpatient ASP interventions. Insurance providers should be leveraged as key partners to support large-scale outpatient stewardship efforts.
The Child Opportunity Index is an index of 29 indicators of social determinants of health linked to the United States of America Census. Disparities in the treatment of Wolff–Parkinson–White have not be reported. We hypothesise that lower Child Opportunity Index levels are associated with greater disease burden (antiarrhythmic use, ablation success, and Wolff–Parkinson–White recurrence) and ablation utilisation.
Methods:
A retrospective, single-centre study was performed with Wolff–Parkinson–White patients who received care from January 2021 to July 2023. Following exclusion for <5 years old and with haemodynamically significant CHD, 267 patients were included (45% high, 30% moderate, and 25% low Child Opportunity Index). Multi-level logistic and log-linear regression was performed to assess the relationship between Child Opportunity Index levels and outcomes.
Results:
Low patients were more likely to be Black (p < 0.0001) and to have public insurance (p = 0.0006), though, there were no significant differences in ablation utilisation (p = 0.44) or time from diagnosis to ablation (p = 0.37) between groups. There was an inverse relationship with emergency department use (p = 0.007). The low group had 2.8 times greater odds of having one or more emergency department visits compared to the high group (p = 0.004).
Conclusion:
The Child Opportunity Index was not related with ablation utilisation, while there was an inverse relationship in emergency department use. These findings suggest that while social determinants of health, as measured by Child Opportunity Index, may influence emergency department utilisation, they do not appear to impact the overall management and procedural timing for Wolff–Parkinson–White treatment.
Patients with cancer frequently experience insomnia that significantly impacts their quality of life, worsens existing symptoms, and potentially hinders treatment outcomes and recovery. Here, we report on 3 cancer patients whose insomnia was improved with low-dose olanzapine.
Methods
A retrospective review of medical records was conducted for 3 cancer patients experiencing insomnia treated with olanzapine at Johns Hopkins Hospital. The data collection included the type of cancer diagnosis, the level of insomnia severity experienced by individuals, and treatment results and outcome.
Results
Olanzapine improved sleep in all 3 patients and decreased nausea/vomiting and anxiety in patients 2 and 3.
Significance of results
A low dose of olanzapine has potential to treat insomnia in cancer patients. The ideal dosing regimens and potential risks are unclear, especially for long-term use. More research and clinical trials are needed to evaluate off-label use of olanzapine for insomnia, including its efficacy and risks, and to optimize the dosage to reduce its side effects in cancer patients. Oncology providers should consider olanzapine as a potential treatment for insomnia, especially given its off-label uses and potential benefits.
Antibiotics are essential to combating infections; however, misuse and overuse has contributed to antimicrobial resistance (AMR). Antimicrobial stewardship programs (ASPs) are a strategy to combat AMR and are mandatory in Canadian hospitals for accreditation. The Canadian Nosocomial Infection Surveillance Program (CNISP) sought to capture a snapshot of ASP practices within the network of Canadian acute care hospitals. Objectives of the survey were to describe the status, practices, and process indicators of ASPs across acute care hospitals participating in CNISP.
Design:
The survey explored the following items related to ASP programs: 1) program structure and leadership, 2) human, technical and financial resources allocated, 3) inventory of interventions carried and implemented, 4) tracking antimicrobial use; and 5) educational and promotional components.
Methods:
CNISP developed a 34-item survey in both English and French. The survey was administered to 109 participating CNISP hospitals from June to August 2024, responses were analyzed descriptively.
Results:
Ninety-seven percent (106/109) of CNISP hospitals responded to the survey. Eighty-four percent (89/106) reported having a formal ASP in place at the time of the study. Ninety percent (80/89) of acute care hospitals with an ASP performed prospective audit and feedback for antibiotic agents and 85% (76/89) had formal surveillance of quantitative antimicrobial use. Additionally, just over 80% (74/89) provided education to their prescribers and other healthcare staff.
Conclusions:
CNISP acute care hospitals employ multiple key aspects of ASP including implementing interventions and monitoring/tracking antimicrobial use. There were acute care hospitals without an ASP, highlighting areas for investigation and improvement.
We present the Evolutionary Map of the Universe (EMU) survey conducted with the Australian Square Kilometre Array Pathfinder (ASKAP). EMU aims to deliver the touchstone radio atlas of the southern hemisphere. We introduce EMU and review its science drivers and key science goals, updated and tailored to the current ASKAP five-year survey plan. The development of the survey strategy and planned sky coverage is presented, along with the operational aspects of the survey and associated data analysis, together with a selection of diagnostics demonstrating the imaging quality and data characteristics. We give a general description of the value-added data pipeline and data products before concluding with a discussion of links to other surveys and projects and an outline of EMU’s legacy value.
In 2010, USAID catalyzed the formation of One Health University Networks as part of a holistic response designed to promote the One Health approach for addressing complex health challenges. This globally connected One Health University network now includes the African One Health University Network (AFROHUN) and the Southeast Asia University Network (SEAOHUN) and has representation from over 120 universities in 17 countries across Africa and Southeast Asia. Over more than 15 years of USAID investment, these networks have trained more than 85,000 students, in-service professionals and faculty around the world in One Health principles and collaborative problem solving, grounded in One Health core competencies. These One Health practitioners have gone on to contribute to improved global health security in their communities and countries. The evolution and maturation of these networks is a testament to a strong vision and dedication to the task by leadership and donors. As the global academic community continues to refine and adapt training methodologies for ‘future ready’ individuals, resources and examples from One Health University Networks stand as a legacy to build upon.
Nuts are nutrient-rich, energy-dense foods that are associated with better diet quality in children(1), yet intake in Australian children remains low(2). Prospective studies have demonstrated positive associations between nut consumption and cognitive performance in children(3), while randomised controlled trials (RCTs) assessing nut consumption and cognitive performance in adults have reported inconsistent findings(4). This 2-phase cross-over RCT examined the feasibility of Australian children eating an almond-enriched diet (30 g almonds, 5 days per week) compared with a nut-free diet for 8 weeks each. Associated changes in diet quality, lifestyle factors and cognitive performance were also measured. Forty children (48% female, 8–13 years) who were low habitual nut consumers (< 30 g/day) and free from nut allergies and cognitive, behavioural or medical conditions that could affect study outcomes were enrolled. Feasibility outcomes included retention, compliance with study foods and changes in ratings of liking and palatability of almonds. Other outcomes were assessed before and after each 8-week diet phase, separated by a 2-week washout. Parent/guardian–child dyads completed questionnaires about diet (diet quality score), physical activity, and sleep behaviour. Sleep quality and length were recorded for 7 nights prior to clinic visits. At each visit sleepiness was captured (Karolinska Sleepiness Scale) before children completed a computerised test battery (COMPASS) to assess cognitive performance across attention/concentration, executive function, memory, processing speed and verbal fluency domains. Analyses were performed using SPSS 26.0 software with statistical significance defined as p < 0.05. Data were analysed using mixed effects models, with diet and time as fixed effects, a random effect of ID and controlling for diet order, age, sex and sleepiness. Retention was excellent with all participants completing the study and mean compliance with almonds was 98%. Mean liking and palatability ratings declined after 8 weeks (−23 points, p = 0.006) but remained favourable. There were no significant changes in diet quality, physical activity or sleep (behaviour, length or quality) during the study. Changes in cognitive performance over time and between diets ranged from trivial to small (Cohen’s d = 0.01–0.28) for all tests, failing to reach significance except for simple reaction time (faster response over time, d = −0.1, F(1,115.7) = 4.455, p = 0.037) and Peg and Ball response time (faster after nut-free diet, d = 0.28, F(1,115.4) = 4.176, p = 0.043). This study demonstrated that it was feasible to conduct an almond-enriched dietary intervention in Australian children, with excellent retention and compliance to study requirements. Whilst significant changes were limited for scientific outcomes, this study was not designed to be powered for these outcomes. Rather, these data will be valuable for determining required sample sizes in future studies assessing nut interventions and cognitive performance in children.
An important component of post-release monitoring of biological control of invasive plants is the tracking of species interactions. During post-release monitoring following the initial releases of the weevil Ceutorhynchus scrobicollis Nerenscheimer and Wagner (Coleoptera: Curculionidae) on garlic mustard, Alliaria petiolata (Marschall von Bieberstein) Cavara and Grande (Brassicaceae), in Ontario, Canada, we identified the presence of larvae of the tumbling flower beetle, Mordellina ancilla Leconte (Coleoptera: Mordellidae), in garlic mustard stems. This study documents the life history of M. ancilla on garlic mustard to assess for potential interactions between M. ancilla and C. scrobicollis as a biological control agent. Garlic mustard stems were sampled at eight sites across southern Ontario and throughout the course of one year to record the prevalence of this association and to observe its life cycle on the plant. We found M. ancilla to be a widespread stem-borer of late second–year and dead garlic mustard plants across sampling locations. This is the first host record for M. ancilla on garlic mustard. The observed life cycle of M. ancilla indicates that it is unlikely to negatively impact the growth and reproduction of garlic mustard and that it is unlikely to affect the use of C. scrobicollis as a biological control agent.
Root research on field-grown crops is hindered by the difficulty of estimating root biomass in soil. Root washing, the current standard method is laborious and expensive. Biochemical methods to quantify root biomass in soil, targeting species-specific DNA, have potential as a more efficient assay. We combined an efficient DNA extraction method, designed specifically to extract DNA from soil, with well-established quantitative PCR methods to estimate the root biomass of 22 wheat varieties grown in field trials over two seasons. We also developed an assay for estimating root biomass for black-grass, a common weed of wheat cultivation.
Methods
Two robust qPCR assays were developed to estimate the quantity of plant root DNA in soil samples, one specific to wheat and barley, and a second specific to black-grass.
Results
The DNA qPCR method was comparable, with high correlations, with the results of root washing from soil cores taken from winter wheat field trials. The DNA qPCR assay showed both variety and depth as significant factors in the distribution of root biomass in replicated field trials.
Conclusions
The results suggest that these DNA qPCR assays are a useful, high-throughput tool for investigating the genetic basis of wheat root biomass distribution in field-grown crops, and the impact of black-grass root systems on crop production.