We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Given increased survival for adults with CHD, we aim to determine outcome differences of infective endocarditis compared to patients with structurally normal hearts in the general population.
Methods:
We conducted a retrospective cross-sectional study identifying infective endocarditis hospitalisations in patients 18 years and older from the National Inpatient Sample database between 2001 and 2016 using International Classification of Disease diagnosis and procedure codes. Weighting was used to create national annual estimates indexed to the United States population, and multivariable logistic regression analysis determined variable associations. Outcome variables were mortality and surgery. The primary predictor variable was the presence or absence of CHD.
Results:
We identified 1,096,858 estimated infective endocarditis hospitalisations, of which 17,729 (1.6%) were adults with CHD. A 125% increase in infective endocarditis hospitalisations occurred for adult CHD patients during the studied time period (p < 0.001). Adults with CHD were significantly less likely to experience mortality (5.4% vs. 9.5%, OR 0.54, CI 0.47–0.63, p < 0.001) and more likely to undergo in-hospital surgery (31.6% vs. 6.7%, OR 6.49, CI 6.03–6.98, p < 0.001) compared to the general population. CHD severity was not associated with increased mortality (p = 0.53). Microbiologic aetiology of infective endocarditis varied between groups (p < 0.001) with Streptococcus identified more commonly in adults with CHD compared to patients with structurally normal hearts (36.2% vs. 14.4%).
Conclusions:
Adults with CHD hospitalised for infective endocarditis are less likely to experience mortality and more likely to undergo surgery than the general population.
Iron (Fe) minerals play a crucial role in biogeochemical cycles due to their ubiquity in nature, high adsorption capacity and redox activity towards many other elements. Mixed-valent Fe minerals are unique since they contain Fe(II) and Fe(III). For example, magnetite (Fe(II)Fe(III)2O4) nanoparticles (MNPs) can affect the availability and mobility of nutrients and contaminants. This is due to the high surface area to volume ratio and the presence of Fe(II) and Fe(III), allowing redox transformation of (in‑)organic contaminants. Recent studies have shown that magnetite can serve as an electron source and sink for Fe(II)-oxidizing and Fe(III)-reducing microorganisms, storing and releasing electrons; thus, it functions as a biogeobattery. However, the ability of MNPs to act as biogeobatteries over consecutive redox cycles and the consequences for mineral integrity and identity remain unknown. Here, we show MNPs working as biogeobatteries in two consecutive redox cycles over 41 days. MNPs were first oxidized by the autotrophic nitrate-reducing Fe(II)-oxidizing culture KS and subsequently reduced by the Fe(III)-reducing Geobacter sulfurreducens. In addition to reduced magnetite, we identified the Fe(II) mineral vivianite after reductions, suggesting partial reductive dissolution of MNPs and re-crystallization of Fe2+ with phosphate from the growth medium. Measurements of the Fe(II)/Fe(III) ratio revealed microbial oxidation and reduction for both the first redox cycle (oxidation: 0.29±0.014, reduction: 0.75±0.023) and the second redox cycle (oxidation: 0.30±0.015, reduction: 1.64±0.10). Relative changes in magnetic susceptibility (∆κ in %) revealed greater changes for the second oxidation (–8.7±1.99%) than the first (–3.9±0.19%) but more minor changes for the second reduction (+14.29±0.39%) compared to the first (+25.42±1.31%). Our results suggest that MNPs served as biogeobatteries but became less stable over time, which has significant consequences for associated contaminants, nutrients and bioavailability for Fe-metabolizing microorganisms.
During the menopausal transition, women often encounter a range of physical and psychological symptoms which negatively impact on health-related quality of life (HRQoL)(1). Diet quality has previously been identified as a modifiable factor associated with mitigating the severity of these symptoms in peri-menopausal and menopausal women(2). We therefore explored the independent associations between adherence to a Mediterranean diet (MedDiet) and the severity of menopausal symptoms in peri-menopausal and menopausal women living in Australia. We also explored the association between MedDiet adherence and HRQoL in this same cohort of women. We conducted a cross-sectional study of Australian peri-menopausal or menopausal women aged between 40 to 60 years. An 86-item self-administered questionnaire was used to assess the relationship between adherence to a MedDiet and severity of symptoms. MedDiet adherence was assessed using the Mediterranean Diet Adherence Screener (MEDAS), the Menopause Rating Scale (MRS) was used to assess the severity of menopausal symptoms related to somatic, psychological and urinary-genital symptoms and the 36-item short form survey instrument (SF-36) was used to assess HRQoL. Multivariable linear regression analysis (and 95% CI) was used to investigate the independent association between adherence to a MedDiet, severity of menopausal symptoms and HRQoL subscales using one unadjusted and five adjusted predictor models. A total of n = 207 participants (50.7 ± 4.3 years; BMI: 28.0 ± 7.4 kg/m2) were included in the final analyses. Participants reported low-moderate adherence to a MedDiet (5.2 ± 1.8; range: 1-11). We showed that MedDiet adherence was not associated with severity of menopausal symptoms. However, when assessing individual dietary constituents of the MEDAS, we showed that low consumption of sugar-sweetened beverages (<250ml per day) was inversely associated with joint and muscle complaints, independent of all covariates (β = −0.149; CI: −0.118, −0.022; P = 0.042). Furthermore, adherence to a MedDiet was positively associated with the physical function subscale of HRQoL (β = 0.173, CI: 0.001, 0.029; P = 0.031) and a low intake of red and processed meats (≤ 1 serve per day) was positively associated with the general health subscale (β = 0.296, CI: 0.005, 0.014; P = <0.001), independent of all covariates used in the fully adjusted model. Our results suggest that diet quality may be related to severity of menopausal symptoms and HRQoL in peri-menopausal and menopausal women. However, exploration of these findings using longitudinal analyses and robust clinical trials are needed to better elucidate these findings.
Informal carers play an essential role in the care of individuals with Parkinson’s disease (PD). This role, however, is often fraught with difficulties, including emotional, physical, and financial. Coping styles and relationship quality have been hypothesized to influence the impact of stressors. The aim of this study is to examine the relationship between carers’ coping style, relationship quality, and carer burden.
Design:
Cross-sectional.
Participants:
Thirty-nine PD patient carer dyads were included in the study.
Measurements:
Participants completed self-rated questionnaires including the Dyadic Adjustment Scale, Zarit Burden Interview, and Brief Coping Orientation to Problems Experienced Inventory.
Results:
Correlational analyses found significant and positive correlation between carer burden and all three coping styles (problem-focused, emotion-focused, and dysfunctional). There was also a moderate association between carers’ perceived relationship quality and satisfaction and carer burden. Regression analyses found that carer’s gender, severity of PD, relationship quality, emotion-focused, and dysfunctional coping styles did not predict carer burden. Conversely, problem-focused coping style predicted carer burden.
Conclusion:
The results highlight that there is no perfect way to react and care for a loved one and serves as important information for practitioners who design and implement interventions.
Background: On DECT, the ratio of maximum iodine concentration within parenchyma compared to the superior sagittal sinus has been shown to predict hemorrhagic transformation. We aimed to determine if this ratio also predicts the development of an infarct. Methods: 53 patients with small infarct cores (ASPECTS≥7) and good endovascular recanalization (mTICI 2b/3) were enrolled. Maximum brain parenchymal iodine concentration as per DECT relative to the superior sagittal sinus (iodine ratio) was correlated with the development of an infarct on follow up CT. Results: All patients showed contrast staining, 52 developed infarcts in the area of staining. The extent of infarction (smaller, equal or larger than area of staining) did not correlate with the iodine ratio. Conclusions: Brain parenchyma with contrast staining on post-procedure head CT almost invariably goes on to infarct, however the extent of infarct development is not predicted by the intensity of contrast staining.
n=53 patients with successful recanalization of anterior circulation LVO infarct (TICI2b,3) with post procedural parenchymal iodine staining
F/U infarct extent
Number
Hemorrhage(n)
Iodine ratio on intial CT(median/range)
0: No infarct in area of staining
1
0
101(101-101)*
1: Infarct smaller than staining
8
0
138(64-341)*
2: Infarct equal to staining
14
0
140(74-259)*
3:Infarct larger than staining
30
6
120(23-1715)*
0,1:No or smaller infarct than staining
9
0
114(64-341)*
2,3 :equal or larger infarct than staining
44
6
126(23-1714)*
all
53
6
123(23-1714)*
There was no correlation between the degree of contrast staining on initial post procedural CT as expressed in iodine ratio and F/U infarct extent.
Clinical trials continue to face significant challenges in participant recruitment and retention. The Recruitment Innovation Center (RIC), part of the Trial Innovation Network (TIN), has been funded by the National Center for Advancing Translational Sciences of the National Institutes of Health to develop innovative strategies and technologies to enhance participant engagement in all stages of multicenter clinical trials. In collaboration with investigator teams and liaisons at Clinical and Translational Science Award institutions, the RIC is charged with the mission to design, field-test, and refine novel resources in the context of individual clinical trials. These innovations are disseminated via newsletters, publications, a virtual toolbox on the TIN website, and RIC-hosted collaboration webinars. The RIC has designed, implemented, and promised customized recruitment support for 173 studies across many diverse disease areas. This support has incorporated site feasibility assessments, community input sessions, recruitment materials recommendations, social media campaigns, and an array of study-specific suggestions. The RIC’s goal is to evaluate the efficacy of these resources and provide access to all investigating teams, so that more trials can be completed on time, within budget, with diverse participation, and with enough accrual to power statistical analyses and make substantive contributions to the advancement of healthcare.
To prioritise and refine a set of evidence-informed statements into advice messages to promote vegetable liking in early childhood, and to determine applicability for dissemination of advice to relevant audiences.
Design:
A nominal group technique (NGT) workshop and a Delphi survey were conducted to prioritise and achieve consensus (≥70 % agreement) on thirty evidence-informed maternal (perinatal and lactation stage), infant (complementary feeding stage) and early years (family diet stage) vegetable-related advice messages. Messages were validated via triangulation analysis against the strength of evidence from an Umbrella review of strategies to increase children’s vegetable liking, and gaps in advice from a Desktop review of vegetable feeding advice.
Setting:
Australia.
Participants:
A purposeful sample of key stakeholders (NGT workshop, n 8 experts; Delphi survey, n 23 end users).
Results:
Participant consensus identified the most highly ranked priority messages associated with the strategies of: ‘in-utero exposure’ (perinatal and lactation, n 56 points) and ‘vegetable variety’ (complementary feeding, n 97 points; family diet, n 139 points). Triangulation revealed two strategies (‘repeated exposure’ and ‘variety’) and their associated advice messages suitable for policy and practice, twelve for research and four for food industry.
Conclusions:
Supported by national and state feeding guideline documents and resources, the advice messages relating to ‘repeated exposure’ and ‘variety’ to increase vegetable liking can be communicated to families and caregivers by healthcare practitioners. The food industry provides a vehicle for advice promotion and product development. Further research, where stronger evidence is needed, could further inform strategies for policy and practice, and food industry application.
Clinical trial participation among US Hispanics remains low, despite a significant effort by research institutions nationwide. ResearchMatch, a national online platform, has matched 113,372 individuals interested in participating in research with studies conducted by 8778 researchers. To increase accessibility to Spanish speakers, we translated the ResearchMatch platform into Spanish by implementing tenets of health literacy and respecting linguistic and cultural diversity across the US Hispanic population. We describe this multiphase process, preliminary results, and lessons learned.
Methods:
Translation of the ResearchMatch site consisted of several activities including: (1) improving the English language site’s reading level, removing jargon, and using plain language; (2) obtaining a professional Spanish translation of the site and incorporating iterative revisions by a panel of bilingual community members from diverse Hispanic backgrounds; (3) technical development and launch; and (4) initial promotion.
Results:
The Spanish language version was launched in August 2018, after 11 months of development. Community input improved the initial translation, and early registration and use by researchers demonstrate the utility of Spanish ResearchMatch in engaging Hispanics. Over 12,500 volunteers in ResearchMatch self-identify as Hispanic (8.5%). From August 2018 to March 2020, 162 volunteers registered through the Spanish language version of ResearchMatch, and over 500 new and existing volunteers have registered a preference to receive messages about studies in Spanish.
Conclusion:
By applying the principles of health literacy and cultural competence, we developed a Spanish language translation of ResearchMatch. Our multiphase approach to translation included key principles of community engagement that should prove informative to other multilingual web-based platforms.
Systematic, national surveillance of outbreaks of intestinal infectious disease has been undertaken by Public Health England (PHE) since 1992. Between 1992 and 2002, there were 19 outbreaks linked to raw drinking milk (RDM) or products made using raw milk, involving 229 people; 36 of these were hospitalised. There followed an eleven-year period (2003–2013) where no outbreaks linked to RDM were reported. However, since 2014 seven outbreaks of Escherichia coli O157:H7 (n = 3) or Campylobacter jejuni (n = 4) caused by contaminated RDM were investigated and reported. Between 2014 and 2017, there were 114 cases, five reported hospitalisations and one death. The data presented within this review indicated that the risk of RDM has increased since 2014. Despite the labelling requirements and recommendations that children should not consume RDM, almost a third of outbreak cases were children. In addition, there has been an increase in consumer popularity and in registered RDM producers in the UK. The Food Standards Agency (FSA) continue to provide advice on RDM to consumers and have recently made additional recommendations to enhance existing controls around registration and hygiene of RDM producers.
Epoch of Reionisation (EoR) data analysis requires unprecedented levels of accuracy in radio interferometer pipelines. We have developed an imaging power spectrum analysis to meet these requirements and generate robust 21 cm EoR measurements. In this work, we build a signal path framework to mathematically describe each step in the analysis, from data reduction in the Fast Holographic Deconvolution (FHD) package to power spectrum generation in the εppsilon package. In particular, we focus on the distinguishing characteristics of FHD/εppsilon: highly accurate spectral calibration, extensive data verification products, and end-to-end error propagation. We present our key data analysis products in detail to facilitate understanding of the prominent systematics in image-based power spectrum analyses. As a verification to our analysis, we also highlight a full-pipeline analysis simulation to demonstrate signal preservation and lack of signal loss. This careful treatment ensures that the FHD/εppsilon power spectrum pipeline can reduce radio interferometric data to produce credible 21 cm EoR measurements.
We apply two methods to estimate the 21-cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uv-plane. The direct and gridded bispectrum estimators are applied to 21 h of high-band (167–197 MHz; z = 6.2–7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point-source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 h, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21-cm bispectrum may be accessible in less time than the 21-cm power spectrum for some wave modes, with detections in hundreds of hours.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
Shiga toxin-producing Escherichia coli (STEC) infection can cause serious illness including haemolytic uraemic syndrome. The role of socio-economic status (SES) in differential clinical presentation and exposure to potential risk factors amongst STEC cases has not previously been reported in England. We conducted an observational study using a dataset of all STEC cases identified in England, 2010–2015. Odds ratios for clinical characteristics of cases and foodborne, waterborne and environmental risk factors were estimated using logistic regression, stratified by SES, adjusting for baseline demographic factors. Incidence was higher in the highest SES group compared to the lowest (RR 1.54, 95% CI 1.19–2.00). Odds of Accident and Emergency attendance (OR 1.35, 95% CI 1.10–1.75) and hospitalisation (OR 1.71, 95% CI 1.36–2.15) because of illness were higher in the most disadvantaged compared to the least, suggesting potential lower ascertainment of milder cases or delayed care-seeking behaviour in disadvantaged groups. Advantaged individuals were significantly more likely to report salad/fruit/vegetable/herb consumption (OR 1.59, 95% CI 1.16–2.17), non-UK or UK travel (OR 1.76, 95% CI 1.40–2.27; OR 1.85, 95% CI 1.35–2.56) and environmental exposures (walking in a paddock, OR 1.82, 95% CI 1.22–2.70; soil contact, OR 1.52, 95% CI 2.13–1.09) suggesting other unmeasured risks, such as person-to-person transmission, could be more important in the most disadvantaged group.
Shiga-toxin producing Escherichia coli (STEC) is a pathogen that can cause bloody diarrhoea and severe complications. Cases occur sporadically but outbreaks are also common. Understanding the incubation period distribution and factors influencing it will help in the investigation of exposures and consequent disease control. We extracted individual patient data for STEC cases associated with outbreaks with a known source of exposure in England and Wales. The incubation period was derived and cases were described according to patient and outbreak characteristics. We tested for heterogeneity in reported incubation period between outbreaks and described the pattern of heterogeneity. We employed a multi-level regression model to examine the relationship between patient characteristics such as age, gender and reported symptoms; and outbreak characteristics such as mode of transmission with the incubation period. A total of 205 cases from 41 outbreaks were included in the study, of which 64 cases (31%) were from a single outbreak. The median incubation period was 4 days. Cases reporting bloody diarrhoea reported shorter incubation periods compared with cases without bloody diarrhoea, and likewise, cases aged between 40 and 59 years reported shorter incubation period compared with other age groups. It is recommended that public health officials consider the characteristics of cases involved in an outbreak in order to inform the outbreak investigation and the period of exposure to be investigated.
It is well established that there is a high prescribing rate of psychotropic agents in residential aged care (RAC). The appropriateness of these medications has become controversial, given the limited data on efficacy and growing evidence of associated adverse outcomes.
Objective:
To assess psychotropic prescribing in RAC including identification of potentially inappropriate prescriptions (PIPs) and common psychological and behavioral symptoms indicated for prescribing. These were viewed in context of dementia and different RAC facilities.
Methods:
Electronic care plans of 779 RAC residents across 12 facilities were examined to elucidate psychotropic prescribing rates, PIPs, and indications for use.
Results:
One in two residents (48.1%) were prescribed a psychotropic drug. The primary reasons for prescribing were depression (61.5%), anxiety (26.7%), sleep problems (25.4%), agitation (13.7%), psychosis (11.0%), and other behaviors (7.2%). Residents with dementia (56.6%) were more likely to be prescribed a drug for agitation and psychosis, and had a significantly increased prescription rate for antidepressants (OR = 1.50, 95% CI = 1.08–2.08, p = 0.01) and antipsychotics (OR = 1.88, 95% CI = 1.23–2.88, p < 0.01). Conversely, residents with dementia were less likely to receive medication to combat sleeping difficulties, with significantly lower benzodiazepine prescribing (OR = 0.63, 95% CI = 0.44–0.91, p = 0.01). Over half of all psychotropic prescriptions (54.0%) were potentially inappropriate based on the Beers Criteria. There was high variability of prescribing rates between homes.
Conclusion:
There is a high prescribing rate of potentially inappropriate medications. Residents with dementia are more likely to receive medication for agitation and psychosis, and are less likely to receive medication to combat sleeping difficulties.
Early detection of karyotype abnormalities, including aneuploidy, could aid producers in identifying animals which, for example, would not be suitable candidate parents. Genome-wide genetic marker data in the form of single nucleotide polymorphisms (SNPs) are now being routinely generated on animals. The objective of the present study was to describe the statistics that could be generated from the allele intensity values from such SNP data to diagnose karyotype abnormalities; of particular interest was whether detection of aneuploidy was possible with both commonly used genotyping platforms in agricultural species, namely the Applied BiosystemsTM AxiomTM and the Illumina platform. The hypothesis was tested using a case study of a set of dizygotic X-chromosome monosomy 53,X sheep twins. Genome-wide SNP data were available from the Illumina platform (11 082 autosomal and 191 X-chromosome SNPs) on 1848 male and 8954 female sheep and available from the AxiomTM platform (11 128 autosomal and 68 X-chromosome SNPs) on 383 female sheep. Genotype allele intensity values, either as their original raw values or transformed to logarithm intensity ratio (LRR), were used to accurately diagnose two dizygotic (i.e. fraternal) twin 53,X sheep, both of which received their single X chromosome from their sire. This is the first reported case of 53,X dizygotic twins in any species. Relative to the X-chromosome SNP genotype mean allele intensity values of normal females, the mean allele intensity value of SNP genotypes on the X chromosome of the two females monosomic for the X chromosome was 7.45 to 12.4 standard deviations less, and were easily detectable using either the AxiomTM or Illumina genotype platform; the next lowest mean allele intensity value of a female was 4.71 or 3.3 standard deviations less than the population mean depending on the platform used. Both 53,X females could also be detected based on the genotype LRR although this was more easily detectable when comparing the mean LRR of the X chromosome of each female to the mean LRR of their respective autosomes. On autopsy, the ovaries of the two sheep were small for their age and evidence of prior ovulation was not appreciated. In both sheep, the density of primordial follicles in the ovarian cortex was lower than normally found in ovine ovaries and primary follicle development was not observed. Mammary gland development was very limited. Results substantiate previous studies in other species that aneuploidy can be readily detected using SNP genotype allele intensity values generally already available, and the approach proposed in the present study was agnostic to genotype platform.
Conventionally perennial ryegrass evaluations are conducted under simulated grazing studies to identify varieties with the best phenotypic performance. However, cut-plot environments differ greatly to those experienced on commercial farms as varieties are not exposed to the same stress levels in test environments. It could be argued that plot-based testing regimes provide little direction to plant breeders in the development of advanced varieties. Varietal phenotypic performance needs to be quantified in ‘commercial’ situations. The objective of the current study was to evaluate the phenotypic performance of a range of perennial ryegrass varieties under commercial farm conditions. Monocultures of 11 Irish Recommended List perennial ryegrass varieties were sown on 66 commercial farms throughout Ireland where performance was evaluated over a 3-year period from 2013 to 2015, inclusive. A linear mixed model was used to quantify variety effects on grassland phenotypic performance characteristics. No significant variety effect was estimated for total, seasonal or silage herbage production. Despite the lack of variety effects, pairwise comparisons found significant performance differences between individual varieties. Grazed herbage yield is of primary importance and was shown to be correlated strongly with total production (0.71); Grazed herbage yield differed significantly by variety, with a range of 1927 kg dry matter (DM)/ha between the highest and lowest performing varieties. Sward quality (dry matter digestibility [DMD]) and density were influenced by variety with a range of 44 g/kg DM for DMD and 0.7 ground score units between the highest and lowest performing varieties. Results of the current study show that on-farm evaluation is effective in identifying the most suitable varieties for intensive grazing regimes, and the phenotypic variance identified among varieties performance for many traits should allow for improved genetic gain in areas such as DM production, persistence and grazing efficiency.