We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We investigated the effects of maternal vitamin and mineral supplementation throughout gestation on gene expression in the jejunal mucosa of neonatal calves. Crossbred Angus heifers (n = 14) were estrus synchronized, bred to female-sexed semen, and randomly assigned to a basal diet (Control, CON; n = 7) or the basal diet plus vitamin and mineral supplement (Treatment, VTM; n = 7). After parturition, calves were removed from their dams before suckling, fed colostrum replacer, and euthanized 30 h after the first feeding. A subsample of the mucosa of the mid-jejunum was collected, and total RNA was isolated. Gene expression was measured using RNA-Seq, and differentially expressed genes (DEGs) were identified using DESeq2. We identified 528 DEGs from the jejunal mucosa between the VTM and CON calves (P ≤ 0.05 and |log2FC| ≥ 0.5). The DEGs were associated with nutrient transport, lipid metabolism, and immune-related biological processes and pathways. Interestingly, genes underlying the complement and coagulation cascades were mostly downregulated in calves born to VTM dams. On the other hand, the cytokine-cytokine receptor interaction KEGG pathway showed most genes upregulated (LIFR, KDR, TNFRSF4, TNFSF18, FLT1, and TNFRSF12A). Our results show that vitamin and mineral supplementation throughout gestation affects genes underlying tissue structure, nutrient transport and metabolism, and immune system pathways in neonates. The implications of such changes and the long-term outcomes on herd health and performance warrant further research.
To report on the design and results of an innovative nurse practitioner (NP)-led specialist primary care service for children facing housing instability.
Background:
During 2017–2018, children aged 0–14 years represented 23% of the total population receiving support from specialist homeless services in Australia. The impact of housing instability on Australian children is considerable, resulting in disengagement from social institutions including health and education, and poorer physical and mental health outcomes across the lifespan. Current services fail to adequately address health and educational needs of children facing housing insecurity. Research identifies similar circumstances for children in other high-income countries. This paper outlines the design, and reports on results of, an innovative NP-led primary care service for children facing housing instability introduced into three not-for-profit faith-based services in one Australian state.
Methods:
Between 2019 and 2021, 66 children of parents experiencing housing instability received standardized health assessment and referral where appropriate by a NP. Data from the standardized tool, such as condition and severity, were recorded to determine common conditions. In addition, comprehensive case notes recorded by the NP were used to understand potential causes of conditions, and referral needs, including potential barriers.
Findings:
The 66 children assessed were aged between 7 weeks to 16 years. Developmental delay, low immunization rates, and dental caries were the most common conditions identified. Access to appropriate services was inhibited by cost, disengagement, and COVID-19.
Conclusion:
Given their advanced skills and knowledge, embedding NPs in specialist homeless services is advantageous to help vulnerable children.
Tetflupyrolimet (Dodhylex™ Active, FMC Corporation) is a novel herbicide inhibiting de novo pyrimidine biosynthesis that controls grassy weeds preemergence in rice (Oryza sativa L.) production. Field trials were conducted from 2021 to 2024 to evaluate turfgrass tolerance to tetflupyrolimet applications for annual bluegrass (Poa annua L.) and smooth crabgrass [Digitaria ischaemum (Schreb.) Schreb. ex Muhl.] control. Tolerance was evaluated on seven turfgrass species, including creeping bentgrass (Agrostis stolonifera L.), Kentucky bluegrass (Poa pratensis L.), tall fescue [Schedonorus arundinaceus (Schreb.) Dumort.; syn.: Festuca arundinacea Schreb.], hybrid bermudagrass [Cynodon dactylon (L.) Pers. × Cynodon transvaalensis Burtt-Davy], and manilagrass [Zoysia matrella (L.) Merr.] at various mowing heights ranging from 3.8 to 12.5 mm. Separate experiments were conducted on each turfgrass species to evaluate tolerance in both fall and spring. Tetflupyrolimet was applied at rates of 0, 25, 50, 100, 200, 400, 800, 1600, 3200, or 6400 g ai ha−1. No injury was observed on any warm-season turfgrass species in either season, whereas cool-season grass tolerance varied among species each season; however, cool-season turfgrass tolerance for all species was greater in spring than fall. While efficacy of tetflupyrolimet (400 g ha−1) for preemergence D. ischaemum control varied among years, mixtures of tetflupyrolimet (400 g ha−1), pyroxasulfone (128 g ai ha−1), and rimsulfuron (35 g ai ha−1) applied preemergence or early postemergence effectively controlled multiple-resistant P. annua in both seasons. Overall, these findings highlight that warm-season turfgrasses are highly tolerant of tetflupyrolimet applications for P. annua or D. ischaemum control.
Studies show that people with severe mental illness (SMI) have a greater risk of dying from colorectal cancer (CRC). These studies mostly predate the introduction of national bowel cancer screening programmes (NBCSPs) and it is unknown if these have reduced disparity in CRC-related mortality for people with SMI.
Methods
We compared mortality rates following CRC diagnosis at colonoscopy between a nationally representative sample of people with and without SMI who participated in Australia’s NBCSP. Participation was defined as the return of a valid immunochemical faecal occult blood test (iFOBT). We also compared mortality rates between people with SMI who did and did not participate in the NBCSP. SMI was defined as receiving two or more Pharmaceutical Benefits Scheme prescriptions for second-generation antipsychotics or lithium.
Results
Amongst NBCSP participants, the incidence of CRC in the SMI cohort was lower than in the controls (hazard ratio [HR] 0.77, 95% confidence interval [CI] 0.61–0.98). In spite of this, their all-cause mortality rate was 1.84 times higher (95% CI 1.12–3.03), although there was only weak evidence of a difference in CRC-specific mortality (HR 1.82; 95% CI 0.93–3.57). People with SMI who participated in the NBCSP had better all-cause survival than those who were invited to participate but did not return a valid iFOBT (HR 0.67, 95% CI 0.50–0.88). The benefit of participation was strongest for males with SMI and included improved all-cause and CRC-specific survival.
Conclusions
Participation in the NBCSP may be associated with improved survival following a CRC diagnosis for people with SMI, especially males, although they still experienced greater mortality than the general population. Approaches to improving CRC outcomes in people with SMI should include targeted screening, and increased awareness about the benefits or participation.
Trial registration
Australian and New Zealand Clinical Trials Registry (Trial ID: ACTRN12620000781943).
Fetal growth restriction (FGR) is associated with increased risk of developing non-communicable diseases. We have a placenta-specific nanoparticle gene therapy protocol that increases placental expression of human insulin-like growth factor 1 (hIGF1), for the treatment of FGR in utero. We aimed to characterize the effects of FGR on hepatic gluconeogenesis pathways during early stages of FGR establishment, and determine whether placental nanoparticle-mediated hIGF1 therapy treatment could resolve differences in the FGR fetus. Female Hartley guinea pigs (dams) were fed either a Control or Maternal Nutrient Restriction (MNR) diet using established protocols. At GD30-33, dams underwent ultrasound guided, transcutaneous, intraplacental injection of hIGF1 nanoparticle or PBS (sham) and were sacrificed 5 days post-injection. Fetal liver tissue was fixed and snap frozen for morphology and gene expression analysis. In female and male fetuses, liver weight as a percentage of body weight was reduced by MNR, and not changed with hIGF1 nanoparticle treatment. In female fetal livers, expression of hypoxia inducible factor 1 (Hif1α) and tumor necrosis factor (Tnfα) were increased in MNR compared to Control, but reduced in MNR + hIGF1 compared to MNR. In male fetal liver, MNR increased expression of Igf1 and decreased expression of Igf2 compared to Control. Igf1 and Igf2 expression was restored to Control levels in the MNR + hIGF1 group. This data provides further insight into the sex-specific mechanistic adaptations seen in FGR fetuses and demonstrates that disruption to fetal developmental mechanisms may be returned to normal by treatment of the placenta.
The objective of this study was to investigate changes in serum biomarkers of acute brain injury, including white matter and astrocyte injury during chronic foetal hypoxaemia. We have previously shown histopathological changes in myelination and neuronal density in fetuses with chronic foetal hypoxaemia at a level consistent with CHD.
Methods:
Mid-gestation foetal sheep (110 ± 3 days gestation) were cannulated and attached to a pumpless, low-resistance oxygenator circuit, and incubated in a sterile fluid environment mimicking the intrauterine environment. Fetuses were maintained with an oxygen delivery of 20–25 ml/kg/min (normoxemia) or 14–16 ml/kg/min (hypoxaemia). Myelin Basic Protein and Glial Fibrillary Acidic Protein serum levels in the two groups were assessed by ELISA at baseline and at 7, 14, and 21 days of support.
Results:
Based on overlapping 95% confidence intervals, there were no statistically significant differences in either Myelin Basic Protein or Glial Fibrillary Acidic Protein serum levels between the normoxemic and hypoxemic groups, at any time point. No statistically significant correlations were observed between oxygen delivery and levels of Myelin Basic Protein and Glial Fibrillary Acidic Protein.
Conclusion:
Chronic foetal hypoxaemia during mid-gestation is not associated with elevated serum levels of acute white matter (Myelin Basic Protein) or astrocyte injury (Glial Fibrillary Acidic Protein), in this model. In conjunction with our previously reported findings, our data support the hypothesis that the brain dysmaturity with impaired myelination found in fetuses with chronic hypoxaemia is caused by disruption of normal developmental pathways rather than by direct cellular injury.
Major depressive disorder (MDD) is a common, debilitating, phenotypically heterogeneous disorder with heritability ranges from 30% to 50%. Compared to other psychiatric disorders, its high prevalence, moderate heritability, and strong polygenicity have posed major challenges for gene-mapping in MDD. Studies of common genetic variation in MDD, driven by large international collaborations such as the Psychiatric Genomics Consortium, have confirmed the highly polygenic nature of the disorder and implicated over 100 genetic risk loci to date. Rare copy number variants associated with MDD risk were also recently identified. The goal of this review is to present a broad picture of our current understanding of the epidemiology, genetic epidemiology, molecular genetics, and gene–environment interplay in MDD. Insights into the impact of genetic factors on the aetiology of this complex disorder hold great promise for improving clinical care.
Optimising oilseed rape canopy size through correct management is crucial for maximising yield. Plant growth regulators (PGRs) and nitrogen (N) fertiliser are generally applied at a flat rate, however variable applications may be useful for the optimisation of canopy size. The aim of this paper was to understand the potential for spectral reflectance indices to predict green area index (GAI) and crop N content in winter oilseed rape, with specific focus on the Fritzmeier Isaria Crop Sensor. Three large oilseed rape chessboard experiments were set up in 2015 and 2016 in the UK. The results show good correlations between the Isaria indices and both GAI and crop N content, suggesting that the Isaria may be a useful tool for variably applying PGRs and N fertiliser to oilseed rape.
A range of precision farming technologies are used commercially for variable rate applications of nitrogen (N) for cereals, yet these usually adjust N rates from a pre-set value, rather than predicting economically optimal N requirements on an absolute basis. This paper reports chessboard experiments set up to examine variation in N requirements, and to develop and test systems for its prediction, and to assess its predictability. Results showed very substantial variability in fertiliser N requirements within fields, typically >150 kg ha−1, and large variation in optimal yields, typically >2 t ha−1. Despite this, calculated increases in yield and gross margin with N requirements perfectly matched across fields were surprisingly modest (compared to the uniform average rate). Implications are discussed, including the causes of the large remaining variation in grain yield, after N limitations were removed.
People with a history of self-harm are at a far greater risk of suicide than the general population. However, the relationship between self-harm and suicide is complex.
Aims
To undertake the first systematic review and meta-analysis of prospective studies of risk factors and risk assessment scales to predict suicide following self-harm.
Method
We conducted a search for prospective cohort studies of populations who had self-harmed. For the review of risk scales we also included studies examining the risk of suicide in people under specialist mental healthcare, in order to broaden the scope of the review and increase the number of studies considered. Differences in predictive accuracy between populations were examined where applicable.
Results
Twelve studies on risk factors and 7 studies on risk scales were included. Four risk factors emerged from the metaanalysis, with robust effect sizes that showed little change when adjusted for important potential confounders. These included: previous episodes of self-harm (hazard ratio (HR) = 1.68, 95% CI 1.38–2.05, K = 4), suicidal intent (HR = 2.7, 95% CI 1.91–3.81, K = 3), physical health problems (HR = 1.99, 95% CI 1.16–3.43, K = 3) and male gender (HR = 2.05, 95% CI 1.70–2.46, K = 5). The included studies evaluated only three risk scales (Beck Hopelessness Scale (BHS), Suicide Intent Scale (SIS) and Scale for Suicide Ideation). Where meta-analyses were possible (BHS, SIS), the analysis was based on sparse data and a high heterogeneity was observed. The positive predictive values ranged from 1.3 to 16.7%.
Conclusions
The four risk factors that emerged, although of interest, are unlikely to be of much practical use because they are comparatively common in clinical populations. No scales have sufficient evidence to support their use. The use of these scales, or an over-reliance on the identification of risk factors in clinical practice, may provide false reassurance and is, therefore, potentially dangerous. Comprehensive psychosocial assessments of the risks and needs that are specific to the individual should be central to the management of people who have self-harmed.
Cryptosporidium, a parasite known to cause large drinking and recreational water outbreaks, is tolerant of chlorine concentrations used for drinking water treatment. Human laboratory-based surveillance for enteric pathogens detected a cryptosporidiosis outbreak in Baker City, Oregon during July 2013 associated with municipal drinking water. Objectives of the investigation were to confirm the outbreak source and assess outbreak extent. The watershed was inspected and city water was tested for contamination. To determine the community attack rate, a standardized questionnaire was administered to randomly sampled households. Weighted attack rates and confidence intervals (CIs) were calculated. Water samples tested positive for Cryptosporidium species; a Cryptosporidium parvum subtype common in cattle was detected in human stool specimens. Cattle were observed grazing along watershed borders; cattle faeces were observed within watershed barriers. The city water treatment facility chlorinated, but did not filter, water. The community attack rate was 28·3% (95% CI 22·1–33·6), sickening an estimated 2780 persons. Watershed contamination by cattle probably caused this outbreak; water treatments effective against Cryptosporidium were not in place. This outbreak highlights vulnerability of drinking water systems to pathogen contamination and underscores the need for communities to invest in system improvements to maintain multiple barriers to drinking water contamination.
To observe patient care across hemodialysis facilities enrolled in the National Opportunity to Improve Infection Control in ESRD (end-stage renal disease) (NOTICE) project in order to evaluate adherence to evidence-based practices aimed at prevention of infection.
SETTING AND PARTICIPANTS
Thirty-four hemodialysis facilities were randomly selected from among 772 facilities in 4 end-stage renal disease participating networks. Facility selection was stratified on dialysis organization affiliation, size, socioeconomic status, and urban/rural status.
MEASUREMENTS
Trained infection control evaluators used an infection control worksheet to observe 73 distinct infection control practices at the hemodialysis facilities, from October 1, 2011, through January 31, 2012.
RESULTS
There was considerable variation in infection control practices across enrolled facilities. Overall adherence to recommended practices was 68% (range, 45%–92%) across all facilities. Overall adherence to expected hand hygiene practice was 72% (range, 10%–100%). Compliance to hand hygiene before and after procedures was high; however, during procedures hand hygiene compliance averaged 58%. Use of chlorhexidine as the specific agent for exit site care was 19% overall but varied from 0% to 35% by facility type. The 8 checklists varied in the frequency of perfect performance from 0% for meeting every item on the checklist for disinfection practices to 22% on the arteriovenous access practices at initiation.
CONCLUSIONS
Our findings suggest that there are many areas for improvement in hand hygiene and other infection prevention practices in end-stage renal disease. These NOTICE project findings will help inform the development of a larger quality improvement initiative at dialysis facilities.
Although rare, typhoid fever cases acquired in the United States continue to be reported. Detection and investigation of outbreaks in these domestically acquired cases offer opportunities to identify chronic carriers. We searched surveillance and laboratory databases for domestically acquired typhoid fever cases, used a space–time scan statistic to identify clusters, and classified clusters as outbreaks or non-outbreaks. From 1999 to 2010, domestically acquired cases accounted for 18% of 3373 reported typhoid fever cases; their isolates were less often multidrug-resistant (2% vs. 15%) compared to isolates from travel-associated cases. We identified 28 outbreaks and two possible outbreaks within 45 space–time clusters of ⩾2 domestically acquired cases, including three outbreaks involving ⩾2 molecular subtypes. The approach detected seven of the ten outbreaks published in the literature or reported to CDC. Although this approach did not definitively identify any previously unrecognized outbreaks, it showed the potential to detect outbreaks of typhoid fever that may escape detection by routine analysis of surveillance data. Sixteen outbreaks had been linked to a carrier. Every case of typhoid fever acquired in a non-endemic country warrants thorough investigation. Space–time scan statistics, together with shoe-leather epidemiology and molecular subtyping, may improve outbreak detection.
This paper describes the system architecture of a newly constructed radio telescope – the Boolardy engineering test array, which is a prototype of the Australian square kilometre array pathfinder telescope. Phased array feed technology is used to form multiple simultaneous beams per antenna, providing astronomers with unprecedented survey speed. The test array described here is a six-antenna interferometer, fitted with prototype signal processing hardware capable of forming at least nine dual-polarisation beams simultaneously, allowing several square degrees to be imaged in a single pointed observation. The main purpose of the test array is to develop beamforming and wide-field calibration methods for use with the full telescope, but it will also be capable of limited early science demonstrations.
Antibiograms have effectively improved antibiotic prescribing in acute-care settings; however, their effectiveness in skilled nursing facilities (SNFs) is currently unknown.
Objective.
To develop SNF-specific antibiograms and identify opportunities to improve antibiotic prescribing.
Design and Setting.
Cross-sectional and pretest-posttest study among residents of 3 Maryland SNFs.
Methods.
Antibiograms were created using clinical culture data from a 6-month period in each SNF. We also used admission clinical culture data from the acute care facility primarily associated with each SNF for transferred residents. We manually collected all data from medical charts, and antibiograms were created using WHONET software. We then used a pretest-posttest study to evaluate the effectiveness of an antibiogram on changing antibiotic prescribing practices in a single SNF. Appropriate empirical antibiotic therapy was defined as an empirical antibiotic choice that sufficiently covered the infecting organism, considering antibiotic susceptibilities.
Results.
We reviewed 839 patient charts from SNF and acute care facilities. During the initial assessment period, 85% of initial antibiotic use in the SNFs was empirical, and thus only 15% of initial antibiotics were based on culture results. Fluoroquinolones were the most frequently used empirical antibiotics, accounting for 54.5% of initial prescribing instances. Among patients with available culture data, only 35% of empirical antibiotic prescribing was determined to be appropriate. In the single SNF in which we evaluated antibiogram effectiveness, prevalence of appropriate antibiotic prescribing increased from 32% to 45% after antibiogram implementation; however, this was not statistically significant (P = .32).
Conclusions.
Implementation of antibiograms may be effective in improving empirical antibiotic prescribing in SNFs.
The prevalence of mental disorders among prisoners is considerably higher than in the general population. This is an important public health issue as the vast majority of prisoners stay in custody for less than 9 months and, when not in prison, offenders' lifestyles are frequently chaotic, characterized by social exclusion, instability and unemployment. Multi-disciplinary mental health inreach services were introduced to target care towards prisoners with severe mental illness (SMI) in a similar way to that provided by Community Mental Health Teams outside prison. The aim was to establish the proportion of prisoners with SMI who were assessed and managed by prison mental health inreach services.
Method
A two-phase prevalence survey in six prisons in England measured SMI upon reception into custody. Case-note review established the proportion of those with SMI subsequently assessed and treated by inreach services.
Results
Of 3492 prisoners screened, 23% had SMI. Inreach teams assessed only 25% of these unwell prisoners, and accepted just 13% onto their caseloads.
Conclusions
Inreach teams identified and managed only a small proportion of prisoners with SMI. Prison-based services need to improve screening procedures and develop effective care pathways to ensure access to appropriate services. Improved identification of mental illness is needed in both the community and the Criminal Justice System to better engage with socially transient individuals who have chaotic lifestyles and complex needs.
Dr. Loran B. Smith passed away in Topeka, Kansas, on July 24, 2009. He was born on July 23, 1946. He was the son of Gordon T and Edith A (Hibbard) Smith of Medford, Massachusetts. Loran received his bachelors degree at Salem State College (Massachusetts) in 1968, a masters from Oklahoma State in 1971, and then taught at Black Hills State (Spearfish, South Dakota) from 1971–1974 and Augustana College in Souix Falls from 1974–1977. He received his Ph.D. from the University of Nebraska-Lincoln in 1980 and taught at Missouri Southern State College in Joplin until 1982. He then came to Washburn University of Topeka, where he taught until his death. While “Doc” Smith (as the students referred to him) published sufficiently enough to be awarded tenure and promotion to professor, that was not his forte. Loran was a gifted teacher. His CV lists 23 teaching awards, including Washburn's Faculty Certificate of Merit, a university-wide teaching honor based on student elections, from 1985–1998. Loran was also extremely active in faculty governance and other service to the university and the Topeka community. He was on the university's faculty governing body from 1996–2006, serving as its vice president in 2002 and president from 2003–2005. He was the chairman of the Social Science Division almost all of the 1990s and he also served as the chairman of the college's curriculum committee during that same time span. As Washburn is an open-admission university, we have retention problems not experienced by most universities. Loran researched, organized, and ran a college experience program for at-risk students. He was very active in ASPA, serving as the Kansas chapter president from 1987–1988, indeed, his auto license plate read “KS ASPA” and was purchased for him by students he had recruited into ASPA. Loran's main area of academic interest was state and local government and he was the election night expert for one of the local TV stations here in the capital of Kansas from 1984–1992. What occupied most of his time and energy outside of his official academic duties was serving as the faculty advisor for a local chapter of the Sigma Phi Epsilon fraternity. Doc Smith took what was a typical college fraternity and turned it into a modern association of men that consistently had the highest average GPA of all the fraternities and sororities on campus. It was not unusual for Loran to pay for a student's tuition and fraternity house bill, buy students books, and lend money to a needy student. Loran had a reputation for frugality (his apartment had a TV but no cable, a rotary phone, and he rented all of his furniture and appliances). Loran's tightness with money turned out to be a big benefit for the fraternity. One chapter official put it this way, “Through his notorious tight-fisted watch over finances, the Chapter was able to wipe out a significant debt to the National Housing Corporation ahead of schedule and helped the chapter build a significant savings by 2000.” People who knew Loran thought that he was not married but Loran was married to his job. Not only was Loran in his office nearly every evening until 10:00 p.m., but he was there all day Saturday and Sunday too, and, more often than not, there was a student in that office talking with him.
Common mental health problems are highly prevalent in primary care, the UK National Service Framework for mental health demanding that effective and accessible services be made available. Although built upon a strong evidence base, traditional psychological therapies are often limited in terms of their applicability and availability. As a consequence innovative self-help programmes are increasingly being advocated as an alternative means of managing mental health illness within primary care. This study reports the results of a three month evaluation of a self-help service provided by a busy UK urban Primary Care Trust. Levels of utilization, effectiveness and stakeholder acceptability were examined through a combination of quantitative and qualitative data. A total of 662 patients were referred to the self-help clinics over a three month period, 67% of whom attended their first appointment. The mean number of sessions per patient was 2.8 (SD = 2.4), with an average total time of 69.6 min (SD = 48.2). Mean Clinical Outcomes in Routine Evaluation (CORE-OM) scores improved significantly between baseline and three month follow-up (P < 0.001), 39% of patients demonstrating a clinically significant improvement. Both selfhelp therapists and referring general practitioners reported moderate to high satisfaction with the self-help treatment model, with the majority of patients perceiving the intervention to be appropriate to their needs. Data demonstrated that, whilst there was a clear need for a simple self-help service to be based in primary care, the ultimate success of this provision necessitates a well developed infrastructure capable of providing sufficient support and information to ensure that it is flexible and responsive to individual needs.