We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study examined whether supplementation with collagen peptides (CP) affects appetite and post-exercise energy intake in healthy active females.
In this randomised, double-blind crossover study, 15 healthy females (23 ± 3 y) consumed 15 g/day of CP or a taste matched non-energy control (CON) for 7 days. On day 7, participants cycled for 45 min at ∼55% Wmax, before consuming the final supplement. Sixty min post supplementation an ad libitum meal was provided, and energy intake recorded. Subjective appetite sensations were measured daily for 6 days (pre- and 30 min post-supplement), and pre (0 min) to 280 min post-exercise on day 7. Blood glucose and hormone concentrations (total ghrelin, glucagon-like peptide-1 (GLP-1), and peptide YY (PYY), cholecystokinin (CCK), dipeptidyl peptidase-4 (sDPP4), leptin, and insulin, were measured fasted at baseline (day 0), then pre-breakfast (0 min), post-exercise (100 min), post-supplement (115, 130, 145, 160 min) and post-meal (220, 280 min) on day 7.
Ad-libitum energy intake was ∼10% (∼41kcal) lower in the CP trial (P=0.037). There was no difference in gastrointestinal symptoms or subjective appetite sensations throughout the trial (P≥0.412). Total plasma GLP-1 (area under the curve, CON: 6369±2330; CP: 9064±3021 pmol/L; P<0.001) and insulin (+80% at peak) were higher after CP (P<0.001). Plasma ghrelin and leptin were lower in CP (condition effect; P≤0.032). PYY, CCK, sDPP4 and glucose were not different between CP and placebo (P≥0.100).
CP supplementation following exercise increased GLP-1 and insulin concentrations and reduced ad libitum energy intake at a subsequent meal in physically active females.
Heath forests, or known locally as kerangas, in Indonesia and Malaysia form a distinct and understudied ecoregion. We document the distribution and ecological significance of the largest extent of kerangas in Kalimantan, Indonesian Borneo. We mapped 16,586 km2 of kerangas to the nearest one square kilometre across Kalimantan, showing a significant reduction from previous estimates. About 19% of this area exists as a poorly documented mosaic landscape in Central Kalimantan’s Rungan-Kahayan region. Here, peat-based forests transition to heath and dipterocarp forests, making it difficult to reliably classify these forests for conservation planning. Using remote sensing and tree plot data, we identified three forest types—kerangas, low pole, and mixed swamp. Vegetation structure is influenced by soil, topography, and hydrology, while peat depth and elevation affect species diversity. Our findings indicate that these forests are dynamic ecosystems with diverse vegetation communities adapted to peat as well as sandy soils. Lowland heath forests in Rungan-Kahayan exhibits higher tree densities compared to other Bornean heath forests, reflecting unique ecological adaptations to challenging environments. Despite covering just 3% of Kalimantan’s forest area, these ecosystems remain largely unprotected, facing threats from land conversion and fire. Our study highlights the ecological complexity of kerangas and underscores the urgent need for targeted conservation and further research on these forests.
The impact of chronic pain and opioid use on cognitive decline and mild cognitive impairment (MCI) is unclear. We investigated these associations in early older adulthood, considering different definitions of chronic pain.
Methods:
Men in the Vietnam Era Twin Study of Aging (VETSA; n = 1,042) underwent cognitive testing and medical history interviews at average ages 56, 62, and 68. Chronic pain was defined using pain intensity and interference ratings from the SF-36 over 2 or 3 waves (categorized as mild versus moderate-to-severe). Opioid use was determined by self-reported medication use. Amnestic and non-amnestic MCI were assessed using the Jak-Bondi approach. Mixed models and Cox proportional hazards models were used to assess associations of pain and opioid use with cognitive decline and risk for MCI.
Results:
Moderate-to-severe, but not mild, chronic pain intensity (β = −.10) and interference (β = −.23) were associated with greater declines in executive function. Moderate-to-severe chronic pain intensity (HR = 1.75) and interference (HR = 3.31) were associated with a higher risk of non-amnestic MCI. Opioid use was associated with a faster decline in verbal fluency (β = −.18) and a higher risk of amnestic MCI (HR = 1.99). There were no significant interactions between chronic pain and opioid use on cognitive decline or MCI risk (all p-values > .05).
Discussion:
Moderate-to-severe chronic pain intensity and interference related to executive function decline and greater risk of non-amnestic MCI; while opioid use related to verbal fluency decline and greater risk of amnestic MCI. Lowering chronic pain severity while reducing opioid exposure may help clinicians mitigate later cognitive decline and dementia risk.
Smooth scouringrush is an herbaceous perennial with an extensive underground rhizome system that has invaded no-till dryland production fields in the inland Pacific Northwest. The objective of this field study was to determine whether there were any short- or long-term benefits to tank-mixing chlorsulfuron + metsulfuron with glyphosate for smooth scouringrush control. Field studies were conducted at three sites across eastern Washington from 2020 to 2024. Glyphosate was applied during fallow periods at 0, 1,260, 2,520, and 3,780 g ae ha−1 with and without chlorsulfuron + metsulfuron applied at 21.9 + 4.4 g ai ha−1. Smooth scouringrush stem density was evaluated 1, 2, and 3 yr after treatment. Chlorsulfuron + metsulfuron provided excellent control of smooth scouringrush (<5 plants m−2) for the first 2 yr at all three sites, and there was no observed benefit of tank-mixing with glyphosate. This continued to be the case 3 yr after treatment at two of the sites, but at one site, adding glyphosate at 2,520 or 3,780 g ha−1 to chlorsulfuron + metsulfuron decreased stem density compared to chlorsulfuron + metsulfuron applied alone. For treatments containing glyphosate only, the greatest efficacy 3 yr after treatment was achieved at the highest application rate of 3,780 g ha−1. Although no short-term benefit was observed in adding glyphosate to chlorsulfuron + metsulfuron for smooth scouringrush control, at one of three sites the duration of control was increased by at least 1 yr with the addition of glyphosate at a rate of 2,520 g ha−1 or more and an organosilicone surfactant as tank-mix partners.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
Limited access to multiple sclerosis (MS)-focused care in rural areas can decrease the quality of life in individuals living with MS while influencing both physical and mental health.
Methods:
The objectives of this research were to compare demographic and clinical outcomes in participants with MS who reside within urban, semi-urban and rural settings within Newfoundland and Labrador. All participants were assessed by an MS neurologist, and data collection included participants’ clinical history, date of diagnosis, disease-modifying therapy (DMT) use, measures of disability, fatigue, pain, heat sensitivity, depression, anxiety and disease activity.
Results:
Overall, no demographic differences were observed between rural and urban areas. Furthermore, the categorization of primary residence did not demonstrate any differences in physical disability or indicators of disease activity. A significantly higher percentage of participants were prescribed platform or high-efficacy DMTs in semi-urban areas; a higher percentage of participants in urban and rural areas were prescribed moderate-efficacy DMTs. Compared to depression, anxiety was more prevalent within the entire cohort. Comparable levels of anxiety were measured across all areas, yet individuals in rural settings experienced greater levels of depression. Individuals living with MS in either an urban or rural setting demonstrated clinical similarities, which were relatively equally managed by DMTs.
Conclusion:
Despite greater levels of depression in rural areas, the results of this study highlight that an overall comparable level and continuity of care is provided to individuals living with MS within rural and urban Newfoundland and Labrador.
This study introduces the prostate cancer linear energy transfer sensitivity index (PCLSI) as a novel method to predict relative biological effectiveness (RBE) in prostate cancer using linear energy transfer (LET) in proton therapy based on screening for DNA repair mutations.
Materials and Methods:
Five prostate cancer cell lines with DNA repair mutations known to cause sensitivity to LET and DNA repair inhibitors were examined using published data. Relative Du145 LET sensitivity data were leveraged to deduce the LET equivalent of olaparib doses. The PCLSI model was built using three of the prostate cancer cell lines (LNCaP, 22Rv1 and Du145) with DNA mutation frequency from patient cohorts. The PCLSI model was compared against two established RBE models, McNamara and McMahon, for LET-optimized prostate cancer treatment plans.
Results:
The PCLSI model relies on the presence of eight DNA repair mutations: AR, ATM, BRCA1, BRCA2, CDH1, ETV1, PTEN and TP53, which are most likely to predict increased LET sensitivity and RBE in proton therapy. In the LET-optimized plan, the PCLSI model indicates that prostate cancer cells with these DNA repair mutations are more sensitive to increased LET than the McNamara and McMahon RBE models, with expected RBE increases ranging from 11%–33% at 2keV/µm.
Conclusions:
The PCLSI model predicts increasing RBE as a function of LET in the presence of certain genetic mutations. The integration of LET-optimized proton therapy and genetic mutation profiling could be a significant step toward the use of individualized medicine to improve outcomes using RBE escalation without the potential toxicity of physical dose escalation.
Poor weight gain in infants with single ventricle cardiac physiology between stage 1 and stage 2 palliative surgeries is associated with worse outcomes. The growth of infants with single ventricle physiology, enrolled in home monitoring programmes in the United Kingdom, has not been widely described.
Aim:
To explore the growth of infants with single ventricle physiology supported by a home monitoring programme, at a tertiary centre in the South of England.
Methods:
A retrospective review of two cohorts, comparing weight gain amongst infants with single ventricle physiology, before and following the implementation of a home monitoring programme. Inclusion was dependent on a diagnosis compatible with single ventricle physiology during the interstage.
Results:
Enrolment into a home monitoring programme (cohort 2) was associated with 55% more infants being discharged home during the interstage period (p < 0.05). Interstage mortality did not differ between cohorts. There were no differences in interstage growth velocity between cohorts (cohort 1 23.98 ± 11.7 g/day and cohort 2 23.82 ± 8.3 g/day); however, infants in cohort 2 experienced less growth deceleration early in life, and achieved catch-up growth at 12-23 months. Interstage nasogastric feeding, regardless of the cohort, was associated with worse growth outcomes.
Conclusion:
A home monitoring programme for infants with single ventricle physiology provides the opportunity for infants to be safely discharged home to their families and cared for at home during the interstage. Infants in the home monitoring programme experienced better growth, achieving weight restoration at 12–23 months.
Peripheral inflammatory markers, including serum interleukin 6 (IL-6), are associated with depression, but less is known about how these markers associate with depression at different stages of the life course.
Methods
We examined the associations between serum IL-6 levels at baseline and subsequent depression symptom trajectories in two longitudinal cohorts: ALSPAC (age 10–28 years; N = 4,835) and UK Biobank (39–86 years; N = 39,613) using multilevel growth curve modeling. Models were adjusted for sex, BMI, and socioeconomic factors. Depressive symptoms were measured using the Short Moods and Feelings Questionnaire in ALSPAC (max time points = 11) and the Patient Health Questionnaire-2 in UK Biobank (max time points = 8).
Results
Higher baseline IL-6 was associated with worse depression symptom trajectories in both cohorts (largest effect size: 0.046 [ALSPAC, age 16 years]). These associations were stronger in the younger ALSPAC cohort, where additionally higher IL-6 levels at age 9 years was associated with worse depression symptoms trajectories in females compared to males. Weaker sex differences were observed in the older cohort, UK Biobank. However, statistically significant associations (pFDR <0.05) were of smaller effect sizes, typical of large cohort studies.
Conclusions
These findings suggest that systemic inflammation may influence the severity and course of depressive symptoms across the life course, which is apparent regardless of age and differences in measures and number of time points between these large, population-based cohorts.
Climate and land-use changes are major threats to amphibian conservation. However, amphibians on tropical oceanic islands appear to have been overlooked with regards to their vulnerability to global anthropogenic threats. Here we examine whether there are gaps in research evaluating the vulnerability of tropical oceanic island amphibians to climate and land-use changes. We carried out a systematic review of the literature on experimental studies published during 1 July 1998–30 June 2022, to evaluate whether there are knowledge gaps in relation to geographical scope, taxonomic representation, life stage assessment, the factors affecting amphibians and how species and populations respond to these factors. Of 327 articles on climate change and 451 on land-use change, the research of only 18 was carried out on tropical oceanic islands, only on anurans, and < 20% of the authors were affiliated with an oceanic island institution. These 18 studies were on only five islands, and the range of families and life stages assessed was limited. We also found uneven research into the factors affecting oceanic island amphibians and their responses; analyses involving the effect of temperature on amphibian range expansion or contraction were the most common, with few studies of the effects of salinity. The scarcity and unevenness of research from oceanic islands limit our understanding of the effects of climate and land-use changes on amphibians. We discuss potential reasons for these knowledge gaps and recommend ways to address them, such as more equitable distribution of resources and provision of training and research opportunities for island-based biologists.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Both the speed and accuracy of responding are important measures of performance. A well-known interpretive difficulty is that participants may differ in their strategy, trading speed for accuracy, with no change in underlying competence. Another difficulty arises when participants respond slowly and inaccurately (rather than quickly but inaccurately), e.g., due to a lapse of attention. We introduce an approach that combines response time and accuracy information and addresses both situations. The modeling framework assumes two latent competing processes. The first, the error-free process, always produces correct responses. The second, the guessing process, results in all observed errors and some of the correct responses (but does so via non-specific processes, e.g., guessing in compliance with instructions to respond on each trial). Inferential summaries of the speed of the error-free process provide a principled assessment of cognitive performance reducing the influences of both fast and slow guesses. Likelihood analysis is discussed for the basic model and extensions. The approach is applied to a data set on response times in a working memory test.
Hypereutrophic Grand Lake St Marys (GLSM) is a large (52 km2), shallow (mean depth ~ 1.5 m) reservoir in an agricultural watershed of western Ohio (USA). GLSM suffers from extensive cyanobacterial harmful algal blooms (cHABs) that persist much of the year, resulting in total microcystin concentrations that are often above safe contact levels. Over two summers (2020 and 2021), two phosphorus (P) binding agents (alum and lanthanum/bentonite clay Phoslock, respectively), in conjunction with a P-binding algaecide (SeClear) in 2021, were applied to a 3.24-ha enclosure to mitigate cHAB activity and create a ‘safe’ recreational space for the public. We evaluated these applications by comparing total phosphorus (TP), total microcystin, total chlorophyll, and phycocyanin concentrations within the enclosure and the adjacent lake. Some evidence for short-term reductions in TP, microcystin, chlorophyll, and phycocyanin concentrations were observed following each P binding treatment, but all parameters rapidly returned to or exceeded pre-application levels within 2–3 weeks after treatment. These results suggest that in-lake chemical treatments to mitigate cHABs are unlikely to provide long-lasting benefits in these semi-enclosed areas of large, shallow, hypereutrophic systems, and resources may be better applied toward reducing external nutrient loads (P and nitrogen) from the watershed.
Italian ryegrass [Lolium perenne L. ssp. multiflorum (Lam.) Husnot] has become a major annual weed in wheat (Triticum aestivum L.) production systems in the inland Pacific Northwest. With large genetic variability and abundant seed production, L. perenne ssp. multiflorum has developed globally 74 documented cases of herbicide resistance covering 8 different mechanisms of action. Harvest weed seed control (HWSC) systems were introduced in Australia in response to the widespread evolution of multiple herbicide resistance in rigid ryegrass (Lolium rigidum Gaudin) and wild radish (Raphanus raphanistrum L.). The efficacy of these systems for any given weed species is directly related to the proportion of total seed retained by that species at harvest time. From 2017 to 2020, ten L. perenne ssp. multiflorum plants were collected from three different slope aspects at each location in Washington, USA. Collections were initiated in each field when it was visually apparent that seed fill was nearly complete, and seed shatter had not yet occurred. Collection continued at near-weekly intervals until the fields were harvested. The number of filled florets on a spikelet was used to assess the degree of seed shatter over time. Seed shatter at harvest was 67% of the total number of florets on each spikelet. Seed shatter was closely aligned with wheat kernel development in both spring and winter wheat. The high percentage of L. perenne ssp. multiflorum seeds that are shattered by harvest may make HWSC less effective than for L. rigidum in Australia; however, seeds with the greatest biomass tend to not shatter before harvest, which may increase the efficacy of HWSC for managing the soil seedbank. Strategies like planting earlier-maturing wheat cultivars could help HWSC be more effective by having wheat harvest begin earlier, when more L. perenne ssp. multiflorum seeds are still on the mother plant.
We compare two initial specimen diversion devices evaluated over 3 months to investigate their utility in lowering blood culture contamination rates at or below 1%. Overall contamination rates during trial periods were 2.46% and 2.60% but usage was low, whereas device-specific contamination rates were 0.68% and 0.8%, respectively.
Weeds are one of the greatest challenges to snap bean (Phaseolus vulgaris L.) production. Anecdotal observation posits certain species frequently escape the weed management system by the time of crop harvest, hereafter called residual weeds. The objectives of this work were to (1) quantify the residual weed community in snap bean grown for processing across the major growing regions in the United States and (2) investigate linkages between the density of residual weeds and their contributions to weed canopy cover. In surveys of 358 fields across the Northwest (NW), Midwest (MW), and Northeast (NE), residual weeds were observed in 95% of the fields. While a total of 109 species or species-groups were identified, one to three species dominated the residual weed community of individual fields in most cases. It was not uncommon to have >10 weeds m−2 with a weed canopy covering >5% of the field’s surface area. Some of the most abundant and problematic species or species-groups escaping control included amaranth species such as smooth pigweed (Amaranthus hybridus L.), Palmer amaranth (Amaranthus palmeri S. Watson), redroot pigweed (Amaranthus retroflexus L.), and waterhemp [Amaranthus tuberculatus (Moq.) Sauer]; common lambsquarters (Chenopodium album L.); large crabgrass [Digitaria sanguinalis (L.) Scop.]; and ivyleaf morningglory (Ipomoea hederacea Jacq.). Emerging threats include hophornbeam copperleaf (Acalypha ostryifolia Riddell) in the MW and sharppoint fluvellin [Kickxia elatine (L.) Dumort.] in the NW. Beyond crop losses due to weed interference, the weed canopy at harvest poses a risk to contaminating snap bean products with foreign material. Random forest modeling predicts the residual weed canopy is dominated by C. album, D. sanguinalis, carpetweed (Mollugo verticillata L.), I. hederacea, amaranth species, and A. ostryifolia. This is the first quantitative report on the weed community escaping control in U.S. snap bean production.
Identifying persons with HIV (PWH) at increased risk for Alzheimer’s disease (AD) is complicated because memory deficits are common in HIV-associated neurocognitive disorders (HAND) and a defining feature of amnestic mild cognitive impairment (aMCI; a precursor to AD). Recognition memory deficits may be useful in differentiating these etiologies. Therefore, neuroimaging correlates of different memory deficits (i.e., recall, recognition) and their longitudinal trajectories in PWH were examined.
Design:
We examined 92 PWH from the CHARTER Program, ages 45–68, without severe comorbid conditions, who received baseline structural MRI and baseline and longitudinal neuropsychological testing. Linear and logistic regression examined neuroanatomical correlates (i.e., cortical thickness and volumes of regions associated with HAND and/or AD) of memory performance at baseline and multilevel modeling examined neuroanatomical correlates of memory decline (average follow-up = 6.5 years).
Results:
At baseline, thinner pars opercularis cortex was associated with impaired recognition (p = 0.012; p = 0.060 after correcting for multiple comparisons). Worse delayed recall was associated with thinner pars opercularis (p = 0.001) and thinner rostral middle frontal cortex (p = 0.006) cross sectionally even after correcting for multiple comparisons. Delayed recall and recognition were not associated with medial temporal lobe (MTL), basal ganglia, or other prefrontal structures. Recognition impairment was variable over time, and there was little decline in delayed recall. Baseline MTL and prefrontal structures were not associated with delayed recall.
Conclusions:
Episodic memory was associated with prefrontal structures, and MTL and prefrontal structures did not predict memory decline. There was relative stability in memory over time. Findings suggest that episodic memory is more related to frontal structures, rather than encroaching AD pathology, in middle-aged PWH. Additional research should clarify if recognition is useful clinically to differentiate aMCI and HAND.
Diagnostic criteria for major depressive disorder allow for heterogeneous symptom profiles but genetic analysis of major depressive symptoms has the potential to identify clinical and etiological subtypes. There are several challenges to integrating symptom data from genetically informative cohorts, such as sample size differences between clinical and community cohorts and various patterns of missing data.
Methods
We conducted genome-wide association studies of major depressive symptoms in three cohorts that were enriched for participants with a diagnosis of depression (Psychiatric Genomics Consortium, Australian Genetics of Depression Study, Generation Scotland) and three community cohorts who were not recruited on the basis of diagnosis (Avon Longitudinal Study of Parents and Children, Estonian Biobank, and UK Biobank). We fit a series of confirmatory factor models with factors that accounted for how symptom data was sampled and then compared alternative models with different symptom factors.
Results
The best fitting model had a distinct factor for Appetite/Weight symptoms and an additional measurement factor that accounted for the skip-structure in community cohorts (use of Depression and Anhedonia as gating symptoms).
Conclusion
The results show the importance of assessing the directionality of symptoms (such as hypersomnia versus insomnia) and of accounting for study and measurement design when meta-analyzing genetic association data.
This study documents several correlations observed during the first run of the plasma wakefield acceleration experiment E300 conducted at FACET-II, using a single drive electron bunch. The established correlations include those between the measured maximum energy loss of the drive electron beam and the integrated betatron X-ray signal, the calculated total beam energy deposited in the plasma and the integrated X-ray signal, among three visible light emission measuring cameras and between the visible plasma light and X-ray signal. The integrated X-ray signal correlates almost linearly with both the maximum energy loss of the drive beam and the energy deposited into the plasma, demonstrating its usability as a measure of energy transfer from the drive beam to the plasma. Visible plasma light is found to be a useful indicator of the presence of a wake at three locations that overall are two metres apart. Despite the complex dynamics and vastly different time scales, the X-ray radiation from the drive bunch and visible light emission from the plasma may prove to be effective non-invasive diagnostics for monitoring the energy transfer from the beam to the plasma in future high-repetition-rate experiments.