We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Synthetic Aperture Radar Interferometry (InSAR) is an active remote sensing method that uses repeated radar scans of the Earth's solid surface to measure relative deformation at centimeter precision over a wide swath. It has revolutionized our understanding of the earthquake cycle, volcanic eruptions, landslides, glacier flow, ice grounding lines, ground fluid injection/withdrawal, underground nuclear tests, and other applications requiring high spatial resolution measurements of ground deformation. This book examines the theory behind and the applications of InSAR for measuring surface deformation. The most recent generation of InSAR satellites have transformed the method from investigating 10's to 100's of SAR images to processing 1000's and 10,000's of images using a wide range of computer facilities. This book is intended for students and researchers in the physical sciences, particularly for those working in geophysics, natural hazards, space geodesy, and remote sensing. This title is also available as Open Access on Cambridge Core.
The next-generation radio astronomy instruments are providing a massive increase in sensitivity and coverage, largely through increasing the number of stations in the array and the frequency span sampled. The two primary problems encountered when processing the resultant avalanche of data are the need for abundant storage and the constraints imposed by I/O, as I/O bandwidths drop significantly on cold storage. An example of this is the data deluge expected from the SKA Telescopes of more than 60 PB per day, all to be stored on the buffer filesystem. While compressing the data is an obvious solution, the impacts on the final data products are hard to predict. In this paper, we chose an error-controlled compressor – MGARD – and applied it to simulated SKA-Mid and real pathfinder visibility data, in noise-free and noise-dominated regimes. As the data have an implicit error level in the system temperature, using an error bound in compression provides a natural metric for compression. MGARD ensures the compression incurred errors adhere to the user-prescribed tolerance. To measure the degradation of images reconstructed using the lossy compressed data, we proposed a list of diagnostic measures, exploring the trade-off between these error bounds and the corresponding compression ratios, as well as the impact on science quality derived from the lossy compressed data products through a series of experiments. We studied the global and local impacts on the output images for continuum and spectral line examples. We found relative error bounds of as much as 10%, which provide compression ratios of about 20, have a limited impact on the continuum imaging as the increased noise is less than the image RMS, whereas a 1% error bound (compression ratio of 8) introduces an increase in noise of about an order of magnitude less than the image RMS. For extremely sensitive observations and for very precious data, we would recommend a $0.1\%$ error bound with compression ratios of about 4. These have noise impacts two orders of magnitude less than the image RMS levels. At these levels, the limits are due to instabilities in the deconvolution methods. We compared the results to the alternative compression tool DYSCO, in both the impacts on the images and in the relative flexibility. MGARD provides better compression for similar error bounds and has a host of potentially powerful additional features.
The interaction of helminth infections with type 2 diabetes (T2D) has been a major area of research in the past few years. This paper, therefore, focuses on the systematic review of the effects of helminthic infections on metabolism and immune regulation related to T2D, with mechanisms through which both direct and indirect effects are mediated. Specifically, the possible therapeutic role of helminths in T2D management, probably mediated through the modulation of host metabolic pathways and immune responses, is of special interest. This paper discusses the current possibilities for translating helminth therapy from basic laboratory research to clinical application, as well as existing and future challenges. Although preliminary studies suggest the potential for helminth therapy for T2D patients, their safety and efficacy still need to be confirmed by larger-scale clinical studies.
Plants exhibit diverse morphological, anatomical and physiological responses to hypoxia stress from soil waterlogging, yet coordination between these responses is not fully understood. Here, we present a mechanistic model to simulate how rooting depth, root aerenchyma -porous tissue arising from localized cell death-, and root barriers to radial oxygen loss (ROL) interact to influence waterlogging survival. Our model revealed an interaction between rooting depth and the relative effectiveness of aerenchyma and ROL barriers for prolonging waterlogging survival. As the formation of shallow roots increases waterlogging survival time, the positive effect of aerenchyma becomes more apparent with increased rooting depth. While ROL barriers further increased survival in combination with aerenchyma in deep-rooted plants, ROL barriers had little positive effect in the absence of aerenchyma. Furthermore, as ROL barriers limit root-to-soil oxygen diffusion bidirectionally, our model revealed optimality in the timing of ROL formation. These findings highlight the importance of coordination between morphological and anatomical responses in waterlogging resilience of plants.
Shifts in food acquisition during the COVID-19 pandemic may have affected diet. Assessing changes in diet is needed to inform food assistance programs aimed at mitigating diet disparities during future crises. This longitudinal study assessed changes in diet among a low-income, racially diverse population from March-November 2020.
Methods
Survey data were collected from 291 adults living in Austin, TX. Multivariable ordinal logistic regression models assessed the relationship between changes in consumption of fresh, frozen, and canned fruits and vegetables (FV), and sugar-sweetened beverages (SSBs) and the following food acquisition factors: food security, difficulty finding food, food bank usage, and food shopping method.
Results
Adjusted models indicated individuals with consistent food insecurity had increased odds of reporting a higher category of consumption for frozen (aOR = 2.13, P < 0.05, CI:1.18-3.85) and canned (aOR = 4.04, P < 0.01, CI:2.27-7.20) FV and SSB (aOR = 3.01, P < 0.01, CI:1.65-5.51). Individuals who reported using a food bank were more likely to report increased consumption of frozen (aOR = 2.14, P < 0.05, CI:1.22-3.76) and canned FV (aOR = 2.91, P < 0.01, CI:1.69-4.99).
Conclusions
Shifts in food acquisition factors were associated with changes in diet. Findings demonstrate the need for more robust food assistance programs that specifically focus on all dimensions of food security.
Background: The combination of PARP inhibitor and immune checkpoint inhibitors have been proposed as a potentially synergistic combinatorial treatment in IDH mutant glioma, targeting dysregulated homologous recombination repair pathways. This study analyzed the cell-free DNA methylome of patients in a phase 2 trial using the PARP inhibitor Olaparib and the PD-1 inhibitor Durvalumab. Methods: Patients with recurrent high-grade IDH-mutant gliomas were enrolled in a phase II open-label study (NCT03991832). Serum was collected at baseline and monthly and cell-free methylated DNA immunoprecipitation and high-throughput sequencing (cfMeDIP-seq) was performed. Binomial GLMnet models were developed and model performance was assessed using validation set data. Results: 29 patients were enrolled between 2020–2023. Patients received olaparib 300mg twice daily and durvalumab 1500mg IV every 4 weeks. The overall response rate was 10% via RANO criteria. 144 plasma samples were profiled with cfMeDIP-seq along with 30 healthy controls. The enriched circulating tumour DNA methylome during response periods exhibited a highly specific signature, accurately discriminating response versus failure (AUC 0.98 ± 0.03). Additionally, samples that were taken while on treatment were able to be discriminated from samples off therapy (AUC 0.74 ± 0.11). Conclusions: The cell-free plasma DNA methylome exhibits highly specific signatures that enable accurate prediction of response to therapy.
Milk fat is a crucial component for evaluating the production performance and nutritional value of goat milk. Previous research indicated that the composition of ruminal microbiota plays a significant role in regulating milk fat percentage in ruminants. Thus, this study aimed to identify key ruminal microorganisms and blood metabolites relevant to milk fat synthesis in dairy goats as a mean to explore their role in regulating milk fat synthesis. Sixty clinically healthy Xinong Saanen dairy goats at mid-lactation and of similar body weight, and similar milk yield were used in a feeding study for 15 days. Based on daily milk yield of dairy goats and the results of milk component determination on the 1st and 8th days, five goats with the highest milk fat content (H group) and five goats with the lowest milk fat content (L group) were selected for further analysis. Before the morning feeding on the 15th day of the experiment, samples of milk, blood and ruminal fluid were collected for analyses of components, volatile fatty acids, microbiota and metabolites. Results revealed that acetate content in the rumen of H group was greater compared with L group. H group had abundant beneficial bacteria including Ruminococcaceae_UCG-005, Saccharofermentans, Ruminococcaceae-UCG-002 and Prevotellaceae_UCG-3, which were important for plant cellulose and hemicellulose degradation and immune regulation. Metabolomics analysis revealed H group had greater relative concentrations of 4-acetamidobutanoic acid and azelaic acid in serum, and had lower relative concentrations of Arginyl-Alanine, SM(d18:1/12:0) and DL-Tryptophan. These altered metabolites are involved in the sphingolipid signaling pathway, arginine and proline metabolism. Overall, this study identified key ruminal microorganisms and serum metabolites associated with milk fat synthesis in dairy goats. These findings offer insights for enhancing the quality of goat milk and contribute to a better understanding of the regulatory mechanisms involved in milk fat synthesis in dairy goats.
Multicenter clinical trials are essential for evaluating interventions but often face significant challenges in study design, site coordination, participant recruitment, and regulatory compliance. To address these issues, the National Institutes of Health’s National Center for Advancing Translational Sciences established the Trial Innovation Network (TIN). The TIN offers a scientific consultation process, providing access to clinical trial and disease experts who provide input and recommendations throughout the trial’s duration, at no cost to investigators. This approach aims to improve trial design, accelerate implementation, foster interdisciplinary teamwork, and spur innovations that enhance multicenter trial quality and efficiency. The TIN leverages resources of the Clinical and Translational Science Awards (CTSA) program, complementing local capabilities at the investigator’s institution. The Initial Consultation process focuses on the study’s scientific premise, design, site development, recruitment and retention strategies, funding feasibility, and other support areas. As of 6/1/2024, the TIN has provided 431 Initial Consultations to increase efficiency and accelerate trial implementation by delivering customized support and tailored recommendations. Across a range of clinical trials, the TIN has developed standardized, streamlined, and adaptable processes. We describe these processes, provide operational metrics, and include a set of lessons learned for consideration by other trial support and innovation networks.
This study explored mental workload recognition methods for carrier-based aircraft pilots utilising multiple sensor physiological signal fusion and portable devices. A simulation carrier-based aircraft flight experiment was designed, and subjective mental workload scores and electroencephalogram (EEG) and photoplethysmogram (PPG) signals from six pilot cadets were collected using NASA Task Load Index (NASA-TLX) and portable devices. The subjective scores of the pilots in three flight phases were used to label the data into three mental workload levels. Features from the physiological signals were extracted, and the interrelations between mental workload and physiological indicators were evaluated. Machine learning and deep learning algorithms were used to classify the pilots’ mental workload. The performances of the single-modal method and multimodal fusion methods were investigated. The results showed that the multimodal fusion methods outperformed the single-modal methods, achieving higher accuracy, precision, recall and F1 score. Among all the classifiers, the random forest classifier with feature-level fusion obtained the best results, with an accuracy of 97.69%, precision of 98.08%, recall of 96.98% and F1 score of 97.44%. The findings of this study demonstrate the effectiveness and feasibility of the proposed method, offering insights into mental workload management and the enhancement of flight safety for carrier-based aircraft pilots.
The stars of the Milky Way carry the chemical history of our Galaxy in their atmospheres as they journey through its vast expanse. Like barcodes, we can extract the chemical fingerprints of stars from high-resolution spectroscopy. The fourth data release (DR4) of the Galactic Archaeology with HERMES (GALAH) Survey, based on a decade of observations, provides the chemical abundances of up to 32 elements for 917 588 stars that also have exquisite astrometric data from the Gaia satellite. For the first time, these elements include life-essential nitrogen to complement carbon, and oxygen as well as more measurements of rare-earth elements critical to modern-life electronics, offering unparalleled insights into the chemical composition of the Milky Way. For this release, we use neural networks to simultaneously fit stellar parameters and abundances across the whole wavelength range, leveraging synthetic grids computed with Spectroscopy Made Easy. These grids account for atomic line formation in non-local thermodynamic equilibrium for 14 elements. In a two-iteration process, we first fit stellar labels to all 1 085 520 spectra, then co-add repeated observations and refine these labels using astrometric data from Gaia and 2MASS photometry, improving the accuracy and precision of stellar parameters and abundances. Our validation thoroughly assesses the reliability of spectroscopic measurements and highlights key caveats. GALAH DR4 represents yet another milestone in Galactic archaeology, combining detailed chemical compositions from multiple nucleosynthetic channels with kinematic information and age estimates. The resulting dataset, covering nearly a million stars, opens new avenues for understanding not only the chemical and dynamical history of the Milky Way but also the broader questions of the origin of elements and the evolution of planets, stars, and galaxies.
Ecological momentary assessment (EMA) may be a valid and acceptable method of assessing dietary intake in young adults(1). EMA may overcome some of the limitations associated with traditional dietary assessment methods such as high respondent burden and memory biases(2) by capturing time-sensitive data via concise dietary surveys. However, most dietary EMA studies either deliver signal-contingent EMAs at fixed intervals or rely on the user’s memory to self-initiate event-contingent EMAs whenever they ate. This may be inappropriate for young adults due to their highly variable eating patterns(1). Young adults are particularly vulnerable to weight gain due to major life transitions and, for this population, dietary information may need to be collected near real-time to improve recall accuracy(3). Therefore, the aim of this study was to examine the feasibility (response rate) and acceptability of an EMA protocol that delivered dietary surveys at times personalised to young adults’ (18–30 years) eating patterns and to compare this to the feasibility and acceptability of EMAs delivered at fixed intervals. A randomised, double-blinded crossover design with two four-day treatment arms was used. In one arm, participants received six EMAs per day at fixed intervals. In the other arm, EMAs were delivered at times tailored to participants’ usual eating schedules (ranged between two to six EMAs per day). Usual eating schedules were determined using time-stamped food and beverage images captured by participants over the four days immediately prior to treatments. EMA questions included, but were not limited to, time of consumption and type of food or beverage group consumed. Response rates were calculated as the percentage of EMAs responded to out of the EMAs delivered. At the end of each arm, participants completed an acceptability survey assessing their opinion of the number of EMAs per day, length of the EMAs, and number of recording days. Twenty-three subjects were included (13 female; mean age 26, SD 2.1 years). Mean response rates of the fixed interval and personalised schedule treatments were 65.1% (SE 3.7%) and 66.3% (SE 3.7%), respectively. Compared to the fixed interval treatment, EMAs delivered during the personalised schedule treatment did not align closer with participants’ eating times; the average time difference between EMA delivery and reported eating time was 1.7 hours for both treatments. Participants from both treatments reported receiving too many EMAs per day but found the length of the EMA and number of days of recording to be ‘just right’. In conclusion, EMAs delivered on a personalised schedule may not improve participant adherence. Due to the irregular nature of young adults’ eating patterns, timing of EMA delivery is difficult to tailor. Future definitive trials should use more sophisticated methods of personalisation such as wearable sensors to trigger event-contingent EMAs.
The World Cancer Research Fund and the American Institute for Cancer Research recommend a plant-based diet to cancer survivors, which may reduce chronic inflammation and excess adiposity associated with worse survival. We investigated associations of plant-based dietary patterns with inflammation biomarkers and body composition in the Pathways Study, in which 3659 women with breast cancer provided validated food frequency questionnaires approximately 2 months after diagnosis. We derived three plant-based diet indices: overall plant-based diet index (PDI), healthful plant-based diet index (hPDI) and unhealthful plant-based diet index (uPDI). We assayed circulating inflammation biomarkers related to systemic inflammation (high-sensitivity C-reactive protein [hsCRP]), pro-inflammatory cytokines (IL-1β, IL-6, IL-8, TNF-α) and anti-inflammatory cytokines (IL-4, IL-10, IL-13). We estimated areas (cm2) of muscle and visceral and subcutaneous adipose tissue (VAT and SAT) from computed tomography scans. Using multivariable linear regression, we calculated the differences in inflammation biomarkers and body composition for each index. Per 10-point increase for each index: hsCRP was significantly lower by 6·9 % (95 % CI 1·6%, 11·8%) for PDI and 9·0 % (95 % CI 4·9%, 12·8%) for hPDI but significantly higher by 5·4 % (95 % CI 0·5%, 10·5%) for uPDI, and VAT was significantly lower by 7·8 cm2 (95 % CI 2·0 cm2, 13·6 cm2) for PDI and 8·6 cm2 (95 % CI 4·1 cm2, 13·2 cm2) for hPDI but significantly higher by 6·2 cm2 (95 % CI 1·3 cm2, 11·1 cm2) for uPDI. No significant associations were observed for other inflammation biomarkers, muscle, or SAT. A plant-based diet, especially a healthful plant-based diet, may be associated with reduced inflammation and visceral adiposity among breast cancer survivors.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
The 1994 discovery of Shor's quantum algorithm for integer factorization—an important practical problem in the area of cryptography—demonstrated quantum computing's potential for real-world impact. Since then, researchers have worked intensively to expand the list of practical problems that quantum algorithms can solve effectively. This book surveys the fruits of this effort, covering proposed quantum algorithms for concrete problems in many application areas, including quantum chemistry, optimization, finance, and machine learning. For each quantum algorithm considered, the book clearly states the problem being solved and the full computational complexity of the procedure, making sure to account for the contribution from all the underlying primitive ingredients. Separately, the book provides a detailed, independent summary of the most common algorithmic primitives. It has a modular, encyclopedic format to facilitate navigation of the material and to provide a quick reference for designers of quantum algorithms and quantum computing researchers.
Guideline-based tobacco treatment is infrequently offered. Electronic health record-enabled patient-generated health data (PGHD) has the potential to increase patient treatment engagement and satisfaction.
Methods:
We evaluated outcomes of a strategy to enable PGHD in a medical oncology clinic from July 1, 2021 to December 31, 2022. Among 12,777 patients, 82.1% received a tobacco screener about use and interest in treatment as part of eCheck-in via the patient portal.
Results:
We attained a broad reach (82.1%) and moderate response rate (30.9%) for this low-burden PGHD strategy. Patients reporting current smoking (n = 240) expressed interest in smoking cessation medication (47.9%) and counseling (35.8%). As a result of patient requests via PGHD, most tobacco treatment requests by patients were addressed by their providers (40.6–80.3%). Among patients with active smoking, those who received/answered the screener (n = 309 ) were more likely to receive tobacco treatment compared with usual care patients who did not have the patient portal (n = 323) (OR = 2.72, 95% CI = 1.93–3.82, P < 0.0001) using propensity scores to adjust for the effect of age, sex, race, insurance, and comorbidity. Patients who received yet ignored the screener (n = 1024) compared with usual care were also more likely to receive tobacco treatment, but to a lesser extent (OR = 2.20, 95% CI = 1.68–2.86, P < 0.0001). We mapped observed and potential benefits to the Translational Science Benefits Model (TSBM).
Discussion:
PGHD via patient portal appears to be a feasible, acceptable, scalable, and cost-effective approach to promote patient-centered care and tobacco treatment in cancer patients. Importantly, the PGHD approach serves as a real world example of cancer prevention leveraging the TSBM.
The phenomenon of focusing of microwave beams in a plasma near a turning-point caustic is discussed by exploiting the analytical solution to the Gaussian beam-tracing equations in the two-dimensional (2-D) linear-layer problem. The location of maximum beam focusing and the beam width at that location are studied in terms of the beam initial conditions. This focusing must be taken into account to interpret Doppler backscattering (DBS) measurements. We find that the filter function that characterises the scattering intensity contribution along the beam path through the plasma is inversely proportional to the beam width, predicting enhanced scattering from the beam focusing region. We show that the DBS signal enhancement for decreasing incident angles between the beam path and the density gradient is due to beam focusing and not due to forward scattering, as was originally proposed by (Gusakov et al., (Plasma Phys. Contr. Fusion, vol. 56, 2014, p. 0250092014, 2017); Plasma Phys. Rep. vol. 43(6), 2017, pp. 605–613). The analytic beam model is used to predict the measurement of the $k_y$ density-fluctuation wavenumber power spectrum via DBS, showing that, in an NSTX-inspired example, the spectral exponent of the turbulent, intermediate-to-high $k_y$ density-fluctuation spectrum might be quantitatively measurable via DBS, but not the spectral peak corresponding to the driving scale of the turbulent cascade.
This chapter covers quantum algorithmic primitives for loading classical data into a quantum algorithm. These primitives are important in many quantum algorithms, and they are especially essential for algorithms for big-data problems in the area of machine learning. We cover quantum random access memory (QRAM), an operation that allows a quantum algorithm to query a classical database in superposition. We carefully detail caveats and nuances that appear for realizing fast large-scale QRAM and what this means for algorithms that rely upon QRAM. We also cover primitives for preparing arbitrary quantum states given a list of the amplitudes stored in a classical database, and for performing a block-encoding of a matrix, given a list of its entries stored in a classical database.
This chapter covers the multiplicative weights update method, a quantum algorithmic primitive for certain continuous optimization problems. This method is a framework for classical algorithms, but it can be made quantum by incorporating the quantum algorithmic primitive of Gibbs sampling and amplitude amplification. The framework can be applied to solve linear programs and related convex problems, or generalized to handle matrix-valued weights and used to solve semidefinite programs.
This chapter covers quantum algorithmic primitives related to linear algebra. We discuss block-encodings, a versatile and abstract access model that features in many quantum algorithms. We explain how block-encodings can be manipulated, for example by taking products or linear combinations. We discuss the techniques of quantum signal processing, qubitization, and quantum singular value transformation, which unify many quantum algorithms into a common framework.