To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this chapter we introduce the problem of analyzing low-probability events, known as large deviation theory. It is usually solved by computing moment-generating functions and Fenchel-Legendre conjugation. It turns out, however, that these steps can be interpreted information-theoretically in terms of information projection. We show how to solve information projection in a special case of linear constraints, connecting the solution to exponential families.
In Chapter 20 we study data transmission with constraints on the channel input. For example, how many bits per channel use can we transmit under constraints on the codewords? To answer this question in general, we need to extend the setup and coding theorems to channels with input constraints. After doing that we will apply these results to compute the capacities of various Gaussian channels (memoryless, with intersymbol interference and subject to fading).
This study introduces an innovative methodology for mortality forecasting, which integrates signature-based methods within the functional data framework of the Hyndman–Ullah (HU) model. This new approach, termed the Hyndman–Ullah with truncated signatures (HUts) model, aims to enhance the accuracy and robustness of mortality predictions. By utilizing signature regression, the HUts model is able to capture complex, nonlinear dependencies in mortality data which enhances forecasting accuracy across various demographic conditions. The model is applied to mortality data from 12 countries, comparing its forecasting performance against variants of the HU models across multiple forecast horizons. Our findings indicate that overall the HUts model not only provides more precise point forecasts but also shows robustness against data irregularities, such as those observed in countries with historical outliers. The integration of signature-based methods enables the HUts model to capture complex patterns in mortality data, making it a powerful tool for actuaries and demographers. Prediction intervals are also constructed with bootstrapping methods.
This paper introduces a novel theoretical framework that offers a closed-form expression for the tail variance (TV) for the novel family of generalised hyper-elliptical (GHE) distributions. The GHE family combines an elliptical distribution with the generalised inverse Gaussian (GIG) distribution, resulting in a highly adaptable and powerful model. Expanding upon the findings of Ignatieva and Landsman ((2021) Insurance: Mathematics and Economics, 101, 437–465.) regarding the tail conditional expectation (TCE), this study demonstrates the significance of the TV as an additional risk measure that provides valuable insights into the tail risk and effectively captures the variability within the loss distribution’s tail. To validate the theoretical results, we perform an empirical analysis on two specific cases: the Laplace – GIG and the Student-t – GIG mixtures. By incorporating the TV derived for the GHE family, we are able to quantify correlated risks in a multivariate portfolio more efficiently. This contribution is particularly relevant to the insurance and financial industries, as it offers a reliable method for accurately assessing the risks associated with extreme losses. Overall, this paper presents an innovative and rigorous approach that enhances our understanding of risk assessment within the financial and insurance sectors. The derived expressions for the TV in addition to TCE within the GHE family of distributions provide valuable insights and practical tools for effectively managing risk.
Engineering machines are becoming increasingly complex and possess more control variables, increasing the complexity and versatility of the control systems. Different configurations of the control system, named a policy, can result in similar output behavior but with different resource or component life usage. There is therefore an opportunity to find optimal policies with respect to economic decisions. While many solutions have been proposed to find such economic policy decisions at the asset level, we consider this problem at the fleet level. In this case, the optimal operation of each asset is affected by the state of all other assets in the fleet. Challenges introduced by considering multiple assets include the construction of economic multi-objective optimization criteria, handling rare events such as failures, application of fleet-level constraints, and scalability. The proposed solution presents a framework for economic fleet optimization. The framework is demonstrated for economic criteria relating to resource usage, component lifing, and maintenance scheduling, but is generically extensible. Direct optimization of lifetime distributions is considered in order to avoid the computational burden of discrete event simulation of rare events. Results are provided for a real-world case study targeting the optimal economic operation of a fleet of aerospace gas turbine engines.
This paper proposes to solve the vortex gust mitigation problem on a 2D, thin flat plate using onboard measurements. The objective is to solve the discrete-time optimal control problem of finding the pitch rate sequence that minimizes the lift perturbation, that is, the criterion where is the lift coefficient obtained by the unsteady vortex lattice method. The controller is modeled as an artificial neural network, and it is trained to minimize using deep reinforcement learning (DRL). To be optimal, we show that the controller must take as inputs the locations and circulations of the gust vortices, but these quantities are not directly observable from the onboard sensors. We therefore propose to use a Kalman particle filter (KPF) to estimate the gust vortices online from the onboard measurements. The reconstructed input is then used by the controller to calculate the appropriate pitch rate. We evaluate the performance of this method for gusts composed of one to five vortices. Our results show that (i) controllers deployed with full knowledge of the vortices are able to mitigate efficiently the lift disturbance induced by the gusts, (ii) the KPF performs well in reconstructing gusts composed of less than three vortices, but shows more contrasted results in the reconstruction of gusts composed of more vortices, and (iii) adding a KPF to the controller recovers a significant part of the performance loss due to the unobservable gust vortices.
A recent outbreak of cryptosporidiosis (Cryptosporidium parvum, subtype IIdA23G1) among veterinary students associated with extracurricular activities concerned with lambs is described from Norway. Although cryptosporidiosis outbreaks among veterinary students have been frequently reported, this is among the first from lamb contact. Cryptosporidium oocysts were detected in samples from two students and three lambs. A questionnaire distributed immediately after the outbreak was recognized, identified an assumed attack rate of 50% based on exposure and illness among exposed students (28 of 56), despite most reporting good or very good hygiene measures. Laboratory diagnostics confirmed infection in two of these. The illness lasted over a week in most students (up to 15 days), but contact with health services was negligible. In addition to implementing measures to reduce the likelihood of further such outbreaks among veterinary students, it is recommended that future outbreaks of diarrhoea among ruminants on the farm should be investigated for aetiological agents.
Recently, Kurisu and Otsu (2022b, Econometric Theory 38(1), 172–193) derived the uniform convergence rates for the nonparametric deconvolution estimators proposed by Li and Vuong (1998, Journal of Multivariate Analysis 65(2), 139–165). This article shows that faster uniform convergence rates can be established for their estimators under the same assumptions. In addition, a new class of deconvolution estimators based on a variant of Kotlarski’s identity is also proposed. It is shown that in some cases, these new estimators can have faster uniform convergence rates than the existing estimators.
We establish theoretical results about the low frequency contamination (i.e., long memory effects) induced by general nonstationarity for estimates such as the sample autocovariance and the periodogram, and deduce consequences for heteroskedasticity and autocorrelation robust (HAR) inference. We present explicit expressions for the asymptotic bias of these estimates. We show theoretically that nonparametric smoothing over time is robust to low frequency contamination. Nonstationarity can have consequences for both the size and power of HAR tests. Under the null hypothesis there are larger size distortions than when data are stationary. Under the alternative hypothesis, existing LRV estimators tend to be inflated and HAR tests can exhibit dramatic power losses. Our theory indicates that long bandwidths or fixed-b HAR tests suffer more from low frequency contamination relative to HAR tests based on HAC estimators, whereas recently introduced double kernel HAC estimators do not suffer from this problem. We present second-order Edgeworth expansions under nonstationarity about the distribution of HAC and DK-HAC estimators and about the corresponding t-test in the regression model. The results show that the distortions in the rejection rates can be induced by time variation in the second moments even when there is no break in the mean.
In practice, nondestructive testing (NDT) procedures tend to consider experiments (and their respective models) as distinct, conducted in isolation, and associated with independent data. In contrast, this work looks to capture the interdependencies between acoustic emission (AE) experiments (as meta-models) and then use the resulting functions to predict the model hyperparameters for previously unobserved systems. We utilize a Bayesian multilevel approach (similar to deep Gaussian Processes) where a higher-level meta-model captures the inter-task relationships. Our key contribution is how knowledge of the experimental campaign can be encoded between tasks as well as within tasks. We present an example of AE time-of-arrival mapping for source localization, to illustrate how multilevel models naturally lend themselves to representing aggregate systems in engineering. We constrain the meta-model based on domain knowledge, then use the inter-task functions for transfer learning, predicting hyperparameters for models of previously unobserved experiments (for a specific design).
In this paper, we analyze a polling system on a circle. Random batches of customers arrive at a circle, where each customer, independently, obtains a location that is uniformly distributed on the circle. A single server cyclically traverses the circle to serve all customers. Using mean value analysis, we derive the expected number of waiting customers within a given distance of the server. We exploit this to obtain closed-form expressions for both the mean batch sojourn time and the mean time to delivery.
In May 2017, whole-genome sequencing (WGS) became the primary subtyping method for Salmonella in Canada. As a result of the increased discriminatory power provided by WGS, 16 multi-jurisdictional outbreaks of Salmonella associated with frozen raw breaded chicken products were identified between 2017 and 2019. The majority (15/16) were associated with S. enteritidis, while the remaining outbreak was associated with S. Heidelberg. The 16 outbreaks included a total of 487 cases with ages ranging from 0 to 98 years (median: 24 years); 79 hospitalizations and two deaths were reported. Over the course of the outbreak investigations, 14 frozen raw breaded chicken products were recalled, and one was voluntarily withdrawn from the market. After previous changes to labelling and the issuance of public communication for these products proved ineffective at reducing illnesses, new industry requirements were issued in 2019, which required the implementation of measures at the manufacturing/processing level to reduce Salmonella to below detectable amounts in frozen raw breaded chicken products. Since implementation, no further outbreaks of Salmonella associated with frozen breaded chicken have been identified in Canada, a testament to the effectiveness of these risk mitigation measures.
This commentary explores MENA”s AI governance, addressing gaps, showcasing successful strategies, and comparing national approaches. It emphasizes current deficiencies, highlights regional contributions to global AI governance, and offers insights into effective frameworks. The study reveals distinctions and trends in MENA”s national AI strategies, serving as a concise resource for policymakers and industry stakeholders.
Shiga toxin-producing Escherichia coli (STEC) is a group of bacteria that causes gastrointestinal illness and occasionally causes large foodborne outbreaks. It represents a major public health concern due to its ability to cause severe illness which can sometimes be fatal. This study was undertaken as part of a rapid investigation into a national foodborne outbreak of STEC O145. On 22 May 2024, United Kingdom (UK) public health agencies and laboratories identified an increase in stool specimens submissions and patients testing positive for Shiga toxin-producing E. coli (STEC). Whole genome sequencing (WGS) identified serotype O145:H28 stx2a/eae belonging to the same five single nucleotide polymorphism (SNP) single linkage cluster as the causative agent. By 3 July 2024, 288 cases had been linked to the cluster. Most cases were adults (87%) and females (57%), 49% were hospitalized with a further 10% attending emergency care. Descriptive epidemiology and analytical studies were conducted which identified consumption of nationally distributed pre-packed sandwiches as a common food exposure. The implicated food business operators voluntarily recalled ready-to-eat sandwiches and wraps containing lettuce on 14 June 2024.
The financial burden of hospitalization from life-threatening infectious diseases on the U.S. healthcare system is substantial and continues to increase. The purpose of this study was to identify key predictors of high hospital charges for infective endocarditis at a major university-affiliated cardiac care centre in West Virginia.
A retrospective electronic medical records’ review was undertaken of all adult patients admitted for endocarditis between 2014–2018. Multiple linear regression analysis assessed the total charges billed to the patient account for their endocarditis hospitalization in the medical record.
Hospital charges have increased 12-fold during 2014–2018. Among the 486 patients, the median hospital charge was $198 678. About 47% of the patients underwent surgery incurring 70% of the total charges. Patients with hospital stays of ≥50 days accounted for a third of all charges. The multiple linear regression model accounted for 85% of the linear variance in the hospital charges. Median charges increased by 30.87% for patients with ≥9 consultations, 60.32% for those who died in the hospital, and 81.85% for those who underwent surgical intervention.
The study findings showed that complex care requiring multiple consultations, surgical interventions, and longer hospital stays were significantly associated with higher hospital charges for endocarditis treatment.