To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Deep neural network models have substantial advantages over traditional and machine learning methods that make this class of models particularly promising for adoption by actuaries. Nonetheless, several important aspects of these models have not yet been studied in detail in the actuarial literature: the effect of hyperparameter choice on the accuracy and stability of network predictions, methods for producing uncertainty estimates and the design of deep learning models for explainability. To allow actuaries to incorporate deep learning safely into their toolkits, we review these areas in the context of a deep neural network for forecasting mortality rates.
The objectives of this study were to define risk factors for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection in University of Cambridge (UoC) students during a period of increased incidence in October and November 2020. The study design was a survey.
Routine public health surveillance identified an increase in the numbers of UoC students with confirmed SARS-CoV-2 positivity in the 10 days after a national lockdown was announced in the UK on 5th November 2020. Cases were identified both through symptom-triggered testing and a universal asymptomatic testing programme. An online questionnaire was sent to all UoC students on 25 November to investigate risk factors for testing positive in the period after 30th October 2020. This asked about symptoms, SARS-CoV-2 test results, aspects of university life, and attendance at social events in the week prior to lockdown. Univariate and multivariable analyses were undertaken evaluating potential risk factors for SARS-CoV-2 positivity.
Among 3980 students responding to the questionnaire, 99 (2.5%) reported testing SARS-CoV-2 positive in the period studied; 28 (28%) were asymptomatic. We found strong independent associations with SARS-CoV-2 positivity and attendance at two social settings in the City of Cambridge (adjusted odds ratio favouring disease 13.0 (95% CI 6.2–26.9) and 14.2 (95% CI 2.9–70)), with weaker evidence of association with three further social settings. By contrast, we did not observe strong independent associations between disease risk and accommodation type or attendance at a range of activities associated with the university curriculum.
To conclude attendance at social settings can facilitate widespread SARS-CoV-2 transmission in university students. Constraint of transmission in higher education settings needs to emphasise risks outside university premises, as well as a COVID-safe environment within university premises.
Annual seasonal influenza vaccination is recommended for individuals at high risk of developing post-infection complications in many locations. However, reduced vaccine immunogenicity and effectiveness have been observed among repeat vaccinees in some influenza seasons. We investigated the impact of repeated influenza vaccination on relative vaccine effectiveness (VE) among individuals who were recommended for influenza vaccination in the United Kingdom with a retrospective cohort study using primary healthcare data from the Clinical Practice Research Datalink, a primary care database in the United Kingdom. Relative VE was estimated against general practitioner-diagnosed influenza-like illnesses (GP-ILI) and medically attended acute respiratory illnesses (MAARI) among participants who have been repeatedly vaccinated compared with first-time vaccinees using proportional hazards models. Relative VE against MAARI may be reduced for individuals above 65 years old who were vaccinated in the current and previous influenza seasons for some influenza seasons. However, these findings were not conclusive as we could not exclude the possibility of residual confounding in our dataset. The use of routinely collected data from electronic health records to examine the effects of repeated vaccination needs to be complemented with sufficient efforts to include negative control outcomes to rule out residual confounding.
Many classic networks grow by hooking small components via vertices. We introduce a class of networks that grows by fusing the edges of a small graph to an edge chosen uniformly at random from the network. For this random edge-hooking network, we study the local degree profile, that is, the evolution of the average degree of a vertex over time. For a special subclass, we further determine the exact distribution and an asymptotic gamma-type distribution. We also study the “core,” which consists of the well-anchored edges that experience fusing. A central limit theorem emerges for the size of the core.
At the end, we look at an alternative model of randomness attained by preferential hooking, favoring edges that experience more fusing. Under preferential hooking, the core still follows a Gaussian law but with different parameters. Throughout, Pólya urns are systematically used as a method of proof.
The SARS-CoV-2 Omicron variant has increased infectivity and immune escape compared with previous variants, and caused the surge of massive COVID-19 waves globally. Despite a vast majority (~90%) of the population of Santa Fe city, Argentina had been vaccinated and/or had been infected by SARS-CoV-2 when Omicron emerged, the epidemic wave that followed its arrival was by far the largest one experienced in the city. A serosurvey conducted prior to the arrival of Omicron allowed to assess the acquired humoral defences preceding the wave and to conduct a longitudinal study to provide individual-level real-world data linking antibody levels and protection against COVID-19 during the wave. A very large proportion of 1455 sampled individuals had immunological memory against COVID-19 at the arrival of Omicron (almost 90%), and about half (48.9%) had high anti-spike immunoglobulin G levels (>200 UI/ml). However, the antibody titres varied greatly among the participants, and such variability depended mainly on the vaccine platform received, on having had COVID-19 previously and on the number of days elapsed since last antigen exposure (vaccine shot or natural infection). A follow-up of 514 participants provided real-world evidence of antibody-mediated protection against COVID-19 during a period of high risk of exposure to an immune-escaping highly transmissible variant. Pre-wave antibody titres were strongly negatively associated with COVID-19 incidence and severity of symptoms during the wave. Also, receiving a vaccine shot during the follow-up period reduced the COVID-19 risk drastically (15-fold). These results highlight the importance of maintaining high defences through vaccination at times of high risk of exposure to immune-escaping variants.
We study approximations for the Lévy area of Brownian motion which are based on the Fourier series expansion and a polynomial expansion of the associated Brownian bridge. Comparing the asymptotic convergence rates of the Lévy area approximations, we see that the approximation resulting from the polynomial expansion of the Brownian bridge is more accurate than the Kloeden–Platen–Wright approximation, whilst still only using independent normal random vectors. We then link the asymptotic convergence rates of these approximations to the limiting fluctuations for the corresponding series expansions of the Brownian bridge. Moreover, and of interest in its own right, the analysis we use to identify the fluctuation processes for the Karhunen–Loève and Fourier series expansions of the Brownian bridge is extended to give a stand-alone derivation of the values of the Riemann zeta function at even positive integers.
The risk factors specific to the elderly population for severe coronavirus disease 2019 (COVID-19) caused by the Omicron variant of concern (VOC) are not yet clear. We performed an exploratory analysis using logistic regression to identify risk factors for severe COVID-19 illness among 4,868 older adults with a positive severe acute respiratory coronavirus 2 (SARS-CoV-2) test result who were admitted to a healthcare facility between 1 January 2022 and 16 May 2022. We then conducted one-to-one propensity score (PS) matching for three factors – dementia, admission from a long-term care facility and poor physical activity status – and used Fisher's exact test to compare the proportion of severe COVID-19 cases in the matched data. We also estimated the average treatment effect on treated (ATT) in each PS matching analysis. Of the 4,868 cases analysed, 1,380 were severe. Logistic regression analysis showed that age, male sex, cardiovascular disease, cerebrovascular disease, chronic lung disease, renal failure and/or dialysis, physician-diagnosed obesity, admission from a long-term care facility and poor physical activity status were risk factors for severe disease. Vaccination and dementia were identified as factors associated with non-severe illness. The ATT for dementia, admission from a long-term care facility and poor physical activity status was −0.04 (95% confidence interval −0.07 to −0.01), 0.09 (0.06 to 0.12) and 0.17 (0.14 to 0.19), respectively. Our results suggest that poor physical activity status and living in a long-term care facility have a substantial association with the risk of severe COVID-19 caused by the Omicron VOC, while dementia may be associated with non-severe illness.
This paper studies the open-loop equilibrium strategies for a class of non-zero-sum reinsurance–investment stochastic differential games between two insurers with a state-dependent mean expectation in the incomplete market. Both insurers are able to purchase proportional reinsurance contracts and invest their wealth in a risk-free asset and a risky asset whose price is modeled by a general stochastic volatility model. The surplus processes of two insurers are driven by two standard Brownian motions. The objective for each insurer is to find the equilibrium investment and reinsurance strategies to balance the expected return and variance of relative terminal wealth. Incorporating the forward backward stochastic differential equations (FBSDEs), we derive the sufficient conditions and obtain the general solutions of equilibrium controls for two insurers. Furthermore, we apply our theoretical results to two special stochastic volatility models (Hull–White model and Heston model). Numerical examples are also provided to illustrate our results.
Understanding the subsurface is crucial in building a sustainable future, particularly for urban centers. Importantly, the thermal effects that anthropogenic infrastructure, such as buildings, tunnels, and ground heat exchangers, can have on this shared resource need to be well understood to avoid issues, such as overheating the ground, and to identify opportunities, such as extracting and utilizing excess heat. However, obtaining data for the subsurface can be costly, typically requiring the drilling of boreholes. Bayesian statistical methodologies can be used towards overcoming this, by inferring information about the ground by combining field data and numerical modeling, while quantifying associated uncertainties. This work utilizes data obtained in the city of Cardiff, UK, to evaluate the applicability of a Bayesian calibration (using GP surrogates) approach to measured data and associated challenges (previously not tested) and to obtain insights on the subsurface of the area. The importance of the data set size is analyzed, showing that more data are required in realistic (field data), compared to controlled conditions (numerically-generated data), highlighting the importance of identifying data points that contain the most information. Heterogeneity of the ground (i.e., input parameters), which can be particularly prominent in large-scale subsurface domains, is also investigated, showing that the calibration methodology can still yield reasonably accurate results under heterogeneous conditions. Finally, the impact of considering uncertainty in subsurface properties is demonstrated in an existing shallow geothermal system in the area, showing a higher than utilized ground capacity, and the potential for a larger scale system given sufficient demand.
Nontyphoidal salmonellosis is the leading reported foodborne illness in Florida. Although the diversity of Salmonella serotypes circulating in Florida has been identified, the geographical characteristics of the major serotypes are poorly described. Here we examined the geospatial patterns of 803 whole-genome sequenced Salmonella isolates within seven major serotypes (Enteritidis, Newport, Javiana, Sandiego, Braenderup, Typhimurium and I 4,[5],12:i:-) with the metadata obtained from Florida Department of Health during 2017–2018. Geographically, the distribution of incidence rates varied distinctively between serotypes. Illnesses with Enteritidis and Newport serotypes were widespread in Florida. The incidence rate for Javiana was relatively higher in the north compared to the south. Typhimurium was concentrated in the northwest, while I 4,[5],12:i:-, the monophasic variant of Typhimurium was limited to the south. We also evaluated space–time clustering of isolates at the zip code level using scan statistic models. Space–time clusters were detected for each major serotype during 2017–2018. The multinomial scan statistic found the risk of illness with Javiana was higher in the north and southwest in the fall of 2017 compared to other major serotypes. This serotype-specific clustering analysis will assist in further unpacking the associations between distinct reservoirs and illnesses with major serotypes in Florida.
Network mutual aid platforms is one of the popular risk-sharing models in recent years, and they have almost 200 million members in China. However, current mutual aid platforms does not satisfy the actuarial rules in either the apportionment method or the pricing principle. Hence, a variety of mutual aid models which enable mutual aid members with different risks to exchange their risks in a transparent and actuarial fair way have been proposed in this paper. Besides, the decision-making frameworks for participants choosing between the mutual aid platform and similar insurance products, or choosing no risk sharing are constructed, respectively. Decisions are made based on the principle of maximizing expected utility. Moreover, the optimization problems of maximizing profit and minimizing risk are constructed, respectively. Through the principle of individual fairness and relative fairness, the problem of adverse selection of the platform can also be reduced. Finally, the actual mutual aid plan is compared with similar insurance products to discuss the advantages of the optimized plan.
We describe the investigations and management of a Cryptosporidium parvum outbreak of linked to consumption of pasteurised milk from a vending machine. Multiple locus variable number of tandem repeats analysis was newly used, confirming that C. parvum detected in human cases was indistinguishable from that in a calf on the farm. This strengthened the evidence for milk from an on-farm vending machine as the source of the outbreak because of post-pasteurisation contamination. Bacteriological indicators of post-pasteurisation contamination persisted after the initial hygiene improvement notice. We propose that on-farm milk vending machines may represent an emerging public health risk.
In a dynamic panel data model, the number of moment conditions increases rapidly with the time dimension, resulting in a large dimensional covariance matrix of the instruments. As a consequence, the generalized method of moments (GMM) estimator exhibits a large bias in small samples, especially when the autoregressive parameter is close to unity. To address this issue, we propose a regularized version of the one-step GMM estimator using three regularization schemes based on three different ways of inverting the covariance matrix of the instruments. Under double asymptotics, we show that our regularized estimators are consistent and asymptotically normal. These regularization schemes involve a tuning or regularization parameter which needs to be chosen. We derive a data-driven selection of this regularization parameter based on an approximation of the higher-order mean square error and show its optimality. As an empirical application, we estimate a model of income dynamics.
In the usual shock models, the shocks arrive from a single source. Bozbulut and Eryilmaz [(2020). Generalized extreme shock models and their applications. Communications in Statistics – Simulation and Computation49(1): 110–120] introduced two types of extreme shock models when the shocks arrive from one of $m\geq 1$ possible sources. In Model 1, the shocks arrive from different sources over time. In Model 2, initially, the shocks randomly come from one of $m$ sources, and shocks continue to arrive from the same source. In this paper, we prove that the lifetime of Model 1 is less than Model 2 in the usual stochastic ordering. We further show that if the inter-arrival times of shocks have increasing failure rate distributions, then the usual stochastic ordering can be generalized to the hazard rate ordering. We study the stochastic behavior of the lifetime of Model 2 with respect to the severity of shocks using the notion of majorization. We apply the new stochastic ordering results to show that the age replacement policy under Model 1 is more costly than Model 2.
To assess whether there is some signal in a big database, aggregate tests for the global null hypothesis of no effect are routinely applied in practice before more specialized analysis is carried out. Although a plethora of aggregate tests is available, each test has its strengths but also its blind spots. In a Gaussian sequence model, we study whether it is possible to obtain a test with substantially better consistency properties than the likelihood ratio (LR; i.e., Euclidean norm-based) test. We establish an impossibility result, showing that in the high-dimensional framework we consider, the set of alternatives for which a test may improve upon the LR test (i.e., its superconsistency points) is always asymptotically negligible in a relative volume sense.
This paper proposes a model-free test for the strict stationarity of a potentially vector-valued time series using the discrete Fourier transform (DFT) approach. We show that the DFT of a residual process based on the empirical characteristic function weakly converges to a zero spectrum in the frequency domain for a strictly stationary time series and a nonzero spectrum otherwise. The proposed test is powerful against various types of nonstationarity including deterministic trends and smooth or abrupt structural changes. It does not require smoothed nonparametric estimation and, thus, can detect the Pitman sequence of local alternatives at the parametric rate $T^{-1/2}$, faster than all existing nonparametric tests. We also design a class of derivative tests based on the characteristic function to test the stationarity in various moments. Monte Carlo studies demonstrate that our test has reasonarble size and excellent power. Our empirical application of exchange rates strongly suggests that both nominal and real exchange rate returns are nonstationary, which the augmented Dickey–Fuller and Kwiatkowski–Phillips–Schmidt–Shin tests may overlook.
This paper develops a test for homogeneity of the threshold parameter in threshold regression models. The test has a natural interpretation from time series perspectives and can also be applied to test for additional change points in the structural break models. The limiting distribution of the test statistic is derived, and the finite sample properties are studied in Monte Carlo simulations. We apply the new test to the tipping point problem studied by Card, Mas, and Rothstein (2008, Quarterly Journal of Economics 123, 177–218) and statistically justify that the location of the tipping point varies across tracts.
We consider the problem of group testing (pooled testing), first introduced by Dorfman. For nonadaptive testing strategies, we refer to a nondefective item as “intruding” if it only appears in positive tests. Such items cause misclassification errors in the well-known COMP algorithm and can make other algorithms produce an error. It is therefore of interest to understand the distribution of the number of intruding items. We show that, under Bernoulli matrix designs, this distribution is well approximated in a variety of senses by a negative binomial distribution, allowing us to understand the performance of the two-stage conservative group testing algorithm of Aldridge.
We developed an agent-based model using a trial emulation approach to quantify effect measure modification of spillover effects of pre-exposure prophylaxis (PrEP) for HIV among men who have sex with men (MSM) in the Atlanta-Sandy Springs-Roswell metropolitan area, Georgia. PrEP may impact not only the individual prescribed, but also their partners and beyond, known as spillover. We simulated a two-stage randomised trial with eligible components (≥3 agents with ≥1 HIV+ agent) first randomised to intervention or control (no PrEP). Within intervention components, agents were randomised to PrEP with coverage of 70%, providing insight into a high PrEP coverage strategy. We evaluated effect modification by component-level characteristics and estimated spillover effects on HIV incidence using an extension of randomisation-based estimators. We observed an attenuation of the spillover effect when agents were in components with a higher prevalence of either drug use or bridging potential (if an agent acts as a mediator between ≥2 connected groups of agents). The estimated spillover effects were larger in magnitude among components with either higher HIV prevalence or greater density (number of existing partnerships compared to all possible partnerships). Consideration of effect modification is important when evaluating the spillover of PrEP among MSM.
We use mollification to regularize the problem of deconvolution of random variables. This regularization method offers a unifying and generalizing framework in order to compare the benefits of various filter-type techniques like deconvolution kernels, Tikhonov, or spectral cutoff methods. In particular, the mollifier approach allows to relax some restrictive assumptions required for the deconvolution kernels, and has better stabilizing properties compared with spectral cutoff or Tikhonov. We show that this approach achieves optimal rates of convergence for both finitely and infinitely smoothing convolution operators under Besov and Sobolev smoothness assumptions on the unknown probability density. The qualification can be arbitrarily high depending on the choice of the mollifier function. We propose an adaptive choice of the regularization parameter using the Lepskiĭ method, and we provide simulations to compare the finite sample properties of our estimator with respect to the well-known regularization methods.