To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Exponential random graph models, or ERGMs, are a flexible and general class of models for modeling dependent data. While the early literature has shown them to be powerful in capturing many network features of interest, recent work highlights difficulties related to the models’ ill behavior, such as most of the probability mass being concentrated on a very small subset of the parameter space. This behavior limits both the applicability of an ERGM as a model for real data and inference and parameter estimation via the usual Markov chain Monte Carlo algorithms. To address this problem, we propose a new exponential family of models for random graphs that build on the standard ERGM framework. Specifically, we solve the problem of computational intractability and “degenerate” model behavior by an interpretable support restriction. We introduce a new parameter based on the graph-theoretic notion of degeneracy, a measure of sparsity whose value is commonly low in real-world networks. The new model family is supported on the sample space of graphs with bounded degeneracy and is called degeneracy-restricted ERGMs, or DERGMs for short. Since DERGMs generalize ERGMs—the latter is obtained from the former by setting the degeneracy parameter to be maximal—they inherit good theoretical properties, while at the same time place their mass more uniformly over realistic graphs. The support restriction allows the use of new (and fast) Monte Carlo methods for inference, thus making the models scalable and computationally tractable. We study various theoretical properties of DERGMs and illustrate how the support restriction improves the model behavior. We also present a fast Monte Carlo algorithm for parameter estimation that avoids many issues faced by Markov Chain Monte Carlo algorithms used for inference in ERGMs.
This paper studies a generalization of the Gerber-Shiu expected discounted penalty function [Gerber and Shiu (1998). On the time value of ruin. North American Actuarial Journal 2(1): 48–72] in the context of the perturbed compound Poisson insurance risk model, where the moments of the total discounted claims and the discounted small fluctuations (arising from the Brownian motion) until ruin are also included. In particular, the latter quantity is represented by a stochastic integral and has never been analyzed in the literature to the best of our knowledge. Recursive integro-differential equations satisfied by our generalized Gerber-Shiu function are derived, and these are transformed to defective renewal equations where the components are identified. Explicit solutions are given when the individual claim amounts are distributed as a combination of exponentials. Numerical illustrations are provided, including the computation of the covariance between discounted claims and discounted perturbation until ruin.
This paper introduces a neural network (NN) approach for fitting the Lee-Carter (LC) and the Poisson Lee-Carter model on multiple populations. We develop some NNs that replicate the structure of the individual LC models and allow their joint fitting by simultaneously analysing the mortality data of all the considered populations. The NN architecture is specifically designed to calibrate each individual model using all available information instead of using a population-specific subset of data as in the traditional estimation schemes. A large set of numerical experiments performed on all the countries of the Human Mortality Database shows the effectiveness of our approach. In particular, the resulting parameter estimates appear smooth and less sensitive to the random fluctuations often present in the mortality rates’ data, especially for low-population countries. In addition, the forecasting performance results significantly improved as well.
We present a uniqueness result for Gibbs point processes with interactions that come from a non-negative pair potential; in particular, we provide an explicit uniqueness region in terms of activity z and inverse temperature $\beta$. The technique used relies on applying to the continuous setting the classical Dobrushin criterion. We also present a comparison to the two other uniqueness methods of cluster expansion and disagreement percolation, which can also be applied for this type of interaction.
This study aimed to describe the clinical manifestations of adenovirus infections and identify potential risk factors for co-infection with chlamydia, viruses and bacteria in hospitalised children from Hangzhou, China. From January to December 2019, the characteristics of hospitalised children infected with adenovirus at Hangzhou Children's Hospital and Zhejiang Xiaoshan Hospital were collected. The clinical factors related to co-infection with chlamydia, viruses and bacteria were assessed using multivariate logistic regression analyses. A total of 5989 children were infected with adenovirus, of which 573 were hospitalised for adenovirus infection. The severity of adenovirus respiratory infection was categorised as follows: mild (bronchiolitis, 73.6%), moderate (bronchopneumonia, 17.6%) or severe (pneumonia, 8.8%). Of the 573 children who were hospitalised, 280 presented with co-infection of chlamydia, viruses or bacteria, while the remaining 293 had only adenovirus infection. Multivariate stepwise logistic regression analyses indicated that elevated ferritin was associated with an increased risk of chlamydia co-infection (odds ratio (OR) 6.50; 95% confidence interval (CI) 1.56–27.11; P = 0.010). However, increased white blood cell (WBC) count was associated with a reduced risk of viral co-infection (OR 0.84; 95% CI 0.75–0.95; P = 0.006). The study indicated that co-infection with chlamydia could be affected by elevated ferritin levels. WBC levels could affect viral co-infection in hospitalised children infected with adenovirus.
A diffusion approximation to a risk process under dynamic proportional reinsurance is considered. The goal is to minimise the discounted time in drawdown; that is, the time where the distance of the present surplus to the running maximum is larger than a given level $d > 0$. We calculate the value function and determine the optimal reinsurance strategy. We conclude that the drawdown measure stabilises process paths but has a drawback as it also prevents surpassing the initial maximum. That is, the insurer is, under the optimal strategy, not interested in any more profits. We therefore suggest using optimisation criteria that do not avoid future profits.
This paper illustrates the potential impacts of climate change on financial markets, focusing on their long-term significance. It uses a top-down modelling tool developed by Ortec Finance in partnership with Cambridge Econometrics that combines climate science with macro-economic and financial effects to examine the possible impacts of three plausible (not extreme) climate pathways. The paper first considers the impact on gross domestic product (GDP), finding that GDP is lower in all three pathways, with the most severe reduction in the Failed Transition Pathway where the Paris Agreement climate targets are not met. The model then translates these GDP impacts into financial market effects. In the Failed Transition Pathway, cumulative global equity returns are approximately 50% lower over the period 2020–2060 than in the climate-uninformed base case. For the other two pathways where the Paris Agreement targets are met, the corresponding figures are 15% and 25% lower returns than in the base case. Results are provided for other asset classes too. These demonstrate that climate change represents a significant market risk, with implications for financial planning, modelling and regulation.
This paper demonstrates how climate scenario analysis can be used for forward-looking assessment of the risks and opportunities for financial institutions, using a case study for a UK defined benefit pension scheme. It uses a top-down modelling tool developed by Ortec Finance in partnership with Cambridge Econometrics to explore the possible impacts of three plausible (not extreme) climate pathways of the scheme’s assets and liabilities. It finds that the funding risks are greater under all three climate pathways than under the climate-uninformed base scenario. In the absence of changes to the investment strategy or recovery plan, the time taken to reach full funding is increased by three to nine years. Given that most models currently used by actuaries do not make explicit adjustments for climate change, these modelled results suggest it is quite likely that pension schemes are systematically underestimating the funding risks they face.
Big data has been reported widely to facilitate epidemic prevention and control in health care during the coronavirus disease 2019 (COVID-19) pandemic. However, there is still a lack of practical experience in applying it to hospital prevention and control. This study is devoted to the practical experience of design and implementation as well as the preliminary results of an innovative big data-driven COVID-19 risk personnel screening management system in a hospital. Our screening system integrates data sources in four dimensions, which includes Health Quick Response (QR) code, abroad travelling history, transportation close contact personnel and key surveillance personnel. Its screening targets cover all patients, care partner and staff who come to the hospital. As of November 2021, nearly 690 000 people and 5.79 million person-time had used automated COVID-19 risk screening and monitoring. A total of 10 376 person-time (0.18%) with abnormal QR code were identified, 242 person-time with abroad travelling history were identified, 925 person-time were marked based on the data of key surveillance personnel, no transportation history personnel been reported and no COVID-19 nosocomial infection occurred in the hospital. Through the application of this system, the hospital's expenditure on manpower and material resources for epidemic prevention and control has also been significantly reduced. Collectively, this study has proved to be an effective and efficient model for the use of digital health technology in response to the COVID-19 pandemic. Based on the data from multiple sources, this system has an irreplaceable role in identifying close contacts or suspicious person, and can significantly reduce the social burden caused by COVID-19, especially the human resources and economic costs of hospital prevention and control. It may provide guidance for clinical epidemic prevention and control in hospitals, as well as for future public health emergencies.
Rabies in cattle is a viral disease with mandatory notification in Brazil, transmitted by Desmodus rotundus, which causes an invariably fatal acute encephalitis. To understand the dynamics of this disease in Tocantins state, Brazil, an analysis of the time series of rabies cases in cattle between 2006 and 2019 was carried out to describe the pattern of its occurrence, aiming to subsidise the Official Veterinary Service (OVS) with relevant information to enable the improvement of control actions provided for in the guidelines of the National Program for the Control of Rabies in Herbivores (NPCRH). The statistical analyses of the time series under study were performed using the R Studio software, version 1.1.463, in which the existence of trend, cyclicality and seasonality of rabies cases in cattle was assessed. These analyses showed that this disease is endemic in Tocantins state, with epidemic outbreaks that can occur every 3 or 4 years, without a seasonality pattern. The autoregressive integrated by moving average (ARIMA(4,1,4)) model predicted the approximate occurrence of 38 rabies cases in cattle in 2022 and all monthly records of this disease remained within the predicted confidence interval (95% CI) in 2020 and 2021, demonstrating it has a good predictive capacity and allowing OVS to intervene in the present processes to achieve better control of this disease.
Consider the following iterated process on a hypergraph H. Each vertex v starts with some initial weight $x_v$. At each step, uniformly at random select an edge e in H, and for each vertex v in e replace the weight of v by the average value of the vertex weights over all vertices in e. This is a generalization of an interactive process on graphs which was first introduced by Aldous and Lanoue. In this paper we use the eigenvalues of a Laplacian for hypergraphs to bound the rate of convergence for this iterated averaging process.
Presence of antimicrobial resistance (AMR) genes in Escherichia coli inhabiting anthropogenic rivers is an important public health concern because plasmid-mediated AMR genes can easily spread to other pathogens by horizontal gene transfer. Besides β-lactams, quinolones and aminoglycosides are the major antibiotics against E. coli. In the present study, we have investigated the presence of plasmid-mediated quinolone resistance (PMQR) and aminoglycoside resistance genes in E. coli isolated from a major river of northern India. Our results revealed that majority of the strains were phenotypically susceptible for fluoroquinolones and some aminoglycosides like amikacin, netilmicin, tobramycin and gentamicin. However, 16.39% of the strains were resistant for streptomycin, 8.19% for kanamycin and 3.30% for gentamicin. Of the various PMQR genes investigated, only qnrS1 was present in 24.59% of the strains along with ISEcl2. Aminoglycoside-resistance genes like strA-strB were found to be present in 16.39%, aphA1 in 8.19% and aacC2 in only 3.30% of the strains. Though, no co-relation was observed between phenotypic resistance for fluorquinolones and presence of PMQR genes, phenotypic resistance for streptomycin, kanamycin and gentamicin exactly co-related with the presence of the genes strA-strB, aphA1 and aacC2, respectively. Moreover, all the AMR genes discerned in aquatic E. coli were found to be situated on conjugative plasmids and, thus easily transferrable. Our study accentuates the importance of routine surveillance of urban rivers to curtail the spread of AMR genes in aquatic pathogens.
This paper develops an estimation methodology for network data generated from a system of simultaneous equations, which allows for network interdependencies via spatial lags in the endogenous and exogenous variables, as well as in the disturbances. By allowing for higher-order spatial lags, our specification provides important flexibility in modeling network interactions. The estimation methodology builds, among others, on the two-step generalized method of moments estimation approach introduced in Kelejian and Prucha (1998, Journal of Real Estate Finance and Economics 17, 99–121; 1999, International Economic Review 40, 509–533; 2004, Journal of Econometrics 118, 27–50). The paper considers limited and full information estimators, and one- and two-step estimators, and establishes their asymptotic properties. In contrast to some of the earlier two-step estimation literature, our asymptotic results facilitate joint tests for the absence of all forms of network spillovers.
We propose a statistical procedure to determine the dimension of the nonstationary subspace of cointegrated functional time series taking values in the Hilbert space of square-integrable functions defined on a compact interval. The procedure is based on sequential application of a proposed test for the dimension of the nonstationary subspace. To avoid estimation of the long-run covariance operator, our test is based on a variance ratio-type statistic. We derive the asymptotic null distribution and prove consistency of the test. Monte Carlo simulations show good performance of our test and provide evidence that it outperforms the existing testing procedure. We apply our methodology to three empirical examples: age-specific U.S. employment rates, Australian temperature curves, and Ontario electricity demand.
We estimate and test for multiple structural breaks in distribution via an empirical characteristic function approach. By minimizing the sum of squared generalized residuals, we can consistently estimate the break fractions. We propose a sup-F type test for structural breaks in distribution as well as an information criterion and a sequential testing procedure to determine the number of breaks. We further construct a class of derivative tests to gauge possible sources of structural breaks, which is asymptotically more powerful than the smoothed nonparametric tests for structural breaks. Simulation studies show that our method performs well in determining the appropriate number of breaks and estimating the unknown breaks. Furthermore, the proposed tests have reasonable size and excellent power in finite samples. In an application to exchange rate returns, our tests are able to detect structural breaks in distribution and locate the break dates. Our tests also indicate that the documented breaks appear to occur in variance and higher-order moments, but not so often in mean.
Neural networks-based learning of the distribution of non-dispatchable renewable electricity generation from sources, such as photovoltaics (PV) and wind as well as load demands, has recently gained attention. Normalizing flow density models are particularly well suited for this task due to the training through direct log-likelihood maximization. However, research from the field of image generation has shown that standard normalizing flows can only learn smeared-out versions of manifold distributions. Previous works on normalizing flow-based scenario generation do not address this issue, and the smeared-out distributions result in the sampling of noisy time series. In this paper, we exploit the isometry of the principal component analysis (PCA), which sets up the normalizing flow in a lower-dimensional space while maintaining the direct and computationally efficient likelihood maximization. We train the resulting principal component flow (PCF) on data of PV and wind power generation as well as load demand in Germany in the years 2013–2015. The results of this investigation show that the PCF preserves critical features of the original distributions, such as the probability density and frequency behavior of the time series. The application of the PCF is, however, not limited to renewable power generation but rather extends to any dataset, time series, or otherwise, which can be efficiently reduced using PCA.