To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The bonus-malus system (BMS) is a widely recognized and commonly employed risk management tool. A well-designed BMS can match expected insurance payments with estimated claims even in a diverse group of risks. Although there has been abundant research on improving bonus-malus (BM) systems, one important aspect has been overlooked: the stationary probability of a BMS satisfies the monotone likelihood ratio property. The monotone likelihood ratio for stationary probabilities allows us to better understand how riskier policyholders are more likely to remain in higher premium categories, while less risky policyholders are more likely to move toward lower premiums. This study establishes this property for BMSs that are described by an ergodic Markov chain with one possible claim and a transition rule +1/-d. We derive this result from the linear recurrences that characterize the stationary distribution; this represents a novel analytical approach in this domain. We also illustrate the practical implications of our findings: in the BM design problem, the premium scale is automatically monotonic.
Networks describe complex relationships between individual actors. In this work, we address the question of how to determine whether a parametric model, such as a stochastic block model or latent space model, fits a data set well, and will extrapolate to similar data. We use recent results in random matrix theory to derive a general goodness-of-fit (GoF) test for dyadic data. We show that our method, when applied to a specific model of interest, provides a straightforward, computationally fast way of selecting parameters in a number of commonly used network models. For example, we show how to select the dimension of the latent space in latent space models. Unlike other network GoF methods, our general approach does not require simulating from a candidate parametric model, which can be cumbersome with large graphs, and eliminates the need to choose a particular set of statistics on the graph for comparison. It also allows us to perform GoF tests on partial network data, such as Aggregated Relational Data. We show with simulations that our method performs well in many situations of interest. We analyze several empirically relevant networks and show that our method leads to improved community detection algorithms.
Pneumococcal conjugate vaccines (PCVs) have influenced population dynamics of Streptococcus pneumoniae in the nasopharynx and may have contributed to increased Staphylococcus aureus colonization. This study assessed the prevalence of colonization, antibiotic resistance patterns, and associated risk factors for colonization and co-colonization of S. aureus and S. pneumoniae in healthy Peruvian children post-PCV introduction. Nasopharyngeal swabs from children <24 months were collected in five hospitals in Lima (2018–2019). Microbiological identification and antibiotic susceptibility tests were performed, and multinomial regression evaluated factors influencing colonization. Among 894 children, 19.7% were colonized with S. aureus, 20.3% with S. pneumoniae, and 2.9% co-colonized. Of the 176 S. aureus strains isolated, 1.7% were methicillin resistant and 20.5% were clindamycin resistant; no resistance to trimethoprim-sulfamethoxazole (SXT) was found. Among 182 S. pneumoniae strains isolated, 48.9% were resistant to macrolides, 74.7% to SXT; no resistance to penicillin was found. Breastfeeding and vaccination with PCV13 were associated with a reduced prevalence of S. aureus colonization, while vaccination with PCV13 increased the prevalence of S. pneumoniae colonization, mainly by non-vaccine serotypes. This study highlights the need to continue monitoring the changes in colonization dynamics and antimicrobial resistance patterns after vaccine introduction, to guide empirical therapy and future vaccine strategies.
Invasive Escherichia coli disease (IED) is associated with high hospitalization and mortality rates, particularly among adults aged ≥60 years. O-antigens are virulence factors required for E. coli survival. To inform EXPEC9V development, a novel glycoconjugate vaccine targeting E. coli O-antigens that is no longer in active clinical development, this retrospective observational study describes O-serotype prevalence among E. coli isolates from IED patients. Eligible patients were identified from medical record databases (9 January 2018–8 November 2019) across 17 tertiary care hospitals in Europe, North America, and Asia. To estimate vaccine serotype coverage of EXPEC9V, E. coli isolates were O-serotyped using whole-genome sequencing and agglutination. Antimicrobial susceptibility testing was also performed. Nine hundred and two patients were enrolled, of whom 690 (76.5%) were aged ≥60 years. Common serotypes were O25, O2, O6, O1, O15, O75, O16, O4, and O18, with O25 being the most reported (17.3%). In patients aged ≥60 years, 422/637 E. coli isolates were EXPEC9V O-serotypes. EXPEC9V O-serotype prevalence did not substantially differ when stratified according to sex, presence of a positive blood culture, sepsis, fatality, or multidrug resistance. Consistent with previous studies, serotype O25 was most prevalent and associated with ~20% of cases. An EXPEC9V vaccine serotype coverage of 66.2% was observed for IED patients aged ≥60 years.
This paper investigates the time N until a random walk first exceeds some specified barrier. Letting $X_i, i \geq 1,$ be a sequence of independent, identically distributed random variables with a log-concave density or probability mass function, we derive both lower and upper bounds on the probability $P(N \gt n),$ as well as bounds on the expected value $E[N].$ On barriers of the form $a + b \sqrt{k},$ where a is nonnegative, b is positive, and k is the number of steps, we provide additional bounds on $E[N].$
Introduced over a century ago, Whittaker–Henderson smoothing remains widely used by actuaries in constructing one-dimensional and two-dimensional experience tables for mortality, disability, and other life insurance risks. In this paper, we reinterpret this smoothing technique within a modern statistical framework and address six practically relevant questions about its use. First, we adopt a Bayesian perspective on this method to construct credible intervals. Second, in the context of survival analysis, we clarify how to choose the observation and weight vectors by linking the smoothing technique to a maximum likelihood estimator. Third, we improve accuracy by relaxing the method’s reliance on an implicit normal approximation. Fourth, we select the smoothing parameters by maximizing a marginal likelihood function. Fifth, we improve computational efficiency when dealing with numerous observation points and consequently parameters. Finally, we develop an extrapolation procedure that ensures consistency between estimated and predicted values through constraints.
In our digital world, reusing data to inform: decisions, advance science, and improve people’s lives should be easier than ever. However, the reuse of data remains limited, complex, and challenging. Some of this complexity requires rethinking consent and public participation processes about it. First, to ensure the legitimacy of uses, including normative aspects like agency and data sovereignty. Second, to enhance data quality and mitigate risks, especially since data are proxies that can misrepresent realities or be oblivious to the original context or use purpose. Third, because data, both as a good and infrastructure, are the building blocks of both technologies and knowledge of public interest that can help societies work towards the well-being of their people and the environment. Using the case study of the European Health Data Space, we propose a multidimensional, polytopic framework with multiple intersections to democratising decision-making and improving the way in which meaningful participation and consent processes are conducted at various levels and from the point of view of institutions, regulations, and practices.
Risk-sharing rules have been applied to mortality pooling products to ensure these products are actuarially fair and self-sustaining. However, most of the existing studies on the risk-sharing rules of mortality pooling products assume deterministic mortality rates, whereas the literature on mortality models provides empirical evidence suggesting that mortality rates are stochastic and correlated between cohorts. In this paper, we extend existing risk-sharing rules and introduce a new risk-sharing rule, named the joint expectation (JE) rule, to ensure the actuarial fairness of mortality pooling products while accounting for stochastic and correlated mortality rates. Moreover, we perform a systematic study of how the choice of risk-sharing rule, the volatility and correlation of mortality rates, pool size, account balance, and age affect the distribution of mortality credits. Then, we explore a dynamic pool that accommodates heterogeneous members and allows new entrants, and we track the income payments for different members over time. Furthermore, we compare different risk-sharing rules under the scenario of a systematic shock in mortality rates. We find that the account balance affects the distribution of mortality credits for the regression rule, while it has no effect under the proportional, JE, and alive-only rules. We also find that a larger pool size increases the sensitivity to the deviation in total mortality credits for cohorts with mortality rates that are volatile and highly correlated with those of other cohorts, under the stochastic regression rule. Finally, we find that risk-sharing rules significantly influence the effect of longevity shocks on fund balances since, under different risk-sharing rules, fund balances have different sensitivities to deviations in mortality credits.
This paper examines the optimal design of peer-to-peer (P2P) insurance models, which combines outside insurance purchases with P2P risk sharing and heterogeneous risk. Participants contribute deposits to collectively cover the premium for group-based insurance against tail risks and to share uncovered losses. We analyze the cost structure by decomposing it into a fixed premium for outside coverage and a variable component for shared losses, the latter of which may be partially refunded if aggregate losses are sufficiently low. We derive closed-form solutions to the optimal sharing rule that maximizes a mean-variance objective from the perspective of a central or social planner, and we characterize its theoretical properties. Building on this foundation, we further investigate the choice of deposit for the common fund. Finally, we also provide numerical illustrations.
This paper studies a long-standing problem of risk exchange and optimal resource allocation among multiple entities in a continuous-time pure risk-exchange economy. We establish a novel risk exchange mechanism that allows entities to share and transfer risks dynamically over time. To achieve Pareto optimality, we formulate the problem as a stochastic control problem and derive explicit solutions for the optimal investment, consumption, and risk exchange strategies using a martingale method. To highlight practical applications of the solution to the proposed problem, we apply our results to a target benefit pension plan, featuring the potential benefits of risk sharing within this pension system. Numerical examples show the sensitivity of investment portfolios, the adjustment item, and allocation ratios to specific parameters. It is observed that an increase in the aggregate endowment process results in a rise in the adjustment item. Furthermore, the allocation ratios exhibit a positive correlation with the weights of the agents.
Robots need a sense of touch to handle objects effectively, and force sensors provide a straightforward way to measure touch or physical contact. However, contact force data are typically sparse and difficult to analyze, as it only appears during contact and is often affected by noise. Therefore, many researchers have consequently relied on vision-based methods for robotic manipulation. However, vision has limitations, such as occlusions that block the camera’s view, making it ineffective or insufficient for dexterous tasks involving contact. This article presents a method for robotic systems operating under quasi-static conditions to perform contact-rich manipulation using only force/torque measurements. First, the interaction forces/torques between the manipulated object and its environment are collected in advance. A potential function is then constructed from the collected force/torque data using Gaussian process regression with derivatives. Next, we develop haptic dynamic movement primitives (Haptic DMPs) to generate robot trajectories. Unlike conventional DMPs, which primarily focus on kinematic aspects, our Haptic DMPs incorporate force-based interactions by integrating the constructed potential energy. The effectiveness of the proposed method is demonstrated through numerical tasks, including the classical peg-in-hole problem.
Human interactions in the online world comprise a combination of positive and negative exchanges. These diverse interactions can be captured using signed network representations, where edges take positive or negative weights to indicate the sentiment of the interaction between individuals. Signed networks offer valuable insights into online political polarization by capturing antagonistic interactions and ideological divides on social media platforms. This study analyzes polarization on Menéame, a Spanish social media platform that facilitates engagement with news stories through comments and voting. Using a dual-method approach—Signed Hamiltonian Eigenvector Embedding for Proximity for signed networks and Correspondence Analysis for unsigned networks—we investigate how including negative ties enhances the understanding of structural polarization levels across different conversation topics on the platform. While the unsigned Menéame network effectively delineates ideological communities, only by incorporating negative ties can we identify ideologically extreme users who engage in antagonistic behaviors: without them, the most extreme users remain indistinguishable from their less confrontational ideological peers.
Disaster Risk Financing (DRF) presents a massive challenge to governments worldwide in protecting against catastrophic disaster losses. This study explores the development of a Disaster Fund that optimally integrates various DRF instruments, considering several real-world factors, including limited reserves, constrained risk horizons, risk aversion, risk tolerance, insurance structures, and premium pricing strategies. We demonstrate that the Value-at-Risk (VaR) and Tail VaR constraints are equivalent when the government has a limited risk horizon. Furthermore, we investigate the optimality of various insurance structures under different premium principles, conduct comparative statics on key parameters, and analyze the influence of a VaR constraint on the optimal mix of disaster financing instruments. Lastly, we apply our Disaster Fund model to the National Flood Insurance Program dataset to assess the optimal disaster financing strategy within the context of our framework.
Ponzi schemes are financial frauds that are pervasive throughout the world. Since they cause serious harm to society, it is of interest to study them so that they can be prevented. Typically, a Ponzi scheme is instigated by a promoter who promises above-average investment returns. He uses funds from the early investors to pay his later investors. These scams can occasionally last a long time, but they are ultimately unsustainable. This paper describes some well-known Ponzi schemes and identifies their common characteristics. We also review some of the approaches used to model Ponzi schemes.
Leptospirosis remains a significant occupational zoonosis in New Zealand, and emerging serovar shifts warrant a closer examination of climate-related transmission pathways. This study aimed to examine whether total monthly rainfall is associated with reported leptospirosis in humans in New Zealand. Poisson and negative binomial models were developed to examine the relationship between rainfall at 0-, 1-, 2-, and 3-month lags and the incidence of leptospirosis during the month of the report. Total monthly rainfall was positively associated with the occurrence of human leptospirosis in the following month by a factor of 1.017 (95% CI: 1.007–1.026), 1.023 at the 2-month lag (95% CI:1.013–1.032), and 1.018 at the 3-month lag (95% CI: 1.009–1.028) for every additional cm of rainfall. Variation was present in the magnitude of association for each of the individual serovars considered, suggesting different exposure pathways. Assuming that the observed associations are causal, this study supports that additional human cases are likely to occur associated with increased levels of rainfall. This provides the first evidence for including rainfall in a leptospirosis early warning system and to design targeted communication and prevention measures and provide resource allocation, particularly after heavy rainfall in New Zealand.