To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This paper concerns an insurance firm’s surplus process observed at renewal inspection times, with a focus on assessing the probability of the surplus level dropping below zero. For various types of inter-inspection time distributions, an explicit expression for the corresponding transform is given. In addition, Cramér–Lundberg-type asymptotics are established. Also, an importance sampling-based Monte Carlo algorithm is proposed, and is shown to be logarithmically efficient.
New-generation pneumococcal conjugate vaccines (PCVs) are available to replace PCV-13 for childhood and adult immunization. Besides cost-effectiveness evaluations which have highly variable results, the comparative immunogenicity of these new vaccines (PCV15, PCV20, PCV21) and their coverage of invasive pneumococcal disease (IPD) and carriage strains in different age-groups should be regarded as well as the antibody susceptibility, antibiotic resistance, invasiveness and virulence of serotypes included in each vaccine. Based on the Canadian experience, these topics are discussed. The optimal strategy would be a 2+1 PCV20 schedule for children, PCV21 for elderly adults and a dual PCV20+PCV21 schedule for adults at very high IPD risk. Shifting from PCV-13 to PCV-15 for children entails a risk of increased IPD incidence in adults because additional serotypes are of low virulence and could be replaced by more invasive and virulent serotypes. This risk can be reasonably excluded if PCV-20 replaces PCV-13 as the former covers additional serotypes being highly invasive and virulent. It is recognized that off-label use of PCV-20 according to a 2+1 schedule could be problematic for some jurisdictions as this is not authorized in all countries. In Canada, however, the 2+1 PCV20 schedule was authorized based on the same dataset submitted elsewhere.
Motivated by recent developments of quasi-stationary Monte Carlo methods, we investigate the stability of quasi-stationary distributions of killed Markov processes under perturbations of the generator. We first consider a general bounded self-adjoint perturbation operator, and then study a particular unbounded perturbation corresponding to truncation of the killing rate. In both scenarios, we quantify the difference between eigenfunctions of the smallest eigenvalue of the perturbed and unperturbed generators in a Hilbert space norm. As a consequence, $\mathcal{L}^1$-norm estimates of the difference of the resulting quasi-stationary distributions in terms of the perturbation are provided.
The study aimed to delve into the incidence and risk factors associated with myocarditis and pericarditis following SARS-COV-2-19 vaccination, addressing a notable gap in understanding the safety profile of vaccinations. Through meticulous data selection from the National Health Insurance System (NHIS) database of Korea, the researchers employed both a case-crossover study and a nested case-control design to analyze temporal patterns and risk factors related to carditis occurrences post-immunization. Key findings revealed a significant association between SARS-COV-2-19 vaccination and the occurrence of carditis, with a strong temporal correlation observed within 10 days post-vaccination. Noteworthy factors contributing to carditis risk included the duration between vaccination and carditis, specific comorbidities and medication use. The study concluded by recommending an extended post-vaccination surveillance duration of at least 10 days and underscored the importance of considering individual medical histories and concurrent medication use in assessing vaccine-induced carditis risk. This study might contribute to understanding vaccine safety profiles and emphasizes the significance of comprehensive post-vaccination monitoring protocols.
Within an infrastructure to monitor vaccine effectiveness (VE) against hospitalization due to COVID-19 and COVID-19 related deaths from November 2022 to July 2023 in seven countries in real-world conditions (VEBIS network), we compared two approaches: (a) estimating VE of the first, second or third COVID-19 booster doses administered during the autumn of 2022, and (b) estimating VE of the autumn vaccination dose regardless of the number of prior doses (autumnal booster approach). Retrospective cohorts were constructed using Electronic Health Records at each participating site. Cox regressions with time-changing vaccination status were fit and site-specific estimates were combined using random-effects meta-analysis. VE estimates with both approaches were mostly similar, particularly shortly after the start of the vaccination campaign, and showed a similar timing of VE waning. However, autumnal booster estimates were more precise and showed a clearer trend, particularly compared to third booster estimates, as calendar time increased after the vaccination campaign and during periods of lower SARS-CoV-2 activity. Moreover, the decrease in protection by increasing calendar time was more clear and precise than when comparing protection by number of doses. Therefore, estimating VE under an autumnal booster framework emerges as a preferred method for future monitoring of COVID-19 vaccination campaigns.
Processing and extracting actionable information, such as fault or anomaly indicators originating from vibration telemetry, is both challenging and critical for an accurate assessment of mechanical system health and subsequent predictive maintenance. In the setting of predictive maintenance for populations of similar assets, the knowledge gained from any single asset should be leveraged to provide improved predictions across the entire population. In this paper, a novel approach to population-level health monitoring is presented adopting a transfer learning approach. The new methodology is applied to monitor multiple rotating plant assets in a power generation scenario. The focus is on the detection of statistical anomalies as a means of identifying deviations from the typical operating regime from a time series of telemetry data. This is a challenging task because the machine is observed under different operating regimes. The proposed methodology can effectively transfer information across different assets, automatically identifying segments with common statistical characteristics and using them to enrich the training of the local supervised learning models. The proposed solution leads to a substantial reduction in mean square error relative to a baseline model.
Denmark is one of the leading countries in establishing digital solutions in the health sector. When SARS-CoV-2 arrived in February 2020, a real-time surveillance system could be rapidly built on existing infrastructure. This rapid data integration for COVID-19 surveillance enabled a data-driven response. Here we describe (a) the setup of the automated, real-time surveillance and vaccination monitoring system for COVID-19 in Denmark, including primary stakeholders, data sources, and algorithms, (b) outputs for various stakeholders, (c) how outputs were used for action and (d) reflect on challenges and lessons learnt. Outputs were tailored to four main stakeholder groups: four outputs provided direct information to individual citizens, four to complementary systems and researchers, 25 to decision-makers, and 15 informed the public, aiding transparency. Core elements in infrastructure needed for automated surveillance had been in place for more than a decade. The COVID-19 epidemic was a pressure test that allowed us to explore the system’s potential and identify challenges for future pandemic preparedness. The system described here constitutes a model for the future infectious disease surveillance in Denmark. With the current pandemic threat posed by avian influenza viruses, lessons learnt from the COVID-19 pandemic remain topical and relevant.
The EU’s Common European Data Space (CEDS) aims to create a single market for data-sharing in Europe, build trust among stakeholders, uphold European values, and benefit society. However, there is the possibility that the values of the EU and the benefits for the common good of European society may get overlooked for the economic benefits of organisations if norms and social values are not considered. We propose that the concept of “data commons” is relevant for defining openness versus enclosure of data in data spaces and is important when considering the balance and trade-off between individual (market) versus collective (societal) benefits from data-sharing within the CEDS. Commons are open-access resources governed by a group, either formally by regulation or informally by local customs. The application of the data commons to the CEDS would promote data-sharing for the “common good.” However, we propose that the data commons approach should be balanced with the market-based approach to CEDS in an inclusive hybrid data governance approach that meets material, price-driven interests, while stimulating collective learning in online networks to form social communities that offer participants a shared identity and social recognition.
This report explores key considerations in relation to adopting a dynamic discount rate funding approach and the impacts of doing so in a range of areas, including funding volatility, investment strategy and end game objectives. It considers the advantages and disadvantages of this approach from the perspective of a range of stakeholders and the challenges that need overcoming in order to fully implement and support the approach, for example data challenges and the new skills required in the industry. The report includes sample modelling to highlight the practical issues that arise when adopting this approach. It describes a step-by-step approach for assessing the risks to be considered when determining an appropriate level of assets to provide funding for a sample set of pension scheme cash flows, as summarised in the table below.
Steps involved in determining the funding buffer and discount rate
Step 1
Create an asset portfolio based on best estimate liability cash flows
Step 2
Adjustment for investment costs
Step 3
Buffer: allowance for asset-side risks
Step 4
Buffer: allowance for asset-liability mismatch risk (reinvestment and disinvestment risk)
Step 5
Buffer: allowance for liability-side risks
Step 6
Buffer: consideration of risk diversification when determining the buffer
It also considers how a dynamic discount rate approach fits within the proposed future funding regulations. Finally, the report puts forward recommendations for the IFoA, Scheme Actuaries and TPR.
Consequences of schemes adopting a dynamic discount rate approach could include very different investment strategies with investment in a wider pool of assets, less use of leveraged Liability Driven Investment, fewer schemes targeting buy-out as their end game strategy and an increase in technical work for actuaries in advising on the optimisation of asset and liability cash flows.
Early detection and active management of invasive group A Streptococcus (iGAS) infection outbreaks are essential. Here, we describe the changing epidemiology of outbreaks of iGAS in England between 2015 and 2019, a period of increasing incidence of iGAS infection. Data on iGAS infections were extracted from national public health management records and laboratory records. Outbreaks were described in size, duration, setting, and emm type. Overall, 194 outbreaks were identified, and reports increased each year, from 16 outbreaks in 2015 to 61 in 2019. The median outbreak size was 3 cases (n = 37; 19%), with 27% of outbreaks recording 4–10 cases (n = 53) and 7% recording more than 10 cases (n = 13). Outbreak duration ranged from 0 to 170 weeks (median 7). Settings of outbreaks changed over the study period, with increasing numbers observed in multiple settings. This study provides new insights into the changing burden of iGAS infection and outbreaks in England.
Methicillin-resistant Staphylococcus aureus (MRSA) spa type t4549 is increasingly prevalent in Denmark, yet its epidemiological sources remain unclear. This study aimed to generate hypotheses about possible risk factors that may be associated with MRSA t4549 infections. We conducted a nationwide case – case questionnaire study comparing MRSA t4549 cases to other MRSA types (t002, t008, t127, t304, and t223) reported between January 2022 and November 2023. The analysis, which included descriptive statistics and logistic regression, found that 75% of MRSA t4549 cases were male. Infections were significantly more frequent in the foot (28%) and toe (54%) compared to other MRSA types. Key risk factors identified were contact with pheasants (OR = 8.70; 95%CI 1.25–174.29), participation in indoor team sports (OR = 7.54, 95%CI: 1.58–54.82) and swimming (OR = 4.15, 95%CI: 1.97–9.03). Although the limited number of cases warrants cautious interpretation, it is crucial to emphasize the need for preventive measures at both the individual and sports facility levels. Further environmental studies are needed to clarify the role of the environment and wildlife in MRSA t4549 transmission. The increasing prevalence of this spa type in Denmark underlines the importance of implementing effective public health strategies to reduce the risk of MRSA transmission.
We study Pareto optimality in a decentralized peer-to-peer risk-sharing market where agents’ preferences are represented by robust distortion risk measures that are not necessarily convex. We obtain a characterization of Pareto-optimal allocations of the aggregate risk in the market, and we show that the shape of the allocations depends primarily on each agent’s assessment of the tail of the aggregate risk. We quantify the latter via an index of probabilistic risk aversion, and we illustrate our results using concrete examples of popular families of distortion functions. As an application of our results, we revisit the market for flood risk insurance in the United States. We present the decentralized risk sharing arrangement as an alternative to the current centralized market structure, and we characterize the optimal allocations in a numerical study with historical flood data. We conclude with an in-depth discussion of the advantages and disadvantages of a decentralized insurance scheme in this setting.
Metal–organic polyhedra (MOPs) are discrete, porous metal–organic assemblies known for their wide-ranging applications in separation, drug delivery, and catalysis. As part of The World Avatar (TWA) project—a universal and interoperable knowledge model—we have previously systematized known MOPs and expanded the explorable MOP space with novel targets. Although these data are available via a complex query language, a more user-friendly interface is desirable to enhance accessibility. To address a similar challenge in other chemistry domains, the natural language question-answering system “Marie” has been developed; however, its scalability is limited due to its reliance on supervised fine-tuning, which hinders its adaptability to new knowledge domains. In this article, we introduce an enhanced database of MOPs and a first-of-its-kind question-answering system tailored for MOP chemistry. By augmenting TWA’s MOP database with geometry data, we enable the visualization of not just empirically verified MOP structures but also machine-predicted ones. In addition, we renovated Marie’s semantic parser to adopt in-context few-shot learning, allowing seamless interaction with TWA’s extensive MOP repository. These advancements significantly improve the accessibility and versatility of TWA, marking an important step toward accelerating and automating the development of reticular materials with the aid of digital assistants.
In today’s world, smart algorithms—artificial intelligence (AI) and other intelligent systems—are pivotal for promoting the development agenda. They offer novel support for decision-making across policy planning domains, such as analysing poverty alleviation funds and predicting mortality rates. To comprehensively assess their efficacy and implications in policy formulation, this paper conducts a systematic review of 207 publications. The analysis underscores their integration within and across stages of the policy planning cycle: problem diagnosis and goal articulation; resource and constraint identification; design of alternative solutions; outcome projection; and evaluation. However, disparities exist in smart algorithm applications across stages, economic development levels, and Sustainable Development Goals (SDGs). While these algorithms predominantly focus on resource identification (29%) and contribute significantly to designing alternatives—such as long-term national energy policies—and projecting outcomes, including predicting multi-scenario land-use ecological security strategies, their application in evaluation remains limited (10%). Additionally, low-income nations have yet to fully harness AI’s potential, while upper-middle-income countries effectively leverage it. Notably, smart algorithm applications for SDGs also exhibit unevenness, with more emphasis on SDG 11 than on SDG 5 and SDG 17. Our study identifies literature gaps. Firstly, despite theoretical shifts, a disparity persists between physical and socioeconomic/environmental planning applications. Secondly, there is limited attention to policy-making in development initiatives, which is critical for improving lives. Future research should prioritise developing adaptive planning systems using emerging powerful algorithms to address uncertainty and complex environments. Ensuring algorithmic transparency, human-centered approaches, and responsible AI are crucial for AI accountability, trust, and credibility.
In recent years, there has been a global trend among governments to provide free and open access to data collected by Earth-observing satellites with the purpose of maximizing the use of this data for a broad array of research and applications. Yet, there are still significant challenges facing non-remote sensing specialists who wish to make use of satellite data. This commentary explores an illustrative case study to provide concrete examples of these challenges and barriers. We then discuss how the specific challenges faced within the case study illuminate some of the broader issues in data accessibility and utility that could be addressed by policymakers that aim to improve the reach of their data, increase the range of research and applications that it enables, and improve equity in data access and use.
The alignment of artificial intelligence (AI) systems with societal values and the public interest is a critical challenge in the field of AI ethics and governance. Traditional approaches, such as Reinforcement Learning with Human Feedback (RLHF) and Constitutional AI, often rely on pre-defined high-level ethical principles. This article critiques these conventional alignment frameworks through the philosophical perspectives of pragmatism and public interest theory, arguing against their rigidity and disconnect with practical impacts. It proposes an alternative alignment strategy that reverses the traditional logic, focusing on empirical evidence and the real-world effects of AI systems. By emphasizing practical outcomes and continuous adaptation, this pragmatic approach aims to ensure that AI technologies are developed according to the principles that are derived from the observable impacts produced by technology applications.
The preferential attachment model is a natural and popular random graph model for a growing network that contains very well-connected ‘hubs’. We study the higher-order connectivity of such a network by investigating the topological properties of its clique complex. We concentrate on the Betti numbers, a sequence of topological invariants of the complex related to the numbers of holes (equivalently, repeated connections) of different dimensions. We prove that the expected Betti numbers grow sublinearly fast, with the trivial exceptions of those at dimensions 0 and 1. Our result also shows that preferential attachment graphs undergo infinitely many phase transitions within the parameter regime where the limiting degree distribution has an infinite variance. Regarding higher-order connectivity, our result shows that preferential attachment favors higher-order connectivity. We illustrate our theoretical results with simulations.
Machine learning’s integration into reliability analysis holds substantial potential to ensure infrastructure safety. Despite the merits of flexible tree structure and formulable expression, random forest (RF) and evolutionary polynomial regression (EPR) cannot contribute to reliability-based design due to absent uncertainty quantification (UQ), thus hampering broader applications. This study introduces quantile regression and variational inference (VI), tailored to RF and EPR for UQ, respectively, and explores their capability in identifying material indices. Specifically, quantile-based RF (QRF) quantifies uncertainty by weighting the distribution of observations in leaf nodes, while VI-based EPR (VIEPR) works by approximating the parametric posterior distribution of coefficients in polynomials. The compression index of clays is taken as an exemplar to develop models, which are compared in terms of accuracy and reliability, and also with deterministic counterparts. The results indicate that QRF outperforms VIEPR, exhibiting higher accuracy and confidence in UQ. In the regions of sparse data, predicted uncertainty becomes larger as errors increase, demonstrating the validity of UQ. The generalization ability of QRF is further verified on a new creep index database. The proposed uncertainty-incorporated modeling approaches are available under diverse preferences and possess significant prospects in broad scientific computing domains.