To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Telomeres are repeating DNA sequences found on the ends of chromosomes, which shorten with age and are implicated in senescence. Cross-species analyses of telomere shortening rates (TSR) and telomere lengths are important for understanding mechanisms underlying senescence, lifespan and life-history strategies of different species. Whittemore et al. (2019) generated a new dataset on variation in TSR, lifespan and body mass. In phylogenetically uncorrected analyses they found that TSR negatively correlates with lifespan. We re-ran analyses of their dataset using appropriate phylogenetic corrections. We found a strong phylogenetic signal in the association between TSR and body mass. We were able to corroborate Whittemore et al.’s major findings, including while correcting for body mass in a multivariate analysis. Since laboratory mice have different telomere lengths and potentially different telomere dynamics than wild mice, we removed mice from the analysis, which attenuates most associations.
Dynamic nuclear polarization (DNP) is a technique in magic-angle spinning (MAS) nuclear magnetic resonance (NMR) which leads to sensitivity enhancement and helps to overcome the issue of low polarization in detected nuclei. Recent research showed, that methyl groups, which show active reorientation dynamics and cause heteronuclear cross relaxation at typical DNP temperatures around 100 K, may be used as a pinpoint source of polarization for selective and site-specific probing. In this study, we investigated the cross-relaxation behavior of methyl groups in nicotine and caffeine under DNP. These effects could be useful for investigating receptor/ligand binding.
This research investigates the impact of deprivation on demographic inequalities in England and Wales among adults. Using demographic measures including the modal age at death, life expectancy, lifespan variation and mortality, it shows a negative correlation with deprivation as measured by the 2015 Index of Multiple Deprivation. Although it finds that life expectancy is increasing overall and the gap between men and women is lessening, improvements are slower paced in more deprived areas such that the gap between rich and poor is slowly worsening over time. Men are more adversely impacted by deprivation than women with the gap in period life expectancy at age 30 in 2015 between the top and bottom 1% of deprived neighbourhoods at 10.9 years for men and 8.4 years for women. Between 2001 and 2015 inequalities in male mortality rates at age 44 were 4.4 times greater in the most deprived 10% of neighbourhoods than those in the 10% least deprived and were much higher than in intervening deciles. The worst deprivation is concentrated in specific areas. For example, in 22 out of 326 English districts, 25% or more of neighbourhoods are in the most deprived 10% and in 5 districts it is 40% or above.
This paper compares the reliability functions of the cold standby, hot standby, and load-sharing redundancy configurations, each of which is composed of two identical components for meeting a given system requirement. Thus far, no research has been done into the conditions that make one configuration more reliable than another because their reliability functions have no closed forms even when the component follows a Weibull lifetime distribution. In this paper, two analytical results are obtained given that the reliability of each configuration is expressed in terms of the design and operational loads of the component. First, higher reliability can be achieved in a cold standby configuration than in a load-sharing configuration if the increase in the component reliability obtained from the reduction in the operational load is not significant. Second, a cold standby configuration exhibits better reliability and carries a higher load than a hot standby configuration if the design load can be increased with a less decrease in the component reliability.
Rules of thumb (RoTs) are proposed as a means of promoting higher levels of Defined Contribution (DC) pension saving and to help stimulate debate about the high and uncertain cost of pension provision, leading to the development of solutions. The Lifetime Pension Contribution (LPC) tells young people what pension contribution is required over a full working life to achieve a decent retirement income, calculated as 23% of average UK earnings. Another RoT is that each 1% of earnings provides a pension of 1.5% of earnings. Other RoTs show how costs vary by retirement age and if the saverʼs retirement planning is on track. The current high cost of pensions is partly due to low interest rates and the inefficiencies of the DC market, with inadequate bulk purchasing power and risk sharing. RoTs might help encourage higher employer contributions, either through automatic enrolment or on a voluntary basis.
Raw milk cheeses are commonly consumed in France and are also a common source of foodborne outbreaks (FBOs). Both an FBO surveillance system and a laboratory-based surveillance system aim to detect Salmonella outbreaks. In early August 2018, five familial FBOs due to Salmonella spp. were reported to a regional health authority. Investigation identified common exposure to a raw goats' milk cheese, from which Salmonella spp. were also isolated, leading to an international product recall. Three weeks later, on 22 August, a national increase in Salmonella Newport ST118 was detected through laboratory surveillance. Concomitantly isolates from the earlier familial clusters were confirmed as S. Newport ST118. Interviews with a selection of the laboratory-identified cases revealed exposure to the same cheese, including exposure to batches not included in the previous recall, leading to an expansion of the recall. The outbreak affected 153 cases, including six cases in Scotland. S. Newport was detected in the cheese and in the milk of one of the producer's goats. The difference in the two alerts generated by this outbreak highlight the timeliness of the FBO system and the precision of the laboratory-based surveillance system. It is also a reminder of the risks associated with raw milk cheeses.
With a case-crossover design, a case's exposure during a risk period is compared to the case's exposures at referent periods. The selection of referents for this self-controlled design is determined by the referent selection strategy (RSS). Previous research mainly focused on systematic bias associated with the RSS. We additionally focused on how RSS determines the number of referents per risk, sensitivity to overdispersion and time-varying confounding.
We illustrated the consequences of different RSS using a simulation study informed by data on meteorological variables and Legionnaires’ disease. By randomising the events and exposure time series, we explored statistical power associated with time-stratified and fixed bidirectional RSS and their susceptibility to systematic bias and confounding bias. In addition, we investigated how a high number of events on the same date (e.g. outbreaks) affected coefficient estimation. As illustrated by our work, referent selection alone can be insufficient to control for a time-varying confounding bias. In contrast to systematic bias, confounding bias can be hard to detect. We studied potential solutions: varying the model parameters and link-function, outlier-removal and aggregating the input-data over smaller areas. Our simulation study offers a framework for researchers looking to detect and to avoid bias in case-crossover studies.
Cerebral toxoplasmosis is a leading cause of the central nervous system disorders in acquired immune deficiency syndrome. This study aimed to investigate the clinical course of cerebral toxoplasmosis in human immunodeficiency virus (HIV)-infected individuals. The study included 90 HIV-infected patients with cerebral toxoplasmosis, who underwent inpatient treatment. In case of positive enzyme immunoassay, HIV infection was confirmed with the immunoblot test. The HIV-1 ribonucleic acid level was determined using the polymerase chain reaction method. The flow cytometry was used for counting CD4 (cluster of differentiation 4 cells). Pathomorphological examination included the autopsy, gross and microscopic examination of internal organs, histological and other methods. The incidence of cerebral toxoplasmosis significantly increases at the CD4 count below 100 cells/μl, P < 0.001, and at the HIV viral load above 50 copies/ml, P < 0.05. The clinical picture of cerebral toxoplasmosis included focal symptoms, cognitive impairment, toxic syndrome, mild cerebral symptoms and a meningeal symptom. Given the absence of a specific clinical picture and the absence of abnormal laboratory and instrumental findings, the cerebral toxoplasmosis needs to be diagnosed with a number diagnostic methods combined: clinical examination, laboratory testing, immunological examination, molecular genetic testing and neuroradiological imaging.
Medicine is becoming increasingly reliant on diagnostic, prognostic and screening tests for the successful treatment of patients. With new tests being developed all the time, a more informed understanding of the benefits and drawbacks of these tests is crucial. Providing readers with the tools needed to evaluate and interpret these tests, numerous real-world examples demonstrate the practical application and relevance of the material. The mathematics involved are rigorously explained using simple and informative language. Topics covered include the diagnostic process, reliability and accuracy of tests, and quantifying treatment benefits using randomized trials, amongst others. Engaging illustrations act as visual representations of the concepts discussed in the book, complementing the textual explanation.Based on decades of experience teaching in a clinical research training program, this fully updated second edition is an essential guide for anyone looking to select, develop or market medical tests.
Regression modelling involving heavy-tailed response distributions, which have heavier tails than the exponential distribution, has become increasingly popular in many insurance settings including non-life insurance. Mixed Exponential models can be considered as a natural choice for the distribution of heavy-tailed claim sizes since their tails are not exponentially bounded. This paper is concerned with introducing a general family of mixed Exponential regression models with varying dispersion which can efficiently capture the tail behaviour of losses. Our main achievement is that we present an Expectation-Maximization (EM)-type algorithm which can facilitate maximum likelihood (ML) estimation for our class of mixed Exponential models which allows for regression specifications for both the mean and dispersion parameters. Finally, a real data application based on motor insurance data is given to illustrate the versatility of the proposed EM-type algorithm.
We study the relation between one-year premium risk and ultimate premium risk. In practice, the one-year risk is sometimes related to the ultimate risk by using a so-called emergence pattern formula which postulates a linear relation between both risks. We define the true emergence pattern of the ultimate loss for the one-year premium risk based on a conditional distribution of the ultimate loss derived from a multivariate distribution of the claims development process. We investigate three models commonly used in claims reserving and prove that the true emergence pattern formulas are different from the linear emergence pattern formula used in practice. We show that the one-year risk, when measured by VaR, can be under and overestimated if the linear emergence pattern formula is applied. We present two modifications of the linear emergence pattern formula. These modifications allow us to go beyond the claims development models investigated in the first part and work with an arbitrary distribution of the ultimate loss.
A generalised property exposure rating framework is presented here to address two issues arising in the standard approach to exposure rating, especially in the context of direct insurance and facultative reinsurance (D&F) property pricing:
(a) What to do when the main assumption of exposure rating, scalability – that is, that the probability of a given damage ratio does not depend on the maximum possible loss (MPL) but only on the type of property – breaks down.
(b) How to take account of the uncertainty around the MPL, that is, the fact that the MPL (unlike the insured value, IV) is an informed estimate rather than a contractual feature.
The first difficulty is addressed by making the exposure rating framework more flexible by introducing exposure curves that are a mixture of scale-dependent and scale-independent losses, and where the weight given to the two components is a function of the MPL. This allows to model the impact on the expected losses of changes in the underlying deductibles, a classic problem in D&F pricing normally solved with the help of deductible impact tables. The second difficulty is addressed by working out the mathematical implications of having a finite probability of exceeding the MPL and extending the exposure curve up to the maximum of MPL and IV. A practical application of this framework – in which the scale-independent and scale-dependent losses are identified with attritional and large losses respectively – is described. A discussion of how this generalised framework can be calibrated based on actual data is included, and implementation code for the framework is made available.
This paper presents analytical representations for an optimal insurance contract under distortion risk measure and in the presence of model uncertainty. We incorporate ambiguity aversion and distortion risk measure through the model of Robert and Therond [(2014) ASTIN Bulletin: The Journal of the IAA, 44(2), 277–302.] as per the framework of Klibanoff et al. [(2005) A smooth model of decision making under ambiguity. Econometrica, 73(6), 1849–1892.]. Explicit optimal insurance indemnity functions are derived when the decision maker (DM) applies Value-at-Risk as risk measure and is ambiguous about the loss distribution. Our results show that: (1) under model uncertainty, ambiguity aversion results in a distorted probability distribution over the set of possible models with a bias in favor of the model which yields a larger risk; (2) a more ambiguity-averse DM would demand more insurance coverage; (3) for a given budget, uncertainties about the loss distribution result in higher risk level for the DM.
This paper studies deep learning approaches to find optimal reinsurance and dividend strategies for insurance companies. Due to the randomness of the financial ruin time to terminate the control processes, a Markov chain approximation-based iterative deep learning algorithm is developed to study this type of infinite-horizon optimal control problems. The optimal controls are approximated as deep neural networks in both cases of regular and singular types of dividend strategies. The framework of Markov chain approximation plays a key role in building the iterative equations and initialization of the algorithm. We implement our method to classic dividend and reinsurance problems and compare the learning results with existing analytical solutions. The feasibility of our method for complicated problems has been demonstrated by applying to an optimal dividend, reinsurance and investment problem under a high-dimensional diffusive model with jumps and regime switching.
Pheochromocytoma (PCC) is a rare, mostly benign tumour of the adrenal medulla. Hereditary PCC accounts for ~35% of cases and has been associated with germline mutations in several cancer susceptibility genes (e.g., KIF1B, SDHB, VHL, SDHD, RET). We performed whole-exome sequencing in a family with four PCC-affected patients in two consecutive generations and identified a potential novel candidate pathogenic variant in the REXO2 gene that affects splicing (c.531-1G>T (NM 015523.3)), which co-segregated with the phenotype in the family. REXO2 encodes for RNA exonuclease 2 protein and localizes to 11q23, a chromosomal region displaying allelic imbalance in PCC. REXO2 protein has been associated with DNA repair, replication and recombination processes and thus its inactivation may contribute to tumorigenesis. While the study suggests that this novel REXO2 gene variant underlies PCC in this family, additional functional studies are required in order to establish the putative role of the REXO2 gene in PCC predisposition.
The substantially updated third edition of the popular Actuarial Mathematics for Life Contingent Risks is suitable for advanced undergraduate and graduate students of actuarial science, for trainee actuaries preparing for professional actuarial examinations, and for life insurance practitioners who wish to increase or update their technical knowledge. The authors provide intuitive explanations alongside mathematical theory, equipping readers to understand the material in sufficient depth to apply it in real-world situations and to adapt their results in a changing insurance environment. Topics include modern actuarial paradigms, such as multiple state models, cash-flow projection methods and option theory, all of which are required for managing the increasingly complex range of contemporary long-term insurance products. Numerous exam-style questions allow readers to prepare for traditional professional actuarial exams, and extensive use of Excel ensures that readers are ready for modern, Excel-based exams and for the actuarial work environment. The Solutions Manual (ISBN 9781108747615), available for separate purchase, provides detailed solutions to the text's exercises.