To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Consider a two-type Moran population of size N with selection and mutation, where the selective advantage of the fit individuals is amplified at extreme environmental conditions. Assume selection and mutation are weak with respect to N, and extreme environmental conditions rarely occur. We show that, as $N\to\infty$, the type frequency process with time sped up by N converges to the solution to a Wright–Fisher-type SDE with a jump term modeling the effect of the environment. We use an extension of the ancestral selection graph (ASG) to describe the genealogical picture of the model. Next, we show that the type frequency process and the line-counting process of a pruned version of the ASG satisfy a moment duality. This relation yields a characterization of the asymptotic type distribution. We characterize the ancestral type distribution using an alternative pruning of the ASG. Most of our results are stated in annealed and quenched form.
A multivariate Poisson distribution is a natural choice for modeling count data stemming from correlated random variables; however, it is limited by the underlying univariate model assumption that the data are equi-dispersed. Alternative models include a multivariate negative binomial and a multivariate generalized Poisson distribution, which themselves suffer from analogous limitations as described in Chapter 1. While the aforementioned distributions motivate the need to instead consider a multivariate analog of the univariate COM–Poisson, such model development varies in order to take into account (or results in) certain distributional qualities. This chapter summarizes such efforts where, for each approach, readers will first learn about any bivariate COM–Poisson distribution formulations, followed by any multivariate analogs. Accordingly, because these models are multidimensional generalizations of the univariate COM–Poisson, they each contain their analogous forms of the Poisson, Bernoulli, and geometric distributions as special cases. The methods discussed in this chapter are the trivariate reduction, compounding, Sarmanov family of distributions, and copulas.
Edited by
David Lynch, Federal Reserve Board of Governors,Iftekhar Hasan, Fordham University Graduate Schools of Business,Akhtar Siddique, Office of the Comptroller of the Currency
Retail credit risk is an important risk for many banks. This chapter describes various retail credit risk models in great detail and reviews the ways they may be validated. Validation principles are described for models used for risk management, stress testing and other applications. The classes of models include both static scoring models and multi-period loss forecasting models. Within the latter class, roll rate model, vintage-based model, and various other models are described. Account/loan level models are also described, including the Cox Proportional Hazard rate model and multinomial logit model. In each case, the authors discuss the academic underpinnings, the industry usage, and choices that are commonly made under various circumstances. The role of data in determining these choices is also discussed.
from
12
-
Validation of Models Used by Banks to Estimate Their Allowance for Loan and Lease Losses
Edited by
David Lynch, Federal Reserve Board of Governors,Iftekhar Hasan, Fordham University Graduate Schools of Business,Akhtar Siddique, Office of the Comptroller of the Currency
Edited by
David Lynch, Federal Reserve Board of Governors,Iftekhar Hasan, Fordham University Graduate Schools of Business,Akhtar Siddique, Office of the Comptroller of the Currency
This chapter provides a unified discussion of the framework for model validation. It describes how model validation developed over time across various disciplines. It then describes the various approaches that are applied for validation of risk management models at financial institutions.
Edited by
David Lynch, Federal Reserve Board of Governors,Iftekhar Hasan, Fordham University Graduate Schools of Business,Akhtar Siddique, Office of the Comptroller of the Currency
This chapter presents a general approach for evaluating the empirical performance of VaR models. The approach leverages data used in standard VaR backtesting, filtering on a number of selected conditioning variables, to perform tests on specific properties of the model. This simple but general approach can be used to test a wide range of model properties and provide useful information on potential areas of improvement for a VaR model.
Edited by
David Lynch, Federal Reserve Board of Governors,Iftekhar Hasan, Fordham University Graduate Schools of Business,Akhtar Siddique, Office of the Comptroller of the Currency
While the Poisson model motivated much of the classical control chart theory for count data, several works note the constraining equi-dispersion assumption. Dispersion must be addressed because over-dispersed data can produce false out-of-control detections when using Poisson limits, while under-dispersed data will produce Poisson limits that are too broad, resulting in potential false negatives and out-of-control states requiring a longer study period for detection. Section 6.1 introduces the Shewhart COM–Poisson control chart, demonstrating its flexibility in assessing in- or out-of-control status, along with advancements made to this chart. These initial works lead to a wellspring of flexible control chart development motivated by the COM–Poisson distribution. Section 6.2 describes a generalized exponentially weighted moving average control chart, and Section 6.3 describes the cumulative sum charts for monitoring COM–Poisson processes. Meanwhile, Section 6.4 introduces generally weighted moving average charts based on the COM-Poisson, and Section 6.5 presents the Conway–Maxwell–Poisson chart via the progressive mean statistic. Finally, the chapter concludes with discussion.
Edited by
David Lynch, Federal Reserve Board of Governors,Iftekhar Hasan, Fordham University Graduate Schools of Business,Akhtar Siddique, Office of the Comptroller of the Currency
This chapter provides an alternative to exceedance or density-based evaluation of VaR models. This alternative is based on empirical likelihood. We also outline a method to infer the risk exposure, by assessing whether some measure of interval forecast error (e.g., distance between the empirical distribution of the PITs and the posited uniform distribution) is related to some measure of risk exposure, such as the volatility of given risk factors.
Edited by
David Lynch, Federal Reserve Board of Governors,Iftekhar Hasan, Fordham University Graduate Schools of Business,Akhtar Siddique, Office of the Comptroller of the Currency
This chapter assesses the accuracy and possible misspecification of VaR models and offers a comparison of backtesting results using PITs over exceedances for the same sample of real portfolios. It investigates results from a set of tests used to assess unconditional coverage, conditional coverage, and independence properties of the realized VaR exceptions. This also presents a comprehensive overview of tests used to assess the uniformity and independence properties of a series of PIT estimates generated from real-world risk models. The analysis includes tests based on the empirical CDF (e.g., Kolmogorov–Smirnov; Cramér–Von Mises; and Anderson–Darling) as well as tests of dependence based on regression analysis of observed PITs.
As the Reynolds number increases, the large-eddy simulation (LES) of complex flows becomes increasingly intractable because near-wall turbulent structures become increasingly small. Wall modeling reduces the computational requirements of LES by enabling the use of coarser cells at the walls. This paper presents a machine-learning methodology to develop data-driven wall-shear-stress models that can directly operate, a posteriori, on the unstructured grid of the simulation. The model architecture is based on graph neural networks. The model is trained on a database which includes fully developed boundary layers, adverse pressure gradients, separated boundary layers, and laminar–turbulent transition. The relevance of the trained model is verified a posteriori for the simulation of a channel flow, a backward-facing step and a linear blade cascade.
This chapter introduces the Conway–Maxwell–Poisson regression model, along with adaptations of the model to account for zero-inflation, censoring, and data clustering. Section 5.1 motivates the consideration and development of the various COM–Poisson regressions. Section 5.2 introduces the regression model and discusses related issues including parameter estimation, hypothesis testing, and statistical computing in R. Section 5.3 advances that work to address excess zeroes, while Section 5.4 describes COM–Poisson models that incorporate repeated measures and longitudinal studies. Section 5.5 focuses attention on the R statistical packages and functionality associated with regression analysis that accommodates excess zeros and/or clustered data as described in the two previous sections. Section 5.6 considers a general additive model based on COM–Poisson. Finally, Section 5.7 informs readers of other statistical computing softwares that are also available to conduct COM–Poisson regression, discussing their associated functionality. The chapter concludes with discussion.
Edited by
David Lynch, Federal Reserve Board of Governors,Iftekhar Hasan, Fordham University Graduate Schools of Business,Akhtar Siddique, Office of the Comptroller of the Currency