To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Survey studies offer a balance between large-sample analysis and more sample-specific studies, since they can be based on a large sample of cross-sectional companies but at the same time they allow us to ask specific qualitative questions of the respondents. Survey studies also allow us to measure and quantify decision-making processes and beliefs. Thus, survey data analysis can be seen as a bridge connecting qualitative studies to quantitative studies in corporate finance research. This chapter covers the most commonly used techniques in survey data analysis. In particular, it focuses on the assumptions and applications of principal components analysis (PCA), but also briefly explains factor analysis. The chapter also briefly discusses the similarities and differences between these two methods. The chapter includes an example of an application of PCA to ownership concentration by examining the different dimensions of ownership concentration. Finally, lab work on PCA and a mini case study are provided.
Panel data consist of multiple observations for each unit in the data. The units can be investors, firms, households, and so on. Panel datasets that allow us to follow these units over time provide intuitive understanding of the unit’s behavior. The panel-data analysis tends to be better at addressing the causality issues in research than cross-sectional data. This chapter provides a wide range of examples of panel-data techniques, with the main focus on linear panel-data models. It covers pooled OLS estimators, the fixed-effects model, least-squares dummy variable estimator, difference-in-differences model, between estimator, random-effects model, Hausman–Taylor random-effects IV method, and briefly the dynamic panel-data models. The chapter also reviews stationarity and the generalized method of moments (GMM) briefly. An application of linear panel-data models, as well as lab work and a mini case study, are provided at the end of the chapter.
The aim with regression analysis is to summarize the observed data and study how the response of a dependent variable varies as the values of the independent variable(s) change. There are many models that examine this relationship by obtaining the estimates of parameters in a regression model. The classical linear regression model (CLRM) is the basis of all the other models discussed in this book. This chapter discusses the CLRM in detail using the ordinary least squares (OLS) estimation method. The outcome of OLS can also be used as a benchmark in more advanced analysis. The focus is on the assumptions and applications of this technique, starting from a single-regression model with one independent variable and then covering multiple linear regression models with many independent variables. The chapter provides an application to the capital asset pricing model, lab work on the CLRM, and a mini case study.
ML methods are increasingly being used in (corporate) finance studies, with impressive applications. ML methods can be applied with the aim of reducing prediction error in the models, but can also be used to extend the existing traditional econometric methods. The performance of the ML models depends on the quality of the input data and the choice of model. There are many ML models, but all come with their own specific details. It is therefore essential to select accurate model(s) for the analysis. This chapter briefly reviews some broad types of ML methods. It covers supervised learning, which tends to achieve superior prediction performance by using more flexible functional forms than OLS in the prediction model. It explains unsupervised learning methods that derive and learn structural information from conventional data. Finally, the chapter also discusses some limitations and drawbacks of ML, as well as potential remedies.
Event studies are commonly applied in corporate finance, with a focus on testing market efficiency hypotheses and evaluating the effects of corporate decisions on firm values, stock prices, and other outcome variables. The chapter discusses the event-study model using examples from (i) return predictability literature; (ii) the effects of firm-level and macro news on stock returns, testing semi-strong efficiency; as well as (iii) insider trading, testing the strong form of efficiency. In short-term event studies the chapter reviews abnormal (AR) and cumulative abnormal return (CAR) calculations and discusses statistical tests of ARs and CARs. It also covers long-term event studies and discusses the buy-and-hold abnormal returns as well as the calendar-time portfolio approach. The chapter provides an application of a short-term event study by examining how stock prices respond to the news of a CEO’s departure. The chapter ends with lab work and a mini case study.
This paper concerns an insurance firm’s surplus process observed at renewal inspection times, with a focus on assessing the probability of the surplus level dropping below zero. For various types of inter-inspection time distributions, an explicit expression for the corresponding transform is given. In addition, Cramér–Lundberg-type asymptotics are established. Also, an importance sampling-based Monte Carlo algorithm is proposed, and is shown to be logarithmically efficient.
New-generation pneumococcal conjugate vaccines (PCVs) are available to replace PCV-13 for childhood and adult immunization. Besides cost-effectiveness evaluations which have highly variable results, the comparative immunogenicity of these new vaccines (PCV15, PCV20, PCV21) and their coverage of invasive pneumococcal disease (IPD) and carriage strains in different age-groups should be regarded as well as the antibody susceptibility, antibiotic resistance, invasiveness and virulence of serotypes included in each vaccine. Based on the Canadian experience, these topics are discussed. The optimal strategy would be a 2+1 PCV20 schedule for children, PCV21 for elderly adults and a dual PCV20+PCV21 schedule for adults at very high IPD risk. Shifting from PCV-13 to PCV-15 for children entails a risk of increased IPD incidence in adults because additional serotypes are of low virulence and could be replaced by more invasive and virulent serotypes. This risk can be reasonably excluded if PCV-20 replaces PCV-13 as the former covers additional serotypes being highly invasive and virulent. It is recognized that off-label use of PCV-20 according to a 2+1 schedule could be problematic for some jurisdictions as this is not authorized in all countries. In Canada, however, the 2+1 PCV20 schedule was authorized based on the same dataset submitted elsewhere.
Motivated by recent developments of quasi-stationary Monte Carlo methods, we investigate the stability of quasi-stationary distributions of killed Markov processes under perturbations of the generator. We first consider a general bounded self-adjoint perturbation operator, and then study a particular unbounded perturbation corresponding to truncation of the killing rate. In both scenarios, we quantify the difference between eigenfunctions of the smallest eigenvalue of the perturbed and unperturbed generators in a Hilbert space norm. As a consequence, $\mathcal{L}^1$-norm estimates of the difference of the resulting quasi-stationary distributions in terms of the perturbation are provided.
The study aimed to delve into the incidence and risk factors associated with myocarditis and pericarditis following SARS-COV-2-19 vaccination, addressing a notable gap in understanding the safety profile of vaccinations. Through meticulous data selection from the National Health Insurance System (NHIS) database of Korea, the researchers employed both a case-crossover study and a nested case-control design to analyze temporal patterns and risk factors related to carditis occurrences post-immunization. Key findings revealed a significant association between SARS-COV-2-19 vaccination and the occurrence of carditis, with a strong temporal correlation observed within 10 days post-vaccination. Noteworthy factors contributing to carditis risk included the duration between vaccination and carditis, specific comorbidities and medication use. The study concluded by recommending an extended post-vaccination surveillance duration of at least 10 days and underscored the importance of considering individual medical histories and concurrent medication use in assessing vaccine-induced carditis risk. This study might contribute to understanding vaccine safety profiles and emphasizes the significance of comprehensive post-vaccination monitoring protocols.
Within an infrastructure to monitor vaccine effectiveness (VE) against hospitalization due to COVID-19 and COVID-19 related deaths from November 2022 to July 2023 in seven countries in real-world conditions (VEBIS network), we compared two approaches: (a) estimating VE of the first, second or third COVID-19 booster doses administered during the autumn of 2022, and (b) estimating VE of the autumn vaccination dose regardless of the number of prior doses (autumnal booster approach). Retrospective cohorts were constructed using Electronic Health Records at each participating site. Cox regressions with time-changing vaccination status were fit and site-specific estimates were combined using random-effects meta-analysis. VE estimates with both approaches were mostly similar, particularly shortly after the start of the vaccination campaign, and showed a similar timing of VE waning. However, autumnal booster estimates were more precise and showed a clearer trend, particularly compared to third booster estimates, as calendar time increased after the vaccination campaign and during periods of lower SARS-CoV-2 activity. Moreover, the decrease in protection by increasing calendar time was more clear and precise than when comparing protection by number of doses. Therefore, estimating VE under an autumnal booster framework emerges as a preferred method for future monitoring of COVID-19 vaccination campaigns.
Processing and extracting actionable information, such as fault or anomaly indicators originating from vibration telemetry, is both challenging and critical for an accurate assessment of mechanical system health and subsequent predictive maintenance. In the setting of predictive maintenance for populations of similar assets, the knowledge gained from any single asset should be leveraged to provide improved predictions across the entire population. In this paper, a novel approach to population-level health monitoring is presented adopting a transfer learning approach. The new methodology is applied to monitor multiple rotating plant assets in a power generation scenario. The focus is on the detection of statistical anomalies as a means of identifying deviations from the typical operating regime from a time series of telemetry data. This is a challenging task because the machine is observed under different operating regimes. The proposed methodology can effectively transfer information across different assets, automatically identifying segments with common statistical characteristics and using them to enrich the training of the local supervised learning models. The proposed solution leads to a substantial reduction in mean square error relative to a baseline model.
Denmark is one of the leading countries in establishing digital solutions in the health sector. When SARS-CoV-2 arrived in February 2020, a real-time surveillance system could be rapidly built on existing infrastructure. This rapid data integration for COVID-19 surveillance enabled a data-driven response. Here we describe (a) the setup of the automated, real-time surveillance and vaccination monitoring system for COVID-19 in Denmark, including primary stakeholders, data sources, and algorithms, (b) outputs for various stakeholders, (c) how outputs were used for action and (d) reflect on challenges and lessons learnt. Outputs were tailored to four main stakeholder groups: four outputs provided direct information to individual citizens, four to complementary systems and researchers, 25 to decision-makers, and 15 informed the public, aiding transparency. Core elements in infrastructure needed for automated surveillance had been in place for more than a decade. The COVID-19 epidemic was a pressure test that allowed us to explore the system’s potential and identify challenges for future pandemic preparedness. The system described here constitutes a model for the future infectious disease surveillance in Denmark. With the current pandemic threat posed by avian influenza viruses, lessons learnt from the COVID-19 pandemic remain topical and relevant.
The EU’s Common European Data Space (CEDS) aims to create a single market for data-sharing in Europe, build trust among stakeholders, uphold European values, and benefit society. However, there is the possibility that the values of the EU and the benefits for the common good of European society may get overlooked for the economic benefits of organisations if norms and social values are not considered. We propose that the concept of “data commons” is relevant for defining openness versus enclosure of data in data spaces and is important when considering the balance and trade-off between individual (market) versus collective (societal) benefits from data-sharing within the CEDS. Commons are open-access resources governed by a group, either formally by regulation or informally by local customs. The application of the data commons to the CEDS would promote data-sharing for the “common good.” However, we propose that the data commons approach should be balanced with the market-based approach to CEDS in an inclusive hybrid data governance approach that meets material, price-driven interests, while stimulating collective learning in online networks to form social communities that offer participants a shared identity and social recognition.
This report explores key considerations in relation to adopting a dynamic discount rate funding approach and the impacts of doing so in a range of areas, including funding volatility, investment strategy and end game objectives. It considers the advantages and disadvantages of this approach from the perspective of a range of stakeholders and the challenges that need overcoming in order to fully implement and support the approach, for example data challenges and the new skills required in the industry. The report includes sample modelling to highlight the practical issues that arise when adopting this approach. It describes a step-by-step approach for assessing the risks to be considered when determining an appropriate level of assets to provide funding for a sample set of pension scheme cash flows, as summarised in the table below.
Steps involved in determining the funding buffer and discount rate
Step 1
Create an asset portfolio based on best estimate liability cash flows
Step 2
Adjustment for investment costs
Step 3
Buffer: allowance for asset-side risks
Step 4
Buffer: allowance for asset-liability mismatch risk (reinvestment and disinvestment risk)
Step 5
Buffer: allowance for liability-side risks
Step 6
Buffer: consideration of risk diversification when determining the buffer
It also considers how a dynamic discount rate approach fits within the proposed future funding regulations. Finally, the report puts forward recommendations for the IFoA, Scheme Actuaries and TPR.
Consequences of schemes adopting a dynamic discount rate approach could include very different investment strategies with investment in a wider pool of assets, less use of leveraged Liability Driven Investment, fewer schemes targeting buy-out as their end game strategy and an increase in technical work for actuaries in advising on the optimisation of asset and liability cash flows.