To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Virtual reality (VR) is increasingly used in learning and can be experienced with a head-mounted display as a 3D immersive version (immersive virtual reality [IVR]) or with a PC (or another computer) as a 2D desktop-based version (desktop virtual reality [DVR]). A research gap is the effect of IVR and DVR on learners’ skill retention. To address this gap, we designed an experiment in which learners were trained and tested for the assembly of a procedural industrial task. We found nonsignificant differences in the number of errors, the time to completion, satisfaction, self-efficacy, and motivation. The results support the view that DVR and IVR are similarly useful for learning retention. These insights may help researchers and practitioners to decide which form of VR they should use.
This study aimed to provide reference for evaluating the achievability of hepatitis B virus (HBV) elimination in a high endemicity city with universal neonatal vaccination in place for over 30 years. Between September 2018 and October 2020, 2085 citizens from 1143 geographically random households in Hong Kong completed a questionnaire and had blood-testing for HBV markers (anti-HBs, HBsAg, anti-HBc, HBeAg). We evaluated the epidemiology and examined factors associated with HBV exposure, vaccination and chronic diseases. The proportion of households with HBsAg positive index participants was 9.2% (95% CI 7.5%–10.9%). The age- and sex-adjusted HBsAg prevalence was 6.3% (95% CI 5.3%–7.4%), compared to >10% in those born in 1960-1970 and among non-local born citizens, and <1% in people born after introduction of neonatal vaccination. Among 155 HBsAg positive participants, 59% were aware of their infection status with 10% on treatment and 10/150 (6.7%) HBeAg positive. More than 40% (872/2064) tested negative for both HBsAg and anti-HBs, contributed by the lack of immunity in older adults and the waning immunity of vaccines. Hong Kong has remained at high-intermediate HBV endemicity state. The moderate level of anti-HBs positivity and very low treatment coverage (10%) among HBsAg positive participants pose challenges for achieving the HBV elimination target.
In this article, the performance parameters of the electric vehicle were investigated, and its operating point was defined using the core components (Battery, Inverter, and Motor). The test vehicle 2023 Cadillac Lyriq, provided by General Motors Inc., was driven on specified road segments, and the real-time data were retrieved using the integrated controller area network architecture. The neoVI–Fire 2 tool was connected to the vehicle system, which records the dynamic data, and Vehicle Spy software was used to convert the data into a readable format. Finally, the vector electric vehicle operating point was proposed, and the corresponding behavior was interpreted. This methodology could assist researchers in understanding the dynamic behavior of electric vehicle parameters to develop integrated techniques which augment the performance in real time.
The uptake of electric vehicles (EVs) and renewable energy technologies is changing the magnitude, variability, and direction of power flows in electricity networks. To ensure a successful transition to a net zero energy system, it will be necessary for a wide range of stakeholders to understand the impacts of these changing flows on networks. However, there is a gap between those with the data and capabilities to understand electricity networks, such as network operators, and those working on adjacent parts of the energy transition jigsaw, such as electricity suppliers and EV charging infrastructure operators. This paper describes the electric vehicle network analysis tool (EVENT), developed to help make network analysis accessible to a wider range of stakeholders in the energy ecosystem who might not have the bandwidth to curate and integrate disparate datasets and carry out electricity network simulations. EVENT analyses the potential impacts of low-carbon technologies on congestion in electricity networks, helping to inform the design of products and services. To demonstrate EVENT’s potential, we use an extensive smart meter dataset provided by an energy supplier to assess the impacts of electricity smart tariffs on networks. Results suggest both network operators and energy suppliers will have to work much more closely together to ensure that the flexibility of customers to support the energy system can be maximized, while respecting safety and security constraints within networks. EVENT’s modular and open-source approach enables integration of new methods and data, future-proofing the tool for long-term impact.
This paper extends the standard double-exponential jump-diffusion (DEJD) model to allow for successive jumps to bring about different effects on the asset price process. The double-exponentially distributed jump sizes are no longer assumed to have the same parameters; instead, we assume that these parameters may take a series of different values to reflect growing or diminishing effects from these jumps. The mathematical analysis of the stock price requires an introduction of a number of distributions that are extended from the hypoexponential (HE) distribution. Under such a generalized setting, the European option price is derived in closed-form which ensures its computational convenience. Through our numerical examples, we examine the effects on the return distributions from the growing and diminishing severity of the upcoming jumps expected in the near future, and investigate how the option prices and the shapes of the implied volatility smiles are influenced by the varying severity of jumps. These results demonstrate the benefits of the modeling flexibility provided by our extension.
Open government and open data are often presented as the Asterix and Obelix of modern government—one cannot discuss one, without involving the other. Modern government, in this narrative, should open itself up, be more transparent, and allow the governed to have a say in their governance. The usage of technologies, and especially the communication of governmental data, is then thought to be one of the crucial instruments helping governments achieving these goals. Much open government data research, hence, focuses on the publication of open government data, their reuse, and re-users. Recent research trends, by contrast, divert from this focus on data and emphasize the importance of studying open government data in practice, in interaction with practitioners, while simultaneously paying attention to their political character. This commentary looks more closely at the implications of emphasizing the practical and political dimensions of open government data. It argues that researchers should explicate how and in what way open government data policies present solutions to what kind of problems. Such explications should be based on a detailed empirical analysis of how different actors do or do not do open data. The key question to be continuously asked and answered when studying and implementing open government data is how the solutions openness present latch onto the problem they aim to solve.
The promised merits of data-driven innovation in general and algorithmic systems in particular hardly need enumeration. However, as decision-making tasks are increasingly delegated to algorithmic systems, this raises questions about accountability. These pressing questions of algorithmic accountability, particularly with regard to data-driven innovation in the public sector, deserve ample scholarly attention. Therefore, this paper brings together perspectives from governance studies and critical algorithm studies to assess how algorithmic accountability succeeds or falls short in practice and analyses the Dutch System Risk Indication (SyRI) as an empirical case. Dissecting a concrete case teases out to which degree archetypical accountability practices and processes function in relation to algorithmic decision-making processes, and which new questions concerning algorithmic accountability emerge therein. The case is approached through the analysis of “scavenged” material. It was found that while these archetypical accountability processes and practices can be incredibly productive in dealing with algorithmic systems they are simultaneously at risk. The current accountability configurations hinge predominantly on the ex ante sensitivity and responsiveness of the political fora. When these prove insufficient, mitigation in medias res/ex post is very difficult for other actants. In part, this is not a new phenomenon, but it is amplified in relation to algorithmic systems. Different fora ask different kinds of medium-specific questions to the actor, from different perspectives with varying power relations. These algorithm-specific considerations relate to the decision-making around an algorithmic system, their functionality, and their deployment. Strengthening ex ante political accountability fora to these algorithm-specific considerations could help mitigate this.
Under the assumption that sequences of graphs equipped with resistances, associated measures, walks and local times converge in a suitable Gromov-Hausdorff topology, we establish asymptotic bounds on the distribution of the $\varepsilon$-blanket times of the random walks in the sequence. The precise nature of these bounds ensures convergence of the $\varepsilon$-blanket times of the random walks if the $\varepsilon$-blanket time of the limiting diffusion is continuous at $\varepsilon$ with probability 1. This result enables us to prove annealed convergence in various examples of critical random graphs, including critical Galton-Watson trees and the Erdős-Rényi random graph in the critical window. We highlight that proving continuity of the $\varepsilon$-blanket time of the limiting diffusion relies on the scale invariance of a finite measure that gives rise to realizations of the limiting compact random metric space, and therefore we expect our results to hold for other examples of random graphs with a similar scale invariance property.
We introduce an approach for damage detection in gearboxes based on the analysis of sensor data with the multi-resolution dynamic mode decomposition (mrDMD). The application focus is the condition monitoring of wind turbine gearboxes under varying load conditions, in particular irregular and stochastic wind fluctuations. We analyze data stemming from a simulated vibration response of a simple nonlinear gearbox model in a healthy and damaged scenario and under different wind conditions. With mrDMD applied on time-delay snapshots of the sensor data, we can extract components in these vibration signals that highlight features related to damage and enable its identification. A comparison with Fourier analysis, time synchronous averaging, and empirical mode decomposition shows the advantages of the proposed mrDMD-based data analysis approach for damage detection.
A detailed analysis of management and performance fees for asset managers and investment funds is undertaken. While fund fees are considered as a cost of capital for investors, the structuring of such fee mechanisms in a fund can also influence a fund manager’s decisions and investment strategy, thereby also influencing the investment performance of the investors funds. The study undertaken will allow for an assessment of the effect of fee structures and the potential for asymmetric incentives to arise that may promote adverse risk-taking behaviours by the fund manager, to the detriment of the investor or retiree who places a portion of their retirement savings into such a managed fund with such fee structures. As such, understanding the mechanism of fee charging as well as pricing the fees correctly is vital. An exploration of the application of actuarial distortion pricing methods for complete and incomplete market valuation is performed on a variety of path-dependent option-like performance fee structures for various funds in the European and American markets. Furthermore, several scenario analysis and sensitivity studies are undertaken. The class of Net Asset Value models adopted are Lévy processes, and the pricing is performed via Monte Carlo techniques.
In the context of mortality forecasting, “rotation” refers to the phenomenon that mortality decline accelerates at older ages but decelerates at younger ages. Since rotation is typically subtle, it is difficult to be confirmed and modeled in a statistical, data-driven manner. In this paper, we attempt to overcome this challenge by proposing an alternative modeling approach. The approach encompasses a new model structure, which includes a component that is devoted to measuring rotation. It also features a modeling technique known as ANCOVA, which allows us to statistically detect rotation and extrapolate the phenomenon into the future. Our proposed approach yields plausible mortality forecasts that are similar to those produced by Li et al. [Extending the Lee-Carter method to model the rotation of age patterns of mortality decline for long-term projections. Demography 50 (6), 2037–205, and may be considered more advantageous than the approach of Li et al. in the sense that it is able to generate not only static but also stochastic forecasts.
This systematic literature review aimed to provide an overview of the characteristics and methods used in studies applying the disability-adjusted life years (DALY) concept for infectious diseases within European Union (EU)/European Economic Area (EEA)/European Free Trade Association (EFTA) countries and the United Kingdom. Electronic databases and grey literature were searched for articles reporting the assessment of DALY and its components. We considered studies in which researchers performed DALY calculations using primary epidemiological data input sources. We screened 3053 studies of which 2948 were excluded and 105 studies met our inclusion criteria. Of these studies, 22 were multi-country and 83 were single-country studies, of which 46 were from the Netherlands. Food- and water-borne diseases were the most frequently studied infectious diseases. Between 2015 and 2022, the number of burden of infectious disease studies was 1.6 times higher compared to that published between 2000 and 2014. Almost all studies (97%) estimated DALYs based on the incidence- and pathogen-based approach and without social weighting functions; however, there was less methodological consensus with regards to the disability weights and life tables that were applied. The number of burden of infectious disease studies undertaken across Europe has increased over time. Development and use of guidelines will promote performing burden of infectious disease studies and facilitate comparability of the results.
Model order reduction (MOR) can provide low-dimensional numerical models for fast simulation. Unlike intrusive methods, nonintrusive methods are attractive because they can be applied even without access to full order models (FOMs). Since nonintrusive MOR methods strongly rely on snapshots of the FOMs, constructing good snapshot sets becomes crucial. In this work, we propose a novel active-learning-based approach for use in conjunction with nonintrusive MOR methods. It is based on two crucial novelties. First, our approach uses joint space sampling to prepare a data pool of the training data. The training data are selected from the data pool using a greedy strategy supported by an error estimator based on Gaussian process regression. Second, we introduce a case-independent validation strategy based on probably approximately correct learning. While the methods proposed here can be applied to different MOR methods, we test them here with artificial neural networks and operator inference.
The book graph $B_n ^{(k)}$ consists of $n$ copies of $K_{k+1}$ joined along a common $K_k$. In the prequel to this paper, we studied the diagonal Ramsey number $r(B_n ^{(k)}, B_n ^{(k)})$. Here we consider the natural off-diagonal variant $r(B_{cn} ^{(k)}, B_n^{(k)})$ for fixed $c \in (0,1]$. In this more general setting, we show that an interesting dichotomy emerges: for very small $c$, a simple $k$-partite construction dictates the Ramsey function and all nearly-extremal colourings are close to being $k$-partite, while, for $c$ bounded away from $0$, random colourings of an appropriate density are asymptotically optimal and all nearly-extremal colourings are quasirandom. Our investigations also open up a range of questions about what happens for intermediate values of $c$.
This paper develops the estimation method of mean and covariance functions of functional data with additional covariate information. With the strength of both local linear smoothing modeling and general weighing scheme, we are able to explicitly characterize the mean and covariance functions with incorporating covariate for irregularly spaced and sparsely observed longitudinal data, as typically encountered in engineering technology or biomedical studies, as well as for functional data which are densely measured. Theoretically, we establish the uniform convergence rates of the estimators in the general weighing scheme. Monte Carlo simulation is conducted to investigate the finite-sample performance of the proposed approach. Two applications including the children growth data and white matter tract dataset obtained from Alzheimer's Disease Neuroimaging Initiative study are also provided.
For decades, proponents of the Internet have promised that it would one day provide a seamless way for everyone in the world to communicate with each other, without introducing new boundaries, gatekeepers, or power structures. What happened? This article explores the system-level characteristics of a key design feature of the Internet that helped it to achieve widespread adoption, as well as the system-level implications of certain patterns of use that have emerged over the years as a result of that feature. Such patterns include the system-level acceptance of particular authorities, mechanisms that promote and enforce the concentration of power, and network effects that implicitly penalize those who do not comply with decisions taken by privileged actors. We provide examples of these patterns and offer some key observations, toward the development of a general theory of why they emerged despite our best efforts, and we conclude with some suggestions on how we might mitigate the worst outcomes and avoid similar experiences in the future.
In this paper, we consider a mixed dividend strategy in a dual risk model. The mixed dividend strategy is the combination of a threshold dividend and a Parisian implementation delays dividend under periodic observation. Given a series of discrete observation points, when the surplus level is larger than the predetermined bonus barrier at observation point, the Parisian implementation delays dividend is immediately carried out, and the threshold dividend is performed continuously during the delayed period. We study the Gerber-Shiu expected discounted penalty function and the expected discounted dividend payments before ruin in such a dual risk model. Numerical illustrations are given to study the influence of relevant parameters on the ruin-related quantities and the selection of the optimal dividend barrier for a given initial surplus level.
There is a close relationship between random graphs and percolation. In fact, percolation and random graphs have been viewed as “the same phenomenon expressed in different languages” (Albert and Barabási, ). Early ideas on percolation (although not under that name) in molecular chemistry can be found in the articles by Flory () and Stockmayer ().
Percolation can be defined more generally than as a process on , . In this chapter, we motivate the main ideas and theory of percolation on more general graphs by application to polymer gelation and amorphous computing.