To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Clustering is a method of allocating data points in various groups, known as clusters, based on similarity. The notion of expressing similarity mathematically and then maximizing it (minimize dissimilarity) can be formulated as an optimization problem. Spectral clustering is an example of such an approach to clustering, and it has been successfully applied to visualization of clustering and mapping of points into clusters in two and three dimensions. Higher dimension problems remained untouched due to complexity and, most importantly, lack of understanding what “similarity” means in higher dimensions. In this paper, we apply spectral clustering to long timeseries EEG (electroencephalogram) data. We developed several models, based on different similarity functions and different approaches for spectral clustering itself. The results of the numerical experiment demonstrate that the created models are accurate and can be used for timeseries classification.
We find solutions that describe the levelling of a thin fluid film, comprising a non-Newtonian power-law fluid, that coats a substrate and evolves under the influence of surface tension. We consider the evolution from periodic and localized initial conditions as separate cases. The particular (similarity) solutions in each of these two cases exhibit the generic property that the profiles are weakly singular (that is, higher-order derivatives do not exist) at points where the pressure gradient vanishes. Numerical simulations of the thin film equation, with either periodic or localized initial condition, are shown to approach the appropriate particular solution.
Many industrial design problems are characterized by a lack of an analytical expression defining the relationship between design variables and chosen quality metrics. Evaluating the quality of new designs is therefore restricted to running a predetermined process such as physical testing of prototypes. When these processes carry a high cost, choosing how to gather further data can be very challenging, whether the end goal is to accurately predict the quality of future designs or to find an optimal design. In the multi-fidelity setting, one or more approximations of a design’s performance are available at varying costs and accuracies. Surrogate modelling methods have long been applied to problems of this type, combining data from multiple sources into a model which guides further sampling. Many challenges still exist; however, the foremost among them is choosing when and how to rely on available low-fidelity sources. This tutorial-style paper presents an introduction to the field of surrogate modelling for multi-fidelity expensive black-box problems, including classical approaches and open questions in the field. An illustrative example using Australian elevation data is provided to show the potential downfalls in blindly trusting or ignoring low-fidelity sources, a question that has recently gained much interest in the community.
There are several factors that can cause the excessive accumulation of biofluid in human tissue, such as pregnancy, local traumas, allergic responses or the use of certain therapeutic medications. This study aims to further investigate the shear-dependent peristaltic flow of Phan–Thien–Tanner (PTT) fluid within a planar channel by incorporating the phenomenon of electro-osmosis. This research is driven by the potential biomedical applications of this knowledge. The non-Newtonian fluid features of the PTT fluid model are considered as physiological fluid in a symmetric planar channel. This study is significant, as it demonstrates that the chyme in the small intestine can be modelled as a PTT fluid. The governing equations for the flow of the ionic liquid, thermal radiation and heat transfer, along with the Poisson–Boltzmann equation within the electrical double layer, are discussed. The long-wavelength ($\delta \ll 1$) and low-Reynolds-number approximations ($Re \to 0$) are used to simplify the simultaneous equations. The solutions analyse the Debye electronic length parameter, Helmholtz–Smoluchowski velocity, Prandtl number and thermal radiation. Additionally, streamlines are used to examine the phenomenon of entrapment. Graphs are used to explain the influence of different parameters on the flow and temperature. The findings of the current model have practical implications in the design of microfluidic devices for different particle transport phenomena at the micro level. Additionally, the noteworthy results highlight the advantages of electro-osmosis in controlling both flow and heat transfer. Ultimately, our objective is to use these findings as a guide for the advancement of lab-on-a-chip systems.
Online algorithms are a rich area of research with widespread applications in scheduling, combinatorial optimization, and resource allocation problems. This lucid textbook provides an easy but rigorous introduction to online algorithms for graduate and senior undergraduate students. In-depth coverage of most of the important topics is presented with special emphasis on elegant analysis. The book starts with classical online paradigms like the ski-rental, paging, list-accessing, bin packing, where performance of online algorithms is studied under the worst-case input and moves on to newer paradigms like 'beyond worst case', where online algorithms are augmented with predictions using machine learning algorithms. The book goes on to cover multiple applied problems such as routing in communication networks, server provisioning in cloud systems, communication with energy harvested from renewable sources, and sub-modular partitioning. Finally, a wide range of solved examples and practice exercises are included, allowing hands-on exposure to the concepts.
Singularly perturbed ordinary differential equations often exhibit Stokes’ phenomenon, which describes the appearance and disappearance of oscillating exponentially small terms across curves in the complex plane known as Stokes lines. These curves originate at singular points in the leading-order solution to the differential equation. In many important problems, it is impossible to obtain a closed-form expression for these leading-order solutions, and it is therefore challenging to locate these singular points. We present evidence that the analytic leading-order solution of a linear differential equation can be replaced with a numerical rational approximation using the adaptive Antoulas–Anderson (AAA) method. Despite such an approximation having completely different singularity types and locations, we show that the subsequent exponential asymptotic analysis accurately predicts the exponentially small behaviour present in the solution. For sufficiently small values of the asymptotic parameter, this approach breaks down; however, the range of validity may be extended by increasing the number of poles in the rational approximation. We present a related nonlinear problem and discuss the challenges that arise due to nonlinear effects. Overall, our approach allows for the study of exponentially small asymptotic effects without requiring an exact analytic form for the leading-order solution; this permits exponential asymptotic methods to be used in a much wider range of applications.
We consider planar flow involving two viscous fluids in a porous medium. One fluid is injected through a line source at the origin and moves radially outwards, pushing the second, ambient fluid outwards. There is an interface between the two fluids and if the inner injected fluid is of lower viscosity, the interface is unstable to small disturbances and radially directed unstable Saffman–Taylor fingers are produced. A linearized theory is presented and is compared with nonlinear results obtained using a numerical spectral method. An additional theory is also discussed, in which the sharp interface is replaced with a narrow diffuse interfacial region. We show that the nonlinear results are in close agreement with the linearized theory for small-amplitude disturbances at early times, but that large-amplitude fingers develop at later times and can even detach completely from the initial injection region.
Mathematical modelling of microwaves travelling through bauxite ore provides a way to compute moisture content in the free space transmission method given data on signal attenuation, phase shift and variable bauxite depth. We extend a recently developed four-layer model that uses coupled ordinary differential wave equations for the electric field together with continuity boundary conditions at interfaces between ore, air and antenna to find a solution that incorporates multiple internal reflections in ore and air. The model provides good fits to data, depending on ore permittivity and conductivity.
Our extensions are to use effective medium models to obtain electromagnetic properties of the ore mixture from moisture content and to incorporate the damping effects of scattering from the ore surface. Our model leads to a formula for the received signal showing how signal strengths SS and phase shifts depend on the moisture content of the bauxite ore, through the effects of moisture on permittivity and conductivity. We show that SS may be noninvertible, indicating that attenuation data alone cannot be used to infer moisture content. Combining with phase data typically corrects the noninvertibility. Reducing the operating frequency dramatically improves the usefulness of signal strength data for inferring moisture content.
Almost by definition of risk, rare events play a crucial role. We tackle this problem by presenting some basic tools from extreme value theory (EVT). From a statistical point of view, the workhorses are the block maxima method (BMM) and the peaks over threshold method (POTM). Besides giving the mathematical formulation, we exemplify both approaches via simulated examples. Once these tools are in place, we can provide estimators of the relevant risk measures such as high-exceedance probabilities, quantiles and return periods. In a crucial part of the book, we then estimate these quantities for sea-level data at Hoek van Holland near Rotterdam. We obtain estimates, including confidence intervals, for a necessary dike height withstanding a required 1 in 10 000 years storm event. Further applications concern financial data and data from the L’Aquila earthquake. For the latter, we present dynamic models for earthquake aftershocks. After an excursion to the world of records in athletics, we present the signature application of EVT through the story of the sinking of the MV Derbyshire. We show how an application of EVT techniques has saved many lives at sea.
In this chapter we end our more strenuous hike and start to enjoy a technically more relaxed stroll through the landscape of risk. An important feature throughout the book is that, for all the data examples given, we first start with a section “About the data”. In this way, you gain experience in finding and preparing the relevant data before starting with a statistical analysis. In order to discuss the consequences of climate change, we analyze the Hadley Centre Central England Temperature (HadCET) dataset. For the implications of a rise in temperature to the loss in volume of alpine glaciers, we apply the techniques learned to the case of the Lower Arolla Glacier in Switzerland. For an application to the realm of agricultural production, we look at the consequence of climate change to wine production. For this, we analyze data from a specific wine producer in France. We end this stroll with an interesting and perhaps somewhat surprising story of the astronomer Johannes Kepler. The story relates to his second marriage and the measurement of the volume of wine barrels.
We have reached the end of our stroll. We find ourselves in the company of Alexandre Dumas who, in 1850, wrote “The Black Tulip”. In it, he combines the stories of the tulip mania in the Netherlands with the tragic story of the brothers de Witt. In our final example of “About the data” we reconstruct the historic trading data of tulip bulbs, which turns out to be a detective story in its own right. Prices for tulip bulbs crashed on February 3, 1637. We also include the story of the growing of the first black tulip in 1986. Johan de Witt was tragically lynched by a politically motivated mob on August 20, 1672. With him, we meet a politician who, through his mathematical training, was able to solve an important problem from the realm of life insurance risk, the pricing of annuities. His publication “Waerdye” is our final example on risk communication. We leave the closing lines of our book to Shakespeare’s Hamlet, who spoke the following words to Horatio “There are more things in heaven and earth, Horatio, than are dreamt of in your philosophy.” We hope that we were able to convince you that these words very much apply to the realm of risk.
When it comes to natural disasters, earthquakes and tsunamis all too often top the list of worst calamities. Using several examples we will try to improve our understanding of how they occur. In later chapters, we discuss whether science indeed has techniques that can lead to statistical modeling. The examples discussed include the 2004 Boxing Day tsunami, killing more than 220 000 people, the 2011 Tōhoku earthquake and tsunami, which included the major nuclear disaster in Fukushima, and the volcanic explosion at the kingdom of Tonga on January 15, 2022. From each of these events we discuss specifics concerning risk, both in understanding as well as communication. We start the chapter with a brief, non-technical discussion of (Daniel) Bernoulli’s principle in incompressible fluids. This allows us to learn how tsunamis are formed and propagate across oceans causing catastrophic inundations to lower-lying coastal areas, often very far away. Especially for the Tōhoku and Fukushima case, we discuss the crucial difference between an "if" approach to risk management versus a "what if" one. The Tonga explosion highlights the importance of modeling such extremal events, taking the global geometric shape of our planet into account.
This rather long chapter constitutes part of the hike in our walk/hike/stroll set-up. We introduce the reader to the basics of stochastics (representing both probability and statistics) necessary for the more technical discussions on risk later. The path followed starts from probability space (a theoretical concept we quickly leave aside); we then move to the notion of a random variable and,, its distribution function, including the most important discrete as well as continuous examples. Historical examples as well as pedagogical ones are always included in order to support the understanding of the new concepts introduced. These examples often show that there is more to randomness than meets the eye. For the applications discussed later, we will measure statistical uncertainty through the concept of confidence intervals. These can be based either on some asymptotic theory involving the famous bell curve, the normal distribution, or on some form of resampling known under the name of bootstrapping. Further, we add some tools that are very important for measuring and communicating risk; these include the concepts of return periods and quantile functions.