To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We investigate the effect of high wind speeds on the breakup mechanisms that govern the formation of a spray from nozzles that form liquid sheets, which subsequently break up. The fragmentation mechanism of liquid sheets from spray nozzles has recently been described in detail under quiescent conditions. With high wind speeds, measurements of the droplet size distribution reveal two rather than one characteristic drop sizes, suggesting the existence of two distinct breakup mechanisms. High-speed images of the spray are used to identify these two mechanisms. We show that the smaller droplets result from the breakup of ‘bags’ formed in the spray sheet by the wind, while the larger droplets result from the breakup of the remaining perforated sheet. Based on the two mechanisms, a probability density function is constructed and fitted to the measured droplet size distributions. We show that the spray sheet destabilises due to the Rayleigh–Taylor instability induced by the airflow, and that the experimentally observable breakup length and size of the holes blown in the sheet are predicted by the fastest growing wavenumber. From this, a theoretical prediction for the droplet size from bag breakup and remaining sheet breakup is derived.
Our primary result concerns the positivity of specific kernels constructed using the q-ultraspherical polynomials. In other words, it concerns a two-parameter family of bivariate, compactly supported distributions. Moreover, this family has a property that all its conditional moments are polynomials in the conditioning random variable. The significance of this result is evident for individuals working on distribution theory, orthogonal polynomials, q-series theory, and the so-called quantum polynomials. Therefore, it may have a limited number of interested researchers. That is why, we put our results into a broader context. We recall the theory of Hilbert–Schmidt operators and the idea of Lancaster expansions (LEs) of the bivariate distributions absolutely continuous with respect to the product of their marginal distributions. Applications of LE can be found in Mathematical Statistics or the creation of Markov processes with polynomial conditional moments (the most well-known of these processes is the famous Wiener process).
The neo-Kantian transcendentalist reading of the epistemic status of logical axioms in Frege argues that he is committed to the neo-Kantian idea that we are epistemically justified in accepting logical axioms because accepting them is necessary for achieving epistemically crucial goals. However, I show that Frege hesitates to be fully committed to neo-Kantian transcendentalism because he struggles to accept the idea that such a teleological reason can constitute an epistemic warrant. This interpretation shows some crucial aspects of his philosophy of logic, such as his understanding of the relationship between the simplicity and the sufficiency of logical systems.
When implementing Markov Chain Monte Carlo (MCMC) algorithms, perturbation caused by numerical errors is sometimes inevitable. This paper studies how the perturbation of MCMC affects the convergence speed and approximation accuracy. Our results show that when the original Markov chain converges to stationarity fast enough and the perturbed transition kernel is a good approximation to the original transition kernel, the corresponding perturbed sampler has fast convergence speed and high approximation accuracy as well. Our convergence analysis is conducted under either the Wasserstein metric or the $\chi^2$ metric, both are widely used in the literature. The results can be extended to obtain non-asymptotic error bounds for MCMC estimators. We demonstrate how to apply our convergence and approximation results to the analysis of specific sampling algorithms, including Random walk Metropolis, Metropolis adjusted Langevin algorithm with perturbed target densities, and parallel tempering Monte Carlo with perturbed densities. Finally, we present some simple numerical examples to verify our theoretical claims.
The impact of a chemical reaction, $A+B \rightarrow C$, on the stability of a miscible radial displacement in a porous medium is established. Our study involves a comprehensive analysis employing both linear stability analysis and nonlinear simulations. Through linear stability analysis, the onset of instability for monotonic as well as non-monotonic viscosity profiles corresponding to the same end-point viscosity are discussed and compared. We establish a $(R_b,R_c)$ phase plane for a wide range of Damköhler number ($Da$) and Péclet number ($Pe$) into stable and unstable regions. Here, $R_b=\ln (\mu _B/ \mu _A)$ and $R_c=\ln (\mu _C/ \mu _A)$ and $\mu _{i}$ is the viscosity of fluid $i$$\in \{A$, $B$, $C$}. The stable zone in the $(R_b, R_c)$ phase plane contracts with increased $Da$ and $Pe$ but never vanishes. It exists even for $Da \rightarrow \infty$. Interestingly, we obtain a $Da$ independent stable region in the neighbourhood of $R_c=R_b$ where no transition occurs in stability despite changes in reaction rate. The study allows us to acquire knowledge about the transition of the stability for varying $Da, Pe$ and different reactions classified using $R_b, R_c$.
There is limited information on rare spinocerebellar ataxia (SCA) variants, particularly in the Canadian population. This study aimed to describe the demographic and clinical features of uncommon SCA subtypes in Canada and compare them with international data.
Methods:
We conducted a case series and literature review of adult patients with rare SCA subtypes, including SCA5, SCA7, SCA12, SCA14, SCA15, SCA28, SCA34, SCA35 and SCA36. Data were collected from medical centers in Ontario, Alberta and Quebec between January 2000 and February 2021.
Results:
We analyzed 25 patients with rare SCA subtypes, with onset ages ranging from birth to 67 years. Infantile and juvenile-onset cases were observed in SCA5, SCA7, SCA14 and SCA34. Most patients presented with gait ataxia, with no significant differences across groups. Additional common features included saccadic abnormalities (22 of 25), dysarthria (19 of 25) and nystagmus (12 of 22, except in SCA7). Less common findings included dystonia (8 of 25), cognitive impairment (7 of 25), tremor (9 of 25) and parkinsonism (3 of 25).
Conclusion:
Our study highlights the heterogeneity of rare SCA subtypes in Canada. Ongoing longitudinal analysis will improve the understanding, management and screening of these disorders.
Though initially described in 1971 by Van Praagh, transposition of great arteries with posterior aorta is rarely reported in last two decades. Since the arrangement of great arteries appears like normally related great arteries, a careful echocardiographic evaluation is necessary in patients with clinical features of transposition of great arteries. In majority of cases with this anatomy, arterial switch operation can be performed without the need of Le Compte manoeuvre.
The analysis of insurance and annuity products issued on multiple lives requires the use of statistical models which account for lifetime dependence. This paper presents a Dirichlet process mixture-based approach that allows to model dependent lifetimes within a group, such as married couples, accounting for individual as well as group-specific covariates. The model is analyzed in a fully Bayesian setting and illustrated to jointly model the lifetime of male–female couples in a portfolio of joint and last survivor annuities of a Canadian life insurer. The inferential approach allows to account for right censoring and left truncation, which are common features of data in survival analysis. The model shows improved in-sample and out-of-sample performance compared to traditional approaches assuming independent lifetimes and offers additional insights into the determinants of the dependence between lifetimes and their impact on joint and last survivor annuity prices.
Fujino gave a proof for the semi-ampleness of the moduli part in the canonical bundle formula in the case when the general fibers are K3 surfaces or abelian varieties. We show a similar statement when the general fibers are primitive symplectic varieties. This answers a question of Fujino raised in the same article. Moreover, using the structure theory of varieties with trivial first Chern class, we reduce the question of semi-ampleness in the case of families of K-trivial varieties to a question when the general fibers satisfy a slightly weaker Calabi–Yau condition.
Entrustable professional activities (EPAs) have gained traction in the medical education field as a means of assessing competencies. Essentially, an EPA is a profession-specific task that a trainee is entrusted to conduct unsupervised, once deemed competent by their supervisor through prior evaluations and discussions. The integration of EPAs into postgraduate assessment strategies enhances the delivery of capability-based curricula. It strategically bridges the theoretical–practical divide and addresses existing issues associated with workplace-based assessments (WBPAs). This article aims to (a) provide an overview of EPAs, (b) review the application of EPAs in postgraduate psychiatry so far, exploring their conceptual framework, implementation, qualities and potential benefits and concerns, and (c) propose a theoretical framework for their integration into the UK psychiatry curriculum.
Classic serotonergic psychedelics are experiencing a clinical revival, which has also revived ethical debates about psychedelic-assisted therapy. A particular issue here is how to prepare and protect patients from the vulnerability that the psychedelic state creates. This article first examines how this vulnerability manifests itself, revealing that it results from an impairment of autonomy: psychedelics diminish decision-making capacity, reduce controllability, and limit resistance to external influences. It then analyzes the strengths and weaknesses of five safety measures proposed in the literature, what aspect of the patient’s vulnerability they seek to reduce, and how they can be optimized. The analysis shows that while preparatory sessions, advance directives, and specific training and oversight are useful, starting with a lower dosage and no therapy is less so. Finally, the article presents a safety measure that has been overlooked in the literature but could be highly effective and feasible: bringing a close person to the psychedelic session.
There is a need for new imaginaries of care and social health for people living with dementia at home. Day programmes are one solution for care in the community that requires further theorisation to ensure an empirical base that is useful for guiding policy. In this article we contribute to the theorising of day programmes by using an ethnographic case study of one woman living with dementia at home using a day programme. Data were collected through observations, interviews and artefact analysis. Peg, whose case story is central in this article, was observed over a period of nine months for a total of 61 hours at the day programme, as well as 16 hours of observation at her home and during two community outings. We use a material semiotic approach to thinking about the day programme as a health ‘technology in practice’ to challenge the taken-for-granted ideas of day programmes as neutral, stable, bounded spaces. The case story of Peg is illustrative of how a day programme and its scripts come into relation with an arrangement of family care and life at home with dementia. At times the configuration of this arrangement works to provide a sort of stabilising distribution of care and space to allow Peg and her family to go on in the day-to-day life with dementia. At other times the arrangement creates limits to the care made possible. We argue that how we conceptualise and study day programmes and their relations to home and the broader care infrastructure matters to the possibilities of care they can enact.
The early months of the COVID pandemic prompted a flood of research aimed at illuminating all facets of the disease and its spread. Scholars sought to create and assess practical interventions that ranged from vaccines to government lockdown policies to workplace practices. To date, there are nearly a half-million publications on the pandemic, and more are in process.
We analyze the process M(t) representing the maximum of the one-dimensional telegraph process X(t) with exponentially distributed upward random times and generally distributed downward random times. The evolution of M(t) is governed by an alternating renewal of two phases: a rising phase R and a constant phase C. During a rising phase, X(t) moves upward, whereas, during a constant phase, it moves upward and downward, continuing to move until it attains the maximal level previously reached. Under some choices of the distribution of the downward times, we are able to determine the distribution of C, which allows us to obtain some bounds for the survival function of M(t). In the particular case of exponential downward random times, we derive an explicit expression for the survival function of M(t). Finally, the moments of the first passage time $\Theta_w$ of the process X(t) through a fixed boundary $w>0$ are analyzed.
We study original position arguments in the context of social choice under ignorance. First, we present a general formal framework for such arguments. Next, we provide an axiomatic characterization of social choice rules that can be supported by original position arguments. We illustrate this characterization in terms of various well-known social choice rules, some of which do and some of which do not satisfy the axioms in question. Depending on the perspective one takes, our results can be used to argue against certain rules, against Rawlsian theories of procedural fairness, or in support of richer, multidimensional models of individual choice.