To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Ending the HIV Epidemic initiative aims to decrease new HIV infections and promote test-and-treat strategies. Our aims were to establish a baseline of HIV outcomes among newly diagnosed PWH in Washington, DC (DC), a ‘hotspot’ for the HIV epidemic. We also examined sociodemographic and clinical factors associated with retention in care (RIC), antiretroviral therapy (ART) initiation and viral suppression (VS) among newly diagnosed PWH in the DC Cohort from 2011–2016. Among 455 newly diagnosed participants, 92% were RIC at 12 months, ART was initiated in 65% at 3 months and 91% at 12 months, VS in at least 17% at 3 months and 82% at 12 months and 55% of those with VS at 12 months had sustained VS for an additional 12 months. AIDS diagnosis was associated with RIC (aOR 2.99; 1.13–2.28), ART initiation by 3 months (aOR 2.58; 1.61–4.12) and VS by 12 months (aOR4.87; 1.69–14.03). This analysis contributes to our understanding of the HIV treatment dynamics of persons with recently diagnosed HIV infection in a city with a severe HIV epidemic.
In this study, we consider a class of multiple-drawing opposite-reinforcing urns with time-dependent replacement rules. The class has the symmetric property of a Friedman-type urn. We divide the class into a small-increment regime and a large-increment regime. For small-increment schemes, we prove almost-sure convergence and a central limit theorem for the proportion of white balls by stochastic approximation. For large-increment schemes, by assuming the affinity condition, we show almost-sure convergence of the proportion of white balls by martingale theory and present a way to identify the limit distribution of the proportion of white balls.
The introduction of pneumococcal conjugate vaccines (PCV) into the childhood vaccination programme has reduced invasive pneumococcal disease (IPD). Although anticipated from data elsewhere, surveillance in Ireland has confirmed reductions in IPD amongst those ⩾65 years of age due to a decline of PCV serotypes in this age group. Currently, direct protection against IPD in the elderly is focused on immunisation with the 23-valent pneumococcal polysaccharide vaccine (PPV23). However, immunity may not be as effective as with PCV and, furthermore, PPV23 uptake is poor in Ireland. Hence, consideration should be given to providing a PCV to this age group.
Data assimilation of flow measurements is an essential tool for extracting information in fluid dynamics problems. Recent works have shown that the physics-informed neural networks (PINNs) enable the reconstruction of unsteady fluid flows, governed by the Navier–Stokes equations, if the network is given enough flow measurements that are appropriately distributed in time and space. In many practical applications, however, experimental measurements involve only time-averaged quantities or their higher order statistics which are governed by the under-determined Reynolds-averaged Navier–Stokes (RANS) equations. In this study, we perform PINN-based reconstruction of time-averaged quantities of an unsteady flow from sparse velocity data. The applied technique leverages the time-averaged velocity data to infer unknown closure quantities (curl of unsteady RANS forcing), as well as to interpolate the fields from sparse measurements. Furthermore, the method’s capabilities are extended further to the assimilation of Reynolds stresses where PINNs successfully interpolate the data to complete the velocity as well as the stresses fields and gain insight into the pressure field of the investigated flow.
David Gibson’s (2008) examination of research on conversational interaction highlighted methodological and theoretical gaps in current understanding – particularly around the localized construction of interaction and the reproduction of social structures. This paper extends extant formal models used by group process researchers to explain how exogenous status structures shape local interaction by incorporating insights from qualitative work examining the local production of conversational interaction. Relational events serve as a bridge between conversation analytic understandings of the deep structure of conversation and expectation states formal models of permeation. We propose a theoretical integration of the status organizing process (permeation) and local turn-taking rules (deep structure) as a more complete model of conversational behavior in task groups. We test a formalized construction of this preliminary theory by examining turn-taking using data from 55 task groups whose members vary in gender, authority, and legitimacy of that authority. This integrated model offers substantial improvements in prediction accuracy over using status information alone. We then propose ways to expand the integrated theoretical framework to advance current understandings of action and events in conversation. Finally, we offer suggestions for insights from group processes theories that could be incorporated into network models of interaction outside of this theoretical framework.
The COVID‑19 pandemic has increased the popularity of online shopping, and companies are looking for ways to provide consumers with experiences that online shopping cannot provide, such as touching products and imagining them in use. In this context, the importance of haptic imagery of products showcased online is increasing. This study replicated and extended Peck et al.’s (2013, Journal of Consumer Psychology, 23, 189–196) finding that physical control and psychological ownership mediate the influence of haptic imagery on purchase intention. This study showed that imagining touching a product increased purchase intention through the mediation of physical control and psychological ownership compared with not imagining, conceptually replicating Peck et al.’s study. This study also examined the moderating effect of product involvement and showed that there was no moderator role of product involvement. The findings would have a practical application in marketing, such as encouraging consumers to imagine touching the product.
Numerous works have been proposed to generate random graphs preserving the same properties as real-life large-scale networks. However, many real networks are better represented by hypergraphs. Few models for generating random hypergraphs exist, and also, just a few models allow to both preserve a power-law degree distribution and a high modularity indicating the presence of communities. We present a dynamic preferential attachment hypergraph model which features partition into communities. We prove that its degree distribution follows a power-law, and we give theoretical lower bounds for its modularity. We compare its characteristics with a real-life co-authorship network and show that our model achieves good performances. We believe that our hypergraph model will be an interesting tool that may be used in many research domains in order to reflect better real-life phenomena.
In this paper, we examine the contribution of Network Science journal to the network science discipline. We do so from two perspectives. First, expanding the existing taxonomy of article contribution, we examine trends in theory testing, theory building, and new method development within the journal’s articles. We find that the journal demands a high level of theoretical contribution and methodological rigor. High levels of theoretical and methodological contribution become significant predictors of article citation rates. Second, we look at the composition of the studies in Network Science and determine that the journal has already established a solid “hard core” for the new discipline.
We consider a two-stage service system with two types of servers, namely subordinates who perform the first-stage service and supervisors who have their own responsibilities in addition to collaborating with the subordinates on the second-stage service. Rewards are earned when first- or second-stage service is completed and when supervisors finish one of their own responsibilities. Costs are incurred when impatient customers abandon without completing the second-stage service. Our problem is to determine how the supervisors should distribute their time between their joint work with the subordinates and their own responsibilities. Under the assumptions that service times at both stages are exponentially distributed and that the customers waiting for second-stage service abandon after an exponential amount of time, we prove that one of two policies will maximize the long-run average profit. Namely, it is optimal for supervisors to start collaborating with subordinates either when subordinates can no longer serve new customers or as soon as there is a customer ready for second-stage service. Furthermore, we show that the optimality condition is a simple threshold on the system parameters. We conclude by proving that pooling supervisors (and their associated subordinates) improves system performance, but with limited returns as more supervisors are pooled.
A stream of research on co-authorship, used as a proxy of scholars’ collaborative behavior, focuses on members of a given scientific community defined at discipline and/or national basis for which co-authorship data have to be retrieved. Recent literature pointed out that international digital libraries provide partial coverage of the entire scholar scientific production as well as under-coverage of the scholars in the community. Bias in retrieving co-authorship data of the community of interest can affect network construction and network measures in several ways, providing a partial picture of the real collaboration in writing papers among scholars. In this contribution, we collected bibliographic records of Italian academic statisticians from an online platform (IRIS) available at most universities. Even if it guarantees a high coverage rate of our population and its scientific production, it is necessary to deal with some data quality issues. Thus, a web scraping procedure based on a semi-automatic tool to retrieve publication metadata, as well as data management tools to detect duplicate records and to reconcile authors, is proposed. As a result of our procedure, it emerged that collaboration is an active and increasing practice for Italian academic statisticians with some differences according to the gender, the academic ranking, and the university location of scholars. The heuristic procedure to accomplish data quality issues in the IRIS platform can represent a working case report to adapt to other bibliographic archives with similar characteristics.
For a quadratic Markov branching process (QMBP), we show that the decay parameter is equal to the first eigenvalue of a Sturm–Liouville operator associated with the partial differential equation that the generating function of the transition probability satisfies. The proof is based on the spectral properties of the Sturm–Liouville operator. Both the upper and lower bounds of the decay parameter are given explicitly by means of a version of Hardy’s inequality. Two examples are provided to illustrate our results. The important quantity, the Hardy index, which is closely linked to the decay parameter of the QMBP, is deeply investigated and estimated.
In Serbia, modern pork production systems with implemented control measures, including the detection of Trichinella larvae in meat (ISO18743), have eliminated farmed pork from pigs slaughtered at abattoirs as a source of trichinellosis. Epidemiological data from 2011 to 2020 indicate that the number of human cases and the number of infected domestic pigs has decreased significantly. Over the years, pork was the most frequent source of human infection. Cases generally occurred in small family outbreaks, and the infection was linked to consumption of raw or undercooked pork from backyard pigs. In most of the outbreaks, T. spiralis was the aetiological agent of infection, but in 2016, a large outbreak was caused by consumption of uninspected wild boar meat containing T. britovi larvae. To achieve safe pork, it is important that consumers of pork from animals raised in backyard smallholdings and of wild game meat are properly educated about the risks associated with consumption of untested meat. Laboratories conducting Trichinella testing should have a functional quality assurance system to ensure competency of analysts and that accurate and repeatable results are achieved. Regular participation in proficiency testing is needed.
Despite the COVID-19 pandemic, influenza remains an important issue. Especially in community settings, influenza outbreaks can be difficult to control and can result in high attack rates. In April 2022, a large A(H3N2) influenza outbreak spread in the largest Italian drug-rehabilitation community. One hundred eighty-four individuals presented influenza-like symptoms (attack rate of 26.2%); 56% previously received the influenza vaccine. Sequence analyses highlighted a genetic drift from the vaccine strain, which may have caused the observed lack of protection.
In this work, we establish a connection between the cumulative residual entropy and the Gini mean difference (GMD). Some relationships between the extropy and the GMD, and the truncated GMD and dynamic versions of the cumulative past extropy are also established. We then show that several entropy and extropy measures discussed here can be brought into the framework of probability weighted moments, which would facilitate finding estimators of these measures.
We consider the critical Galton–Watson process with overlapping generations stemming from a single founder. Assuming that both the variance of the offspring number and the average generation length are finite, we establish the convergence of the finite-dimensional distributions, conditioned on non-extinction at a remote time of observation. The limiting process is identified as a pure death process coming down from infinity.
This result brings a new perspective on Vatutin’s dichotomy, claiming that in the critical regime of age-dependent reproduction, an extant population either contains a large number of short-living individuals or consists of few long-living individuals.
SARS-CoV-2 has severely affected capacity in the National Health Service (NHS), and waiting lists are markedly increasing due to downtime of up to 50 min between patient consultations/procedures, to reduce the risk of infection. Ventilation accelerates this air cleaning, but retroactively installing built-in mechanical ventilation is often cost-prohibitive. We investigated the effect of using portable air cleaners (PAC), a low-energy and low-cost alternative, to reduce the concentration of aerosols in typical patient consultation/procedure environments. The experimental setup consisted of an aerosol generator, which mimicked the subject affected by SARS-CoV-19, and an aerosol detector, representing a subject who could potentially contract SARS-CoV-19. Experiments of aerosol dispersion and clearing were undertaken in situ in a variety of rooms with two different types of PAC in various combinations and positions. Correct use of PAC can reduce the clearance half-life of aerosols by 82% compared to the same indoor-environment without any ventilation, and at a broadly equivalent rate to built-in mechanical ventilation. In addition, the highest level of aerosol concentration measured when using PAC remains at least 46% lower than that when no mitigation is used, even if the PAC's operation is impeded due to placement under a table. The use of PAC leads to significant reductions in the level of aerosol concentration, associated with transmission of droplet-based airborne diseases. This could enable NHS departments to reduce the downtime between consultations/procedures
Suppose that m drivers each choose a preferred parking space in a linear car park with n spots. In order, each driver goes to their chosen spot and parks there if possible, and otherwise takes the next available spot if it exists. If all drivers park successfully, the sequence of choices is called a parking function. Classical parking functions correspond to the case $m=n$.
We investigate various probabilistic properties of a uniform parking function. Through a combinatorial construction termed a parking function multi-shuffle, we give a formula for the law of multiple coordinates in the generic situation $m \lesssim n$. We further deduce all possible covariances: between two coordinates, between a coordinate and an unattempted spot, and between two unattempted spots. This asymptotic scenario in the generic situation $m \lesssim n$ is in sharp contrast with that of the special situation $m=n$.
A generalization of parking functions called interval parking functions is also studied, in which each driver is willing to park only in a fixed interval of spots. We construct a family of bijections between interval parking functions with n cars and n spots and edge-labeled spanning trees with $n+1$ vertices and a specified root.
This paper proposes a robust moment selection method aiming to pick the best model even if this is a moment condition model with mixed identification strength, that is, moment conditions including moment functions that are local to zero uniformly over the parameter set. We show that the relevant moment selection procedure of Hall et al. (2007, Journal of Econometrics 138, 488–512) is inconsistent in this setting as it does not explicitly account for the rate of convergence of parameter estimation of the candidate models which may vary. We introduce a new moment selection procedure based on a criterion that automatically accounts for both the convergence rate of the candidate model’s parameter estimate and the entropy of the estimator’s asymptotic distribution. The benchmark estimator that we consider is the two-step efficient generalized method of moments estimator, which is known to be efficient in this framework as well. A family of penalization functions is introduced that guarantees the consistency of the selection procedure. The finite-sample performance of the proposed method is assessed through Monte Carlo simulations.