We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Nursing home residents may be particularly vulnerable to coronavirus disease 2019 (COVID-19). Therefore, a question is when and how often nursing homes should test staff for COVID-19 and how this may change as severe acute respiratory coronavirus virus 2 (SARS-CoV-2) evolves.
Design:
We developed an agent-based model representing a typical nursing home, COVID-19 spread, and its health and economic outcomes to determine the clinical and economic value of various screening and isolation strategies and how it may change under various circumstances.
Results:
Under winter 2023–2024 SARS-CoV-2 omicron variant conditions, symptom-based antigen testing averted 4.5 COVID-19 cases compared to no testing, saving $191 in direct medical costs. Testing implementation costs far outweighed these savings, resulting in net costs of $990 from the Centers for Medicare & Medicaid Services perspective, $1,545 from the third-party payer perspective, and $57,155 from the societal perspective. Testing did not return sufficient positive health effects to make it cost-effective [$50,000 per quality-adjusted life-year (QALY) threshold], but it exceeded this threshold in ≥59% of simulation trials. Testing remained cost-ineffective when routinely testing staff and varying face mask compliance, vaccine efficacy, and booster coverage. However, all antigen testing strategies became cost-effective (≤$31,906 per QALY) or cost saving (saving ≤$18,372) when the severe outcome risk was ≥3 times higher than that of current omicron variants.
Conclusions:
SARS-CoV-2 testing costs outweighed benefits under winter 2023–2024 conditions; however, testing became cost-effective with increasingly severe clinical outcomes. Cost-effectiveness can change as the epidemic evolves because it depends on clinical severity and other intervention use. Thus, nursing home administrators and policy makers should monitor and evaluate viral virulence and other interventions over time.
This study aimed to identify publicly reported access characteristics for episodic primary care in BC and provided a clinic-level comparison between walk-in clinics and UPCCs.
Background:
Walk-in clinics are non-hospital-based primary care facilities that are designed to operate without appointments and provide increased healthcare access with extended hours. Urgent and Primary Care Centres (UPCCs) were introduced to British Columbia (BC) in 2018 as an additional primary care resource that provided urgent, but not emergent care during extended hours.
Methods:
This cross-sectional study used publicly available data from all walk-in clinics and UPCCs in BC. A structured data collection form was used to record access characteristics from clinic websites, including business hours, weekend availability, attachment to a longitudinal family practice, and provision of virtual services.
Findings:
In total, 268 clinics were included in the analysis (243 walk-in clinics, 25 UPCCs). Of those, 225 walk-in clinics (92.6%) and two UPCCs (8.0%) were attached to a longitudinal family practice. Only 153 (63%) walk-in clinics offered weekend services, compared to 24 (96%) of UPCCs. Walk-in clinics offered the majority (8,968.6/ 78.4%) of their service hours between 08:00 and 17:00, Monday to Friday. UPCCs offered the majority (889.3/ 53.7%) of their service hours after 17:00.
Conclusion:
Most walk-in clinics were associated with a longitudinal family practice and provided the majority of clinic services during typical business hours. More research that includes patient characteristics and care outcomes, analyzed at the clinic level, may be useful to support the optimization of episodic primary healthcare delivery.
Individual hosts are often co-infected with multiple parasite species. Evidence from theoretical and empirical studies supports the idea that co-occurring parasites can impact each other and their hosts via synergistic or antagonistic interactions. The fundamental aim of understanding the consequences of co-infection to hosts and parasites requires an understanding of patterns of species co-occurrence within samples of hosts. We censused parasite assemblages in 755 adult, male fathead minnows collected from 7 lakes/ponds in southern Alberta, Canada between 2018 and 2020. Fifteen species of endoparasites infected fathead minnows, 98% of which were co-infected with between 2 and 9 parasite species (mean species richness: 4.4 ± 1.4). Non-random pairwise associations were detected within the overall parasite community. There were particularly strong, positive associations in the occurrences and intensities of the 2 congeneric larval trematodes Ornithodiplostomum sp. and Ornithodiplostomum ptychocheilus that comprised >96% of the 100 000+ parasites counted in the total sample of minnows. Furthermore, the occurrence of Ornithodiplostomum sp. was a strong predictor of the occurrence of O. ptychocheilus, and vice versa. Positive covariation in the intensities of these 2 dominants likely arises from their shared use of physid snails as first intermediate hosts in these waterbodies. These 2 species represent a predictable and non-random component within the complex assemblage of parasites of fathead minnows in this region.
The Kazakh famine of 1930–3 ranks as one of the great crimes of the Stalinist regime. The crisis, which was sparked by Josef Stalin’s programme of forced collectivisation, led to the death of roughly a third of all Kazakhs, believed to be the highest death ratio due to collectivisation of any people in the Soviet Union. More than 1.5 million people perished, of the total population of around 6 million living in the Kazakh Autonomous Soviet Socialist Republic (the Kazakh ASSR, often known as Kazakhstan). Kazakhs, who speak their own Turkic language, became a minority in their own republic. They would not again constitute more than 50 per cent of the population in Kazakhstan until after the Soviet collapse. The Kazakh famine also constitutes one of the largest pastoral famines in modern history. Prior to the crisis, most Kazakhs practiced pastoral nomadism, carrying out seasonal migrations to pasture their animal herds. But those who survived were forced to settle, prompting a painful and far-reaching reorientation of Kazakh culture and identity.
To date, the evidence regarding the effect of bilingualism/multilingualism on short-term memory (STM) and working memory (WM) capacity is inconclusive. This study investigates whether multilingualism has a positive effect on the verbal STM and WM capacity of neurotypical middle-aged and older individuals. Eighty-two L1-Norwegian sequential bilingual/multilingual academics were tested with tasks measuring verbal STM/WM capacity. Degree of bilingualism/multilingualism for each participant was estimated based on a comprehensive questionnaire. Different measures of bilingualism/multilingualism were used. Data on potentially influencing non-linguistic factors were also collected. Correlation and regression analyses showed that multilingualism impacts both verbal STM and verbal WM. In particular, all analyses showed that number of known foreign languages was the strongest predictor of verbal STM and WM capacity. The results are discussed in light of recent studies on the impact of bilingualism on STM/WM and on recent proposals regarding the mechanism underlying so-called bilingual advantage.
A nationally generalisable cohort (n 5770) was used to determine the prevalence of non-timely (early/late) introduction of complementary food and core food groups and associations with maternal sociodemographic and health behaviours in New Zealand (NZ). Variables describing maternal characteristics and infant food introduction were sourced, respectively, from interviews completed antenatally and during late infancy. The NZ Infant Feeding Guidelines were used to define early (≤ 4 months) and late (≥ 7 months) introduction. Associations were examined using multivariable multinomial regression, presented as adjusted relative risk ratios and 95 % confidence intervals (RRR; 95% CI). Complementary food introduction was early for 40·2 % and late for 3·2 %. The prevalence of early food group introduction were fruit/vegetables (23·8 %), breads/cereals (36·3 %), iron-rich foods (34·1 %) and of late were meat/meat alternatives (45·9 %), dairy products (46·2 %) and fruits/vegetables (9·9 %). Compared with infants with timely food introduction, risk of early food introduction was increased for infants: breastfed < 6months (2·52; 2·19–2·90), whose mothers were < 30 years old (1·69; 1·46–1·94), had a diploma/trade certificate v. tertiary education (1·39; 1·1–1·70), of Māori v. European ethnicity (1·40; 1·12–1·75) or smoked during pregnancy (1·88; 1·44–2·46). Risk of late food introduction decreased for infants breastfed < 6 months (0·47; 0.27–0·80) and increased for infants whose mothers had secondary v. tertiary education (2·04; 1·16–3·60) were of Asian v. European ethnicity (2·22; 1·35, 3·63) or did not attend childbirth preparation classes (2·23; 1·24–4·01). Non-timely food introduction, specifically early food introduction, is prevalent in NZ. Interventions to improve food introduction timeliness should be ethnic-specific and support longer breast-feeding.
Cross-species evidence suggests that the ability to exert control over a stressor is a key dimension of stress exposure that may sensitize frontostriatal-amygdala circuitry to promote more adaptive responses to subsequent stressors. The present study examined neural correlates of stressor controllability in young adults. Participants (N = 56; Mage = 23.74, range = 18–30 years) completed either the controllable or uncontrollable stress condition of the first of two novel stressor controllability tasks during functional magnetic resonance imaging (fMRI) acquisition. Participants in the uncontrollable stress condition were yoked to age- and sex-matched participants in the controllable stress condition. All participants were subsequently exposed to uncontrollable stress in the second task, which is the focus of fMRI analyses reported here. A whole-brain searchlight classification analysis revealed that patterns of activity in the right dorsal anterior insula (dAI) during subsequent exposure to uncontrollable stress could be used to classify participants' initial exposure to either controllable or uncontrollable stress with a peak of 73% accuracy. Previous experience of exerting control over a stressor may change the computations performed within the right dAI during subsequent stress exposure, shedding further light on the neural underpinnings of stressor controllability.
Using data from a nationally generalisable birth cohort, we aimed to: (i) describe the cohort’s adherence to national evidence-based dietary guidelines using an Infant Feeding Index (IFI) and (ii) assess the IFI’s convergent construct validity, by exploring associations with antenatal maternal socio-demographic and health behaviours and with child overweight/obesity and central adiposity at age 54 months. Data were from the Growing Up in New Zealand cohort (n 6343). The IFI scores ranged from zero to twelve points, with twelve representing full adherence to the guidelines. Overweight/obesity was defined by BMI-for-age (based on the WHO Growth Standards). Central adiposity was defined as waist-to-height ratio > 90th percentile. Associations were tested using multiple linear regression and Poisson regression with robust variance (risk ratios, 95 % CI). Mean IFI score was 8·2 (sd 2·1). Maternal characteristics explained 29·1 % of variation in the IFI score. Maternal age, education and smoking had the strongest independent relationships with IFI scores. Compared with children in the highest IFI tertile, girls in the lowest and middle tertiles were more likely to be overweight/obese (1·46, 1·03, 2·06 and 1·56, 1·09, 2·23, respectively) and boys in the lowest tertile were more likely to have central adiposity (1·53, 1·02, 2·30) at age 54 months. Most infants fell short of meeting national Infant Feeding Guidelines. The associations between IFI score and maternal characteristics, and children’s overweight/obesity/central adiposity, were in the expected directions and confirm the IFI’s convergent construct validity.
Graduate schools provide students opportunities for fieldwork and training in archaeological methods and theory, but they often overlook instruction in field safety and well-being. We suggest that more explicit guidance on how to conduct safe fieldwork will improve the overall success of student-led projects and prepare students to direct safe and successful fieldwork programs as professionals. In this article, we draw on the experiences of current and recent graduate students as well as professors who have overseen graduate fieldwork to outline key considerations in improving field safety and well-being and to offer recommendations for specific training and safety protocols. In devising these considerations and recommendations, we have referenced both domestic and international field projects, as well as those involving community collaboration.
Do economic crises mobilize or depress civic engagement? This paper examines this question by analysing cross-national trends in voluntary association membership in the context of the global financial crisis. A mobilization hypothesis suggests that an economic crisis would increase membership in voluntary associations, as these associations provide citizens a channel for interest articulation and aggregation facilitating their response to the crisis. A retreat hypothesis, on the other hand, suggests that an economic crisis would depress voluntary association membership, as people have fewer resources to be involved in these associations at a time of crisis. To test these hypotheses, this paper examines data on voluntary association memberships from the World Values Survey in 14 democratic countries, fielded before and after the global financial crisis hit in 2008. The results support the retreat hypothesis. Following the crisis, there was a decline in voluntary association memberships overall and countries harder hit by the crisis were more likely to experience declines. There was no evidence of mobilization among those more vulnerable to the crisis. Rather, the profile of those engaged in voluntary associations was similar before and after the crisis, skewed towards those better off in society, including those with higher education levels, higher incomes, and in paid employment.
We implemented universal severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) testing of patients undergoing surgical procedures as a means to conserve personal protective equipment (PPE). The rate of asymptomatic coronavirus disease 2019 (COVID-19) was <0.5%, which suggests that early local public health interventions were successful. Although our protocol was resource intensive, it prevented exposures to healthcare team members.
The flat oyster Ostrea edulis has declined significantly in European waters since the 1850s as a result of anthropogenic activity. Ostrea edulis was designated a UK Biodiversity Action Plan Species and Habitat in 1995, and as a Feature of Conservation Importance (FOCI) within the UK Marine & Coastal Access Act 2009. To promote the recovery of oyster beds, a greater understanding of its abundance and distribution is required. Distribution of O. edulis across the proposed Blackwater, Crouch, Roach and Colne MCZ in Essex was determined between 2008 and 2012. Ostrea edulis were present in four estuary zones; with highest sample abundance in the Blackwater and Ray Sand zones. Size structure of populations varied, with the Ray Sand and Colne zones showing a significant lack of individuals with shell height <39 mm. Ostrea edulis occurred in highest number on shell substratum, followed by silty sediments. There were no significant associations between O. edulis abundance or size structure with water column Chl a, suspended solids, oxygen, nitrate or ammonium concentrations, temperature or pH. Highest abundance and most equitable population shell-size distribution for O. edulis were located within, or adjacent to, actively managed aquaculture zones. This suggests that traditional seabed management contributed to the maintenance or recovery of the species of conservation concern. Demonstration that the Essex estuaries were a stronghold for Ostrea edulis in the southern North sea area led to the designation of the Blackwater, Crouch, Roach and Colne estuaries Marine Conservation Zone in 2013.
To simulate effects of different scenarios of folic acid fortification of food on dietary folate equivalents (DFE) intake in an ethnically diverse sample of pregnant women.
Design
A forty-four-item FFQ was used to evaluate dietary intake of the population. DFE intakes were estimated for different scenarios of food fortification with folic acid: (i) voluntary fortification; (ii) increased voluntary fortification; (iii) simulated bread mandatory fortification; and (iv) simulated grains-and-rice mandatory fortification.
Setting
Ethnically and socio-economically diverse cohort of pregnant women in New Zealand.
Participants
Pregnant women (n 5664) whose children were born in 2009–2010.
Results
Participants identified their ethnicity as European (56·0 %), Asian (14·2 %), Māori (13·2 %), Pacific (12·8 %) or Others (3·8 %). Bread, breakfast cereals and yeast spread were main food sources of DFE in the two voluntary fortification scenarios. However, for Asian women, green leafy vegetables, bread and breakfast cereals were main contributors of DFE in these scenarios. In descending order, proportions of different ethnic groups in the lowest tertile of DFE intake for the four fortification scenarios were: Asian (39–60 %), Others (41–44 %), European (31–37 %), Pacific (23–26 %) and Māori (23–27 %). In comparisons within each ethnic group across scenarios of food fortification with folic acid, differences were observed only with DFE intake higher in the simulated grains-and-rice mandatory fortification v. other scenarios.
Conclusions
If grain and rice fortification with folic acid was mandatory in New Zealand, DFE intakes would be more evenly distributed among pregnant women of different ethnicities, potentially reducing ethnic group differences in risk of lower folate intakes.
The collectivisation famines of the 1930s are one of the darkest and most contested chapters in Soviet history. Carried out in the name of agricultural modernisation, Stalin's policy of forced collectivisation led to immense human suffering. Somewhere between 5 to 9 million people are believed to have perished in these famines, with the burden falling disproportionately on several major food-producing regions, including Ukraine, Kazakhstan, the Volga Basin and the Don and Kuban regions of the North Caucasus. Those who survived these terrifying events found their lives transformed, and collectivisation and the accompanying famines played a crucial role in integrating the Soviet Union's vast rural population into the institutions of a ‘workers’ state’.
To evaluate the sociodemographic and lifestyle factors associated with insufficient and excessive use of folic acid supplements (FAS) among pregnant women.
Design
A pregnancy cohort to which multinomial logistic regression models were applied to identify factors associated with duration and dose of FAS use.
Setting
The Growing Up in New Zealand child study, which enrolled pregnant women whose children were born in 2009–2010.
Subjects
Pregnant women (n 6822) enrolled into a nationally generalizable cohort.
Results
Ninety-two per cent of pregnant women were not taking FAS according to the national recommendation (4 weeks before until 12 weeks after conception), with 69 % taking insufficient FAS and 57 % extending FAS use past 13 weeks’ gestation. The factors associated with extended use differed from those associated with insufficient use. Consistent with published literature, the relative risks of insufficient use were increased for younger women, those with less education, of non-European ethnicities, unemployed, who smoked cigarettes, whose pregnancy was unplanned or who had older children, or were living in more deprived households. In contrast, the relative risks of extended use were increased for women of higher socio-economic status or for whom this was their first pregnancy and decreased for women of Pacific v. European ethnicity.
Conclusions
In New Zealand, current use of FAS during pregnancy potentially exposes pregnant women and their unborn children to too little or too much folic acid. Further policy development is necessary to reduce current socio-economic inequities in the use of FAS.
Introduced plants threaten biodiversity and ecosystem processes, including carbon (C) and nitrogen (N) cycles, but little is known about the threshold at which such effects occur. We examined the impact of the invasive shrub Amur honeysuckle on soil organic carbon (SOC) and N density at study sites that varied in invasion history. In plots with and without honeysuckle, we measured honeysuckle abundance and size (basal area) and extracted soil cores. SOC and N densities were highest at the site with the longest invasion history and highest invasion intensity (i.e., greatest abundance and basal area of honeysuckle). Basal area of honeysuckle positively affected SOC and N densities likely because of increased litter decomposition and altered microbial communities. Because honeysuckle increases forest net primary productivity (NPP) and SOC, it also may play a role in C sequestration. Our results demonstrate the need to consider the influence of invasion history and intensity when evaluating the potential impact of invasive species.