We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Globally, there is seasonal variation in tuberculosis (TB) incidence, yet the biological and behavioural or social factors driving TB seasonality differ across countries. Understanding season-specific risk factors that may be specific to the UK could help shape future decision-making for TB control. We conducted a time-series analysis using data from 152,424 UK TB notifications between 2000 and 2018. Notifications were aggregated by year, month, and socio-demographic covariates, and negative binomial regression models fitted to the aggregate data. For each covariate, we calculated the size of the seasonal effect as the incidence risk ratio (IRR) for the peak versus the trough months within the year and the timing of the peak, whilst accounting for the overall trend. There was strong evidence for seasonality (p < 0.0001) with an IRR of 1.27 (95% CI 1.23–1.30). The peak was estimated to occur at the beginning of May. Significant differences in seasonal amplitude were identified across age groups, ethnicity, site of disease, latitude and, for those born abroad, time since entry to the UK. The smaller amplitude in older adults, and greater amplitude among South Asians and people who recently entered the UK may indicate the role of latent TB reactivation and vitamin D deficiency in driving seasonality.
To determine the reach, adoption, implementation and effectiveness of an intervention to increase children’s vegetable intake in long day care (LDC).
Design:
A 12-week pragmatic cluster randomised controlled trial, informed by the multiphase optimisation strategy (MOST), targeting the mealtime environment and curriculum. Children’s vegetable intake and variety was measured at follow-up using a modified Short Food Survey for early childhood education and care and analysed using a two-part mixed model for non-vegetable and vegetable consumers. Outcome measures were based on the RE-AIM framework.
Setting:
Australian LDC centres.
Participants:
Thirty-nine centres, 120 educators and 719 children at follow-up.
Results:
There was no difference between intervention and waitlist control groups in the likelihood of consuming any vegetables when compared with non-vegetable consumers for intake (OR = 0·70, (95 % CI 0·34–1·43), P = 0·32) or variety (OR = 0·73 (95 % CI 0·40–1·32), P = 0·29). Among vegetable consumers (n 652), there was no difference between groups in vegetable variety (exp(b): 1·07 (95 % CI:0·88–1·32, P = 0·49) or vegetable intake (exp(b): 1·06 (95 % CI: 0·78, 1·43)), P = 0·71) with an average of 1·51 (95 % CI 1·20–1·82) and 1·40 (95 % CI 1·08–1·72) serves of vegetables per day in the intervention and control group, respectively. Intervention educators reported higher skills for promoting vegetables at mealtimes, and knowledge and skills for teaching the curriculum, than control (all P < 0·001). Intervention fidelity was moderate (n 16/20 and n 15/16 centres used the Mealtime environment and Curriculum, respectively) with good acceptability among educators. The intervention reached 307/8556 centres nationally and was adopted by 22 % eligible centres.
Conclusions:
The pragmatic self-delivered online intervention positively impacted educator’s knowledge and skills and was considered acceptable and feasible. Intervention adaptations, using the MOST cyclic approach, could improve intervention impact on children’ vegetable intake.
Homeless shelter residents and staff may be at higher risk of SARS-CoV-2 infection. However, SARS-CoV-2 infection estimates in this population have been reliant on cross-sectional or outbreak investigation data. We conducted routine surveillance and outbreak testing in 23 homeless shelters in King County, Washington, to estimate the occurrence of laboratory-confirmed SARS-CoV-2 infection and risk factors during 1 January 2020–31 May 2021. Symptom surveys and nasal swabs were collected for SARS-CoV-2 testing by RT-PCR for residents aged ≥3 months and staff. We collected 12,915 specimens from 2,930 unique participants. We identified 4.74 (95% CI 4.00–5.58) SARS-CoV-2 infections per 100 individuals (residents: 4.96, 95% CI 4.12–5.91; staff: 3.86, 95% CI 2.43–5.79). Most infections were asymptomatic at the time of detection (74%) and detected during routine surveillance (73%). Outbreak testing yielded higher test positivity than routine surveillance (2.7% versus 0.9%). Among those infected, residents were less likely to report symptoms than staff. Participants who were vaccinated against seasonal influenza and were current smokers had lower odds of having an infection detected. Active surveillance that includes SARS-CoV-2 testing of all persons is essential in ascertaining the true burden of SARS-CoV-2 infections among residents and staff of congregate settings.
Early environmental experience can have significant effects on an animal's ability to adapt to challenges in later life. Prior experience of specific situations may facilitate the development of behavioural skills which can be applied in similar situations to later life. In addition, exposure to a more complex environment may enhance cognitive development (eg increased synaptic density), which can then speed the acquisition of new behavioural responses when faced with novel challenges (Grandin 1989).
This paper provides an overview and appraisal of the International Design Engineering Annual (IDEA) challenge - a virtually hosted design hackathon run with the aim of generating a design research dataset that can provide insights into design activities at virtually hosted hackathons. The resulting dataset consists of 200+ prototypes with over 1300 connections providing insights into the products, processes and people involved in the design process. The paper also provides recommendations for future deployments of virtual hackathons for design research.
To prioritise and refine a set of evidence-informed statements into advice messages to promote vegetable liking in early childhood, and to determine applicability for dissemination of advice to relevant audiences.
Design:
A nominal group technique (NGT) workshop and a Delphi survey were conducted to prioritise and achieve consensus (≥70 % agreement) on thirty evidence-informed maternal (perinatal and lactation stage), infant (complementary feeding stage) and early years (family diet stage) vegetable-related advice messages. Messages were validated via triangulation analysis against the strength of evidence from an Umbrella review of strategies to increase children’s vegetable liking, and gaps in advice from a Desktop review of vegetable feeding advice.
Setting:
Australia.
Participants:
A purposeful sample of key stakeholders (NGT workshop, n 8 experts; Delphi survey, n 23 end users).
Results:
Participant consensus identified the most highly ranked priority messages associated with the strategies of: ‘in-utero exposure’ (perinatal and lactation, n 56 points) and ‘vegetable variety’ (complementary feeding, n 97 points; family diet, n 139 points). Triangulation revealed two strategies (‘repeated exposure’ and ‘variety’) and their associated advice messages suitable for policy and practice, twelve for research and four for food industry.
Conclusions:
Supported by national and state feeding guideline documents and resources, the advice messages relating to ‘repeated exposure’ and ‘variety’ to increase vegetable liking can be communicated to families and caregivers by healthcare practitioners. The food industry provides a vehicle for advice promotion and product development. Further research, where stronger evidence is needed, could further inform strategies for policy and practice, and food industry application.
Due to shortages of N95 respirators during the coronavirus disease 2019 (COVID-19) pandemic, it is necessary to estimate the number of N95s required for healthcare workers (HCWs) to inform manufacturing targets and resource allocation.
Methods:
We developed a model to determine the number of N95 respirators needed for HCWs both in a single acute-care hospital and the United States.
Results:
For an acute-care hospital with 400 all-cause monthly admissions, the number of N95 respirators needed to manage COVID-19 patients admitted during a month ranges from 113 (95% interpercentile range [IPR], 50–229) if 0.5% of admissions are COVID-19 patients to 22,101 (95% IPR, 5,904–25,881) if 100% of admissions are COVID-19 patients (assuming single use per respirator, and 10 encounters between HCWs and each COVID-19 patient per day). The number of N95s needed decreases to a range of 22 (95% IPR, 10–43) to 4,445 (95% IPR, 1,975–8,684) if each N95 is used for 5 patient encounters. Varying monthly all-cause admissions to 2,000 requires 6,645–13,404 respirators with a 60% COVID-19 admission prevalence, 10 HCW–patient encounters, and reusing N95s 5–10 times. Nationally, the number of N95 respirators needed over the course of the pandemic ranges from 86 million (95% IPR, 37.1–200.6 million) to 1.6 billion (95% IPR, 0.7–3.6 billion) as 5%–90% of the population is exposed (single-use). This number ranges from 17.4 million (95% IPR, 7.3–41 million) to 312.3 million (95% IPR, 131.5–737.3 million) using each respirator for 5 encounters.
Conclusions:
We quantified the number of N95 respirators needed for a given acute-care hospital and nationally during the COVID-19 pandemic under varying conditions.
Optical tracking systems typically trade off between astrometric precision and field of view. In this work, we showcase a networked approach to optical tracking using very wide field-of-view imagers that have relatively low astrometric precision on the scheduled OSIRIS-REx slingshot manoeuvre around Earth on 22 Sep 2017. As part of a trajectory designed to get OSIRIS-REx to NEO 101955 Bennu, this flyby event was viewed from 13 remote sensors spread across Australia and New Zealand to promote triangulatable observations. Each observatory in this portable network was constructed to be as lightweight and portable as possible, with hardware based off the successful design of the Desert Fireball Network. Over a 4-h collection window, we gathered 15 439 images of the night sky in the predicted direction of the OSIRIS-REx spacecraft. Using a specially developed streak detection and orbit determination data pipeline, we detected 2 090 line-of-sight observations. Our fitted orbit was determined to be within about 10 km of orbital telemetry along the observed 109 262 km length of OSIRIS-REx trajectory, and thus demonstrating the impressive capability of a networked approach to Space Surveillance and Tracking.
We critically review potential involvement of trimethylamine N-oxide (TMAO) as a link between diet, the gut microbiota and CVD. Generated primarily from dietary choline and carnitine by gut bacteria and hepatic flavin-containing mono-oxygenase (FMO) activity, TMAO could promote cardiometabolic disease when chronically elevated. However, control of circulating TMAO is poorly understood, and diet, age, body mass, sex hormones, renal clearance, FMO3 expression and genetic background may explain as little as 25 % of TMAO variance. The basis of elevations with obesity, diabetes, atherosclerosis or CHD is similarly ill-defined, although gut microbiota profiles/remodelling appear critical. Elevated TMAO could promote CVD via inflammation, oxidative stress, scavenger receptor up-regulation, reverse cholesterol transport (RCT) inhibition, and cardiovascular dysfunction. However, concentrations influencing inflammation, scavenger receptors and RCT (≥100 µm) are only achieved in advanced heart failure or chronic kidney disease (CKD), and greatly exceed pathogenicity of <1–5 µm levels implied in some TMAO–CVD associations. There is also evidence that CVD risk is insensitive to TMAO variance beyond these levels in omnivores and vegetarians, and that major TMAO sources are cardioprotective. Assessing available evidence suggests that modest elevations in TMAO (≤10 µm) are a non-pathogenic consequence of diverse risk factors (ageing, obesity, dyslipidaemia, insulin resistance/diabetes, renal dysfunction), indirectly reflecting CVD risk without participating mechanistically. Nonetheless, TMAO may surpass a pathogenic threshold as a consequence of CVD/CKD, secondarily promoting disease progression. TMAO might thus reflect early CVD risk while providing a prognostic biomarker or secondary target in established disease, although mechanistic contributions to CVD await confirmation.
A classic example of microbiome function is its role in nutrient assimilation in both plants and animals, but other less obvious roles are becoming more apparent, particularly in terms of driving infectious and non-infectious disease outcomes and influencing host behaviour. However, numerous biotic and abiotic factors influence the composition of these communities, and host microbiomes can be susceptible to environmental change. How microbial communities will be altered by, and mitigate, the rapid environmental change we can expect in the next few decades remain to be seen. That said, given the enormous range of functional diversity conferred by microbes, there is currently something of a revolution in microbial bioengineering and biotechnology in order to address real-world problems including human and wildlife disease and crop and biofuel production. All of these concepts are explored in further detail throughout the book.
Recent evidence suggests that exercise plays a role in cognition and that the posterior cingulate cortex (PCC) can be divided into dorsal and ventral subregions based on distinct connectivity patterns.
Aims
To examine the effect of physical activity and division of the PCC on brain functional connectivity measures in subjective memory complainers (SMC) carrying the epsilon 4 allele of apolipoprotein E (APOE 4) allele.
Method
Participants were 22 SMC carrying the APOE ɛ4 allele (ɛ4+; mean age 72.18 years) and 58 SMC non-carriers (ɛ4–; mean age 72.79 years). Connectivity of four dorsal and ventral seeds was examined. Relationships between PCC connectivity and physical activity measures were explored.
Results
ɛ4+ individuals showed increased connectivity between the dorsal PCC and dorsolateral prefrontal cortex, and the ventral PCC and supplementary motor area (SMA). Greater levels of physical activity correlated with the magnitude of ventral PCC–SMA connectivity.
Conclusions
The results provide the first evidence that ɛ4+ individuals at increased risk of cognitive decline show distinct alterations in dorsal and ventral PCC functional connectivity.
Civilian suicide rates vary by occupation in ways related to occupational stress exposure. Comparable military research finds suicide rates elevated in combat arms occupations. However, no research has evaluated variation in this pattern by deployment history, the indicator of occupation stress widely considered responsible for the recent rise in the military suicide rate.
Method
The joint associations of Army occupation and deployment history in predicting suicides were analysed in an administrative dataset for the 729 337 male enlisted Regular Army soldiers in the US Army between 2004 and 2009.
Results
There were 496 suicides over the study period (22.4/100 000 person-years). Only two occupational categories, both in combat arms, had significantly elevated suicide rates: infantrymen (37.2/100 000 person-years) and combat engineers (38.2/100 000 person-years). However, the suicide rates in these two categories were significantly lower when currently deployed (30.6/100 000 person-years) than never deployed or previously deployed (41.2–39.1/100 000 person-years), whereas the suicide rate of other soldiers was significantly higher when currently deployed and previously deployed (20.2–22.4/100 000 person-years) than never deployed (14.5/100 000 person-years), resulting in the adjusted suicide rate of infantrymen and combat engineers being most elevated when never deployed [odds ratio (OR) 2.9, 95% confidence interval (CI) 2.1–4.1], less so when previously deployed (OR 1.6, 95% CI 1.1–2.1), and not at all when currently deployed (OR 1.2, 95% CI 0.8–1.8). Adjustment for a differential ‘healthy warrior effect’ cannot explain this variation in the relative suicide rates of never-deployed infantrymen and combat engineers by deployment status.
Conclusions
Efforts are needed to elucidate the causal mechanisms underlying this interaction to guide preventive interventions for soldiers at high suicide risk.
The Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS) has found that the proportional elevation in the US Army enlisted soldier suicide rate during deployment (compared with the never-deployed or previously deployed) is significantly higher among women than men, raising the possibility of gender differences in the adverse psychological effects of deployment.
Method
Person-month survival models based on a consolidated administrative database for active duty enlisted Regular Army soldiers in 2004–2009 (n = 975 057) were used to characterize the gender × deployment interaction predicting suicide. Four explanatory hypotheses were explored involving the proportion of females in each soldier's occupation, the proportion of same-gender soldiers in each soldier's unit, whether the soldier reported sexual assault victimization in the previous 12 months, and the soldier's pre-deployment history of treated mental/behavioral disorders.
Results
The suicide rate of currently deployed women (14.0/100 000 person-years) was 3.1–3.5 times the rates of other (i.e. never-deployed/previously deployed) women. The suicide rate of currently deployed men (22.6/100 000 person-years) was 0.9–1.2 times the rates of other men. The adjusted (for time trends, sociodemographics, and Army career variables) female:male odds ratio comparing the suicide rates of currently deployed v. other women v. men was 2.8 (95% confidence interval 1.1–6.8), became 2.4 after excluding soldiers with Direct Combat Arms occupations, and remained elevated (in the range 1.9–2.8) after adjusting for the hypothesized explanatory variables.
Conclusions
These results are valuable in excluding otherwise plausible hypotheses for the elevated suicide rate of deployed women and point to the importance of expanding future research on the psychological challenges of deployment for women.
We describe the efficacy of enhanced infection control measures, including those recommended in the Centers for Disease Control and Prevention’s 2012 carbapenem-resistant Enterobacteriaceae (CRE) toolkit, to control concurrent outbreaks of carbapenemase-producing Enterobacteriaceae (CPE) and extensively drug-resistant Acinetobacter baumannii (XDR-AB).
Design
Before-after intervention study.
Setting
Fifteen-bed surgical trauma intensive care unit (ICU).
Methods
We investigated the impact of enhanced infection control measures in response to clusters of CPE and XDR-AB infections in an ICU from April 2009 to March 2010. Polymerase chain reaction was used to detect the presence of blaKPC and resistance plasmids in CRE. Pulsed-field gel electrophoresis was performed to assess XDR-AB clonality. Enhanced infection-control measures were implemented in response to ongoing transmission of CPE and a new outbreak of XDR-AB. Efficacy was evaluated by comparing the incidence rate (IR) of CPE and XDR-AB before and after the implementation of these measures.
Results
The IR of CPE for the 12 months before the implementation of enhanced measures was 7.77 cases per 1,000 patient-days, whereas the IR of XDR-AB for the 3 months before implementation was 6.79 cases per 1,000 patient-days. All examined CPE shared endemic blaKPC resistance plasmids, and 6 of the 7 XDR-AB isolates were clonal. Following institution of enhanced infection control measures, the CPE IR decreased to 1.22 cases per 1,000 patient-days (P = .001), and no more cases of XDR-AB were identified.
Conclusions
Use of infection control measures described in the Centers for Disease Control and Prevention’s 2012 CRE toolkit was associated with a reduction in the IR of CPE and an interruption in XDR-AB transmission.
During improved oil recovery (IOR), gas may be introduced into a porous reservoir filled with surfactant solution in order to form foam. A model for the evolution of the resulting foam front known as ‘pressure-driven growth’ is analysed. An asymptotic solution of this model for long times is derived that shows that foam can propagate indefinitely into the reservoir without gravity override. Moreover, ‘pressure-driven growth’ is shown to correspond to a special case of the more general ‘viscous froth’ model. In particular, it is a singular limit of the viscous froth, corresponding to the elimination of a surface tension term, permitting sharp corners and kinks in the predicted shape of the front. Sharp corners tend to develop from concave regions of the front. The principal solution of interest has a convex front, however, so that although this solution itself has no sharp corners (except for some kinks that develop spuriously owing to errors in a numerical scheme), it is found nevertheless to exhibit milder singularities in front curvature, as the long-time asymptotic analytical solution makes clear. Numerical schemes for the evolving front shape which perform robustly (avoiding the development of spurious kinks) are also developed. Generalisations of this solution to geologically heterogeneous reservoirs should exhibit concavities and/or sharp corner singularities as an inherent part of their evolution: propagation of fronts containing such ‘inherent’ singularities can be readily incorporated into these numerical schemes.
The US Army suicide rate has increased sharply in recent years. Identifying significant predictors of Army suicides in Army and Department of Defense (DoD) administrative records might help focus prevention efforts and guide intervention content. Previous studies of administrative data, although documenting significant predictors, were based on limited samples and models. A career history perspective is used here to develop more textured models.
Method
The analysis was carried out as part of the Historical Administrative Data Study (HADS) of the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS). De-identified data were combined across numerous Army and DoD administrative data systems for all Regular Army soldiers on active duty in 2004–2009. Multivariate associations of sociodemographics and Army career variables with suicide were examined in subgroups defined by time in service, rank and deployment history.
Results
Several novel results were found that could have intervention implications. The most notable of these were significantly elevated suicide rates (69.6–80.0 suicides per 100 000 person-years compared with 18.5 suicides per 100 000 person-years in the total Army) among enlisted soldiers deployed either during their first year of service or with less than expected (based on time in service) junior enlisted rank; a substantially greater rise in suicide among women than men during deployment; and a protective effect of marriage against suicide only during deployment.
Conclusions
A career history approach produces several actionable insights missed in less textured analyses of administrative data predictors. Expansion of analyses to a richer set of predictors might help refine understanding of intervention implications.
We present the first sample of diffuse interstellar bands (DIBs) in the nearby galaxy M33. Studying DIBs in other galaxies allows the behaviour of the carriers to be examined under interstellar conditions which can be quite different from those of the Milky Way, and to determine which DIB properties can be used as reliable probes of extragalactic interstellar media. Multi-object spectroscopy of 43 stars in M33 has been performed using Keck/DEIMOS. The stellar spectral types were determined and combined with literature photometry to determine the M33 reddenings E(B-V)M33. Equivalent widths or upper limits have been measured for the λ5780 DIB towards each star. DIBs were detected towards 20 stars, demonstrating that their carriers are abundant in M33. The relationship with reddening is found to be at the upper end of the range observed in the Milky Way. The line of sight towards one star has an unusually strong ratio of DIB equivalent width to E(B-V)M33, and a total of seven DIBs were detected towards this star.
We present a preliminary analysis of a set of optical (3800-8800 Å) high resolution (R = 80,000) spectra for 69 diffuse interstellar band targets. We carried out a sensitive search for interstellar features in the wavelength range 8470-8740 Å that will be covered by the upcoming GAIA mission. We also investigate how the λ8620Å DIB strength varies as a function of other interstellar parameters (other DIBs, E(B-V) and atomic and molecular column densities).
We present a road map of several research avenues leading to the identification of the diffuse interstellar band carriers. The proposed programs represent some consensus among the DIB community, and will certainly take many years to complete. However, the scientific payoff will be huge, and will ultimately lead to the solution of the DIB problem.