We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Flight crews’ capacity to conduct take-off and landing in near zero visibility conditions has been partially addressed by advanced surveillance and cockpit display technology. This capability is yet to be realised within the context of manoeuvring aircraft within airport terminal areas. In this paper the performance and workload benefits of user-centre designed visual and haptic taxi navigational cues, presented via a head-up display (HUD) and active sidestick, respectively, were evaluated in simulated taxiing trials by 12 professional pilots. In addition, the trials sought to examine pilot acceptance of side stick nose wheel steering. The HUD navigational cues demonstrated a significant task-specific benefit by reducing centreline deviation during turns and the frequency of major taxiway deviations. In parallel, the visual cues reduced self-report workload. Pilot’s appraisal of nose wheel steering by sidestick was positive, and active sidestick cues increased confidence in the multimodal guidance construct. The study presents the first examination of how a multimodal display, combining visual and haptic cues, could support the safety and efficiency in which pilots are able to conduct a taxi navigation task in low-visibility conditions.
This study aimed to assess the effectiveness of an ENT simulation course for equipping foundation doctors with core ENT skills in preparation for an ENT senior house officer post.
Method
A total of 41 foundation doctors in the East of England participated in our two-part simulation course. Pre- and post-course surveys, consisting of Likert scales and a Dundee Ready Educational Environment Measure, were sent to assess confidence in core ENT skills and acceptability of course format.
Results
Post-simulation, confidence improved in all core ENT skills taught (p < 0.001), along with confidence and preparedness to work as an ENT senior house officer (p < 0.001). Overall course median Dundee Ready Educational Environment Measure score was 48, and 100 per cent of participants would recommend this course to colleagues.
Conclusion
Simulation improves foundation doctors’ confidence in core ENT skills and increases preparedness for working as an ENT senior house officer. Guidance on core ENT skills requirements should be made available to improve uniformity amongst ENT simulation courses.
To determine the incidence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection among healthcare personnel (HCP) and to assess occupational risks for SARS-CoV-2 infection.
Design:
Prospective cohort of healthcare personnel (HCP) followed for 6 months from May through December 2020.
Setting:
Large academic healthcare system including 4 hospitals and affiliated clinics in Atlanta, Georgia.
Participants:
HCP, including those with and without direct patient-care activities, working during the coronavirus disease 2019 (COVID-19) pandemic.
Methods:
Incident SARS-CoV-2 infections were determined through serologic testing for SARS-CoV-2 IgG at enrollment, at 3 months, and at 6 months. HCP completed monthly surveys regarding occupational activities. Multivariable logistic regression was used to identify occupational factors that increased the risk of SARS-CoV-2 infection.
Results:
Of the 304 evaluable HCP that were seronegative at enrollment, 26 (9%) seroconverted for SARS-CoV-2 IgG by 6 months. Overall, 219 participants (73%) self-identified as White race, 119 (40%) were nurses, and 121 (40%) worked on inpatient medical-surgical floors. In a multivariable analysis, HCP who identified as Black race were more likely to seroconvert than HCP who identified as White (odds ratio, 4.5; 95% confidence interval, 1.3–14.2). Increased risk for SARS-CoV-2 infection was not identified for any occupational activity, including spending >50% of a typical shift at a patient’s bedside, working in a COVID-19 unit, or performing or being present for aerosol-generating procedures (AGPs).
Conclusions:
In our study cohort of HCP working in an academic healthcare system, <10% had evidence of SARS-CoV-2 infection over 6 months. No specific occupational activities were identified as increasing risk for SARS-CoV-2 infection.
To describe the cumulative seroprevalence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies during the coronavirus disease 2019 (COVID-19) pandemic among employees of a large pediatric healthcare system.
Design, setting, and participants:
Prospective observational cohort study open to adult employees at the Children’s Hospital of Philadelphia, conducted April 20–December 17, 2020.
Methods:
Employees were recruited starting with high-risk exposure groups, utilizing e-mails, flyers, and announcements at virtual town hall meetings. At baseline, 1 month, 2 months, and 6 months, participants reported occupational and community exposures and gave a blood sample for SARS-CoV-2 antibody measurement by enzyme-linked immunosorbent assays (ELISAs). A post hoc Cox proportional hazards regression model was performed to identify factors associated with increased risk for seropositivity.
Results:
In total, 1,740 employees were enrolled. At 6 months, the cumulative seroprevalence was 5.3%, which was below estimated community point seroprevalence. Seroprevalence was 5.8% among employees who provided direct care and was 3.4% among employees who did not perform direct patient care. Most participants who were seropositive at baseline remained positive at follow-up assessments. In a post hoc analysis, direct patient care (hazard ratio [HR], 1.95; 95% confidence interval [CI], 1.03–3.68), Black race (HR, 2.70; 95% CI, 1.24–5.87), and exposure to a confirmed case in a nonhealthcare setting (HR, 4.32; 95% CI, 2.71–6.88) were associated with statistically significant increased risk for seropositivity.
Conclusions:
Employee SARS-CoV-2 seroprevalence rates remained below the point-prevalence rates of the surrounding community. Provision of direct patient care, Black race, and exposure to a confirmed case in a nonhealthcare setting conferred increased risk. These data can inform occupational protection measures to maximize protection of employees within the workplace during future COVID-19 waves or other epidemics.
Among 353 healthcare personnel in a longitudinal cohort in 4 hospitals in Atlanta, Georgia (May–June 2020), 23 (6.5%) had severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies. Spending >50% of a typical shift at the bedside (OR, 3.4; 95% CI, 1.2–10.5) and black race (OR, 8.4; 95% CI, 2.7–27.4) were associated with SARS-CoV-2 seropositivity.
Clostridium difficile, the most common cause of hospital-associated diarrhoea in developed countries, presents major public health challenges. The high clinical and economic burden from C. difficile infection (CDI) relates to the high frequency of recurrent infections caused by either the same or different strains of C. difficile. An interval of 8 weeks after index infection is commonly used to classify recurrent CDI episodes. We assessed strains of C. difficile in a sample of patients with recurrent CDI in Western Australia from October 2011 to July 2017. The performance of different intervals between initial and subsequent episodes of CDI was investigated. Of 4612 patients with CDI, 1471 (32%) were identified with recurrence. PCR ribotyping data were available for initial and recurrent episodes for 551 patients. Relapse (recurrence with same ribotype (RT) as index episode) was found in 350 (64%) patients and reinfection (recurrence with new RT) in 201 (36%) patients. Our analysis indicates that 8- and 20-week intervals failed to adequately distinguish reinfection from relapse. In addition, living in a non-metropolitan area modified the effect of age on the risk of relapse. Where molecular epidemiological data are not available, we suggest that applying an 8-week interval to define recurrent CDI requires more consideration.
Salmonella enterica serovar Wangata (S. Wangata) is an important cause of endemic salmonellosis in Australia, with human infections occurring from undefined sources. This investigation sought to examine possible environmental and zoonotic sources for human infections with S. Wangata in north-eastern New South Wales (NSW), Australia. The investigation adopted a One Health approach and was comprised of three complimentary components: a case–control study examining human risk factors; environmental and animal sampling; and genomic analysis of human, animal and environmental isolates. Forty-eight human S. Wangata cases were interviewed during a 6-month period from November 2016 to April 2017, together with 55 Salmonella Typhimurium (S. Typhimurium) controls and 130 neighbourhood controls. Indirect contact with bats/flying foxes (S. Typhimurium controls (adjusted odds ratio (aOR) 2.63, 95% confidence interval (CI) 1.06–6.48)) (neighbourhood controls (aOR 8.33, 95% CI 2.58–26.83)), wild frogs (aOR 3.65, 95% CI 1.32–10.07) and wild birds (aOR 6.93, 95% CI 2.29–21.00) were statistically associated with illness in multivariable analyses. S. Wangata was detected in dog faeces, wildlife scats and a compost specimen collected from the outdoor environments of cases’ residences. In addition, S. Wangata was detected in the faeces of wild birds and sea turtles in the investigation area. Genomic analysis revealed that S. Wangata isolates were relatively clonal. Our findings suggest that S. Wangata is present in the environment and may have a reservoir in wildlife populations in north-eastern NSW. Further investigation is required to better understand the occurrence of Salmonella in wildlife groups and to identify possible transmission pathways for human infections.
For livestock production systems to play a positive role in global food security, the balance between their benefits and disbenefits to society must be appropriately managed. Based on the evidence provided by field-scale randomised controlled trials around the world, this debate has traditionally centred on the concept of economic-environmental trade-offs, of which existence is theoretically assured when resource allocation is perfect on the farm. Recent research conducted on commercial farms indicates, however, that the economic-environmental nexus is not nearly as straightforward in the real world, with environmental performances of enterprises often positively correlated with their economic profitability. Using high-resolution primary data from the North Wyke Farm Platform, an intensively instrumented farm-scale ruminant research facility located in southwest United Kingdom, this paper proposes a novel, information-driven approach to carry out comprehensive assessments of economic-environmental trade-offs inherent within pasture-based cattle and sheep production systems. The results of a data-mining exercise suggest that a potentially systematic interaction exists between ‘soil health’, ecological surroundings and livestock grazing, whereby a higher level of soil organic carbon (SOC) stock is associated with a better animal performance and less nutrient losses into watercourses, and a higher stocking density with greater botanical diversity and elevated SOC. We contend that a combination of farming system-wide trials and environmental instrumentation provides an ideal setting for enrolling scientifically sound and biologically informative metrics for agricultural sustainability, through which agricultural producers could obtain guidance to manage soils, water, pasture and livestock in an economically and environmentally acceptable manner. Priority areas for future farm-scale research to ensure long-term sustainability are also discussed.
The aim of the present paper is to summarise current and future applications of dietary assessment technologies in nutrition surveys in developed countries. It includes the discussion of key points and highlights of subsequent developments from a panel discussion to address strengths and weaknesses of traditional dietary assessment methods (food records, FFQ, 24 h recalls, diet history with interviewer-assisted data collection) v. new technology-based dietary assessment methods (web-based and mobile device applications). The panel discussion ‘Traditional methods v. new technologies: dilemmas for dietary assessment in population surveys’, was held at the 9th International Conference on Diet and Activity Methods (ICDAM9), Brisbane, September 2015. Despite respondent and researcher burden, traditional methods have been most commonly used in nutrition surveys. However, dietary assessment technologies offer potential advantages including faster data processing and better data quality. This is a fast-moving field and there is evidence of increasing demand for the use of new technologies amongst the general public and researchers. There is a need for research and investment to support efforts being made to facilitate the inclusion of new technologies for rapid, accurate and representative data.
Evidence suggests some overlap between the pathological use of food and drugs, yet how impulsivity compares across these different clinical disorders remains unclear. Substance use disorders are commonly characterized by elevated impulsivity, and impulsivity subtypes may show commonalities and differences in various conditions. We hypothesized that obese subjects with binge-eating disorder (BED) and abstinent alcohol-dependent cohorts would have relatively more impulsive profiles compared to obese subjects without BED. We also predicted decision impulsivity impairment in obesity with and without BED.
Method.
Thirty obese subjects with BED, 30 without BED and 30 abstinent alcohol-dependent subjects and age- and gender-matched controls were tested on delay discounting (preference for a smaller immediate reward over a larger delayed reward), reflection impulsivity (rapid decision making prior to evidence accumulation) and motor response inhibition (action cancellation of a prepotent response).
Results.
All three groups had greater delay discounting relative to healthy volunteers. Both obese subjects without BED and alcohol-dependent subjects had impaired motor response inhibition. Only obese subjects without BED had impaired integration of available information to optimize outcomes over later trials with a cost condition.
Conclusions.
Delay discounting appears to be a common core impairment across disorders of food and drug intake. Unexpectedly, obese subjects without BED showed greater impulsivity than obese subjects with BED. We highlight the dissociability and heterogeneity of impulsivity subtypes and add to the understanding of neurocognitive profiles across disorders involving food and drugs. Our results have therapeutic implications suggesting that disorder-specific patterns of impulsivity could be targeted.
Village chickens have been kept for millennia under patronage of smallholder farmers. Our study was intended at dissecting the signature of artificial selection and ecological variation on morphological structures of Ethiopian village chickens. This report was based on visual traits of 798 chickens and a concise one-to-one interview of 399 farmers for their preferences on chicken morphology. Significant population-specific differences in morphological counts were commonly found for rare morphological variants. Most of them were frequently seen in Jarso chickens, while some of them unique to Jarso chickens. This might be explained by the effect of location-specific evolutionary forces and differences in their breeding histories. The high within population variation in the frequency of morphological counts was observed among these panmictic chicken populations largely evolved under uncontrolled mating. Single comb was not (less) preferred by majority of the farmers (93.8 percent); it was thus present at a low frequency (26.7 percent). Farmers have shown high preference for yellow shank (42.3 percent), which was then frequently observed (61.1 percent). The reported reasons for morphological likeness were visual appeal, market demand and cultural and religious values. The absence of significant variation in preferences for chicken morphology among communities between the two study sites was attributed to their multifunctional needs.
Silicon nanoparticles (Si NPs) were synthesized by plasma enhanced chemical vapor deposition (PECVD) using silane as a silicon source. Allylamine was used as passivation ligands to form water-soluble Si NPs. Finally, aqueous asymmetric flow field-flow fractionation was used to successfully separate the polydisperse Si NPs into monodisperse Si NP fractions.
β-Glucans have been identified as natural biomolecules with immunomodulatory activity. The first objective of the present study was to compare the effects of purified β-glucans derived from Laminariadigitata, L. hyperborea and Saccharomyces cerevisiae on piglet performance, selected bacterial populations and intestinal volatile fatty acid (VFA) production. The second aim was to compare the gene expression profiles of the markers of pro- and anti-inflammation in both unchallenged and lipopolysaccharide (LPS)-challenged ileal and colonic tissues. β-Glucans were included at 250 mg/kg in the diets. The β-glucans derived from L. hyperborea, L. digitata and S. cerevisiae all reduced the Enterobacteriaceae population (P < 0·05) without influencing the lactobacilli and bifidobacteria populations (P>0·05) in the ileum and colon. There was a significant interaction between gastrointestinal region and β-glucan source in the expression of cytokine markers, IL-1α (P < 0·001), IL-10 (P < 0·05), TNF-α (P < 0·05) and IL-17A (P < 0·001). β-Glucans did not stimulate any pro- or anti-inflammatory cytokine markers in the ileal epithelial cells. In contrast, the expression of a panel of pro- and anti-inflammatory cytokines (IL-1α, IL-10, TNF-α and IL-17A) was down-regulated in the colon following exposure to β-glucans from all the three sources. However, the data suggest that the soluble β-glucans derived from L. digitata may be acting via a different mechanism from the insoluble β-glucans derived from L. hyperborea and S. cerevisiae, as the VFA profile was different in the L. digitata-treated animals. There was an increase in IL-8 gene expression (P < 0·05) in the gastrointestinal tract from the animals exposed to L. digitata following an LPS ex vivo challenge that was not evident in the other two treatment groups. In conclusion, β-glucans from both seaweed and yeast sources reduce Enterobacteriaceae counts and pro-inflammatory markers in the colon, though the mechanisms of action may be different between the soluble and insoluble fibre sources.
Salmonella mbandaka was isolated from cattle on three dairy farms. The duration of infection was less than four weeks and none of the animals became clinically ill. The animals had all consumed a diet containing a vegetable fat supplement contaminated with S. mbandaka and this was shown to be the source of the infections. It is significant that a feed containing purely vegetable components was incriminated.