We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Host–parasite adaptation is crucial for evolutionary success of a parasite. The nematode Angiostrongylus cantonensis has a heterogenic life cycle involving molluscs as intermediate hosts and rats as definitive hosts. Several mollusc species are susceptible hosts of A. cantonensis, allowing the development of first-stage larvae (L1) into third-stage larvae (L3). Changes in the metabolism of infected molluscs have been demonstrated, disturbing regular routes and inducing host defence mechanisms. This study aimed to identify changes in the proteomic content of Phyllocaulis spp. mucus during A. cantonensis infection. Proteins were extracted from the mucus samples of infected and non-infected slugs and identified using liquid chromatography–tandem mass spectrometry. We found 26 up-regulated and 15 down-regulated proteins in infected slugs compared to non-infected slugs. Protein profiles are promising markers of parasite infection, and a better understanding of proteomic profiles during infection may help inform in vivo studies and promote new tools for the non-invasive identification of infected hosts.
The marketing of unhealthy foods has been implicated in poor diet and rising levels of obesity. Rapid developments in the digital food marketing ecosystem and associated research mean that contemporary review of the evidence is warranted. This preregistered (CRD420212337091)1 systematic review and meta-analysis aimed to provide an updated synthesis of the evidence for behavioural and health impacts of food marketing on both children and adults, using the 4Ps framework (Promotion, Product, Price, Place). Ten databases were searched from 2014 to 2021 for primary data articles of quantitative or mixed design, reporting on one or more outcome of interest following food marketing exposure compared with a relevant control. Reviews, abstracts, letters/editorials and qualitative studies were excluded. Eighty-two studies were included in the narrative review and twenty-three in the meta-analyses. Study quality (RoB2/Newcastle–Ottawa scale) was mixed. Studies examined ‘promotion’ (n 55), ‘product’ (n 17), ‘price’ (n 15) and ‘place’ (n 2) (some > 1 category). There is evidence of impacts of food marketing in multiple media and settings on outcomes, including increased purchase intention, purchase requests, purchase, preference, choice, and consumption in children and adults. Meta-analysis demonstrated a significant impact of food marketing on increased choice of unhealthy foods (OR = 2·45 (95 % CI 1·41, 4·27), Z = 3·18, P = 0·002, I2 = 93·1 %) and increased food consumption (standardised mean difference = 0·311 (95 % CI 0·185, 0·437), Z = 4·83, P < 0·001, I2 = 53·0 %). Evidence gaps were identified for the impact of brand-only and outdoor streetscape food marketing, and for data on the extent to which food marketing may contribute to health inequalities which, if available, would support UK and international public health policy development.
Early life stress (ELS) and a Western diet (WD) promote mood and cardiovascular disorders, however, how these risks interact in disease pathogenesis is unclear. We assessed effects of ELS with or without a subsequent WD on behaviour, cardiometabolic risk factors, and cardiac function/ischaemic tolerance in male mice. Fifty-six new-born male C57BL/6J mice were randomly allocated to a control group (CON) undisturbed before weaning, or to maternal separation (3h/day) and early (postnatal day 17) weaning (MSEW). Mice consumed standard rodent chow (CON, n = 14; MSEW, n = 15) or WD chow (WD, n = 19; MSEW + WD, n = 19) from week 8 to 24. Fasted blood was sampled and open field test and elevated plus maze (EPM) tests undertaken at 7, 15, and 23 weeks of age, with hearts excised at 24 weeks for Langendorff perfusion (evaluating pre- and post-ischaemic function). MSEW alone transiently increased open field activity at 7 weeks; body weight and serum triglycerides at 4 and 7 weeks, respectively; and final blood glucose levels and insulin resistance at 23 weeks. WD increased insulin resistance and body weight gain, the latter potentiated by MSEW. MSEW + WD was anxiogenic, reducing EPM open arm activity vs. WD alone. Although MSEW had modest metabolic effects and did not influence cardiac function or ischaemic tolerance in lean mice, it exacerbated weight gain and anxiogenesis, and improved ischaemic tolerance in WD fed animals. MSEW-induced increases in body weight (obesity) in WD fed animals in the absence of changes in insulin resistance may have protected the hearts of these mice.
Parents’ responses to their children’s negative emotions are a central aspect of emotion socialization that have well-established associations with the development of psychopathology. Yet research is lacking on potential bidirectional associations between parental responses and youth symptoms that may unfold over time. Further, additional research is needed on sociocultural factors that may be related to the trajectories of these constructs. In this study, we examined associations between trajectories of parental responses to negative emotions and adolescent internalizing symptoms and the potential role of youth sex and racial identity. Adolescents and caregivers (N = 256) completed six assessments that spanned adolescent ages 13–18 years. Multivariate growth models revealed that adolescents with higher internalizing symptoms at baseline experienced increasingly non-supportive parental responses over time (punitive and distress responses). By contrast, parental responses did not predict initial levels of or changes in internalizing symptoms. Parents of Black youth reported higher minimization and emotion-focused responses and lower distress responses compared to parents of White youth. We found minimal evidence for sex differences in parental responses. Internalizing symptoms in early adolescence had enduring effects on parental responses to distress, suggesting that adolescents may play an active role in shaping their emotion socialization developmental context.
Aviation passenger screening has been used worldwide to mitigate the translocation risk of SARS-CoV-2. We present a model that evaluates factors in screening strategies used in air travel and assess their relative sensitivity and importance in identifying infectious passengers. We use adapted Monte Carlo simulations to produce hypothetical disease timelines for the Omicron variant of SARS-CoV-2 for travelling passengers. Screening strategy factors assessed include having one or two RT-PCR and/or antigen tests prior to departure and/or post-arrival, and quarantine length and compliance upon arrival. One or more post-arrival tests and high quarantine compliance were the most important factors in reducing pathogen translocation. Screening that combines quarantine and post-arrival testing can shorten the length of quarantine for travelers, and variability and mean testing sensitivity in post-arrival RT-PCR and antigen tests decrease and increase with the greater time between the first and second post-arrival test, respectively. This study provides insight into the role various screening strategy factors have in preventing the translocation of infectious diseases and a flexible framework adaptable to other existing or emerging diseases. Such findings may help in public health policy and decision-making in present and future evidence-based practices for passenger screening and pandemic preparedness.
A small number of studies have confirmed the advantage of generalised equivalent uniform dose (gEUD) optimisation for some standard clinical scenarios; however, its performance with complicated stereotactic treatments is yet to be explored. Therefore, this study compared two planning optimisation methods, gEUD and Physical dose, in stereotactic treatments for several complex anatomical locations.
Methods:
Thirty patients were selected, ten each for sites of brain, lung and spine. Two stereotactic plans were generated for each case using the gEUD objective and Physical objective cost functions. Within each of the three sites, dosimetric indices for conformity, gradient and homogeneity, along with parameters of monitor units and dose–volume histograms (DVHs), were compared for statistical significance. Additionally, patient-specific quality assurance was conducted using portal dosimetry, and the gamma passing rate between the two plans was evaluated.
Results:
Optimisation was better with a gEUD objective as compared with Physical objective, notably sparing critical organs. Overall, the differences in mean values for six critical organs at risk favoured gEUD-based over Physical-based plans (all six 2-tailed p-values were < 0·0002). Furthermore, all differences in mean values for DVH parameters favoured gEUD-based plans: GTVmean, GTVmax, PTVD100V, homogeneity index, gradient index and monitor unit (treatment time) (each 2-tailed p < 0·05).
Conclusions:
gEUD optimisation in stereotactic treatment plans has a clear and general statistical advantage over Physical dose optimisation.
In Southeast Greenland, summer melt and high winter snowfall rates give rise to firn aquifers: vast stores of meltwater buried beneath the ice-sheet surface. Previous detailed studies of a single Greenland firn aquifer site suggest that the water drains into crevasses, but this is not known at a regional scale. We develop and use a tool in Ghub, an online gateway of shared datasets, tools and supercomputing resources for glaciology, to identify crevasses from elevation data collected by NASA's Airborne Topographic Mapper across 29000 km2 of Southeast Greenland. We find crevasses within 3 km of the previously mapped downglacier boundary of the firn aquifer at 20 of 25 flightline crossings. Our data suggest that crevasses widen until they reach the downglacier boundary of the firn aquifer, implying that crevasses collect firn-aquifer water, but we did not find this trend with statistical significance. The median crevasse width, 27 meters, implies an aspect ratio consistent with the crevasses reaching the bed. Our results support the idea that most water in Southeast Greenland firn aquifers drains through crevasses. Less common fates are discharge at the ice-sheet surface (3 of 25 sites) and refreezing at the aquifer bottom (1 of 25 sites).
The history of maize in Central America and surrounding areas has implications for the slow transition from hunting and gathering to agriculture. The spread of early forms of domesticated maize from southern Mexico across Mesoamerica and into South America has been dated to about 8,700–6,500 years ago on the basis of a handful of studies relying primarily on the analysis of pollen, phytoliths, or starch grains. Recent genomic data from southern Belize have been used to identify Archaic period south-to-north population movements from lower Central America, suggesting this migration pattern as a mechanism that introduced genetically improved maize races from South America. Gradually, maize productivity increased to the point that it was suitable for use as a staple crop. Here we present a summary of paleoecological data that support the late and uneven entry of maize into the Maya area relative to other regions of Central America and identify the Pacific coastal margin as the probable route by which maize spread southward into Panama and South America. We consider some implications of the early appearance of maize for Late Archaic populations in these areas; for example, with respect to the establishment of sedentary village life.
Airway management is a controversial topic in modern Emergency Medical Services (EMS) systems. Among many concerns regarding endotracheal intubation (ETI), unrecognized esophageal intubation and observations of unfavorable neurologic outcomes in some studies raise the question of whether alternative airway techniques should be first-line in EMS airway management protocols. Supraglottic airway devices (SADs) are simpler to use, provide reliable oxygenation and ventilation, and may thus be an alternative first-line airway device for paramedics. In 2019, Alachua County Fire Rescue (ACFR; Alachua, Florida USA) introduced a novel protocol for advanced airway management emphasizing first-line use of a second-generation SAD (i-gel) for patients requiring medication-facilitated airway management (referred to as “rapid sequence airway” [RSA] protocol).
Study Objective:
This was a one-year quality assurance review of care provided under the RSA protocol looking at compliance and first-pass success rate of first-line SAD use.
Methods:
Records were obtained from the agency’s electronic medical record (EMR), searching for the use of the RSA protocol, advanced airway devices, or either ketamine or rocuronium. If available, hospital follow-up data regarding patient condition and emergency department (ED) airway exchange were obtained.
Results:
During the first year, 33 advanced airway attempts were made under the protocol by 23 paramedics. Overall, compliance with the airway device sequence as specified in the protocol was 72.7%. When ETI was non-compliantly used as first-line airway device, the first-pass success rate was 44.4% compared to 87.5% with adherence to first-line SAD use. All prehospital SADs were exchanged in the ED in a delayed fashion and almost exclusively per physician preference alone. In no case was the SAD exchanged for suspected dislodgement evidenced by lack of capnography.
Conclusion:
First-line use of a SAD was associated with a high first-pass attempt success rate in a real-life cohort of prehospital advanced airway encounters. No SAD required emergent exchange upon hospital arrival.
Automated virtual reality therapies are being developed to increase access to psychological interventions. We assessed the experience with one such therapy of patients diagnosed with psychosis, including satisfaction, side effects, and positive experiences of access to the technology. We tested whether side effects affected therapy.
Methods
In a clinical trial 122 patients diagnosed with psychosis completed baseline measures of psychiatric symptoms, received gameChange VR therapy, and then completed a satisfaction questionnaire, the Oxford-VR Side Effects Checklist, and outcome measures.
Results
79 (65.8%) patients were very satisfied with VR therapy, 37 (30.8%) were mostly satisfied, 3 (2.5%) were indifferent/mildly dissatisfied, and 1 (0.8%) person was quite dissatisfied. The most common side effects were: difficulties concentrating because of thinking about what might be happening in the room (n = 17, 14.2%); lasting headache (n = 10, 8.3%); and the headset causing feelings of panic (n = 9, 7.4%). Side effects formed three factors: difficulties concentrating when wearing a headset, feelings of panic using VR, and worries following VR. The occurrence of side effects was not associated with number of VR sessions, therapy outcomes, or psychiatric symptoms. Difficulties concentrating in VR were associated with slightly lower satisfaction. VR therapy provision and engagement made patients feel: proud (n = 99, 81.8%); valued (n = 97, 80.2%); and optimistic (n = 96, 79.3%).
Conclusions
Patients with psychosis were generally very positive towards the VR therapy, valued having the opportunity to try the technology, and experienced few adverse effects. Side effects did not significantly impact VR therapy. Patient experience of VR is likely to facilitate widespread adoption.
Physical prototyping during early stage design typically represents an iterative process. Commonly, a single prototype will be used throughout the process, with its form being modified as the design evolves. If the form of the prototype is not captured as each iteration occurs understanding how specific design changes impact upon the satisfaction of requirements is challenging, particularly retrospectively.
In this paper two different systems for digitising physical artefacts, structured light scanning (SLS) and photogrammetry (PG), are investigated as means for capturing iterations of physical prototypes. First, a series of test artefacts are presented and procedures for operating each system are developed. Next, artefacts are digitised using both SLS and PG and resulting models are compared against a master model of each artefact. Results indicate that both systems are able to reconstruct the majority of each artefact's geometry within 0.1mm of the master, however, overall SLS demonstrated superior performance, both in terms of completion time and model quality. Additionally, the quality of PG models was far more influenced by the effort and expertise of the user compared to SLS.
To assess the contribution of different food groups to total salt purchases and to evaluate the estimated reduction in salt purchases if mandatory maximum salt limits in South African legislation were being complied with.
Design:
This study conducted a cross-sectional analysis of purchasing data from Discovery Vitality members. Data were linked to the South African FoodSwitch database to determine the salt content of each food product purchased. Food category and total annual salt purchases were determined by summing salt content (kg) per each unit purchased across a whole year. Reductions in annual salt purchases were estimated by applying legislated maximum limits to product salt content.
Setting:
South Africa.
Participants:
The study utilised purchasing data from 344 161 households, members of Discovery Vitality, collected for a whole year between January and December 2018.
Results:
Vitality members purchased R12·8 billion worth of food products in 2018, representing 9562 products from which 264 583 kg of salt was purchased. The main contributors to salt purchases were bread and bakery products (23·3 %); meat and meat products (19 %); dairy (12·2 %); sauces, dressings, spreads and dips (11·8 %); and convenience foods (8·7 %). The projected total quantity of salt that would be purchased after implementation of the salt legislation was 250 346 kg, a reduction of 5·4 % from 2018 levels.
Conclusions:
A projected reduction in salt purchases of 5·4 % from 2018 levels suggests that meeting the mandatory maximum salt limits in South Africa will make a meaningful contribution to reducing salt purchases.
Global pork production has largely adopted on-farm biosecurity to minimize vectors of disease transmission and protect swine health. Feed and ingredients were not originally thought to be substantial vectors, but recent incidents have demonstrated their ability to harbor disease. The objective of this paper is to review the potential role of swine feed as a disease vector and describe biosecurity measures that have been evaluated as a way of maintaining swine health. Recent research has demonstrated that viruses such as porcine epidemic diarrhea virus and African Swine Fever Virus can survive conditions of transboundary shipment in soybean meal, lysine, and complete feed, and contaminated feed can cause animal illness. Recent research has focused on potential methods of preventing feed-based pathogens from infecting pigs, including prevention of entry to the feed system, mitigation by thermal processing, or decontamination by chemical additives. Strategies have been designed to understand the spread of pathogens throughout the feed manufacturing environment, including potential batch-to-batch carryover, thus reducing transmission risk. In summary, the focus on feed biosecurity in recent years is warranted, but additional research is needed to further understand the risk and identify cost-effective approaches to maintain feed biosecurity as a way of protecting swine health.
In the broad literature on the effects of ingredients, foods and diets on appetite and energy intake (EI), most trials involve a single acute intervention. It is unclear whether these acute results are generally sustained over longer periods. Researchers and regulators therefore lack an objective basis to judge the appropriate duration of efficacy trials in appetite control, to have confidence that acute effects are likely to be maintained. This gap creates uncertainty in requirements and study designs for the substantiation of satiety-enhancing approaches to help in controlling eating behaviour.
Materials and Methods:
A systematic search of literature (Prospero registration number CRD42015023686) identified studies testing both the acute and chronic effects of food-based interventions aimed at reducing appetite or EI. From 9680 unique records identified from titles and abstracts, 178 papers were selected for full screening. Twenty-six trials met the inclusion criteria and provided data sufficient for use in this analysis, and were also scored for risk of bias (RoB) indicators.
Results:
Most of these trials (21/26) measured appetite outcomes and over half (14/26) had objective measures of EI. A significant acute effect of the intervention was retained in 10 of 12 trials for appetite outcomes, and six of nine studies for EI. Initial effects were most likely retained where these were more robust and studies adequately powered. Where the initial, acute effect was not statistically significant, a significant effect was later observed in only two of nine studies for appetite and none of five studies for EI. The main sources of RoB were lack of a priori power calculations and failure to report analyses based on intention-to-treat. Furthermore, 12/26 studies were not adequately powered to detect a meaningful reduction in appetite (~10%).
Discussion:
Maintenance of acute intervention effects on appetite or EI need to be confirmed, but seems likely where the initially observed effects are robust and replicable in adequately powered studies.
In this research, we analyze the economic effects across various crop insurance subsidy and policy scenarios to determine producer insurance choice response, total premium and subsidy payments and study their economic implications on dryland corn, soybean, and winter wheat producers. We rely on the expected utility maximization framework to rank policy combination sets that are available to a typical producer to analyze the impacts of crop insurance subsidy changes and elimination of certain insurance policies across the three crops. Several scenarios were analyzed across subsidy and policy options and were found to have noticeably different farmer behavioral responses and economic implications.
Clonal Mycobacterium mucogenicum isolates (determined by molecular typing) were recovered from 19 bronchoscopic specimens from 15 patients. None of these patients had evidence of mycobacterial infection. Laboratory culture materials and bronchoscopes were negative for Mycobacteria. This pseudo-outbreak was caused by contaminated ice used to provide bronchoscopic lavage. Control was achieved by transitioning to sterile ice.
Canadian specialist emergency medicine (EM) residency training is undergoing the most significant transformation in its history. This article describes the rationale, process, and redesign of EM competency-based medical education. The rationale for this evolution in residency education includes 1) improved public trust by increasing transparency of the quality and rigour of residency education, 2) improved fiscal accountability to government and institutions regarding specialist EM training, 3) improved assessment systems to replace poor functioning end-of-rotation assessment reports and overemphasis on high-stakes, end-of-training examinations, and 4) and tailored learning for residents to address individualized needs. A working group with geographic and stakeholder representation convened over a 2-year period. A consensus process for decision-making was used. Four key design features of the new residency education design include 1) specialty EM-specific outcomes to be achieved in residency; 2) designation of four progressive stages of training, linked to required learning experiences and entrustable professional activities to be achieved at each stage; 3) tailored learning that provides residency programs and learner flexibility to adapt to local resources and learner needs; and 4) programmatic assessment that emphasizes systematic, longitudinal assessments from multiple sources, and sampling sentinel abilities. Required future study includes a program evaluation of this complex education intervention to ensure that intended outcomes are achieved and unintended outcomes are identified.
Starting in 2016, we initiated a pilot tele-antibiotic stewardship program at 2 rural Veterans Affairs medical centers (VAMCs). Antibiotic days of therapy decreased significantly (P < .05) in the acute and long-term care units at both intervention sites, suggesting that tele-stewardship can effectively support antibiotic stewardship practices in rural VAMCs.
Although there are extensive data on clinical psychopathology in youth with suicidal ideation, data are lacking regarding their neurocognitive function.
Aims
To characterise the cognitive profile of youth with suicidal ideation in a community sample and evaluate gender differences and pubertal status effects.
Method
Participants (N = 6151, age 11–21 years, 54.9% females) from the Philadelphia Neurodevelopmental Cohort, a non-help-seeking community sample, underwent detailed clinical evaluation. Cognitive phenotyping included executive functioning, episodic memory, complex reasoning and social cognitive functioning. We compared participants with suicidal ideation (N = 672) and without suicidal ideation (N = 5479). Regression models were employed to evaluate differences in cognitive performance and functional level, with gender and pubertal status as independent variables. Models controlled for lifetime depression or general psychopathology, and for covariates including age and socioeconomic status.
Results
Youth with suicidal ideation showed greater psychopathology, poorer level of function but better overall neurocognitive performance. Greater functional impairment was observed in females with suicidal ideation (suicidal ideation × gender interaction, t = 3.091, P = 0.002). Greater neurocognition was associated with suicidal ideation post-puberty (suicidal ideation × puberty interaction, t = 3.057, P = 0.002). Exploratory analyses of specific neurocognitive domains showed that suicidal ideation-associated cognitive superiority was more prominent in post-pubertal males compared with females (Cohen's d = 0.32 and d = 0.11, respectively) across all cognitive domains.
Conclusions
Suicidal ideation was associated with poorer functioning yet better cognitive performance, especially in post-pubertal males, as measured by a comprehensive cognitive battery. Findings point to gender and pubertal-status specificity in the relationship between suicidal ideation, cognition and function in youth.
Declaration of interest
R.B. serves on the scientific board and reports stock ownership in ‘Taliaz Health’, with no conflict of interest relevant to this work. M.A.O. receives royalties for the commercial use of the Columbia-Suicide Severity Rating Scale from the Research Foundation for Mental Hygiene. Her family owns stock in Bristol-Myers Squibb. All other authors declare no potential conflict of interest.
In the occurrence of dicamba drift, it is not well understood what measurements from soybean plants would correlate with damage to soybean offspring; therefore, possible relationships are of great interest. Sixteen drift trials were established in 2014 and 2015 at the Northeast Research and Extension Center in Keiser, AR. A single 8-m-wide by 30- or 60-m-long pass with a high-clearance sprayer was made in each soybean field, resulting in a dicamba drift event. Seeds were collected from plants in each drift trial and planted in the field in 2015 and 2016. Data were subjected to correlation analysis to determine pairwise associations among parent and offspring observations. Auxin-like symptomology in offspring consistent with dicamba, primarily as leaf cupping, appeared in plots at the unifoliate and first trifoliate stages. Auxin-like symptoms were more prevalent in offspring collected from plants from later reproductive stages as opposed to early reproductive stages. The highest correlation coefficients occurred when parent plants were treated at the R5 growth stage. Parent mature pod malformation was correlated with offspring emergence (r=−0.37, P=0.0082), vigor (r=−0.57, P ≤ 0.0001), injury (r=0.93, P ≤ 0.0001), and percent of plants injured (r=0.92, P ≤ 0.0001). This research documents that soybean damaged from dicamba drift during the R1 to R6 growth stages can negatively affect offspring and that occurrence of pod malformation after dicamba drift at the R5 growth stage may be indicative of injury to the offspring.