We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study examined whether supplementation with collagen peptides (CP) affects appetite and post-exercise energy intake in healthy active females.
In this randomised, double-blind crossover study, 15 healthy females (23 ± 3 y) consumed 15 g/day of CP or a taste matched non-energy control (CON) for 7 days. On day 7, participants cycled for 45 min at ∼55% Wmax, before consuming the final supplement. Sixty min post supplementation an ad libitum meal was provided, and energy intake recorded. Subjective appetite sensations were measured daily for 6 days (pre- and 30 min post-supplement), and pre (0 min) to 280 min post-exercise on day 7. Blood glucose and hormone concentrations (total ghrelin, glucagon-like peptide-1 (GLP-1), and peptide YY (PYY), cholecystokinin (CCK), dipeptidyl peptidase-4 (sDPP4), leptin, and insulin, were measured fasted at baseline (day 0), then pre-breakfast (0 min), post-exercise (100 min), post-supplement (115, 130, 145, 160 min) and post-meal (220, 280 min) on day 7.
Ad-libitum energy intake was ∼10% (∼41kcal) lower in the CP trial (P=0.037). There was no difference in gastrointestinal symptoms or subjective appetite sensations throughout the trial (P≥0.412). Total plasma GLP-1 (area under the curve, CON: 6369±2330; CP: 9064±3021 pmol/L; P<0.001) and insulin (+80% at peak) were higher after CP (P<0.001). Plasma ghrelin and leptin were lower in CP (condition effect; P≤0.032). PYY, CCK, sDPP4 and glucose were not different between CP and placebo (P≥0.100).
CP supplementation following exercise increased GLP-1 and insulin concentrations and reduced ad libitum energy intake at a subsequent meal in physically active females.
Anaemia is characterised by low hemoglobin (Hb) concentration. Despite being a public health concern in Ethiopia, the role of micronutrients and non-nutritional factors as a determinant of Hb concentrations has been inadequately explored. This study focused on the assessment of serum micronutrient and Hb concentrations and a range of non-nutritional factors, to evaluate their associations with the risk of anaemia among the Ethiopian population (n 2046). It also explored the mediation effect of Zn on the relation between se and Hb. Bivariate and multivariate regression analyses were performed to identify the relationship between serum micronutrients concentration, inflammation biomarkers, nutritional status, presence of parasitic infection and socio-demographic factors with Hb concentration (n 2046). Sobel–Goodman test was applied to investigate the mediation of Zn on relations between serum se and Hb. In total, 18·6 % of participants were anaemic, 5·8 % had iron deficiency (ID), 2·6 % had ID anaemia and 0·6 % had tissue ID. Younger age, household head illiteracy and low serum concentrations of ferritin, Co, Cu and folate were associated with anaemia. Serum se had an indirect effect that was mediated by Zn, with a significant effect of se on Zn (P < 0·001) and Zn on Hb (P < 0·001). The findings of this study suggest the need for designing a multi-sectorial intervention to address anaemia based on demographic group.
Multiple micronutrient deficiencies are widespread in Ethiopia. However, the distribution of Se and Zn deficiency risks has previously shown evidence of spatially dependent variability, warranting the need to explore this aspect for wider micronutrients. Here, blood serum concentrations for Ca, Mg, Co, Cu and Mo were measured (n 3102) on samples from the Ethiopian National Micronutrient Survey. Geostatistical modelling was used to test spatial variation of these micronutrients for women of reproductive age, who represent the largest demographic group surveyed (n 1290). Median serum concentrations were 8·6 mg dl−1 for Ca, 1·9 mg dl−1 for Mg, 0·4 µg l−1 for Co, 98·8 µg dl−1 for Cu and 0·2 µg dl−1 for Mo. The prevalence of Ca, Mg and Co deficiency was 41·6 %, 29·2 % and 15·9 %, respectively; Cu and Mo deficiency prevalence was 7·6 % and 0·3 %, respectively. A higher prevalence of Ca, Cu and Mo deficiency was observed in north western, Co deficiency in central and Mg deficiency in north eastern parts of Ethiopia. Serum Ca, Mg and Mo concentrations show spatial dependencies up to 140–500 km; however, there was no evidence of spatial correlations for serum Co and Cu concentrations. These new data indicate the scale of multiple mineral micronutrient deficiency in Ethiopia and the geographical differences in the prevalence of deficiencies suggesting the need to consider targeted responses during the planning of nutrition intervention programmes.
To develop and test–retest the reproducibility of an ethnic-specific FFQ to estimate nutrient intakes for South Asians (SA) in New Zealand (NZ).
Design:
Using culturally appropriate methods, the NZFFQ, a validated dietary assessment tool for NZ adults, was modified to include SA food items by analysing foods consumed by SA participants of the Adult Nutrition Survey, in-person audit of ethnic food stores and a web scan of ethnic food store websites in NZ. This was further refined via three focus group discussions, and the resulting New Zealand South Asian Food Frequency Questionnaire (NZSAFFQ) was tested for reproducibility.
Setting:
Auckland and Dunedin, NZ.
Participants:
Twenty-nine and 110 males and females aged 25–59 years of SA ethnicity participated in the focus group discussions and the test–retest, respectively.
Results:
The development phase resulted in a SA-specific FFQ comprising of 11 food groups and 180 food items. Test–retest of the NZSAFFQ showed good reproducibility between the two FFQ administrations, 6 months apart. Most reproducibility coefficients were within or higher than the acceptable range of 0·5–0·7. The lowest intraclass correlation coefficients (ICC) were observed for β-carotene (0·47), vitamin B12 (0·50), fructose (0·55), vitamin C (0·57) and selenium (0·58), and the highest ICC were observed for alcohol (0·81), iodine (0·79) and folate (0·77). The ICC for fat ranged from 0·70 for saturated fats to 0·77 for polyunsaturated fats. The ICC for protein and energy were 0·68 and 0·72, respectively.
Conclusions:
The developed FFQ showed good reproducibility to estimate nutrient intakes and warrants the need for validation of the instrument.
Compulsory admission procedures of patients with mental disorders vary between countries in Europe. The Ethics Committee of the European Psychiatric Association (EPA) launched a survey on involuntary admission procedures of patients with mental disorders in 40 countries to gather information from all National Psychiatric Associations that are members of the EPA to develop recommendations for improving involuntary admission processes and promote voluntary care.
Methods.
The survey focused on legislation of involuntary admissions and key actors involved in the admission procedure as well as most common reasons for involuntary admissions.
Results.
We analyzed the survey categorical data in themes, which highlight that both medical and legal actors are involved in involuntary admission procedures.
Conclusions.
We conclude that legal reasons for compulsory admission should be reworded in order to remove stigmatization of the patient, that raising awareness about involuntary admission procedures and patient rights with both patients and family advocacy groups is paramount, that communication about procedures should be widely available in lay-language for the general population, and that training sessions and guidance should be available for legal and medical practitioners. Finally, people working in the field need to be constantly aware about the ethical challenges surrounding compulsory admissions.
To evaluate total usual intakes and biomarkers of micronutrients, overall dietary quality and related health characteristics of US older adults who were overweight or obese compared with a healthy weight.
Design:
Cross-sectional study.
Setting:
Two 24-h dietary recalls, nutritional biomarkers and objective and subjective health characteristic data were analysed from the National Health and Nutrition Examination Survey 2011–2014. We used the National Cancer Institute method to estimate distributions of total usual intakes from foods and dietary supplements for eleven micronutrients of potential concern and the Healthy Eating Index (HEI)-2015 score.
Participants:
Older adults aged ≥60 years (n 2969) were categorised by sex and body weight status, using standard BMI categories. Underweight individuals (n 47) were excluded due to small sample size.
Results:
A greater percentage of obese older adults compared with their healthy-weight counterparts was at risk of inadequate Mg (both sexes), Ca, vitamin B6 and vitamin D (women only) intakes. The proportion of those with serum 25-hydroxyvitamin D < 40 nmol/l was higher in obese (12 %) than in healthy-weight older women (6 %). Mean overall HEI-2015 scores were 8·6 (men) and 7·1 (women) points lower in obese than in healthy-weight older adults. In addition, compared with healthy-weight counterparts, obese older adults were more likely to self-report fair/poor health, use ≥ 5 medications and have limitations in activities of daily living and cardio-metabolic risk factors; and obese older women were more likely to be food-insecure and have depression.
Conclusions:
Our findings suggest that obesity may coexist with micronutrient inadequacy in older adults, especially among women.
Introduction: Lacerations are common in children presenting to the emergency department (ED). They are often uncooperative when sutures are needed and may require procedural sedation. Few studies have evaluated intranasal (IN) ketamine for procedural sedation in children, with doses from 3 to 9 mg/kg used mostly for dental procedures. In a previous dose escalation trial, DosINK-1, 6 mg/kg was found to be the optimal IN ketamine dose for procedural sedation for sutures in children. In this trial, we aim to further evaluate the efficacy of this dose. Methods: We conducted a multicentre single-arm clinical trial. A convenience sample of 30 uncooperative children between 1 and 12 years (10 to 30 kg) with no cardiac or kidney disease, active respiratory infection, prior administration of opioid or sedative agents received 6 mg/kg of IN ketamine using an atomizer for their laceration repair with sutures in the ED. The primary outcome was defined as the proportion (95% CI) of patients who achieved an adequate procedural sedation evaluated with the PERC/PECARN consensus criteria. Results: Thirty patients were recruited from April 2018 to November 2019 in 2 pediatric ED. The median age was 3.2 (interquartile range(IQR), 1.9 to 4.7) years-old with laceration of more than 2 cm in 20 (67%) patients and in the face in 21 (70%) cases. Sedation was effective in 18 out of 30 children 60% (95%CI, 45 to 80), was suboptimal in 6 patients (20%) with a procedure completed with minimal difficulties, and unsuccessful in the remaining 6 (20%), all without serious adverse event. Similarly, 21/30 (70%) physicians were willing to reuse IN ketamine at the same doses and 25 parents (83%) would agree to the same sedation in the future. Median time to return to baseline status was 58 min (IQR, 33 to 73). One patient desaturated during the procedure and required transitory oxygen and repositioning. After the procedure, 1 (3%) patient had headache, 1 (3%) patient had nausea, and 2 (7%) patients vomited. Conclusion: A single dose of 6 mg/kg of IN Ketamine for laceration repair with sutures in uncooperative children is safe and facilitated the procedure in 60% (95%CI, 45 to 80) of patients, was suboptimal in 20% and unsuccessful in 20% of patients. As seen with IV ketamine, an available additional dose of IN ketamine for some children if needed could potentially increase proportion of successful sedation. However, the safety and efficacy of repeated doses needs to be addressed.
Introduction: Venipuncture is a frequent cause of pain and distress in the pediatric emergency department (ED). Distraction, which can improve patient experience, remains the most studied psychological intervention. Virtual reality (VR) is a method of immersive distraction that can contribute to the multi-modal management of procedural pain and distress. Methods: The main objectives of this study were to determine the feasibility and acceptability of Virtual Reality (VR) distraction for pain management associated with venipunctures and to examine its preliminary effects on pain and distress in the pediatric ED. Children 7-17 years requiring a venipuncture in the pediatric ED were recruited. Participants were randomized to either a control group (standard care) or intervention group (standard of care + VR). Principal clinical outcome was the mean level of procedural pain, measured by the verbal numerical rating scale (VNRS). Distress was also measured using the Child Fear Scale (CFS) and the Procedure Behavior Check List (PBCL) and memory of pain using the VNRS. Side effects were documented. Results: A total of 63 patients were recruited. Results showed feasibility and acceptability of VR in the PED and overall high satisfaction levels (79% recruitment rate of eligible families, 90% rate of VR game completion, and overall high mean satisfaction levels). There was a significantly higher level of satisfaction among healthcare providers in the intervention group, and 93% of those were willing to use this technology again for the same procedure. Regarding clinical outcomes, no significant difference was observed between groups on procedural pain. Distress evaluated by proxy (10/40 vs 13.2/40, p = 0.007) and memory of pain at 24 hours (2.4 vs 4.2, p = 0.027) were significantly lower in the VR group. Venipuncture was successful on first attempt in 23/31 patients (74%) in the VR group and 15/30 (50%) patients in the control group (p = 0.039). Five of the 31 patients (16%) in the VR group reported side effects Conclusion: The addition of VR to standard care is feasible and acceptable for pain and distress management during venipunctures in the pediatric ED. There was no difference in self-reported procedural pain between groups. Levels of procedural distress and memory of pain at 24 hours were lower in the VR group.
Selenium (Se) is an essential element for human health. However, our knowledge of the prevalence of Se deficiency is less than for other micronutrients of public health concern such as iodine, iron and zinc, especially in sub-Saharan Africa (SSA). Studies of food systems in SSA, in particular in Malawi, have revealed that human Se deficiency risks are widespread and influenced strongly by geography. Direct evidence of Se deficiency risks includes nationally representative data of Se concentrations in blood plasma and urine as population biomarkers of Se status. Long-range geospatial variation in Se deficiency risks has been linked to soil characteristics and their effects on the Se concentration of food crops. Selenium deficiency risks are also linked to socio-economic status including access to animal source foods. This review highlights the need for geospatially-resolved data on the movement of Se and other micronutrients in food systems which span agriculture–nutrition–health disciplinary domains (defined as a GeoNutrition approach). Given that similar drivers of deficiency risks for Se, and other micronutrients, are likely to occur in other countries in SSA and elsewhere, micronutrient surveillance programmes should be designed accordingly.
In the present study, we aimed to compare anthropometric indicators as predictors of mortality in a community-based setting.
Design:
We conducted a population-based longitudinal study nested in a cluster-randomized trial. We assessed weight, height and mid-upper arm circumference (MUAC) on children 12 months after the trial began and used the trial’s annual census and monitoring visits to assess mortality over 2 years.
Setting:
Niger.
Participants:
Children aged 6–60 months during the study.
Results:
Of 1023 children included in the study at baseline, height-for-age Z-score, weight-for-age Z-score, weight-for-height Z-score and MUAC classified 777 (76·0 %), 630 (61·6 %), 131 (12·9 %) and eighty (7·8 %) children as moderately to severely malnourished, respectively. Over the 2-year study period, fifty-eight children (5·7 %) died. MUAC had the greatest AUC (0·68, 95 % CI 0·61, 0·75) and had the strongest association with mortality in this sample (hazard ratio = 2·21, 95 % CI 1·26, 3·89, P = 0·006).
Conclusions:
MUAC appears to be a better predictor of mortality than other anthropometric indicators in this community-based, high-malnutrition setting in Niger.
Hurricane Maria caused catastrophic damage in Puerto Rico, increasing the risk for morbidity and mortality in the post-impact period. We aimed to establish a syndromic surveillance system to describe the number and type of visits at 2 emergency health-care settings in the same hospital system in Ponce, Puerto Rico.
Methods:
We implemented a hurricane surveillance system by interviewing patients with a short questionnaire about the reason for visit at a hospital emergency department and associated urgent care clinic in the 6 mo after Hurricane Maria. We then evaluated the system by comparing findings with data from the electronic medical record (EMR) system for the same time period.
Results:
The hurricane surveillance system captured information from 5116 participants across the 2 sites, representing 17% of all visits captured in the EMR for the same period. Most visits were associated with acute illness/symptoms (79%), followed by injury (11%). The hurricane surveillance and EMR data were similar, proportionally, by sex, age, and visit category.
Conclusions:
The hurricane surveillance system provided timely and representative data about the number and type of visits at 2 sites. This system, or an adapted version using available electronic data, should be considered in future disaster settings.
The aim of this study was to describe individuals seeking care for injury at a major emergency department (ED) in southern Puerto Rico in the months after Hurricane Maria on September 20, 2017.
Methods:
After informed consent, we used a modified version of the Natural Disaster Morbidity Surveillance Form to determine why patients were visiting the ED during October 16, 2017–March 28, 2018. We analyzed visits where injury was reported as the primary reason for visit and whether it was hurricane-related.
Results:
Among 5 116 patients, 573 (11%) reported injury as the primary reason for a visit. Of these, 10% were hurricane-related visits. The most common types of injuries were abrasions, lacerations, and cuts (43% of all injury visits and 50% of hurricane-related visits). The most common mechanisms of injury were falls, slips, trips (268, 47%), and being hit by/or against an object (88, 15%). Most injury visits occurred during the first 3 months after the hurricane.
Conclusions:
Surveillance after Hurricane Maria identified injury as the reason for a visit for about 1 in 10 patients visiting the ED, providing evidence on the patterns of injuries in the months following a hurricane. Public health and emergency providers can use this information to anticipate health care needs after a disaster.
Human alteration of the planet’s terrestrial landscapes for agriculture, habitation and commerce is reshaping wildlife communities. The threat of land cover change to wildlife is pronounced in regions with rapidly growing human populations. We investigated how species richness and species-specific occurrence of bats changed as a function of land cover and canopy (tree) cover across a rapidly changing region of Florida, USA. Contrary to our predictions, we found negligible effects of agriculture and urban development on the occurrence of all species. In contrast, we found that a remotely sensed metric of canopy cover on a broad scale (25 km2) was a good predictor of the occurrence of eight out of ten species. The occurrence of all smaller bats (vespertilionids) in our study increased with 0–50% increases in canopy cover, while larger bats showed different patterns. Occurrence of Brazilian free-tailed bats (Tadarida brasiliensis) decreased with increasing canopy cover, and Florida bonneted bats (Eumops floridanus) were not influenced by canopy cover. We conclude that remotely sensed measures of canopy cover can provide a more reliable predictor of bat species richness than land-cover types, and efforts to prevent the loss of bat diversity should consider maintaining canopy cover across mosaic landscapes with diverse land-cover types.
The Rockeskyll complex in the north, central part of the Quaternary West Eifel volcanic field encapsulates an association of carbonatite, nephelinite and phonolite. The volcanic complex is dominated by three eruptive centres, which are distinct in their magma chemistry and their mode of emplacement. The Auf Dickel diatreme forms one centre and has erupted the only known carbonatite in the West Eifel, along with a broad range of alkaline rock types. Extrusive carbonatitic volcanism is represented by spheroidal autoliths, which preserve an equilibrium assemblage. The diatreme has also erupted xenoliths of calcite-bearing feldspathoidal syenite, phonolite and sanidine and clinopyroxene megacrysts, which are interpreted as fragments of a sub-volcanic complex. The carbonate phase of volcanism has several manifestations; extrusive lapilli, recrystallized ashes and calcite-bearing syenites, fragmented during diatreme emplacement.
A petrogenetic link between carbonatites and alkali mafic magmas is confirmed from Sr and Nd isotope systematics, and an upper mantle origin for the felsic rocks is suggested. The chemistry and mineralogy of mantle xenoliths erupted throughout the West Eifel indicate enrichment in those elements incompatible in the mantle. In addition, the evidence from trace element signatures and melts trapped as glasses support interaction between depleted mantle and small volume carbonate and felsic melts. This close association between carbonate and felsic melts in the mantle is mirrored in the surface eruptives of Auf Dickel and at numerous alkaline-carbonatite provinces worldwide.
Teaching undergraduate students, mentoring graduate students, and generating publishable research are distinct tasks for many political scientists. This article highlights lessons for merging these activities through experiences from an initiative that sparked a series of collaborative-research projects focused on opinions about crime and punishment in the United States. This article describes three collaborative projects conducted between 2015 and 2017 to demonstrate how to merge undergraduate teaching, graduate training, and producing research. By participating in these projects, students learned about social-scientific research through hands-on experiences designing experiments, collecting and analyzing original data, and reporting empirical findings to a public audience. This approach is an effective way to engage students and generate research that can advance professional goals.
Introduction: Laceration is common in children presenting to the emergency department (ED). They are often uncooperative related to pain and distressed during repair. Currently, there are wide variations regarding sedation and analgesia practices when sutures are required. There is a growing interest in the intranasal (IN) route for procedural sedation and pain control because of its effectiveness potential and ease of administration. Few studies have evaluated IN ketamine for procedural sedation in children with reported doses ranging from 3 to 9 mg/kg. The objective is to evaluate the optimal IN ketamine dose for effective and safe procedural sedation for laceration repair in children aged 1 to 12 years. Methods: A dose escalation clinical trial with an initial dose of 3 mg/kg of IN ketamine up to a maximum dose of 9 mg/kg in children 1 to 12 years old, using a 3+3 trial design. For each tested dose, 3 patients are enrolled. Escalation to the next dose is permitted if sedation is unsuccessful in at least one patient without serious adverse event (SAE). Regression to prior dose is warranted in the occurrence of two or more SEAs. This process is repeated until effective sedation for 6 patients at two consecutive doses is achieved with a maximum of 1 SAE or if regression occurs. The primary outcome is the optimal dose for successful procedural sedation as per the PERC/PECARN consensus criteria. Secondary outcome, namely, pain and anxiety levels, parent, patient and provider satisfaction, recovery time, length of stay in the ED, side effects and adverse event are recorded. Results: Nine patients have been recruited from March to December 2017 with median age of 2.9 years-old and with laceration length of 2 to 5 cm and with facial involvement in 55% of cases, respectively. Sedation was successful in 1/3, 1/3 and 3/3 of patients at doses of 3, 4, 5 mg/kg respectively, without any SAE. Median time from ketamine administration to return to baseline status and discharge were 35 and 98 min, respectively. We expect to complete patient recruitment in March 2018. Conclusion: The results from our trial is a groundwork for future dose-finding study. Pending study completion, a multicentric dose validation trial, is set up to further validate the optimal dose from dosINK1 trial. IN ketamine has the potential to improve the field of procedural sedation for children by introducing an effective IN agent with respiratory stability but without the need for an IV line insertion not otherwise needed.
Field studies were conducted to compare the effectiveness of PRE and POST applications of a prepackaged mixture of flufenacet plus metribuzin with that of diclofop for winter wheat tolerance and control of Italian ryegrass. Additional studies investigated the effectiveness of reduced rates of flufenacet plus metribuzin applied POST to Italian ryegrass when wheat was in the spike stage. All PRE and POST applications of flufenacet plus metribuzin produced similar or greater injury to wheat and more consistent control of Italian ryegrass than PRE or POST applications of diclofop. PRE applications of flufenacet plus metribuzin controlled Italian ryegrass 73 to 77%, whereas POST applications controlled Italian ryegrass 77 to 99%. PRE applications of diclofop controlled Italian ryegrass 57%; POST application controlled Italian ryegrass 78%. Wheat injury from flufenacet plus metribuzin applications varied with application rate, cultivar, and year of application.