We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Inuit children have been observed to have high rates of macrocephaly, which leads to burdensome travel for medical evaluation, often with no pathology identified. Given reports that WHO growth charts may not reflect all populations, we compared head circumference (HC) measurements in a cohort of Inuit children with the WHO charts. Methods: We extracted HC data from a retrospective cohort study where, with Inuit partnership, we reviewed medical records of Inuit children, born between 2010-2013, and residing in Nunavut. We excluded children with preterm birth, documented neurologic/genetic disease, and most congenital anomalies. We compared HC values with the 2007 WHO charts. Results: We analyzed records of 1960 Inuit children (8866 data points). Most data were from ages 0-36 months. At all age points, the cohort had statistically significantly larger HC than WHO medians. At age 12 months, median HC were 1.3 cm and 1.5 cm larger for male and female Inuit children. Using WHO growth curves, macrocephaly was overdiagnosed and microcephaly underdiagnosed. Conclusions: Our results support the observation that Inuit children from Nunavut have larger HCs, and use of the WHO charts may lead to overdiagnosis of macrocephaly and underdiagnosis of microcephaly. Population specific growth curves for Inuit children should be considered.
Background: Efgartigimod, a human immunoglobulin (Ig)G1 antibody Fc fragment, blocks the neonatal Fc receptor, reducing IgGs involved in chronic inflammatory demyelinating polyneuropathy (CIDP). The multi-stage, double-blinded, placebo-controlled ADHERE (NCT04281472) and open-label extension ADHERE+ (NCT04280718) trials (interim analysis cutoff: February 16, 2024) assessed efgartigimod PH20 SC in participants with CIDP. Methods: Participants with active CIDP received open-label, weekly efgartigimod PH20 SC 1000 mg during ≤12-week run-in (stage-A). Responders were randomized (1:1) to efgartigimod or placebo for ≤48 weeks (stage-B). Participants with clinical deterioration in stage-B or who completed ADHERE entered ADHERE+. Week 36 changes from run-in baseline (CFB) in adjusted Inflammatory Neuropathy Cause and Treatment (aINCAT), Inflammatory Rasch-built Overall Disability Scale (I-RODS), and grip strength scores were evaluated. Results: Of 322 stage-A participants, 221 were randomized and treated in stage-B, and 99% entered ADHERE+. Mean CFB (SE) in aINCAT, I-RODS, and grip strength scores were -1.2 (0.15) and 8.8 (1.46) and 17.5 (2.02), respectively, at ADHERE+ Week 36 (N=150). Half the participants with clinical deterioration during ADHERE stage-B restabilized on efgartigimod from ADHERE+ Week 4. Conclusions: Interim results from ADHERE+ indicate long-term effectiveness of efgartigimod PH20 SC in clinical outcomes in participants with CIDP.
Background: Efgartigimod, a human immunoglobulin (Ig)G1 antibody Fc fragment, blocks the neonatal Fc receptor, reducing IgGs involved in chronic inflammatory demyelinating polyneuropathy (CIDP), a rare, progressive, immune-mediated disease that can lead to irreversible disability. The multi-stage, double-blinded, placebo-controlled ADHERE (NCT04281472) trial assessed efgartigimod PH20 SC in participants with CIDP. Methods: Participants with active CIDP received open-label, weekly efgartigimod PH20 SC 1000 mg during ≤12-week run-in (stage-A). Responders were randomized (1:1) to weekly efgartigimod or placebo for ≤48 weeks (stage-B). This posthoc analysis evaluated changes from run-in baseline (study enrollment) to stage-B last assessment and items of the Inflammatory Rasch-built Overall Disability Scale (I-RODS). Results: Of 322 participants who entered stage-A, 221 were randomized and treated in stage-B, and 191/221 had data for run-in baseline and post–stage-B timepoints. Mean (SE) I-RODS change at stage-B last assessment vs run-in baseline was 5.7 (1.88) and -4.9 (1.82) in participants randomized to efgartigimod and placebo, respectively. 37/97 (38.1%) and 24/92 (26.1%) participants randomized to efgartigimod and placebo, respectively, experienced ≥4-point improvements in I-RODS score. Efgartigimod-treated participants improved ≥1 point in I-RODS items of clinical interest. Conclusions: Participants who received efgartigimod in stage-B experienced improvements in I-RODS score from study enrollment to stage-B last assessment.
Objectives/Goals: Substantial evidence supports the use of community engagement in CTS. Yet, there is a lack of empirical basis for recommending a particular level of community engagement over others. We aimed to identify associations between level of community involvement and study process outcomes, focusing on procedures to promote enrollment and inclusion. Methods/Study Population: Using manifest content analysis, we analyzed community engagement (CEn) strategies of studies indexed in ClinicalTrials.gov, focusing on studies 1) associated with 20 medical schools located in 8 southern states in the Black Belt, 2) conducted in 2015–2019, and 3) on 7 topics: cancer, depression, anxiety, hypertension, substance use disorder, cardiovascular disease, and HIV/AIDS. Data source was the ClinicalTrials.gov entry and publication for each study. We categorized each study on level of community involvement as described by the study protocol CTSA Consortium Community Engagement Key Function Committee Task Force on the Principles of Community Engagement continuum. Outcomes included recruitment and representativeness. Other codes included funder type, study phase, study status, and time to enrollment. Results/Anticipated Results: Of 890 studies that met inclusion criteria, only 493 had published findings. 286 studies (58%) met enrollment targets. Only 9 studies described any level of CEn (1 outreach, 3 consult, 1 involvement, 3 collaboration, and 1 shared leadership). Time to enrollment for these 9 studies (mean 28.78 mos.) was shorter than for studies without CEn (mean 37.43 months) (n.s.). CEn studies reached significantly higher enrollment (CEn mean = 2395.11, non-CEn mean = 463.93), p Discussion/Significance of Impact: Results demonstrate the substantial effect of CEn on enrollment and inclusion in clinical studies. However, the infinitesimal number of studies that reported CEn did not allow comparisons of level of engagement on the outcomes. Findings highlight ethical questions surrounding the lack of publishing incomplete studies.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Sepiolite and attapulgite have been found to be common, sometimes the major, clay minerals in calcareous lacustrine deposits on the southern High Plains in West Texas and eastern New Mexico. Deflation debris derived from the basins and calcareous soils developed in the debris and in the lacustrine deposits also often contain either or both minerals. Dolomite is the carbonate commonly associated with sepiolite and calcite has a similar relationship to attapulgite in the lacustrine deposits. Pedogenic formation of sepiolite and attapulgite appears unlikely in the area studied since an association with lacustrine materials was made in a very high percentage of the occurrences.
Sepiolite was found to be highly concentrated in the < 0•2μ fraction. A similar, but less pronounced, distribution was noted for attapulgite. The studies suggest that the minerals have developed authigenically in alkaline lacustrine environments during periods of desiccation. Such an environment, interrupted by more humid periods, would have obtained during dry Pleistocene intervals. Volcanic ash is suggested as the source of the essential silica. The Mg concentration would appear to determine whether sepiolite-dolomite or attapulgite-calcite were formed.
Background: Efgartigimod, a human immunoglobulin G (IgG)1 antibody Fc fragment, blocks the neonatal Fc receptor, decreasing IgG recycling and reducing pathogenic IgG autoantibody levels. ADHERE assessed the efficacy and safety of efgartigimod PH20 subcutaneous (SC; co-formulated with recombinant human hyaluronidase PH20) in chronic inflammatory demyelinating polyneuropathy (CIDP). Methods: ADHERE enrolled participants with CIDP (treatment naive or on standard treatments withdrawn during run-in period) and consisted of open-label Stage A (efgartigimod PH20 SC once weekly [QW]), and randomized (1:1) Stage B (efgartigimod or placebo QW). Primary outcomes were clinical improvement (assessed with aINCAT, I-RODS, or mean grip strength; Stage A) and time to first aINCAT score deterioration (relapse; Stage B). Secondary outcomes included treatment-emergent adverse events (TEAEs) incidence. Results: 322 participants entered Stage A. 214 (66.5%) were considered responders, randomized, and treated in Stage B. Efgartigimod significantly reduced the risk of relapse (HR: 0.394; 95% CI: 0.25–0.61) versus placebo (p=0.000039). Reduced risk of relapse occurred in participants receiving corticosteroids, intravenous or SC immunoglobulin, or no treatment before study entry. Most TEAEs were mild to moderate; 3 deaths occurred, none related to efgartigimod. Conclusions: Participants treated with efgartigimod PH20 SC maintained a clinical response and remained relapse-free longer than those treated with placebo.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Despite becoming increasingly represented in academic departments, women scholars face a critical lack of support as they navigate demands pertaining to pregnancy, motherhood, and child caregiving. In addition, cultural norms surrounding how faculty and academic leaders discuss and talk about tenure, promotion, and career success have created pressure for women who wish to grow their family and care for their children, leading to questions about whether it is possible for these women to have a family and an academic career. This paper is a call to action for academia to build structures that support professors who are women as they navigate the complexities of pregnancy, the postpartum period, and the caregiving demands of their children. We specifically call on those of us in I-O psychology, management, and related departments to lead the way. In making this call, we first present the realistic, moral, and financial cases for why this issue needs to be at the forefront of discussions surrounding success in the academy. We then discuss how, in the U.S. and elsewhere, an absence of policies supporting women places two groups of academics—department heads (as the leaders of departments who have discretion outside of formal policies to make work better for women) and other faculty members (as potential allies both in the department and within our professional organizations)—in a critical position to enact support and change. We conclude with our boldest call—to make a cultural shift that shatters the assumption that having a family is not compatible with academic success. Combined, we seek to launch a discussion that leads directly to necessary and overdue changes in how women scholars are supported in academia.
To describe the genomic analysis and epidemiologic response related to a slow and prolonged methicillin-resistant Staphylococcus aureus (MRSA) outbreak.
Design:
Prospective observational study.
Setting:
Neonatal intensive care unit (NICU).
Methods:
We conducted an epidemiologic investigation of a NICU MRSA outbreak involving serial baby and staff screening to identify opportunities for decolonization. Whole-genome sequencing was performed on MRSA isolates.
Results:
A NICU with excellent hand hygiene compliance and longstanding minimal healthcare-associated infections experienced an MRSA outbreak involving 15 babies and 6 healthcare personnel (HCP). In total, 12 cases occurred slowly over a 1-year period (mean, 30.7 days apart) followed by 3 additional cases 7 months later. Multiple progressive infection prevention interventions were implemented, including contact precautions and cohorting of MRSA-positive babies, hand hygiene observers, enhanced environmental cleaning, screening of babies and staff, and decolonization of carriers. Only decolonization of HCP found to be persistent carriers of MRSA was successful in stopping transmission and ending the outbreak. Genomic analyses identified bidirectional transmission between babies and HCP during the outbreak.
Conclusions:
In comparison to fast outbreaks, outbreaks that are “slow and sustained” may be more common to units with strong existing infection prevention practices such that a series of breaches have to align to result in a case. We identified a slow outbreak that persisted among staff and babies and was only stopped by identifying and decolonizing persistent MRSA carriage among staff. A repeated decolonization regimen was successful in allowing previously persistent carriers to safely continue work duties.
Magnetite is a common mineral in the Paleoproterozoic Stollberg Zn–Pb–Ag plus magnetite ore field (~6.6 Mt of production), which occurs in 1.9 Ga metamorphosed felsic and mafic rocks. Mineralisation at Stollberg consists of magnetite bodies and massive to semi-massive sphalerite–galena and pyrrhotite (with subordinate pyrite, chalcopyrite, arsenopyrite and magnetite) hosted by metavolcanic rocks and skarn. Magnetite occurs in sulfides, skarn, amphibolite and altered metamorphosed rhyolitic ash–siltstone that consists of garnet–biotite, quartz–garnet–pyroxene, gedrite–albite, and sericitic rocks. Magnetite probably formed from hydrothermal ore-bearing fluids (~250–400°C) that replaced limestone and rhyolitic ash–siltstone, and subsequently recrystallised during metamorphism. The composition of magnetite from these rock types was measured using electron microprobe analysis and LA–ICP–MS. Utilisation of discrimination plots (Ca+Al+Mn vs. Ti+V, Ni/(Cr+Mn) vs. Ti+V, and trace-element variation diagrams (median concentration of Mg, Al, Ti, V, Co, Mn, Zn and Ga) suggest that the composition of magnetite in sulfides from the Stollberg ore field more closely resembles that from skarns found elsewhere rather than previously published compositions of magnetite in metamorphosed volcanogenic massive sulfide deposits. Although the variation diagrams show that magnetite compositions from various rock types have similar patterns, principal component analyses and element–element variation diagrams indicate that its composition from the same rock type in different sulfide deposits can be distinguished. This suggests that bulk-rock composition also has a strong influence on magnetite composition. Principal component analyses also show that magnetite in sulfides has a distinctive compositional signature which allows it to be a prospective pathfinder mineral for sulfide deposits in the Stollberg ore field.
This study examined struggles to establish autonomy and relatedness with peers in adolescence and early adulthood as predictors of advanced epigenetic aging assessed at age 30. Participants (N = 154; 67 male and 87 female) were observed repeatedly, along with close friends and romantic partners, from ages 13 through 29. Observed difficulty establishing close friendships characterized by mutual autonomy and relatedness from ages 13 to 18, an interview-assessed attachment state of mind lacking autonomy and valuing of attachment at 24, and self-reported difficulties in social integration across adolescence and adulthood were all linked to greater epigenetic age at 30, after accounting for chronological age, gender, race, and income. Analyses assessing the unique and combined effects of these factors, along with lifetime history of cigarette smoking, indicated that each of these factors, except for adult social integration, contributed uniquely to explaining epigenetic age acceleration. Results are interpreted as evidence that the adolescent preoccupation with peer relationships may be highly functional given the relevance of such relationships to long-term physical outcomes.
In May 2021, the Scientific Advisory Committee on Nutrition (SACN) published a risk assessment on lower carbohydrate diets for adults with type 2 diabetes (T2D)(1). The purpose of the report was to review the evidence on ‘low’-carbohydrate diets compared with the current UK government advice on carbohydrate intake for adults with T2D. However, since there is no agreed and widely utilised definition of a ‘low’-carbohydrate diet, comparisons in the report were between lower and higher carbohydrate diets. SACN’s remit is to assess the risks and benefits of nutrients, dietary patterns, food or food components for health by evaluating scientific evidence and to make dietary recommendations for the UK based on its assessment(2). SACN has a public health focus and only considers evidence in healthy populations unless specifically requested to do otherwise. Since the Committee does not usually make recommendations relating to clinical conditions, a joint working group (WG) was established in 2017 to consider this issue. The WG comprised members of SACN and members nominated by Diabetes UK, the British Dietetic Association, Royal College of Physicians and Royal College of General Practitioners. Representatives from NHS England and NHS Health Improvement, the National Institute for Health and Care Excellence and devolved health departments were also invited to observe the WG. The WG was jointly chaired by SACN and Diabetes UK.
COVID-19 vaccines are likely to be scarce for years to come. Many countries, from India to the U.K., have demonstrated vaccine nationalism. What are the ethical limits to this vaccine nationalism? Neither extreme nationalism nor extreme cosmopolitanism is ethically justifiable. Instead, we propose the fair priority for residents (FPR) framework, in which governments can retain COVID-19 vaccine doses for their residents only to the extent that they are needed to maintain a noncrisis level of mortality while they are implementing reasonable public health interventions. Practically, a noncrisis level of mortality is that experienced during a bad influenza season, which society considers an acceptable background risk. Governments take action to limit mortality from influenza, but there is no emergency that includes severe lockdowns. This “flu-risk standard” is a nonarbitrary and generally accepted heuristic. Mortality above the flu-risk standard justifies greater governmental interventions, including retaining vaccines for a country's own citizens over global need. The precise level of vaccination needed to meet the flu-risk standard will depend upon empirical factors related to the pandemic. This links the ethical principles to the scientific data emerging from the emergency. Thus, the FPR framework recognizes that governments should prioritize procuring vaccines for their country when doing so is necessary to reduce mortality to noncrisis flu-like levels. But after that, a government is obligated to do its part to share vaccines to reduce risks of mortality for people in other countries. We consider and reject objections to the FPR framework based on a country: (1) having developed a vaccine, (2) raising taxes to pay for vaccine research and purchase, (3) wanting to eliminate economic and social burdens, and (4) being ineffective in combating COVID-19 through public health interventions.
This study aimed to determine the incidence of laryngeal penetration and aspiration in elderly patients who underwent supracricoid laryngectomy with cricohyoidoepiglottopexy for laryngeal cancer.
Method
A retrospective analysis of dynamic videofluoroscopic swallowing studies was performed in patients who had received supracricoid laryngectomy with cricohyoidoepiglottopexy as a treatment for laryngeal cancers. Digital analysis of videofluoroscopic swallowing studies included measurements of displacement and timing related to swallowing safety.
Results
Videofluoroscopic swallowing studies from 52 patients were analysed. All participants were male and over 65 years old. Studies were performed five years after surgery. Among 52 videofluoroscopic swallowing studies, analysis showed that elevated pharyngeal constriction ratio (pharyngeal constriction ratio more than 0.0875, odds ratio = 5.2, p = 0.016), reduced pharyngoesophageal sphincter opening time (pharyngoesophageal sphincter open less than 0.6 seconds, odds ratio = 11.6, p = 0.00018) and reduced airway closure time (airway close less than 0.6 seconds, odds ratio = 10.6, p = 0.00057) were significantly associated with aspiration.
Conclusion
Deteriorated pharyngeal constriction, shortened airway closure and reduced pharyngoesophageal sphincter opening time are key factors for predicting laryngeal penetration or aspiration after supracricoid laryngectomy with cricohyoidoepiglottopexy.
The Harmonic Scalpel and Ligasure (Covidien) devices are commonly used in head and neck surgery. Parotidectomy is a complex and intricate surgery that requires careful dissection of the facial nerve. This study aimed to compare surgical outcomes in parotidectomy using these haemostatic devices with traditional scalpel and cautery.
Method
A systematic review of the literature was performed with subsequent meta-analysis of seven studies that compared the use of haemostatic devices to traditional scalpel and cautery in parotidectomy. Outcome measures included: temporary facial paresis, operating time, intra-operative blood loss, post-operative drain output and length of hospital stay.
Results
A total of 7 studies representing 675 patients were identified: 372 patients were treated with haemostatic devices, and 303 patients were treated with scalpel and cautery. Statistically significant outcomes favouring the use of haemostatic devices included operating time, intra-operative blood loss and post-operative drain output. Outcome measures that did not favour either treatment included facial nerve paresis and length of hospital stay.
Conclusion
Overall, haemostatic devices were found to reduce operating time, intra-operative blood loss and post-operative drain output.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
Energy deficit is common during prolonged periods of strenuous physical activity and limited sleep, but the extent to which appetite suppression contributes is unclear. The aim of this randomised crossover study was to determine the effects of energy balance on appetite and physiological mediators of appetite during a 72-h period of high physical activity energy expenditure (about 9·6 MJ/d (2300 kcal/d)) and limited sleep designed to simulate military operations (SUSOPS). Ten men consumed an energy-balanced diet while sedentary for 1 d (REST) followed by energy-balanced (BAL) and energy-deficient (DEF) controlled diets during SUSOPS. Appetite ratings, gastric emptying time (GET) and appetite-mediating hormone concentrations were measured. Energy balance was positive during BAL (18 (sd 20) %) and negative during DEF (–43 (sd 9) %). Relative to REST, hunger, desire to eat and prospective consumption ratings were all higher during DEF (26 (sd 40) %, 56 (sd 71) %, 28 (sd 34) %, respectively) and lower during BAL (–55 (sd 25) %, −52 (sd 27) %, −54 (sd 21) %, respectively; Pcondition < 0·05). Fullness ratings did not differ from REST during DEF, but were 65 (sd 61) % higher during BAL (Pcondition < 0·05). Regression analyses predicted hunger and prospective consumption would be reduced and fullness increased if energy balance was maintained during SUSOPS, and energy deficits of ≥25 % would be required to elicit increases in appetite. Between-condition differences in GET and appetite-mediating hormones identified slowed gastric emptying, increased anorexigenic hormone concentrations and decreased fasting acylated ghrelin concentrations as potential mechanisms of appetite suppression. Findings suggest that physiological responses that suppress appetite may deter energy balance from being achieved during prolonged periods of strenuous activity and limited sleep.
Although the Peritraumatic Distress Inventory (PDI) and Peritraumatic Dissociative Experiences Questionnaire (PDEQ) are both useful for identifying adults at risk of developing acute and chronic post-traumatic stress disorder (PTSD), they have not been validated in school-aged children. The present study aims at assessing the psychometric properties of the PDI and PDEQ in a sample of French-speaking school children.
Methods
One-hundred and thirty-three school-aged victims of road traffic accidents were consecutively enrolled into this study via the emergency room. Mean(SD) age was 11.7(2.2) and 56.4% (n=75) of them were of male gender. The 13-item self-report PDI (range 0-52) and the 10-item self report PDEQ (range 10-50) were assessed within one week of the accident. Symptoms of PTSD were assessed 1 and 6 months later using the 20-item self-report Child Post-Traumatic Stress Reaction Index (CPTS-RI) (range 0-80).
Results
Mean(SD) PDI and PDEQ scores were 19.1(10.1) and 21.1(7.6), respectively, while mean(SD) CPTS-RI scores at 1- and 6-months were 22.6(12.4) and 20.6(13.5), respectively. Cronbach's alpha coefficients were 0.8 and 0.77 for the PDI and PDEQ, respectively. The 1-month test-retest correlation coefficient (n=33) was 0.77 for both measures. The PDI demonstrated a 2-factor structure while the PDEQ displayed a 1-factor structure. As with adults, the two measures were inter-correlated (r=0.52) and correlated with subsequent PTSD symptoms (r=0.21−0.56; p< 0.05).
Conclusions
The PDI and PDEQ are reliable and valid in school-aged children, and predict PTSD symptoms.