We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Blast injuries can occur by a multitude of mechanisms, including improvised explosive devices (IEDs), military munitions, and accidental detonation of chemical or petroleum stores. These injuries disproportionately affect people in low- and middle-income countries (LMICs), where there are often fewer resources to manage complex injuries and mass-casualty events.
Study Objective:
The aim of this systematic review is to describe the literature on the acute facility-based management of blast injuries in LMICs to aid hospitals and organizations preparing to respond to conflict- and non-conflict-related blast events.
Methods:
A search of Ovid MEDLINE, Scopus, Global Index Medicus, Web of Science, CINAHL, and Cochrane databases was used to identify relevant citations from January 1998 through July 2024. This systematic review was conducted in adherence with PRISMA guidelines. Data were extracted and analyzed descriptively. A meta-analysis calculated the pooled proportions of mortality, hospital admission, intensive care unit (ICU) admission, intubation and mechanical ventilation, and emergency surgery.
Results:
Reviewers screened 3,731 titles and abstracts and 173 full texts. Seventy-five articles from 22 countries were included for analysis. Only 14.7% of included articles came from low-income countries (LICs). Sixty percent of studies were conducted in tertiary care hospitals. The mean proportion of patients who were admitted was 52.1% (95% CI, 0.376 to 0.664). Among all in-patients, 20.0% (95% CI, 0.124 to 0.288) were admitted to an ICU. Overall, 38.0% (95% CI, 0.256 to 0.513) of in-patients underwent emergency surgery and 13.8% (95% CI, 0.023 to 0.315) were intubated. Pooled in-patient mortality was 9.5% (95% CI, 0.046 to 0.156) and total hospital mortality (including emergency department [ED] mortality) was 7.4% (95% CI, 0.034 to 0.124). There were no significant differences in mortality when stratified by country income level or hospital setting.
Conclusion:
Findings from this systematic review can be used to guide preparedness and resource allocation for acute care facilities. Pooled proportions for mortality and other outcomes described in the meta-analysis offer a metric by which future researchers can assess the impact of blast events. Under-representation of LICs and non-tertiary care medical facilities and significant heterogeneity in data reporting among published studies limited the analysis.
Baseline assessment of cognitive performance is common practice under many concussion management protocols and is required for collegiate athletes by the NCAA. The purpose of baseline cognitive assessment is to understand an athlete’s individual uninjured cognitive performance, as opposed to using population normative data. This baseline can then serve as a reference point for recovery after concussion and can inform return-to-play decisions. However, multiple factors, including lack of effort, can contribute to misrepresentation of baseline results which raises concern for reliability during return-to-play decision-making. Measuring effort across a continuum, rather than as a dichotomous variable (good versus poor effort) may provide informative insight related to cognitive performance at baseline.
Participants and Methods:
Collegiate athletes (n = 231) completed the Immediate Post-Concussion Assessment and Cognitive Test (ImPACT) as part of their baseline pre-participation concussion evaluation. ImPACT creates composite scores of Verbal Memory, Visual Memory, Visual-Motor Speed, and Reaction Time. Baseline self-reported symptoms and total hours of sleep the night prior to testing are also collected through ImPACT. ImPACT has one embedded indicator within the program to assess effort, and research has identified an additional three embedded indicators. Athletes were also administered one stand-alone performance validity test, either the Medical Symptom Validity Test (n = 130) or the Rey Dot Counting Test (n = 101), to independently measure effort. Effort was estimated across a continuum (zero, one, two, or three or more failed effort indicators) with both stand-alone and embedded effort indicators. We evaluated the relationship between effort, symptoms, self-reported sleep, Reaction Time composite score and Visual-Motor Speed composite score using a linear regression model.
Results:
We found that 121 athletes passed all effort indicators, while 39 athletes failed only one effort indicator, 40 athletes failed two effort indicators, and 31 athletes failed three or four (three+) effort indicators. Self-reported symptoms and total hours of sleep were not related to effort, but Reaction Time and VisualMotor Speed composites were. Specifically, performance on the Visual-Motor Speed composite was significantly worse for athletes who failed two or three+ effort indicators compared to athletes who did not fail any, and performance on the Reaction Time composite was significantly worse only for athletes who failed three+ effort indicators. Additionally, athletes who failed one or more effort indicators and reported less sleep performed worse on both the Visual-Motor Speed and Reaction Time composites, compared to those who reported less sleep and did not fail any effort indicators.
Conclusions:
Athletes who failed one effort indicator did not perform significantly worse on Reaction Time and Visual-Motor Speed composites compared to those who passed all effort indicators. However, 31% of athletes failed two or more effort indicators and these athletes performed worse on cognitive tests, likely due to factors impacting their ability to put forth good effort. These results suggest that effort is more complex than a previously used dichotomous variable and highlights the importance of using several indicators of effort throughout baseline assessments. In addition, the importance of sleep should be emphasized during baseline assessments, especially when effort is questionable.
The social defeat hypothesis (SDH) suggests that a chronic experience of social defeat increases the likelihood of the development of psychosis. The SDH indicates that a negative experience of exclusion leads to an increase in the baseline activity of the mesolimbic dopamine system (MDS), which in turn leads to the onset of psychosis. Social defeat models have previously been produced using animal models and preclinical literature; however, these theories have not fully been tested in human clinical samples. There have been studies implying changes in brain structure due to social defeat interactions; however, research evidence is varied.
Objectives
This study aims to uncover whether exposure to SoDe has an impact on brain structure. Furthermore, we hope to understand if these changes are relevant to other mental health disorders.
Methods
698 (506 no SoDe, 191 SoDe) participants between the ages of 15-41 were recruited from the PRONIA-FP7 study. SoDe was measured from the self-reported questionnaires’ Bullying Scale’ and ‘The Everyday Discrimination Scale’. T1-weighted structural MRI data were processed; five 2 sample t-test analyses were carried out to compare the GMV differences in the entire sample and between the four groups.
Results
The VBM analysis showed significant group interactions in the right thalamus proper when comparing participants who had experience SoDe to participants who had not experienced SoDe including all 4 groups along with left cerebral white matter differences. In the ROP subgroup, significant group interactions in the left cerebellum white matter were found along with right cerebral white matter, left cerebral white matter and right Thalamus proper.
Conclusions
The findings suggest that there are significant group interactions in thalamus and cerebral white matter. This is in keeping with some previous research suggesting volumetric changes in the thalamus due to stress and psychosis. Similarly for white matter there is some evidence suggesting differences due to SoDe and psychosis. However, there is a scarcity of research in this area with different research suggesting distinctive findings and therefore the evidence is inconclusive. In the ROP group analysis significant group interactions were present in the cerebellum due to SoDe experience. There is research suggesting the cerebellum’s role in multiple different aspects like social interaction, higher-order cognition, working memory, cognitive flexibility, and psychotic symptoms, with every research suggesting multiple different things the role of the cerebellum in SoDe in the ROP population is in question. Nonetheless this large-scale research presents some interesting novel finding and leads the way to a new area of research. Further analysis will explore the relationship between groups on markers of stress (CRP) and neuroinflammation as potential mediation of the environmental effects of SoDe.
The Kessler Psychological Distress Scales (K10 and K6) are used as screening tools to assess psychological distress and are the first-line assessment of need for help in the Headspace services.
Objectives
Thus, we studied the psychometric properties of their German versions in a Swiss community sample to evaluate their potential usefulness to screen for mental disorders or relevant mental problems in low threshold transdiagnostic German-speaking services.
Methods
The sample consisted of 829 citizens of the Swiss canton Bern of age 19-43 years. K10/K6 were validated against Mini-International Neuropsychiatric Interview (M.I.N.I.) diagnoses, questionnaires about health status and quality of life. Receiver Operating Characteristic (ROC) curve analyses were used to test for general discriminative ability and to select optimal cut-offs of the K10 and K6 for non-psychotic full-blown and subthreshold mental disorders.
Results
Cronbach’s alphas were 0.81 (K10) and 0.70 (K6). ROC analyses indicated much lower optimal thresholds than earlier suggested; 10 for K10 and 6 for K6. At these thresholds, against M.I.N.I. diagnoses, Cohen’s Kappa (<=0.173) and correspondence rates (<=58.14%) were insufficient throughout. Values were higher at the earlier suggested threshold, yet, at the cost of sensitivity that was below 0.5 in all but three, and below 0.3 in all but six cases.
Conclusions
For the lack of sufficient validity and sensitivity, respectively, our findings suggest that both K10 and K6 would only be of limited use in a low-threshold transdiagnostic mental health service – comparable to Headspace – for young adults in Switzerland and likely other German-speaking countries.
Minority and older adult patients remain underrepresented in cancer clinical trials (CCTs). The current study sought to examine sociodemographic inequities in CCT interest, eligibility, enrollment, decline motivation, and attrition across two psychosocial CCTs for gynecologic, gastrointestinal, and thoracic cancers.
Methods:
Patients were approached for recruitment to one of two interventions: (1) a randomized control trial (RCT) examining effects of a cognitive-behavioral intervention targeting sleep, pain, mood, cytokines, and cortisol following surgery, or (2) a yoga intervention to determine its feasibility, acceptability, and effects on mitigating distress. Prospective RCT participants were queried about interest and screened for eligibility. All eligible patients across trials were offered enrollment. Patients who declined yoga intervention enrollment provided reasons for decline. Sociodemographic predictors of enrollment decisions and attrition were explored.
Results:
No sociodemographic differences in RCT interest were observed, and older patients were more likely to be ineligible. Eligible Hispanic patients across trials were significantly more likely to enroll than non-Hispanic patients. Sociodemographic factors predicted differences in decline motivation. In one trial, individuals originating from more urban areas were more likely to prematurely discontinue participation.
Discussion:
These results corroborate evidence of no significant differences in CCT interest across minority groups, with older adults less likely to fulfill eligibility criteria. While absolute Hispanic enrollment was modest, Hispanic patients were more likely to enroll relative to non-Hispanic patients. Additional sociodemographic trends were noted in decline motivation and geographical prediction of attrition. Further investigation is necessary to better understand inequities, barriers, and best recruitment practices for representative CCTs.
The neural mechanisms contributing to the social problems of pediatric brain tumor survivors (PBTS) are unknown. Face processing is important to social communication, social behavior, and peer acceptance. Research with other populations with social difficulties, namely autism spectrum disorder, suggests atypical brain activation in areas important for face processing. This case-controlled functional magnetic resonance imaging (fMRI) study compared brain activation during face processing in PBTS and typically developing (TD) youth.
Methods:
Participants included 36 age-, gender-, and IQ-matched youth (N = 18 per group). PBTS were at least 5 years from diagnosis and 2 years from the completion of tumor therapy. fMRI data were acquired during a face identity task and a control condition. Groups were compared on activation magnitude within the fusiform gyrus for the faces condition compared to the control condition. Correlational analyses evaluated associations between neuroimaging metrics and indices of social behavior for PBTS participants.
Results:
Both groups demonstrated face-specific activation within the social brain for the faces condition compared to the control condition. PBTS showed significantly decreased activation for faces in the medial portions of the fusiform gyrus bilaterally compared to TD youth, ps ≤ .004. Higher peak activity in the left fusiform gyrus was associated with better socialization (r = .53, p < .05).
Conclusions:
This study offers initial evidence of atypical activation in a key face processing area in PBTS. Such atypical activation may underlie some of the social difficulties of PBTS. Social cognitive neuroscience methodologies may elucidate the neurobiological bases for PBTS social behavior.
We present a broad study of linear, clustered, noble gas puffs irradiated with the frequency doubled (527 nm) Titan laser at Lawrence Livermore National Laboratory. Pure Ar, Kr, and Xe clustered gas puffs, as well as two mixed-gas puffs consisting of KrAr and XeKrAr gases, make up the targets. Characterization experiments to determine gas-puff density show that varying the experimental parameter gas-delay timing (the delay between gas puff initialization and laser-gas-puff interaction) provides a simple control over the gas-puff density. X-ray emission (>1.4 keV) is studied as a function of gas composition, density, and delay timing. Xe gas puffs produce the strongest peak radiation in the several keV spectral region. The emitted radiation was found to be anisotropic, with smaller X-ray flux observed in the direction perpendicular to both laser beam propagation and polarization directions. The degree of anisotropy is independent of gas target type but increases with photon energy. X-ray spectroscopic measurements estimate plasma parameters and highlight their difference with previous studies. Electron beams with energy in excess of 72 keV are present in the noble gas-puff plasmas and results indicate that Ar plays a key role in their production. A drastic increase in harder X-ray emissions (X-ray flash effect) and multi-MeV electron-beam generation from Xe gas-puff plasma occurred when the laser beam was focused on the front edge of the linear gas puff.
Basic symptoms, defined as subjectively perceived disturbances in thought, perception and other essential mental processes, have been established as a predictor of psychotic disorders. However, the relationship between basic symptoms and family history of a transdiagnostic range of severe mental illness, including major depressive disorder, bipolar disorder and schizophrenia, has not been examined.
Aims
We sought to test whether non-severe mood disorders and severe mood and psychotic disorders in parents is associated with increased basic symptoms in their biological offspring.
Method
We measured basic symptoms using the Schizophrenia Proneness Instrument – Child and Youth Version in 332 youth aged 8–26 years, including 93 offspring of control parents, 92 offspring of a parent with non-severe mood disorders, and 147 offspring of a parent with severe mood and psychotic disorders. We tested the relationships between parent mental illness and offspring basic symptoms in mixed-effects linear regression models.
Results
Offspring of a parent with severe mood and psychotic disorders (B = 0.69, 95% CI 0.22–1.16, P = 0.004) or illness with psychotic features (B = 0.68, 95% CI 0.09–1.27, P = 0.023) had significantly higher basic symptom scores than control offspring. Offspring of a parent with non-severe mood disorders reported intermediate levels of basic symptoms, that did not significantly differ from control offspring.
Conclusions
Basic symptoms during childhood are a marker of familial risk of psychopathology that is related to severity and is not specific to psychotic illness.
A fine-grained, up to 3-m-thick tephra bed in southwestern Saskatchewan, herein named Duncairn tephra (Dt), is derived from an early Pleistocene eruption in the Jemez Mountains volcanic field of New Mexico, requiring a trajectory of northward tephra dispersal of ~1500 km. An unusually low CaO content in its glass shards denies a source in the closer Yellowstone and Heise volcanic fields, whereas a Pleistocene tephra bed (LSMt) in the La Sal Mountains of Utah has a very similar glass chemistry to that of the Dt, supporting a more southerly source. Comprehensive characterization of these two distal tephra beds along with samples collected near the Valles caldera in New Mexico, including grain size, mineral assemblage, major- and trace-element composition of glass and minerals, paleomagnetism, and fission-track dating, justify this correlation. Two glass populations each exist in the Dt and LSMt. The proximal correlative of Dt1 is the plinian Tsankawi Pumice and co-ignimbritic ash of the first ignimbrite (Qbt1g) of the 1.24 Ma Tshirege Member of the Bandelier Tuff. The correlative of Dt2 and LSMt is the co-ignimbritic ash of Qbt2. Mixing of Dt1 and Dt2 probably occurred during northward transport in a jet stream.
Herbicides used in corn (Zea mays L.), sorghum [Sorghum bicolor (L.) Moench], and soybeans [Glycine max (L.) Merr.] were applied in the spring and their persistence into late summer was determined during 1974 to 1976. Composite soil samples from the top 5 cm of field plots were taken each August and bioassayed in the greenhouse. Bioassay species used were winter wheat [Triticum aestivum L.) and soybeans for herbicides used in corn and sorghum, and winter wheat and white mustard (Brassica hirta Moench) for herbicides used in soybeans. Soil persistence of triazine herbicides caused more injury to winter wheat, soybeans, and white mustard than any other class of herbicides tested. Atrazine [2-chloro-4-(ethylamino)-6-(isopropylamino)-s-triazine] showed the most soil persistence of the five triazines evaluated. At normal field application rates, herbicides other than the triazines showed little injury to the bioassay plants. Soil persistence of herbicides was further reduced when combinations of reduced rates of each herbicide were utilized. Herbicides used for spring applications in corn showed more soil persistence in August than did the herbicides for sorghum, while herbicides for soybeans generally were least persistent. Postemergence herbicide applications resulted in more injury in bioassay species than preplant incorporated or preemergence applications. Persistence of some herbicides will restrict certain options to the grower such as changing crops in case of crop failure, fall planting of winter wheat, double cropping, or certain crop rotations.
All land uses in eastern and southeastern Nebraska were infested to some extent with hemp dogbane (Apocynum cannabinum L.). The highest infestations were observed in oats (Avena sativa L.) and soybeans [Glycine max (L.) Merr.] and the lowest infestations were in alfalfa (Medicago sativa L.), pastures, and winter wheat (Triticum aestivum L.). Yield reductions from hemp dogbane infestations ranged from 0 to 10% in corn (Zea mays L.), 28 to 41% in soybeans, and 37 to 45% in sorghum [Sorghum bicolor (L.) Moench]. Emergence of hemp dogbane from crown roots occurred when the soil temperature was 17 to 19 C, during April in 1977 and 1978. Plants attained the bud stage within 4 to 7 weeks after emergence. Early flower, full bloom, and pod initiation occurred subsequently at about 1 week intervals. Seeds produced were first viable 10 weeks after full bloom. Root activity or regenerative capacity as measured by length and number of new shoots and roots produced at monthly intervals in the germinator showed a cyclic pattern. The highest activity occurred in the spring and late fall and lowest activity in summer and early fall. Protein levels in the roots ranged from 7 to 9% in the fall and spring to 4 to 5% during the summer. Percentage total nonstructural carbohydrates (TNC) ranged from 20 to 31% in lateral roots and 32 to 53% in crown roots, but there was not a consistent cyclic pattern of percentage TNC during the growing season.
Lanolin or lanolin + corn (Zea mays L.) starch rings are often used as barriers on leaves to prevent runoff of foliarly applied 14C-herbicide treatments. A preliminary experiment showed that 64 and 90% of the applied 2,4-D [(2,4-dichlorophenoxy)acetic acid] and 57 and 87% of the applied glyphosate [N-(phosphonomethyl)glycine] was adsorbed to or absorbed into a lanolin and lanolin + starch ring, respectively, during 6 days on a glass slide. Absorption and translocation of 2,4-D in hemp dogbane (Apocynum cannabinum L.) was decreased from 26% down to 16 or 17% of the total applied when a lanolin or lanolin + starch ring was used. Glyphosate absorption and translocation increased with the lanolin ring but not with the lanolin + starch ring. Distribution of the translocated 2,4-D and glyphosate was also altered by use of the ring barriers. Results indicate that one should avoid use of the lanolin ring in 14C-herbicide absorption studies to simulate field conditions.
Low recoveries of total applied 14C in translocation studies and erratic control of hemp dogbane (Apocynum cannabinum L.) in the field showed a need for a balance-sheet study of absorption, translocation, and metabolism of 14C-2,4-D [(2,4-dichlorophenoxy) acetic acid] and 14C-glyphosate [N-(phosphonomethyl)glycine]. Total recovery of 14C-herbicides applied to hemp dogbane in the laboratory was 97% for 2,4-D and 105% for glyphosate. Of the 14C recovered after 12 days in the hemp dogbane, 34 to 55% was parent-2,4-D after 2,4-D treatment, and 93 to 96% was parent glyphosate after glyphosate treatment. Only negligible amounts of 14C were lost via volatilization or evolution as 14CO2. A broadcast treatment with unlabeled herbicide did not significantly affect subsequent absorption, translocation, or metabolism of either herbicide. Total herbicide absorbed and translocated out of the treated area of the leaf generally increased during the subsequent 12 days for 2,4-D but only 3 days for glyphosate. A greater percentage of the total applied 2,4-D (31 vs. 14%) and glyphosate (14 vs. 8%) was translocated from upper rather than lower leaves of hemp dogbane, respectively. Higher temperatures (30 vs. 25 C) resulted in greater translocation of glyphosate (39 vs. 18%) but not 2,4-D (35 vs. 39%). Higher light intensities resulted in greater accumulations of 2,4-D into roots and of glyphosate into untreated areas of the treated leaf. Autoradiographs showed that both herbicides moved through hemp dogbane in a typical symplastic pattern and accumulated in roots and new leaves.
The aim of this study was to examine cross-sectionally whether higher cardiorespiratory fitness (CRF) might favorably modify amyloid-β (Aβ)-related decrements in cognition in a cohort of late-middle-aged adults at risk for Alzheimer’s disease (AD). Sixty-nine enrollees in the Wisconsin Registry for Alzheimer’s Prevention participated in this study. They completed a comprehensive neuropsychological exam, underwent 11C Pittsburgh Compound B (PiB)-PET imaging, and performed a graded treadmill exercise test to volitional exhaustion. Peak oxygen consumption (VO2peak) during the exercise test was used as the index of CRF. Forty-five participants also underwent lumbar puncture for collection of cerebrospinal fluid (CSF) samples, from which Aβ42 was immunoassayed. Covariate-adjusted regression analyses were used to test whether the association between Aβ and cognition was modified by CRF. There were significant VO2peak*PiB-PET interactions for Immediate Memory (p=.041) and Verbal Learning & Memory (p=.025). There were also significant VO2peak*CSF Aβ42 interactions for Immediate Memory (p<.001) and Verbal Learning & Memory (p<.001). Specifically, in the context of high Aβ burden, that is, increased PiB-PET binding or reduced CSF Aβ42, individuals with higher CRF exhibited significantly better cognition compared with individuals with lower CRF. In a late-middle-aged, at-risk cohort, higher CRF is associated with a diminution of Aβ-related effects on cognition. These findings suggest that exercise might play an important role in the prevention of AD. (JINS, 2015, 21, 841–850)
Norovirus outbreaks occur frequently in Denmark and it can be difficult to establish whether apparently independent outbreaks have the same origin. Here we report on six outbreaks linked to frozen raspberries, investigated separately over a period of 3 months. Norovirus from stools were sequence-typed; including extended sequencing of 1138 bp encompassing the hypervariable P2 region of the capsid gene. Norovirus was detected in 27 stool samples. Genotyping showed genotype GI.Pb_GI.6 (polymerase/capsid) with 100% identical sequences. Samples from five outbreaks were furthermore identical over the variable capsid P2 region. In one outbreak at a hospital canteen, frozen raspberries was associated with illness by cohort investigation (relative risk 6·1, 95% confidence interval 3·2–11). Bags of raspberries suspected to be the source were positive for genogroup I and II noroviruses, one typable virus was genotype GI.6 (capsid). These molecular investigations showed that the apparently independent outbreaks were the result of one contamination event of frozen raspberries. The contaminated raspberries originated from a single producer in Serbia and were originally not considered to belong to the same batch. The outbreaks led to consultations and mutual visits between producers, investigators and authorities. Further, Danish legislation was changed to make heat-treatment of frozen raspberries compulsory in professional catering establishments.
The cherry scallop shell moth, Hydria prunivorata (Ferguson), is a colonial feeder on black cherry, Prunus serotina Ehrh. Pupae overwinter in the litter and adult emergence occurs from May through September. Eggs are laid on the foliage in pyramidal-shaped masses 23–26 days after adult emergence. Eggs begin to hatch 4 days following oviposition and each of the four larval stages lasts 4–6 days. There is one generation per year in New York. The egg parasite Telenomus sp. is the principal mortality factor occurring in populations that have remained at outbreak levels for 2 or more years. The life stages of H. prunivorata are described and control recommendations discussed. The peach bark beetle, Phloeotribus liminaris (Harris), may kill black cherry trees that are stressed by heavy defoliation.