We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Over the last 20 years disasters have increasingly involved children, and pediatric disaster medicine research is growing. However, this research is largely reactive, has not been categorized in terms of the disaster cycle, and the quality of the research is variable. To understand the gaps in current literature and highlight areas for future research, we conducted a scoping review of pediatric disaster medicine literature. This work will help create recommendations for future pediatric disaster medicine research.
Method:
Using a published framework for scoping reviews, we worked with a medical librarian and a multi-institutional team to define the research question, develop eligibility criteria, and to identify a search strategy. We conducted a comprehensive Medline search from 2001-2022, which was distributed to nine reviewers. Each article was independently screened for inclusion by two reviewers. Discrepancies were resolved by a third reviewer.
Inclusion criteria included articles published in English, related to all stages of the disaster cycle, and disaster education, focused on or included pediatric populations; published in academic, peer-reviewed journals, and policies from professional societies.
Results:
967 pediatric disaster medicine articles were imported for screening and 35 duplicates were removed. 932 articles were screened for relevance and 109 were excluded. In 2000, three articles met inclusion criteria and 66 in 2021. We noticed reactive spikes in the number of articles after major disasters. Most articles focused on preparedness and response, with only a few articles on recovery, mitigation, and prevention. Methodology used for most studies was either qualitative or retrospective. Most were single site studies and there were < 10 meta-analyses over the 20 years.
Conclusion:
This scoping review describes the trends in and quality of existing pediatric disaster medicine literature. By identifying the gaps in this body of literature, we can better prioritize future research.
Patients presenting to hospital with suspected coronavirus disease 2019 (COVID-19), based on clinical symptoms, are routinely placed in a cohort together until polymerase chain reaction (PCR) test results are available. This procedure leads to delays in transfers to definitive areas and high nosocomial transmission rates. FebriDx is a finger-prick point-of-care test (PoCT) that detects an antiviral host response and has a high negative predictive value for COVID-19. We sought to determine the clinical impact of using FebriDx for COVID-19 triage in the emergency department (ED).
Design:
We undertook a retrospective observational study evaluating the real-world clinical impact of FebriDx as part of an ED COVID-19 triage algorithm.
Setting:
Emergency department of a university teaching hospital.
Patients:
Patients presenting with symptoms suggestive of COVID-19, placed in a cohort in a ‘high-risk’ area, were tested using FebriDx. Patients without a detectable antiviral host response were then moved to a lower-risk area.
Results:
Between September 22, 2020, and January 7, 2021, 1,321 patients were tested using FebriDx, and 1,104 (84%) did not have a detectable antiviral host response. Among 1,104 patients, 865 (78%) were moved to a lower-risk area within the ED. The median times spent in a high-risk area were 52 minutes (interquartile range [IQR], 34–92) for FebriDx-negative patients and 203 minutes (IQR, 142–255) for FebriDx-positive patients (difference of −134 minutes; 95% CI, −144 to −122; P < .0001). The negative predictive value of FebriDx for the identification of COVID-19 was 96% (661 of 690; 95% CI, 94%–97%).
Conclusions:
FebriDx improved the triage of patients with suspected COVID-19 and reduced the time that severe acute respiratory coronavirus virus 2 (SARS-CoV-2) PCR-negative patients spent in a high-risk area alongside SARS-CoV-2–positive patients.
Migration is an established risk factor for developing a psychotic disorder in countries with a long history of migration. Less is known for countries with only a recent history of migration. This study aimed to determine the risk for developing a psychotic disorder in migrants to the Republic of Ireland.
Methods
We included all presentations of first-episode psychosis over 8.5 years to the DETECT Early Intervention for psychosis service in the Republic of Ireland (573 individuals aged 18–65, of whom 22% were first-generation migrants). Psychotic disorder diagnosis relied on SCID. The at-risk population was calculated using census data, and negative binomial regression was used to estimate incidence rate ratios.
Results
The annual crude incidence rate for a first-episode psychotic disorder in the total cohort was 25.62 per 100000 population at risk. Migrants from Africa had a nearly twofold increased risk for developing a psychotic disorder compared to those born in the Republic of Ireland (IRR = 1.83, 95% CI 1.11–3.02, p = 0.02). In contrast, migrants from certain Asian countries had a reduced risk, specifically those from China, India, Philippines, Pakistan, Malaysia, Bangladesh and Hong Kong (aIRR = 0.36, 95% CI 0.16–0.81, p = 0.01).
Conclusions
Further research into the reasons for this inflated risk in specific migrant groups could produce insights into the aetiology of psychotic disorders. This information should also be used, alongside other data on environmental risk factors that can be determined from census data, to predict the incidence of psychotic disorders and thereby resource services appropriately.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Despite established clinical associations among major depression (MD), alcohol dependence (AD), and alcohol consumption (AC), the nature of the causal relationship between them is not completely understood. We leveraged genome-wide data from the Psychiatric Genomics Consortium (PGC) and UK Biobank to test for the presence of shared genetic mechanisms and causal relationships among MD, AD, and AC.
Methods
Linkage disequilibrium score regression and Mendelian randomization (MR) were performed using genome-wide data from the PGC (MD: 135 458 cases and 344 901 controls; AD: 10 206 cases and 28 480 controls) and UK Biobank (AC-frequency: 438 308 individuals; AC-quantity: 307 098 individuals).
Results
Positive genetic correlation was observed between MD and AD (rgMD−AD = + 0.47, P = 6.6 × 10−10). AC-quantity showed positive genetic correlation with both AD (rgAD−AC quantity = + 0.75, P = 1.8 × 10−14) and MD (rgMD−AC quantity = + 0.14, P = 2.9 × 10−7), while there was negative correlation of AC-frequency with MD (rgMD−AC frequency = −0.17, P = 1.5 × 10−10) and a non-significant result with AD. MR analyses confirmed the presence of pleiotropy among these four traits. However, the MD-AD results reflect a mediated-pleiotropy mechanism (i.e. causal relationship) with an effect of MD on AD (beta = 0.28, P = 1.29 × 10−6). There was no evidence for reverse causation.
Conclusion
This study supports a causal role for genetic liability of MD on AD based on genetic datasets including thousands of individuals. Understanding mechanisms underlying MD-AD comorbidity addresses important public health concerns and has the potential to facilitate prevention and intervention efforts.
Low-pressure regional aureoles with steep metamorphic field gradients are critical to understanding progressive metamorphism in high-temperature metasedimentary rocks. Delicately layered pelitic and psammitic metasedimentary rocks at Mt Stafford, central Australia, record a greenschist- to granulite-facies Palaeoproterozoic regional aureole, associated with S-type granite plutons, reflecting metamorphism in the range 500–800 °C and at ∼3 kbar. The rocks experienced minimal deformation during metamorphism and partial melting. Partial melting textures evolve progressively along the steep metamorphic field gradient from the incipient stages of melting marked by cuspate grains with low dihedral angles, to melt proportions sufficient to form diatexite with schollen. Phase equilibria modelling in the NCKFMASHTO system for pelitic, semi-pelitic and high- and low-ferromagnesian psammitic samples quantitatively illustrates the dependence of partial melting on rock composition and water volume. Pelitic compositions are more fertile than psammitic compositions when the water content in the rocks is low, especially during the early stages of melting. The whole-rock ferromagnesian component additionally influences melt fertility, with ferromagnesian-rich psammite being more fertile than psammite with a lower ferromagnesian component. Subtle variations in free water content can result in obvious changes in melt volume but limited variation in melt composition. Distinct melting histories of pelitic and psammitic rocks inferred from field relationships may be partially attributed to potential differences in water volume retained to super-solidus conditions. Melt composition is more dependent on the rock composition than the variation in water content.
Background: Central neuropathic pain syndromes are a result of central nervous system injury, most commonly related to stroke, traumatic spinal cord injury, or multiple sclerosis. These syndromes are distinctly less common than peripheral neuropathic pain, and less is known regarding the underlying pathophysiology, appropriate pharmacotherapy, and long-term outcomes. The objective of this study was to determine the long-term clinical effectiveness of the management of central neuropathic pain relative to peripheral neuropathic pain at tertiary pain centers. Methods: Patients diagnosed with central (n=79) and peripheral (n=710) neuropathic pain were identified for analysis from a prospective observational cohort study of patients with chronic neuropathic pain recruited from seven Canadian tertiary pain centers. Data regarding patient characteristics, analgesic use, and patient-reported outcomes were collected at baseline and 12-month follow-up. The primary outcome measure was the composite of a reduction in average pain intensity and pain interference. Secondary outcome measures included assessments of function, mood, quality of life, catastrophizing, and patient satisfaction. Results: At 12-month follow-up, 13.5% (95% confidence interval [CI], 5.6-25.8) of patients with central neuropathic pain and complete data sets (n=52) achieved a ≥30% reduction in pain, whereas 38.5% (95% CI, 25.3-53.0) achieved a reduction of at least 1 point on the Pain Interference Scale. The proportion of patients with central neuropathic pain achieving both these measures, and thus the primary outcome, was 9.6% (95% CI, 3.2-21.0). Patients with peripheral neuropathic pain and complete data sets (n=463) were more likely to achieve this primary outcome at 12 months (25.3% of patients; 95% CI, 21.4-29.5) (p=0.012). Conclusion: Patients with central neuropathic pain syndromes managed in tertiary care centers were less likely to achieve a meaningful improvement in pain and function compared with patients with peripheral neuropathic pain at 12-month follow-up.
Background: Central neuropathic pain syndromes are a result of central nervous system injury, most commonly related to stroke, spinal cord injury, or multiple sclerosis. These syndromes are much less common than peripheral etiologies, with less known regarding optimal treatment. The objective of this study was to determine the long-term clinical effectiveness of the management of central relative to peripheral neuropathic pain at tertiary pain centers. Methods: Patients diagnosed with central (n=79) and peripheral (n=710) neuropathic pain were identified from a prospective observational cohort from seven Canadian tertiary centers. Data regarding patient -characteristics, analgesic use, and patient-reported outcomes were collected at baseline and 12-month follow-up. The primary outcome was the composite of reduced average pain intensity and pain interference. Secondary outcomes included assessments of function, mood, and quality-of-life. Results: At 12-month follow-up, 13.5% (95%CI,5.6-25.8) of patients achieved ≥30% reduction in pain, whereas 38.5% (95%CI,25.3-53.0) achieved a ≥1 point reduction in pain interference; 9.6% (95%CI,3.2-21.0) of patients achieving both these measures. Patients with peripheral neuropathic pain were more likely to achieve this primary outcome at 12-months (25.3% of patients; 95%CI,21.4-29.5) (p=.012). Conclusions: Patients with central neuropathic pain were less likely to achieve a meaningful improvement in pain and function compared to patients with peripheral neuropathic pain at 12-month follow-up.
OBJECTIVES/SPECIFIC AIMS: The current treatment for amyotrophic lateral sclerosis (ALS) includes systemic delivery of neurotrophic factors (NTFs). Although this approach may seem theoretically sound, NTF efficacy within the central nervous system (CNS) is largely limited by the blood-brain barrier. Thus, a cell-based approach, which allows for targeted delivery of molecular therapies locally from the CNS, could lead to a paradigm shift in the field. METHODS/STUDY POPULATION: The Windebank and Staff group at Mayo Clinic completed a Phase I dose-escalation safety trial of autologous, adipose-derived mesenchymal stem cells (adMSCs) in an effort to move toward personalized medical treatment of ALS. The adMSCs were injected into the intrathecal space by lumbar puncture in 27 patients and the results showed an excellent safety profile across a range of doses. The team is moving forward with this idea by using gene-editing technology to develop clinical-grade, genetically modified autologous MSCs. The patient-derived adMSCs are modified at defined “safe-harbor” regions of the human genome through transcription activator-like effector nuclease (TALEN) technology. RESULTS/ANTICIPATED RESULTS: Our results show that electroporating adMSCs with plasmid DNA leads to efficient GFP or TALEN transgene expression, but yields low cell survival and a low rate of genetic modification. DISCUSSION/SIGNIFICANCE OF IMPACT: It can be concluded that: (1) TALEN technology may be used to target safe harbor loci for gene integration to produce therapeutic adMSC for ALS. (2) Primary barriers to adMSC modification are inefficient TALEN and donor template uptake, low cutting efficiency, and poor cell survival after electroporation. Future directions include optimizing the protocol to obtain 48 base pairs in the homology arms and increasing transfection efficiency.
The development of algorithms for agile science and autonomous exploration has been pursued in contexts ranging from spacecraft to planetary rovers to unmanned aerial vehicles to autonomous underwater vehicles. In situations where time, mission resources and communications are limited and the future state of the operating environment is unknown, the capability of a vehicle to dynamically respond to changing circumstances without human guidance can substantially improve science return. Such capabilities are difficult to achieve in practice, however, because they require intelligent reasoning to utilize limited resources in an inherently uncertain environment. Here we discuss the development, characterization and field performance of two algorithms for autonomously collecting water samples on VALKYRIE (Very deep Autonomous Laser-powered Kilowatt-class Yo-yoing Robotic Ice Explorer), a glacier-penetrating cryobot deployed to the Matanuska Glacier, Alaska (Mission Control location: 61°42′09.3″N 147°37′23.2″W). We show performance on par with human performance across a wide range of mission morphologies using simulated mission data, and demonstrate the effectiveness of the algorithms at autonomously collecting samples with high relative cell concentration during field operation. The development of such algorithms will help enable autonomous science operations in environments where constant real-time human supervision is impractical, such as penetration of ice sheets on Earth and high-priority planetary science targets like Europa.
Background: Painful diabetic neuropathy (PDN) is a frequent complication of diabetes mellitus. Current treatment recommendations are based on short-term trials, generally of ≤3 months’ duration. Limited data are available on the long-term outcomes of this chronic disease. The objective of this study was to determine the long-term clinical effectiveness of the management of chronic PDN at tertiary pain centres. Methods: From a prospective observational cohort study of patients with chronic neuropathic non-cancer pain recruited from seven Canadian tertiary pain centres, 60 patients diagnosed with PDN were identified for analysis. Data were collected according to Initiative on Methods, Measurement, and Pain Assessment in Clinical Trials guidelines including the Brief Pain Inventory. Results: At 12-month follow-up, 37.2% (95% confidence interval [CI], 23.0-53.3) of 43 patients with complete data achieved pain reduction of ≥30%, 51.2% (95% CI, 35.5-66.7) achieved functional improvement with a reduction of ≥1 on the Pain Interference Scale (0-10, Brief Pain Inventory) and 30.2% (95% CI, 17.2-46.1) had achieved both these measures. Symptom management included at least two medication classes in 55.3% and three medication classes in 25.5% (opioids, antidepressants, anticonvulsants). Conclusions: Almost one-third of patients being managed for PDN in a tertiary care setting achieve meaningful improvements in pain and function in the long term. Polypharmacy including analgesic antidepressants and anticonvulsants were the mainstays of effective symptom management.
Background: Painful diabetic neuropathy (PDN) is a frequent complication of diabetes mellitus. Current treatment recommendations are based on short-term trials, generally of duration ≤3 months. Limited data are available on the long-term outcomes of this chronic disease. This study aims to determine the long-term clinical effectiveness of the management of chronic PDN at tertiary pain centres. Methods: From a prospective observational cohort study of patients with chronic neuropathic non-cancer pain recruited from seven Canadian tertiary pain centres, 43 patients diagnosed with PDN were identified for analysis. Data were collected according to Initiative on Methods, Measurement, and Pain Assessment in Clinical Trials (IMMPACT) guidelines including Brief Pain Inventory (BPI). Results: At 12-month follow-up, 37.2% of 43 patients achieved pain reduction of ≥30%, 51.2% achieved functional improvement with a reduction of ≥1 on the Pain Interference Scale (0-10, BPI), and 30.2% (95% CI: 17.2% to 46.1%) had achieved both these measures. Symptom management included at least 2 medication classes in 55.3%, and 3 medications classes (opioids, antidepressants, anticonvulsants) in 25.5%. Conclusions: A sizable minority of patients being managed for PDN in a tertiary care setting achieve meaningful improvement. Polypharmacy, including analgesic antidepressants, anticonvulsants and opioids, is often necessary to attain symptom management.
Space applications have evolved to play a significant role in disaster relief by providing services including remote sensing imagery for mitigation and disaster damage assessments; satellite communication to provide access to medical services; positioning, navigation, and timing services; and data sharing. Common issues identified in past disaster response and relief efforts include lack of communication, delayed ordering of actions (eg, evacuations), and low levels of preparedness by authorities during and after disasters. We briefly summarize the Space for Health (S4H) Team Project, which was prepared during the Space Studies Program 2014 within the International Space University. The S4H Project aimed to improve the way space assets and experiences are used in support of public health during disaster relief efforts. We recommend an integrated solution based on nano-satellites or a balloon communication system, mobile self-contained relief units, portable medical scanning devices, and micro-unmanned vehicles that could revolutionize disaster relief and disrupt different markets. The recommended new system of coordination and communication using space assets to support public health during disaster relief efforts is feasible. Nevertheless, further actions should be taken by governments and organizations in collaboration with the private sector to design, test, and implement this system. (Disaster Med Public Health Preparedness. 2015;9:319-328)
The redshifted 21cm line of neutral hydrogen (Hi), potentially observable at low radio frequencies (~50–200 MHz), should be a powerful probe of the physical conditions of the inter-galactic medium during Cosmic Dawn and the Epoch of Reionisation (EoR). The sky-averaged Hi signal is expected to be extremely weak (~100 mK) in comparison to the foreground of up to 104 K at the lowest frequencies of interest. The detection of such a weak signal requires an extremely stable, well characterised system and a good understanding of the foregrounds. Development of a nearly perfectly (~mK accuracy) calibrated total power radiometer system is essential for this type of experiment. We present the BIGHORNS (Broadband Instrument for Global HydrOgen ReioNisation Signal) experiment which was designed and built to detect the sky-averaged Hi signal from the EoR at low radio frequencies. The BIGHORNS system is a mobile total power radiometer, which can be deployed in any remote location in order to collect radio frequency interference (RFI) free data. The system was deployed in remote, radio quiet locations in Western Australia and low RFI sky data have been collected. We present a description of the system, its characteristics, details of data analysis, and calibration. We have identified multiple challenges to achieving the required measurement precision, which triggered two major improvements for the future system.
Accumulating evidence suggests that the Diagnostic and Statistical Manual of Mental Disorders (DSM) diagnostic criteria for cannabis abuse and dependence are best represented by a single underlying factor. However, it remains possible that models with additional factors, or latent class models or hybrid models, may better explain the data. Using structured interviews, 626 adult male and female twins provided complete data on symptoms of cannabis abuse and dependence, plus a craving criterion. We compared latent factor analysis, latent class analysis, and factor mixture modeling using normal theory marginal maximum likelihood for ordinal data. Our aim was to derive a parsimonious, best-fitting cannabis use disorder (CUD) phenotype based on DSM-IV criteria and determine whether DSM-5 craving loads onto a general factor. When compared with latent class and mixture models, factor models provided a better fit to the data. When conditioned on initiation and cannabis use, the association between criteria for abuse, dependence, withdrawal, and craving were best explained by two correlated latent factors for males and females: a general risk factor to CUD and a factor capturing the symptoms of social and occupational impairment as a consequence of frequent use. Secondary analyses revealed a modest increase in the prevalence of DSM-5 CUD compared with DSM-IV cannabis abuse or dependence. It is concluded that, in addition to a general factor with loadings on cannabis use and symptoms of abuse, dependence, withdrawal, and craving, a second clinically relevant factor defined by features of social and occupational impairment was also found for frequent cannabis use.