We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The occurrence of behavioral health emergencies (BHEs) in children is increasing in the United States, with patient presentations to Emergency Medical Services (EMS) behaving similarly. However, detailed evaluations of EMS encounters for pediatric BHEs at the national level have not been reported.
Methods:
This was a secondary analysis of a national convenience sample of EMS electronic patient care records (ePCRs) collected from January 1, 2018 through December 31, 2021. Inclusion criteria were all EMS activations documented as 9-1-1 responses involving patients < 18 years of age with a primary or secondary provider impression of a BHE. Patient demographics, incident characteristics, and clinical variables including administration of sedation medications, use of physical restraint, and transport status were examined overall and by calendar year.
Results:
A total of 1,079,406 pediatric EMS encounters were present in the dataset, of which 102,014 (9.5%) had behavioral health provider impressions. Just over one-half of BHEs occurred in females (56.2%), and 68.1% occurred in patients aged 14-17 years. Telecommunicators managing the 9-1-1 calls for these events reported non-BHE patient complaints in 34.7%. Patients were transported by EMS 68.9% of the time, while treatment and/or transport by EMS was refused in 12.5%. Prehospital clinicians administered sedation medications in 1.9% of encounters and applied physical restraints in 1.7%. Naloxone was administered for overdose rescue in 1.5% of encounters.
Conclusion:
Approximately one in ten pediatric EMS encounters occurring in the United States involve a BHE, and the majority of pediatric BHEs attended by EMS result in transport of the child. Use of sedation medications and physical restraints by prehospital clinicians in these events is rare. National EMS data from a variety of sources should continue to be examined to monitor trends in EMS encounters for BHEs in children.
Incorporating emerging knowledge into Emergency Medical Service (EMS) competency assessments is critical to reflect current evidence-based out-of-hospital care. However, a standardized approach is needed to incorporate new evidence into EMS competency assessments because of the rapid pace of knowledge generation.
Objective:
The objective was to develop a framework to evaluate and integrate new source material into EMS competency assessments.
Methods:
The National Registry of Emergency Medical Technicians (National Registry) and the Prehospital Guidelines Consortium (PGC) convened a panel of experts. A Delphi method, consisting of virtual meetings and electronic surveys, was used to develop a Table of Evidence matrix that defines sources of EMS evidence. In Round One, participants listed all potential sources of evidence available to inform EMS education. In Round Two, participants categorized these sources into: (a) levels of evidence quality; and (b) type of source material. In Round Three, the panel revised a proposed Table of Evidence. Finally, in Round Four, participants provided recommendations on how each source should be incorporated into competency assessments depending on type and quality. Descriptive statistics were calculated with qualitative analyses conducted by two independent reviewers and a third arbitrator.
Results:
In Round One, 24 sources of evidence were identified. In Round Two, these were classified into high- (n = 4), medium- (n = 15), and low-quality (n = 5) of evidence, followed by categorization by purpose into providing recommendations (n = 10), primary research (n = 7), and educational content (n = 7). In Round Three, the Table of Evidence was revised based on participant feedback. In Round Four, the panel developed a tiered system of evidence integration from immediate incorporation of high-quality sources to more stringent requirements for lower-quality sources.
Conclusion:
The Table of Evidence provides a framework for the rapid and standardized incorporation of new source material into EMS competency assessments. Future goals are to evaluate the application of the Table of Evidence framework in initial and continued competency assessments.
Paramedics are a vital component of the Emergency Medical Services (EMS) workforce and the United States health care system. The continued provision of high-quality care demands constantly improving education at accredited institutions. To date, only limited characteristics of paramedic education in the United States have been documented and studied in the literature. The objective of this study was to describe the educational infrastructure of accredited paramedic programs in the United States in 2018.
Methods:
This is a retrospective, cross-sectional evaluation of the 2018 paramedic program annual report from The Committee on Accreditation of Educational Programs for the EMS Professions (CoAEMSP; Rowlett, Texas USA). The dataset includes detailed program metrics. Additionally, questions concerning program characteristics, demographics, and resources were asked as part of the evaluation. Resource availability was assessed via the Resource Assessment Matrix (RAM) with a benchmark of 80%. Included in the analysis are all paramedic programs with students enrolled. Descriptive statistics were calculated (median, [interquartile range/IQR]).
Results:
A total of 677 programs submitted data (100% response rate). Of these, 626 met inclusion criteria, totaling 17,422 students. Program annual enrollment varied greatly from one to 362 with most programs having small sizes (18 students [IQR 12-30]). Program duration was 12 months [IQR 12-16] with total hours of instruction being approximately 1,174 [IQR 1069-1304], 19% of which were dedicated to clinical experience. Full-time faculty sizes were small (two faculty members [IQR 1-3]) with most programs (80%) having annual operating budgets below USD$500,000. For programs with an annual budget below USD$100,000 (34% of programs), annual enrollment was approximately 14 students [IQR 9-21]. Degrees conferred by programs included certificates (90%), associate degrees (55%), and bachelor’s degree (2%). Simulation access was assessed with nearly all (100%) programs reporting simple task trainers and 84% of programs investing in advanced simulation manikins. Seventy-eight percent of programs met the RAM benchmark.
Conclusion:
Most paramedic educational programs in the United States have small annual enrollments with low numbers of dedicated faculty and confer certificates and associate degrees. Nearly one-quarter of paramedic educational programs are not adequately resourced. This study is limited by self-reported data to the national accreditation agency. Future work is needed to identify program characteristics that are associated with high performance.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Epidemiological studies have reported that the increased risk of developing psychosis in cannabis users is dose related. In addition, experimental research has shown that the active constituent of cannabis responsible for its psychotogenic effect is Delta-9-Tetrahydrocannabinol (THC) (Murray et al, 2007). Recent evidence has suggested an increased in potency (% TCH) in the cannabis seized in the UK (Potter et al, 2007).
Hypothesis:
We predicted that first episode psychosis patients are more likely to use higher potency cannabis and more frequently than controls.
Methods:
We collected information concerning socio-demographic, clinical characteristics and cannabis use (age at first use, frequency, length of use, type of cannabis used) from a sample of 191 first-episode psychosis patients and 120 matched healthy volunteers. All were recruited as part of the Genetic and Psychosis (GAP) study which studied all patients who presented to the South London and Maudsley Trust.
Results:
There was no significant difference in the life-time prevalence of cannabis use or age at first use between cases and controls. However, cases were more likely to be regular users (p=0.05), to be current users (p=0.04) and to have smoked cannabis for longer (p=0.01). Among cannabis users, 86.8% of 1st Episode Psychosis Patients preferentially used Skunk/Sinsemilla compared to 27.7% of Controls. Only 13.2 % of 1st Episode psychosis Patients chose to use Resin/Hash compared to 76.3% of controls. The concentration of TCH in these in South East London, ranges between 8.5 and 14 % (Potter et al, 2007). Controls (47%) were more likely to use Hash (Resin) whose average TCH concentration is 3.4% (Potter et al, 2007).
Conclusions:
Patients with first episode psychosis have smoked higher potency cannabis, for longer and with greater frequency, than healthy controls.
The functional Catechol-O-methyltransferase (COMT Val 108/158 Met) polymorphism has been shown to have an impact on tasks of executive function, memory and attention and recently, tasks with an affective component. As estrogen may downregulate COMT, we were interested in the effect of gender, COMT genotype and the interaction between these factors on brain activations during an affective processing task. We used functional MRI to record brain activations from 74 healthy subjects who engaged in a facial affect recognition task; subjects viewed and identified fearful faces compared to neutral faces. We found a significant effect of gender on brain activations in the left amygdala and right superior temporal gyrus, where females demonstrated increased activations over males. Within these regions, female val/val carriers showed greater activity compared to met/met carriers, while male participants with a met/met allele showed greater deactivations compared to val/val carriers. There was no main effect of the COMT polymorphism, gender or genotype by gender interaction on task performance. We propose that the observed effects of gender and COMT allele on brain activations arise from differences in dopamine levels in these groups and that the gender differences and gender genotype interaction may be due to the downregulation of COMT by estrogen.
Introduction: Although oral rehydration therapy is recommended for children with acute gastroenteritis (AGE) with none to some dehydration, intravenous (IV) rehydration is still commonly administered to these children in high-income countries. IV rehydration is associated with pain, anxiety, and emergency department (ED) revisits in children with AGE. A better understanding of the factors associated with IV rehydration is needed to inform knowledge translation strategies. Methods: This was a planned secondary analysis of the Pediatric Emergency Research Canada (PERC) and Pediatric Emergency Care Applied Research Network (PECARN) randomized, controlled trials of oral probiotics in children with AGE-associated diarrhea. Eligible children were aged 3-48 months and reported > 3 watery stools in a 24-hour period. The primary outcome was administration of IV rehydration at the index ED visit. We used mixed-effects logistic regression model to explore univariable and multivariable relationships between IV rehydration and a priori risk factors. Results: From the parent study sample of 1848 participants, 1846 had data available for analysis: mean (SD) age of 19.1 ± 11.4 months, 45.4% females. 70.2% (1292/1840) vomited within 24 hours of the index ED visit and 34.1% (629/1846) received ondansetron in the ED. 13.0% (240/1846) were administered IV rehydration at the index ED visit, and 3.6% (67/1842) were hospitalized. Multivariable predictors of IV rehydration were Clinical Dehydration Scale (CDS) score [compared to none: mild to moderate (OR: 8.1, CI: 5.5-11.8); severe (OR: 45.9, 95% CI: 20.1-104.7), P < 0.001], ondansetron in the ED (OR: 1.8, CI: 1.2-2.6, P = 0.003), previous healthcare visit for the same illness [compared to no prior visit: prior visit with no IV (OR: 1.9, 95% CI: 1.3-2.9); prior visit with IV (OR: 10.5, 95% CI: 3.2-34.8), P < 0.001], and country [compared to Canada: US (OR: 4.1, CI: 2.3-7.4, P < 0.001]. Significantly more participants returned to the ED with symptoms of AGE within 3 days if IV fluids were administered at the index visit [30/224 (13.4%) versus 88/1453 (6.1%), P < 0.001]. Conclusion: Higher CDS scores, antiemetic use, previous healthcare visits and country were independent predictors of IV rehydration which was also associated with increased ED revisits. Knowledge translation focused on optimizing the use of antiemetics (i.e. for those with dehydration) and reducing the geographic variation in IV rehydration use may improve the ED experience and reduce ED-revisits.
Public Health England has set a definition for free sugars in the UK in order to estimate intakes of free sugars in the National Diet and Nutrition Survey. This follows the recommendation from the Scientific Advisory Committee on Nutrition in its 2015 report on Carbohydrates and Health that a definition of free sugars should be adopted. The definition of free sugars includes: all added sugars in any form; all sugars naturally present in fruit and vegetable juices, purées and pastes and similar products in which the structure has been broken down; all sugars in drinks (except for dairy-based drinks); and lactose and galactose added as ingredients. The sugars naturally present in milk and dairy products, fresh and most types of processed fruit and vegetables and in cereal grains, nuts and seeds are excluded from the definition.
A range of endophenotypes characterise psychosis, however there has been limited work understanding if and how they are inter-related.
Methods
This multi-centre study includes 8754 participants: 2212 people with a psychotic disorder, 1487 unaffected relatives of probands, and 5055 healthy controls. We investigated cognition [digit span (N = 3127), block design (N = 5491), and the Rey Auditory Verbal Learning Test (N = 3543)], electrophysiology [P300 amplitude and latency (N = 1102)], and neuroanatomy [lateral ventricular volume (N = 1721)]. We used linear regression to assess the interrelationships between endophenotypes.
Results
The P300 amplitude and latency were not associated (regression coef. −0.06, 95% CI −0.12 to 0.01, p = 0.060), and P300 amplitude was positively associated with block design (coef. 0.19, 95% CI 0.10–0.28, p < 0.001). There was no evidence of associations between lateral ventricular volume and the other measures (all p > 0.38). All the cognitive endophenotypes were associated with each other in the expected directions (all p < 0.001). Lastly, the relationships between pairs of endophenotypes were consistent in all three participant groups, differing for some of the cognitive pairings only in the strengths of the relationships.
Conclusions
The P300 amplitude and latency are independent endophenotypes; the former indexing spatial visualisation and working memory, and the latter is hypothesised to index basic processing speed. Individuals with psychotic illnesses, their unaffected relatives, and healthy controls all show similar patterns of associations between endophenotypes, endorsing the theory of a continuum of psychosis liability across the population.
Academic health systems and their investigators are challenged to systematically assure clinical research regulatory compliance. This challenge is heightened in the emerging era of centralized single Institutional Review Boards for multicenter studies, which rely on monitoring programs at each participating site.
Objective
To describe the development, implementation, and outcome measurement of an institution-wide paired training curriculum and internal monitoring program for clinical research regulatory compliance.
Methods
Standard operating procedures (SOPs) were developed to facilitate investigator and research professional adherence to institutional policies, federal guidelines, and international standards. An SOP training curriculum was developed and implemented institution-wide. An internal monitoring program was launched, utilizing risk-based monitoring plans of pre-specified frequency and intensity, assessed upon Institutional Review Boards approval of each prospective study. Monitoring plans were executed according to an additional SOP on internal monitoring, with monitoring findings captured in a REDCap database.
Results
We observed few major violations across 3 key domains of clinical research conduct and demonstrated a meaningful decrease in the rates of nonmajor violations in each, over the course of 2 years.
Conclusion
The paired training curriculum and monitoring program is a successful institution-wide clinical research regulatory compliance model that will continue to be refined.
A clean hot-water drill was used to gain access to Subglacial Lake Whillans (SLW) in late January 2013 as part of the Whillans Ice Stream Subglacial Access Research Drilling (WISSARD) project. Over 3 days, we deployed an array of scientific tools through the SLW borehole: a downhole camera, a conductivity–temperature–depth (CTD) probe, a Niskin water sampler, an in situ filtration unit, three different sediment corers, a geothermal probe and a geophysical sensor string. Our observations confirm the existence of a subglacial water reservoir whose presence was previously inferred from satellite altimetry and surface geophysics. Subglacial water is about two orders of magnitude less saline than sea water (0.37–0.41 psu vs 35 psu) and two orders of magnitude more saline than pure drill meltwater (<0.002 psu). It reaches a minimum temperature of –0.55~C, consistent with depression of the freezing point by 7.019 MPa of water pressure. Subglacial water was turbid and remained turbid following filtration through 0.45 µm filters. The recovered sediment cores, which sampled down to 0.8 m below the lake bottom, contained a macroscopically structureless diamicton with shear strength between 2 and 6 kPa. Our main operational recommendation for future subglacial access through water-filled boreholes is to supply enough heat to the top of the borehole to keep it from freezing.
Horsenettle was described quantitatively in two grazed bermudagrass pastures for two years. Six times from April through October, plants were counted, phenologically documented, and tops were harvested for estimation of above-ground phytomass. Roots were sampled for carbohydrate, N fraction, and macronutrient analyses. Top growth, flowering, fruiting, and chemical composition were similar between sites within years, but differed between years. Starch and total non-structural carbohydrates in roots increased following reproductive periods both years. Root concentrations of protein and non-protein N increased and concentrations of P decreased in August and September in one year only. Analysis of the relationship between seasonal trends in carbohydrate fluctuation and pasture management practices suggests that systemic herbicide application in mid-summer may lead to more effective control than application in spring.
In 2008 and 2009, the House of Representatives directed billions of dollars to the auto industry by passing a bailout and the “cash for clunkers” program. Moving beyond corporate influence via campaign contributions, we demonstrate that the presence of auto workers in a district strongly predicts legislative support for both bills. In addition to this critical legislation, we also analyze over 250 bills on which the auto industry either lobbied or took a public position. We find no patterns relating a district's workers or corporate campaign contributions to these votes on broader legislation where other groups, such as environmental advocates or labor unions, are at the table. Instead, the auto industry garners consistent support only on quasi-private, particularistic legislation. Thus, we contend that on particularistic legislation the presence of workers (not just campaign contributions) drives legislative support; however, when legislators expand the scope of conflict, the influence of a single industry is attentuated by other interests.
This study investigated factors that influence occurrence and persistence of plant DNA in the soil environment in three crop rotations. In each rotation, soil was sampled in May before planting, in July and August while crops were growing, and in October after harvest. Total DNA was recovered from soil samples taken at two different depths in the soil profile and quantified. Three target plant genes (corn CP4 epsps, corn 10-kD Zein, and soybean CP4 epsps) also were quantified in these DNA extracts using species-specific quantitative real-time PCR assays. In general, total plant DNA content in the soil environment was greatest when the crop was growing in the field and decreased rapidly after harvest. Nevertheless, low levels of target plant DNA were often still detectable the following spring. Age of rotation did not influence target DNA quantities found in the soil environment. Data were collected for a combination of 10 location-years, which allowed for estimation of the variance components for six factors including time of sampling, year, location, crop, sampling depth, and herbicide to total and target DNA content in the soil samples. Mean target recombinant DNA content in soil was influenced most strongly by time of sampling and year (85 and 6%, respectively), whereas total soil DNA content was less dynamic and was most strongly influenced by location and year (49 and 25%, respectively). Over the duration of this study, no accumulation of transgenic plant DNA in the soil environment was observed.
An abattoir-based study was undertaken between January and May 2013 to estimate the prevalence of Salmonella spp. and Yersinia spp. carriage and seroprevalence of antibodies to Toxoplasma gondii and porcine reproductive and respiratory syndrome virus (PRRSv) in UK pigs at slaughter. In total, 626 pigs were sampled at 14 abattoirs that together process 80% of the annual UK pig slaughter throughput. Sampling was weighted by abattoir throughput and sampling dates and pig carcasses were randomly selected. Rectal swabs, blood samples, carcass swabs and the whole caecum, tonsils, heart and tongue were collected. Salmonella spp. was isolated from 30·5% [95% confidence interval (CI) 26·5–34·6] of caecal content samples but only 9·6% (95% CI 7·3–11·9) of carcass swabs, which was significantly lower than in a UK survey in 2006–2007. S. Typhimurium and S. 4,[5],12:i:- were the most commonly isolated serovars, followed by S. Derby and S. Bovismorbificans. The prevalence of Yersinia enterocolitica carriage in tonsils was 28·7% (95% CI 24·8–32·7) whereas carcass contamination was much lower at 1·8% (95% CI 0·7–2·8). The seroprevalence of antibodies to Toxoplasma gondii and PRRSv was 7·4% (95% CI 5·3–9·5) and 58·3% (95% CI 53·1–63·4), respectively. This study provides a comparison to previous abattoir-based prevalence surveys for Salmonella and Yersinia, and the first UK-wide seroprevalence estimates for antibodies to Toxoplasma and PRRSv in pigs at slaughter.
By
David T. Powell, University of Texas Southwestern Medical Center,
Max Wintermark, Stanford University,
Sugoto Mukherjee, University of Virginia Health System,
Yang Tang, Virginia Commonwealth University Medical Center
Edited by
Yang Tang, Virginia Commonwealth University,Sugoto Mukherjee, University of Virginia,Max Wintermark, Stanford University School of Medicine, California
A lingual thyroid is a known cause of oropharyngeal obstruction in the neonate. It can be asymptomatic, or present as stridor, dysphonia, dysphagia or dyspnoea with faltering growth. The therapeutic options include surgical resection.
Case report:
A 6-day-old female neonate, born at 36 weeks gestation, presented with stridulous breathing and poor feeding. Although the cause was initially thought to be laryngomalacia, nasendoscopy revealed a lingual thyroid. The baby had deranged thyroid function, as detected on neonatal screening, but this result was not available until a later date. Despite being symptomatic, the patient was managed medically; thyroxine therapy was associated with resolution of the respiratory symptoms.
Conclusion:
Nasendoscopy provides valuable information about an ectopic thyroid gland. Thyroid replacement therapy may help to suppress the size of the ectopic gland and ultimately prevent an unnecessary surgical procedure.
To examine the effect of fast-food and full-service restaurant consumption on adults’ energy intake and dietary indicators.
Design
Individual-level fixed-effects regression model estimation based on two different days of dietary intake data was used.
Setting
Parallel to the rising obesity epidemic in the USA, there has been a marked upward trend in total energy intake derived from food away from home.
Subjects
The full sample included 12 528 respondents aged 20–64 years who completed 24 h dietary recall interviews for both day 1 and day 2 in the National Health and Nutrition Examination Survey (NHANES) 2003–2004, 2005–2006, 2007–2008 and 2009–2010.
Results
Fast-food and full-service restaurant consumption, respectively, was associated with an increase in daily total energy intake of 813·75 kJ (194·49 kcal) and 858·04 kJ (205·21 kcal) and with higher intakes of saturated fat (3·48 g and 2·52 g) and Na (296·38 mg and 451·06 mg). Individual characteristics moderated the impacts of restaurant food consumption with adverse impacts on net energy intake being larger for black adults compared with their white and Hispanic counterparts and greater for middle-income v. high-income adults.
Conclusions
Adults’ fast-food and full-service restaurant consumption was associated with higher daily total energy intake and poorer dietary indicators.