To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Environmental DNA (eDNA) metabarcoding has lagged in parasite biodiversity assessments. We implemented this method to examine parasite diversity in sediment and water from 4 physically connected aquatic habitats in coastal South Carolina, USA, as part of a ParasiteBlitz in April 2023. Sediment was collected using a syringe corer, and water was sampled using active filtration and passive collection. Five amplicon libraries, using primers targeting portions of the mitochondrial COI of platyhelminths and 18S ribosomal RNA genes of nematodes, myxozoans, microsporidians, and protists, successfully yielded parasite sequences. Out of >5.8 million sequences, we identified >1,000 parasite amplicon sequence variants (ASVs) corresponding to ~600 parasite operational taxonomic units, from 6 parasite groups. Most diversity was observed among the microsporidians, whose assay demonstrated the highest fidelity. Actively-filtered water samples captured ASVs of all 6 groups, whereas sediment captured only 4, despite yielding 3× as many ASVs. Low DNA yields from passive water samples resulted in fewer, but some unique, ASVs representing 3 parasite groups. The most efficient sampling method varied with respect to parasite group across habitats, and the parasite communities from each habitat were distinct regardless of sampling method. We detected ASVs of 9 named species, 4 of which may represent introductions to the US. The abundance of our results demonstrates the effectiveness and efficiency of eDNA metabarcoding for assessing parasite diversity during short, intensive surveys, and highlights the critical need for more comprehensive sequence databases and the development of primers for those parasite taxa that elude detection using eDNA methods.
Historically, it has been proposed that functional neurological symptoms occur more frequently on the left side of the body due to a distinct body representation and emotional processing of the right hemisphere, yet objective imaging data to support this are lacking. We aimed to investigate whether patients with acute left-sided symptoms (right hemisphere) suspected of having a minor stroke are more likely to show negative diffusion-weighted imaging (DWI) compared to those with right-sided symptoms.
Methods:
Data are from the SpecTRA (Spectrometry for Transient Ischemic Attack Rapid Assessment) multicenter prospective cohort study conducted between 2013 and 2017. Patients with mild persistent unilateral hemiparesis and/or hemisensory symptoms (National Institute of Health Stroke Scale ≤ 3) and available DWI were included. The primary outcome was the proportion of patients with a negative DWI.
Results:
Of 1731 patients, 584 (30.8%) were included. Of these, 310 (53.1%) patients presented with left-sided symptoms and 274 (46.9%) with right-sided symptoms. Overall, 214 (36.6%) patients had a negative DWI, 126 (58.9%) with left-sided symptoms and 88 (41.1%) with right-sided symptoms: risk ratio (RR) 1.27 (95% CI = 1.02–1.57). Left-sided hemiparesis was associated with negative DWI (RR 1.42 [95% CI = 1.08–1.87]), while left-sided hemisensory symptoms were not (RR 1.11 [95% CI = 0.87–1.41]). There was no effect modification by age or sex on this association (Pinteraction 0.787 and 0.057, respectively).
Conclusions:
Unilateral left-sided neurological symptoms were more frequently associated with negative DWI compared to right-sided symptoms in suspected minor stroke patients. This observation is exploratory, as the final diagnosis in DWI-negative cases was not established.
Tenecteplase has been shown to be non-inferior to alteplase for the treatment of acute ischemic stroke within 4.5 hours of stroke onset. While not formally approved by regulatory authorities, many jurisdictions have transitioned to using tenecteplase for routine stroke treatment because it is simpler to use and has cost advantages.
Methods:
We report a three-phase time-series analysis over 2.5 years and the process for transition from use of alteplase to tenecteplase for the routine treatment of acute ischemic stroke from a system-wide perspective involving an entire province. The transition was planned and implemented centrally. Data were collected in clinical routine, arising from both administrative sources and a prospective stroke registry, and represent real-world outcome data. Data are reported using standard descriptive statistics.
Results:
A total of 1211 patients were treated with intravenous thrombolysis (477 pre-transition using alteplase, 180 transition period using both drugs, 554 post-transition using tenecteplase). Baseline characteristics, adverse events and outcomes were similar between epochs. There were four dosing errors with tenecteplase, including providing the cardiac dose to two patients. There were no instances of major hemorrhage associated with dosing errors.
Discussion:
The transition to using intravenous tenecteplase for stroke treatment was seamless and resulted in identical outcomes to intravenous alteplase.
Syncope is common among pediatric patients and is rarely pathologic. The mechanisms for symptoms during exercise are less well understood than the resting mechanisms. Additionally, inert gas rebreathing analysis, a non-invasive examination of haemodynamics including cardiac output, has not previously been studied in youth with neurocardiogenic syncope.
Methods:
This was a retrospective (2017–2023), single-center cohort study in pediatric patients ≤ 21 years with prior peri-exertional syncope evaluated with echocardiography and cardiopulmonary exercise testing with inert gas rebreathing analysis performed on the same day. Patients with and without symptoms during or immediately following exercise were noted.
Results:
Of the 101 patients (15.2 ± 2.3 years; 31% male), there were 22 patients with symptoms during exercise testing or recovery. Resting echocardiography stroke volume correlated with resting (r = 0.53, p < 0.0001) and peak stroke volume (r = 0.32, p = 0.009) by inert gas rebreathing and with peak oxygen pulse (r = 0.61, p < 0.0001). Patients with syncopal symptoms peri-exercise had lower left ventricular end-diastolic volume (Z-score –1.2 ± 1.3 vs. –0.36 ± 1.3, p = 0.01) and end-systolic volume (Z-score –1.0 ± 1.4 vs. −0.1 ± 1.1, p = 0.001) by echocardiography, lower percent predicted peak oxygen pulse during exercise (95.5 ± 14.0 vs. 104.6 ± 18.5%, p = 0.04), and slower post-exercise heart rate recovery (31.0 ± 12.7 vs. 37.8 ± 13.2 bpm, p = 0.03).
Discussion:
Among youth with a history of peri-exertional syncope, those who become syncopal with exercise testing have lower left ventricular volumes at rest, decreased peak oxygen pulse, and slower heart rate recovery after exercise than those who remain asymptomatic. Peak oxygen pulse and resting stroke volume on inert gas rebreathing are associated with stroke volume on echocardiogram.
Animal foods, especially dairy products, eggs and fish, are the main source of iodine in the UK. However, the use of plant-based alternative products (PBAP) is increasing owing to issues of environmental sustainability. We previously measured the iodine content of milk-alternatives(1) but data are lacking on the iodine content of other plant-based products and there is now a greater number of iodine-fortified products. We aimed to compare: (i) the iodine concentration of fortified and unfortified PBAP and (ii) the iodine concentration of PBAP with their animal-product equivalents, including those not previously measured such as egg and fish alternatives.
The iodine concentration of 50 PBAP was analysed in March 2022 at LGC using ICP-MS. The products were selected from a market survey of six UK supermarkets in December 2021. Samples of matrix-matched (e.g. soya/oat) fortified and unfortified alternatives to milk (n = 13 and n = 11), yoghurt (n = 2 and n = 7) and cream (n = 1 and n = 5) were selected for analysis, as well as egg- (n = 1) and fish-alternatives (n = 10). We compared the iodine concentration between PBAPs and data on their animal-product equivalents(2).
The iodine concentration of fortified PBAPs was significantly higher than that of unfortified products; the median iodine concentration of fortified vs. unfortified milk alternatives was 321 vs. 0.84 µg/kg (p<0.001) and of fortified and unfortified yoghurt alternatives was 212 µg/kg vs 3.03 µg/kg (p = 0.04). The fortified cream alternative had a higher iodine concentration than the unfortified alternatives (259 vs. 26.5 µg/kg). The measured iodine concentration of the fortified products differed from that of the product label (both lower and higher); overall, the measured iodine concentration was significantly higher than that stated on the label (mean difference 49.1 µg/kg; p = 0.018).
Compared to the animal-product equivalents, the iodine concentration of unfortified PBAPs was significantly lower for milk (p<0.001) and yoghurt (p<0.001), while there was no difference with fortified versions of milk (p = 0.28) and yoghurt (p = 0.09). The egg alternative had an iodine concentration that was just 0.6% of that of chicken eggs (3.38 vs. 560 µg/kg). Three (30%) of the fish alternatives had kelp/seaweed as ingredients and the median iodine concentration of these products was (non-significantly) higher than those without (126 vs 75 μg/kg; p = 0.83). However, the iodine content of all fish-alternative products was ten-times lower than that of fish (median 99 vs. 995 µg/kg; p<0.001).
The majority of PBAP are not fortified with iodine but those that are fortified have a significantly higher iodine concentration than unfortified products and are closer to the value of their animal equivalents. From an iodine perspective, unfortified plant-based alternatives are not suitable replacements and consumers should ensure adequate iodine from other dietary sources. Manufacturers should consider iodine fortification of a greater number of plant-based alternatives.
Seed genebanks must maintain collections of healthy seeds and regenerate accessions before seed viability declines. Seed shelf life is often characterized at the species level; however, large, unexplained variation among genetic lines within a species can and does occur. This variation contributes to unreliable predictions of seed quality decline with storage time. To assess variation of seed longevity and aid in timing regeneration, ten varieties of pea (Pisum sativum L.), chickpea (Cicer arietinum L.) and lentil (Lens culinaris Medikus subsp. culinaris) from the Australian Grains Genebank were stored at moderate temperature (20°C) and moisture (7–11% water, relative humidity [RH] ~30%) and deterioration was assessed by yearly germination tests for 20 years. Decline in germination was fit to a sigmoidal model and the time corresponding to 50% germination (P50) was used to express seed longevity for each genetic line. The feasibility of using RNA fragmentation to assess changed seed health was measured using RNA integrity number (RIN) from RNA extracted from seeds that were stored for 13 and 20 years. Seed lots of legume grains that maintained high survival throughout the 20 years (i.e. they aged slower than other lines) had higher RIN than samples that degraded faster. RIN was lower in embryonic axes compared with cotyledons in the more deteriorated samples, perhaps indicating that axes exhibit symptoms of ageing sooner than cotyledons. Overall, RIN appears to be associated with longevity indicators of germination for these legumes and indicating that RIN decline can be used to assess ageing rate, which is needed to optimize viability monitoring.
Diagnosis of acute ischemia typically relies on evidence of ischemic lesions on magnetic resonance imaging (MRI), a limited diagnostic resource. We aimed to determine associations of clinical variables and acute infarcts on MRI in patients with suspected low-risk transient ischemic attack (TIA) and minor stroke and to assess their predictive ability.
Methods:
We conducted a post-hoc analysis of the Diagnosis of Uncertain-Origin Benign Transient Neurological Symptoms (DOUBT) study, a prospective, multicenter cohort study investigating the frequency of acute infarcts in patients with low-risk neurological symptoms. Primary outcome parameter was defined as diffusion-weighted imaging (DWI)-positive lesions on MRI. Logistic regression analysis was performed to evaluate associations of clinical characteristics with MRI-DWI-positivity. Model performance was evaluated by Harrel’s c-statistic.
Results:
In 1028 patients, age (Odds Ratio (OR) 1.03, 95% Confidence Interval (CI) 1.01–1.05), motor (OR 2.18, 95%CI 1.27–3.65) or speech symptoms (OR 2.53, 95%CI 1.28–4.80), and no previous identical event (OR 1.75, 95%CI 1.07–2.99) were positively associated with MRI-DWI-positivity. Female sex (OR 0.47, 95%CI 0.32–0.68), dizziness and gait instability (OR 0.34, 95%CI 0.14–0.69), normal exam (OR 0.55, 95%CI 0.35–0.85) and resolved symptoms (OR 0.49, 95%CI 0.30–0.78) were negatively associated. Symptom duration and any additional symptoms/symptom combinations were not associated. Predictive ability of the model was moderate (c-statistic 0.72, 95%CI 0.69–0.77).
Conclusion:
Detailed clinical information is helpful in assessing the risk of ischemia in patients with low-risk neurological events, but a predictive model had only moderate discriminative ability. Patients with clinically suspected low-risk TIA or minor stroke require MRI to confirm the diagnosis of cerebral ischemia.
Background: Cerebral venous thrombosis (CVT)most commonly affects younger women. Diagnosis may be delayed due to its distinct presentation and demographic profile compared to other stroke types. Methods: We examined delays to diagnosis of CVT in the SECRET randomized trial and TOP-SECRET parallel registry. Adults diagnosed with symptomatic CVT within <14 days were included. We examined time to diagnosis and number of health care encounters prior to diagnosis and associations with demographics, clinical and radiologic features and functional and patient-reported outcomes (PROMS) at days 180&365. Results: Of 103 participants, 68.9% were female; median age was 45 (IQR 31.0-61.0). Median time from symptom onset to diagnosis was 4 (1-8) days. Diagnosis on first presentation to medical attention was made in 60.2%. The difference in time to diagnosis for single versus multiple presentations was on the order of days (3[1-7] vs. 5[2-11.75], p=0.16). Women were likelier to have multiple presentations (OR 2.53; 95% CI1.00-6.39; p=0.05) and longer median times to diagnosis (5[2-8]days vs. 2[1-4.5] days; p=0.005). However, this was not associated with absolute or change in functional, or any patient reported, outcome measures (PROMs) at days 180&365. Conclusions: Diagnosis of CVT was commonly delayed; women were likelier to have multiple presentations. We found no association between delayed diagnosis and outcomes.
Critical CHD is associated with morbidity and mortality, worsened by delayed diagnosis. Paediatric residents are front-line clinicians, yet identification of congenital CHD remains challenging. Current exposure to cardiology is limited in paediatric resident education. We evaluated the impact of rapid cycle deliberate practice simulation on paediatric residents’ skills, knowledge, and perceived competence to recognise and manage infants with congenital CHD.
Methods:
We conducted a 6-month pilot study. Interns rotating in paediatric cardiology completed a case scenario assessment during weeks 1 and 4 and participated in paired simulations (traditional debrief and rapid cycle deliberate practice) in weeks 2–4. We assessed interns’ skills during the simulation using a checklist of “cannot miss” tasks. In week 4, they completed a retrospective pre-post knowledge-based survey. We analysed the data using summary statistics and mixed effect linear regression.
Results:
A total of 26 interns participated. There was a significant increase in case scenario assessment scores between weeks 1 and 4 (4, interquartile range 3–6 versus 8, interquartile range 6–10; p-value < 0.0001). The percentage of “cannot miss” tasks on the simulation checklist increased from weeks 2 to 3 (73% versus 83%, p-value 0.0263) and from weeks 2–4 (73% versus 92%, p-value 0.0025). The retrospective pre-post survey scores also increased (1.67, interquartile range 1.33–2.17 versus 3.83, interquartile range 3.17–4; p-value < 0.0001).
Conclusion:
Rapid cycle deliberate practice simulations resulted in improved recognition and initiation of treatment of simulated infants with congenital CHD among paediatric interns. Future studies will include full implementation of the curriculum and knowledge retention work.
Despite advances in incorporating diversity and structural competency into medical education curriculum, there is limited curriculum for public health research professionals. We developed and implemented a four-part diversity, equity, and inclusion (DEI) training series tailored for academic health research professionals to increase foundational knowledge of core diversity concepts and improve skills.
Methods:
We analyzed close- and open-ended attendee survey data to evaluate within- and between-session changes in DEI knowledge and perceived skills.
Results:
Over the four sessions, workshop attendance ranged from 45 to 82 attendees from our 250-person academic department and represented a mix of staff (64%), faculty (25%), and trainees (11%). Most identified as female (74%), 28% as a member of an underrepresented racial and ethnic minority (URM) group, and 17% as LGBTQI. During all four sessions, attendees increased their level of DEI knowledge, and within sessions two through four, attendees’ perception of DEI skills increased. We observed increased situational DEI awareness as higher proportions of attendees noted disparities in mentoring and opportunities for advancement/promotion. An increase in a perceived lack of DEI in the workplace as a problem was observed; but only statistically significant among URM attendees.
Discussion:
Developing applied curricula yielded measurable improvements in knowledge and skills for a diverse health research department of faculty, staff, and students. Nesting this training within a more extensive program of departmental activities to improve climate and address systematic exclusion likely contributed to the series’ success. Additional research is underway to understand the series’ longer-term impact on applying skills for behavior change.
Primary headache disorder is characterized by recurrent headaches which lack underlying causative pathology or trauma. Primary headache disorder is common and encompasses several subtypes including migraine. Vestibular migraine (VM) is a subtype of migraine that causes vestibular symptoms such as vertigo, difficulties with balance, nausea, and vomiting. Literature indicates subjective and performance-based cognitive problems (executive dysfunction) among migraineurs. This study compared the magnitude of the total effect size across neuropsychological domains to determine if there is a reliable difference in effect sizes between individuals with VM and healthy controls (HC). An additional aim was to meta-analyze neuropsychological outcomes in migraine subtypes (other than VM) in reference to healthy controls.
Participants and Methods:
This study was a part of a larger study examining neuropsychological functioning and impairment in individuals with primary headache disorder and HCs. Standardized search terms were applied in OneSearch and PubMed. The search interval covered articles published from 1986 to May 2021. Analyses were random-effects models. Hedge’s g was used as a bias-corrected estimate of effect size. Between-study heterogeneity was assessed using Cochran’s Q and I2. Publication bias was assessed with Duval and Tweedie’s Trim-and-Fill method to identify evidence of missing studies.
Results:
The initial omnibus literature search yielded 6692 studies. Three studies (n=151 VM and 150 HC) met our inclusion criteria of having a VM group and reported neuropsychological performance. VM demonstrated significantly worse performance overall when compared to HCs (k=3, g=-0.99, p<0.001; Q=4.41, I2=54.66) with a large effect size. Within-domain effects of VM were: Executive Functioning=-0.99 (Q=0.62, I2=0), Screener=-1.15 (Q=3.29, I2=69.59), and Visuospatial/Construction=-1.47 (Q=0.001, I2=0.00). Compared to chronic migraine (k=3, g=-0.59, p<0.001; Q=0.68, I2=0.00) and migraine without aura (k=23, g=-0.39, p<0.001; Q=109.70, I2=79.95), VM was the only migraine subgroup to display a large effect size. Trim-and-fill procedure estimated zero VM studies to be missing due to publication bias (adjusted g=-0.99, Q=4.41).
Conclusions:
This initial attempt at a meta-analysis of cognitive deficits in VM was hampered by a lack of studies in this area. Based on our initial findings, individuals with VM demonstrated overall worse performances on neuropsychological tests compared to HCs with the greatest level of impairment seen in visuospatial/construction. Additionally, VM resulted in a large effect size while other migraine subtypes yielded small to moderate effect sizes. Despite the small sample of studies, the overall effect across neuropsychological performance was generally stable (i.e., low between-study heterogeneity). Given than VM accounts for 7% of patients seen in vertigo clinics and 9% of all migraine patients, our results suggest that neuropsychological impairment in VM deserves significantly more study.
We compared the individual-level risk of hospital-onset infections with multidrug-resistant organisms (MDROs) in hospitalized patients prior to and during the coronavirus disease 2019 (COVID-19) pandemic. We also quantified the effects of COVID-19 diagnoses and intrahospital COVID-19 burden on subsequent MDRO infection risk.
Design:
Multicenter, retrospective, cohort study.
Setting:
Patient admission and clinical data were collected from 4 hospitals in the St. Louis area.
Patients:
Data were collected for patients admitted between January 2017 and August 2020, discharged no later than September 2020, and hospitalized ≥48 hours.
Methods:
Mixed-effects logistic regression models were fit to the data to estimate patients’ individual-level risk of infection with MDRO pathogens of interest during hospitalization. Adjusted odds ratios were derived from regression models to quantify the effects of the COVID-19 period, COVID-19 diagnosis, and hospital-level COVID-19 burden on individual-level hospital-onset MDRO infection probabilities.
Results:
We calculated adjusted odds ratios for COVID-19–era hospital-onset Acinetobacter spp., P. aeruginosa and Enterobacteriaceae spp infections. Probabilities increased 2.64 (95% confidence interval [CI], 1.22–5.73) times, 1.44 (95% CI, 1.03–2.02) times, and 1.25 (95% CI, 1.00–1.58) times relative to the prepandemic period, respectively. COVID-19 patients were 4.18 (95% CI, 1.98–8.81) times more likely to acquire hospital-onset MDRO S. aureus infections.
Conclusions:
Our results support the growing body of evidence indicating that the COVID-19 pandemic has increased hospital-onset MDRO infections.
The pace and trajectory of global and local environmental changes are jeopardizing our health in numerous ways, among them exacerbating the risk of disease emergence and spread in both the community and the healthcare setting via healthcare-associated infections (HAIs). Factors such as climate change, widespread land alteration, and biodiversity loss underlie changing human–animal–environment interactions that drive disease vectors, pathogen spillover, and cross-species transmission of zoonoses. Climate change–associated extreme weather events also threaten critical healthcare infrastructure, infection prevention and control (IPC) efforts, and treatment continuity, adding to stress to strained systems and creating new areas of vulnerability. These dynamics increase the likelihood of developing antimicrobial resistance (AMR), vulnerability to HAIs, and high-consequence hospital-based disease transmission. Using a One Health approach to both human and animal health systems, we can become climate smart by re-examining impacts on and relationships with the environment. We can then work collaboratively to reduce and respond to the growing threat and burden of infectious diseases.
Family-centered rounding has emerged as the gold standard for inpatient paediatrics rounds due to its association with improved family and staff satisfaction and reduction of harmful errors. Little is known about family-centered rounding in subspecialty paediatric settings, including paediatric acute care cardiology.
In this qualitative, single centre study, we conducted semi-structured interviews with providers and caregivers eliciting their attitudes toward family-centered rounding. An a priori recruitment approach was used to optimise diversity in reflected opinions. A brief demographic survey was completed by participants. We completed thematic analysis of transcribed interviews using grounded theory.
In total, 38 interviews representing the views of 48 individuals (11 providers, 37 caregivers) were completed. Three themes emerged: rounds as a moment of mutual accountability, caregivers’ empathy for providers, and providers’ objections to family-centered rounding. Providers’ objections were further categorised into themes of assumptions about caregivers, caregiver choices during rounds, and risk for exacerbation of bias and inequity.
Caregivers and providers in the paediatric acute care cardiology setting echoed some previously described attitudes toward family-centered rounding. Many of the challenges surrounding family-centered rounding might be addressed through access to training for caregivers and providers alike. Hospitals should invest in systems to facilitate family-centered rounding if they choose to implement this model of care as the current state risks erosion of provider–caregiver relationship.
This study aimed to determine if pre-operative radiological scoring can reliably predict intra-operative difficulty and final cochlear electrode position in patients with advanced otosclerosis.
Method
A retrospective cohort study of advanced otosclerosis patients who underwent cochlear implantation (n = 48, 52 ears) was compared with a larger cohort of post-lingually deaf adult patients (n = 1414) with bilateral hearing loss and normal cochlear anatomy. Pre-operative imaging for advanced otosclerosis patients and final electrode position were scored and correlated with intra-operative difficulty and speech outcomes.
Results
Advanced otosclerosis patients benefit significantly from cochlear implantation. Mean duration of deafness was longer in the advanced otosclerosis group (19.5 vs 14.3 years; p < 0.05).
Conclusion
Anatomical changes in advanced otosclerosis can result in increased difficulty of surgery. Evidence of pre-operative cochlear luminal changes was associated with intra-operative difficult insertion and final non-scala tympani position. Nearly all electrodes implanted in the advanced otosclerosis cohort were peri-modiolar. No reports of facial nerve stimulation were observed.
The cornerstone of obesity treatment is behavioural weight management, resulting in significant improvements in cardio-metabolic and psychosocial health. However, there is ongoing concern that dietary interventions used for weight management may precipitate the development of eating disorders. Systematic reviews demonstrate that, while for most participants medically supervised obesity treatment improves risk scores related to eating disorders, a subset of people who undergo obesity treatment may have poor outcomes for eating disorders. This review summarises the background and rationale for the formation of the Eating Disorders In weight-related Therapy (EDIT) Collaboration. The EDIT Collaboration will explore the complex risk factor interactions that precede changes to eating disorder risk following weight management. In this review, we also outline the programme of work and design of studies for the EDIT Collaboration, including expected knowledge gains. The EDIT studies explore risk factors and the interactions between them using individual-level data from international weight management trials. Combining all available data on eating disorder risk from weight management trials will allow sufficient sample size to interrogate our hypothesis: that individuals undertaking weight management interventions will vary in their eating disorder risk profile, on the basis of personal characteristics and intervention strategies available to them. The collaboration includes the integration of health consumers in project development and translation. An important knowledge gain from this project is a comprehensive understanding of the impact of weight management interventions on eating disorder risk.
Understanding spatial variation in origination and extinction can help to unravel the mechanisms underlying macroevolutionary patterns. Although methods have been developed for estimating global origination and extinction rates from the fossil record, no framework exists for applying these methods to restricted spatial regions. Here, we test the efficacy of three metrics for regional analysis, using simulated fossil occurrences. These metrics are then applied to the marine invertebrate record of the Permian and Triassic to examine variation in extinction and origination rates across latitudes. Extinction and origination rates were generally uniform across latitudes for these time intervals, including during the Capitanian and Permian–Triassic mass extinctions. The small magnitude of this variation, combined with the possibility of its attribution to sampling bias, cautions against linking any observed differences to contrasting evolutionary dynamics. Our results indicate that origination and extinction levels were more variable across clades than across latitudes.
Little is known about environmental factors that may influence associations between genetic liability to suicidality and suicidal behavior.
Methods
This study examined whether a suicidality polygenic risk score (PRS) derived from a large genome-wide association study (N = 122,935) was associated with suicide attempts in a population-based sample of European-American US military veterans (N = 1664; 92.5% male), and whether cumulative lifetime trauma exposure moderated this association.
Results
Eighty-five veterans (weighted 6.3%) reported a history of suicide attempt. After adjusting for sociodemographic and psychiatric characteristics, suicidality PRS was associated with lifetime suicide attempt (odds ratio 2.65; 95% CI 1.37–5.11). A significant suicidality PRS-by-trauma exposure interaction emerged, such that veterans with higher levels of suicidality PRS and greater trauma burden had the highest probability of lifetime suicide attempt (16.6%), whereas the probability of attempts was substantially lower among those with high suicidality PRS and low trauma exposure (1.4%). The PRS-by-trauma interaction effect was enriched for genes implicated in cellular and developmental processes, and nervous system development, with variants annotated to the DAB2 and SPNS2 genes, which are implicated in inflammatory processes. Drug repurposing analyses revealed upregulation of suicide gene-sets in the context of medrysone, a drug targeting chronic inflammation, and clofibrate, a triacylglyceride level lowering agent.
Conclusion
Results suggest that genetic liability to suicidality is associated with increased risk of suicide attempt among veterans, particularly in the presence of high levels of cumulative trauma exposure. Additional research is warranted to investigate whether incorporation of genomic information may improve suicide prediction models.
Why have some states adopted policies expanding ballot access while others have restricted access to the ballot? Since the 1990s, some states have been adopting policies restricting access to the ballot such as requiring identification. At the same time, states have been adopting a variety of registration reforms that lower the barriers to registration and voting. Using an original, 45-state dataset, we examine state innovation within the policy domain of electoral reforms in US states. We find reforms have an independent and, sometimes, negative effect on the innovation of states in electoral reforms. Next, we use dyad analysis to examine the spread of a single policy: automatic voter registration. We find that the propensity to innovate both within and across a state makes the spread of automatic voter registration more likely. Our paper contributes to the broader understanding of why states adopt electoral reforms.
Pain following surgery for cardiac disease is ubiquitous, and optimal management is important. Despite this, there is large practice variation. To address this, the Paediatric Acute Care Cardiology Collaborative undertook the effort to create this clinical practice guideline.
Methods:
A panel of experts consisting of paediatric cardiologists, advanced practice practitioners, pharmacists, a paediatric cardiothoracic surgeon, and a paediatric cardiac anaesthesiologist was convened. The literature was searched for relevant articles and Collaborative sites submitted centre-specific protocols for postoperative pain management. Using the modified Delphi technique, recommendations were generated and put through iterative Delphi rounds to achieve consensus
Results:
60 recommendations achieved consensus and are included in this guideline. They address guideline use, pain assessment, general considerations, preoperative considerations, intraoperative considerations, regional anaesthesia, opioids, opioid-sparing, non-opioid medications, non-pharmaceutical pain management, and discharge considerations.
Conclusions:
Postoperative pain among children following cardiac surgery is currently an area of significant practice variability despite a large body of literature and the presence of centre-specific protocols. Central to the recommendations included in this guideline is the concept that ideal pain management begins with preoperative counselling and continues through to patient discharge. Overall, the quality of evidence supporting recommendations is low. There is ongoing need for research in this area, particularly in paediatric populations.