We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Preliminary evidence suggests that a ketogenic diet may be effective for bipolar disorder.
Aims
To assess the impact of a ketogenic diet in bipolar disorder on clinical, metabolic and magnetic resonance spectroscopy outcomes.
Method
Euthymic individuals with bipolar disorder (N = 27) were recruited to a 6- to 8-week single-arm open pilot study of a modified ketogenic diet. Clinical, metabolic and MRS measures were assessed before and after the intervention.
Results
Of 27 recruited participants, 26 began and 20 completed the ketogenic diet. For participants completing the intervention, mean body weight fell by 4.2 kg (P < 0.001), mean body mass index fell by 1.5 kg/m2 (P < 0.001) and mean systolic blood pressure fell by 7.4 mmHg (P < 0.041). The euthymic participants had average baseline and follow-up assessments consistent with them being in the euthymic range with no statistically significant changes in Affective Lability Scale-18, Beck Depression Inventory and Young Mania Rating Scale. In participants providing reliable daily ecological momentary assessment data (n = 14), there was a positive correlation between daily ketone levels and self-rated mood (r = 0.21, P < 0.001) and energy (r = 0.19 P < 0.001), and an inverse correlation between ketone levels and both impulsivity (r = −0.30, P < 0.001) and anxiety (r = −0.19, P < 0.001). From the MRS measurements, brain glutamate plus glutamine concentration decreased by 11.6% in the anterior cingulate cortex (P = 0.025) and fell by 13.6% in the posterior cingulate cortex (P = <0.001).
Conclusions
These findings suggest that a ketogenic diet may be clinically useful in bipolar disorder, for both mental health and metabolic outcomes. Replication and randomised controlled trials are now warranted.
Machine learning could predict binge behavior and help develop treatments for bulimia nervosa (BN) and alcohol use disorder (AUD). Therefore, this study evaluates person-specific and pooled prediction models for binge eating (BE), alcohol use, and binge drinking (BD) in daily life, and identifies the most important predictors.
Methods
A total of 120 patients (BN: 50; AUD: 51; BN/AUD: 19) participated in an experience sampling study, where over a period of 12 months they reported on their eating and drinking behaviors as well as on several other emotional, behavioral, and contextual factors in daily life. The study had a burst-measurement design, where assessments occurred eight times a day on Thursdays, Fridays, and Saturdays in seven bursts of three weeks. Afterwards, person-specific and pooled models were fit with elastic net regularized regression and evaluated with cross-validation. From these models, the variables with the 10% highest estimates were identified.
Results
The person-specific models had a median AUC of 0.61, 0.80, and 0.85 for BE, alcohol use, and BD respectively, while the pooled models had a median AUC of 0.70, 0.90, and 0.93. The most important predictors across the behaviors were craving and time of day. However, predictors concerning social context and affect differed among BE, alcohol use, and BD.
Conclusions
Pooled models outperformed person-specific models and the models for alcohol use and BD outperformed those for BE. Future studies should explore how the performance of these models can be improved and how they can be used to deliver interventions in daily life.
Recent evidence from case reports suggests that a ketogenic diet may be effective for bipolar disorder. However, no clinical trials have been conducted to date.
Aims
To assess the recruitment and feasibility of a ketogenic diet intervention in bipolar disorder.
Method
Euthymic individuals with bipolar disorder were recruited to a 6–8 week trial of a modified ketogenic diet, and a range of clinical, economic and functional outcome measures were assessed. Study registration number: ISRCTN61613198.
Results
Of 27 recruited participants, 26 commenced and 20 completed the modified ketogenic diet for 6–8 weeks. The outcomes data-set was 95% complete for daily ketone measures, 95% complete for daily glucose measures and 95% complete for daily ecological momentary assessment of symptoms during the intervention period. Mean daily blood ketone readings were 1.3 mmol/L (s.d. = 0.77, median = 1.1) during the intervention period, and 91% of all readings indicated ketosis, suggesting a high degree of adherence to the diet. Over 91% of daily blood glucose readings were within normal range, with 9% indicating mild hypoglycaemia. Eleven minor adverse events were recorded, including fatigue, constipation, drowsiness and hunger. One serious adverse event was reported (euglycemic ketoacidosis in a participant taking SGLT2-inhibitor medication).
Conclusions
The recruitment and retention of euthymic individuals with bipolar disorder to a 6–8 week ketogenic diet intervention was feasible, with high completion rates for outcome measures. The majority of participants reached and maintained ketosis, and adverse events were generally mild and modifiable. A future randomised controlled trial is now warranted.
Background: Mountain biking (MTB) is an increasingly popular sport that has been associated with serious spinal injuries, which can have devastating effects on patients and significant impacts on healthcare resources. Herein, we characterized the occurrence of these MTB spinal injuries over a 15-year period and analyzed the affiliated acute-care hospital costs. Methods: Patients seen at Vancouver General Hospital for MTB spinal injuries between 2008-2022 were retrospectively reviewed. Demographics, injury details, treatments, outcomes, and resource requirements for acute hospitalization were collected. The Canadian Institute for Health Information was referenced for cost analysis. Results: Over the 15 years of analysis, 149 MTB spinal injuries occurred. The majority (87.2%) were male. 59 (39.6%) were associated with spinal cord injury; most of these were in the cervical spine (72.3%) and majority were AIS Grade A (36.1%). 102 patients (68.5%) required spine surgery; 26 (17.4%) required intensive care; 34 (22.8%) required inpatient rehabilitation. Mean length of stay was 13.5 days and acute admission costs for the healthcare system averaged $35,251 (95% CI $27,080-$43,424). Conclusions: MTB spinal injuries are associated with significant medical, personal, and financial burden. As injury prevention remains paramount, further investigation of the roles of education and safety measures is recommended.
Many triage algorithms exist for use in mass-casualty incidents (MCIs) involving pediatric patients. Most of these algorithms have not been validated for reliability across users.
Study Objective:
Investigators sought to compare inter-rater reliability (IRR) and agreement among five MCI algorithms used in the pediatric population.
Methods:
A dataset of 253 pediatric (<14 years of age) trauma activations from a Level I trauma center was used to obtain prehospital information and demographics. Three raters were trained on five MCI triage algorithms: Simple Triage and Rapid Treatment (START) and JumpSTART, as appropriate for age (combined as J-START); Sort Assess Life-Saving Intervention Treatment (SALT); Pediatric Triage Tape (PTT); CareFlight (CF); and Sacco Triage Method (STM). Patient outcomes were collected but not available to raters. Each rater triaged the full set of patients into Green, Yellow, Red, or Black categories with each of the five MCI algorithms. The IRR was reported as weighted kappa scores with 95% confidence intervals (CI). Descriptive statistics were used to describe inter-rater and inter-MCI algorithm agreement.
Results:
Of the 253 patients, 247 had complete triage assignments among the five algorithms and were included in the study. The IRR was excellent for a majority of the algorithms; however, J-START and CF had the highest reliability with a kappa 0.94 or higher (0.9-1.0, 95% CI for overall weighted kappa). The greatest variability was in SALT among Green and Yellow patients. Overall, J-START and CF had the highest inter-rater and inter-MCI algorithm agreements.
Conclusion:
The IRR was excellent for a majority of the algorithms. The SALT algorithm, which contains subjective components, had the lowest IRR when applied to this dataset of pediatric trauma patients. Both J-START and CF demonstrated the best overall reliability and agreement.
To describe the cumulative seroprevalence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies during the coronavirus disease 2019 (COVID-19) pandemic among employees of a large pediatric healthcare system.
Design, setting, and participants:
Prospective observational cohort study open to adult employees at the Children’s Hospital of Philadelphia, conducted April 20–December 17, 2020.
Methods:
Employees were recruited starting with high-risk exposure groups, utilizing e-mails, flyers, and announcements at virtual town hall meetings. At baseline, 1 month, 2 months, and 6 months, participants reported occupational and community exposures and gave a blood sample for SARS-CoV-2 antibody measurement by enzyme-linked immunosorbent assays (ELISAs). A post hoc Cox proportional hazards regression model was performed to identify factors associated with increased risk for seropositivity.
Results:
In total, 1,740 employees were enrolled. At 6 months, the cumulative seroprevalence was 5.3%, which was below estimated community point seroprevalence. Seroprevalence was 5.8% among employees who provided direct care and was 3.4% among employees who did not perform direct patient care. Most participants who were seropositive at baseline remained positive at follow-up assessments. In a post hoc analysis, direct patient care (hazard ratio [HR], 1.95; 95% confidence interval [CI], 1.03–3.68), Black race (HR, 2.70; 95% CI, 1.24–5.87), and exposure to a confirmed case in a nonhealthcare setting (HR, 4.32; 95% CI, 2.71–6.88) were associated with statistically significant increased risk for seropositivity.
Conclusions:
Employee SARS-CoV-2 seroprevalence rates remained below the point-prevalence rates of the surrounding community. Provision of direct patient care, Black race, and exposure to a confirmed case in a nonhealthcare setting conferred increased risk. These data can inform occupational protection measures to maximize protection of employees within the workplace during future COVID-19 waves or other epidemics.
We present an overview of the Middle Ages Galaxy Properties with Integral Field Spectroscopy (MAGPI) survey, a Large Program on the European Southern Observatory Very Large Telescope. MAGPI is designed to study the physical drivers of galaxy transformation at a lookback time of 3–4 Gyr, during which the dynamical, morphological, and chemical properties of galaxies are predicted to evolve significantly. The survey uses new medium-deep adaptive optics aided Multi-Unit Spectroscopic Explorer (MUSE) observations of fields selected from the Galaxy and Mass Assembly (GAMA) survey, providing a wealth of publicly available ancillary multi-wavelength data. With these data, MAGPI will map the kinematic and chemical properties of stars and ionised gas for a sample of 60 massive (${>}7 \times 10^{10} {\mathrm{M}}_\odot$) central galaxies at $0.25 < z <0.35$ in a representative range of environments (isolated, groups and clusters). The spatial resolution delivered by MUSE with Ground Layer Adaptive Optics ($0.6-0.8$ arcsec FWHM) will facilitate a direct comparison with Integral Field Spectroscopy surveys of the nearby Universe, such as SAMI and MaNGA, and at higher redshifts using adaptive optics, for example, SINS. In addition to the primary (central) galaxy sample, MAGPI will deliver resolved and unresolved spectra for as many as 150 satellite galaxies at $0.25 < z <0.35$, as well as hundreds of emission-line sources at $z < 6$. This paper outlines the science goals, survey design, and observing strategy of MAGPI. We also present a first look at the MAGPI data, and the theoretical framework to which MAGPI data will be compared using the current generation of cosmological hydrodynamical simulations including EAGLE, Magneticum, HORIZON-AGN, and Illustris-TNG. Our results show that cosmological hydrodynamical simulations make discrepant predictions in the spatially resolved properties of galaxies at $z\approx 0.3$. MAGPI observations will place new constraints and allow for tangible improvements in galaxy formation theory.
Stem cells give rise to the entirety of cells within an organ. Maintaining stem cell identity and coordinately regulating stem cell divisions is crucial for proper development. In plants, mobile proteins, such as WUSCHEL-RELATED HOMEOBOX 5 (WOX5) and SHORTROOT (SHR), regulate divisions in the root stem cell niche. However, how these proteins coordinately function to establish systemic behaviour is not well understood. We propose a non-cell autonomous role for WOX5 in the cortex endodermis initial (CEI) and identify a regulator, ANGUSTIFOLIA (AN3)/GRF-INTERACTING FACTOR 1, that coordinates CEI divisions. Here, we show with a multi-scale hybrid model integrating ordinary differential equations (ODEs) and agent-based modeling that quiescent center (QC) and CEI divisions have different dynamics. Specifically, by combining continuous models to describe regulatory networks and agent-based rules, we model systemic behaviour, which led us to predict cell-type-specific expression dynamics of SHR, SCARECROW, WOX5, AN3 and CYCLIND6;1, and experimentally validate CEI cell divisions. Conclusively, our results show an interdependency between CEI and QC divisions.
In April 2019, the U.S. Fish and Wildlife Service (USFWS) released its recovery plan for the jaguar Panthera onca after several decades of discussion, litigation and controversy about the status of the species in the USA. The USFWS estimated that potential habitat, south of the Interstate-10 highway in Arizona and New Mexico, had a carrying capacity of c. six jaguars, and so focused its recovery programme on areas south of the USA–Mexico border. Here we present a systematic review of the modelling and assessment efforts over the last 25 years, with a focus on areas north of Interstate-10 in Arizona and New Mexico, outside the recovery unit considered by the USFWS. Despite differences in data inputs, methods, and analytical extent, the nine previous studies found support for potential suitable jaguar habitat in the central mountain ranges of Arizona and New Mexico. Applying slightly modified versions of the USFWS model and recalculating an Arizona-focused model over both states provided additional confirmation. Extending the area of consideration also substantially raised the carrying capacity of habitats in Arizona and New Mexico, from six to 90 or 151 adult jaguars, using the modified USFWS models. This review demonstrates the crucial ways in which choosing the extent of analysis influences the conclusions of a conservation plan. More importantly, it opens a new opportunity for jaguar conservation in North America that could help address threats from habitat losses, climate change and border infrastructure.
We summarize some of the past year's most important findings within climate change-related research. New research has improved our understanding of Earth's sensitivity to carbon dioxide, finds that permafrost thaw could release more carbon emissions than expected and that the uptake of carbon in tropical ecosystems is weakening. Adverse impacts on human society include increasing water shortages and impacts on mental health. Options for solutions emerge from rethinking economic models, rights-based litigation, strengthened governance systems and a new social contract. The disruption caused by COVID-19 could be seized as an opportunity for positive change, directing economic stimulus towards sustainable investments.
Technical summary
A synthesis is made of ten fields within climate science where there have been significant advances since mid-2019, through an expert elicitation process with broad disciplinary scope. Findings include: (1) a better understanding of equilibrium climate sensitivity; (2) abrupt thaw as an accelerator of carbon release from permafrost; (3) changes to global and regional land carbon sinks; (4) impacts of climate change on water crises, including equity perspectives; (5) adverse effects on mental health from climate change; (6) immediate effects on climate of the COVID-19 pandemic and requirements for recovery packages to deliver on the Paris Agreement; (7) suggested long-term changes to governance and a social contract to address climate change, learning from the current pandemic, (8) updated positive cost–benefit ratio and new perspectives on the potential for green growth in the short- and long-term perspective; (9) urban electrification as a strategy to move towards low-carbon energy systems and (10) rights-based litigation as an increasingly important method to address climate change, with recent clarifications on the legal standing and representation of future generations.
Social media summary
Stronger permafrost thaw, COVID-19 effects and growing mental health impacts among highlights of latest climate science.
Associations of socioenvironmental features like urbanicity and neighborhood deprivation with psychosis are well-established. An enduring question, however, is whether these associations are causal. Genetic confounding could occur due to downward mobility of individuals at high genetic risk for psychiatric problems into disadvantaged environments.
Methods
We examined correlations of five indices of genetic risk [polygenic risk scores (PRS) for schizophrenia and depression, maternal psychotic symptoms, family psychiatric history, and zygosity-based latent genetic risk] with multiple area-, neighborhood-, and family-level risks during upbringing. Data were from the Environmental Risk (E-Risk) Longitudinal Twin Study, a nationally-representative cohort of 2232 British twins born in 1994–1995 and followed to age 18 (93% retention). Socioenvironmental risks included urbanicity, air pollution, neighborhood deprivation, neighborhood crime, neighborhood disorder, social cohesion, residential mobility, family poverty, and a cumulative environmental risk scale. At age 18, participants were privately interviewed about psychotic experiences.
Results
Higher genetic risk on all indices was associated with riskier environments during upbringing. For example, participants with higher schizophrenia PRS (OR = 1.19, 95% CI = 1.06–1.33), depression PRS (OR = 1.20, 95% CI = 1.08–1.34), family history (OR = 1.25, 95% CI = 1.11–1.40), and latent genetic risk (OR = 1.21, 95% CI = 1.07–1.38) had accumulated more socioenvironmental risks for schizophrenia by age 18. However, associations between socioenvironmental risks and psychotic experiences mostly remained significant after covariate adjustment for genetic risk.
Conclusion
Genetic risk is correlated with socioenvironmental risk for schizophrenia during upbringing, but the associations between socioenvironmental risk and adolescent psychotic experiences appear, at present, to exist above and beyond this gene-environment correlation.
Coronavirus disease 2019 (COVID-19) has resulted in a global pandemic, and there is limited data on effective therapies. Bacillus Calmette–Guérin (BCG) vaccine, a live-attenuated strain derived from an isolate of Mycobacterium bovis and originally designed to prevent tuberculosis, has shown some efficacy against infection with unrelated pathogens. In this study, we reviewed 120 consecutive adult patients (≥18 years old) with COVID-19 at a major federally qualified health centre in Rhode Island, United States from 19 March to 29 April 2020. Median age was 39.5 years (interquartile range, 27.0–50.0), 30% were male and 87.5% were Latino/Hispanics. Eighty-two (68.3%) patients had BCG vaccination. Individuals with BCG vaccination were less likely to require hospital admission during the disease course (3.7% vs. 15.8%, P = 0.019). This association remained unchanged after adjusting for demographics and comorbidities (P = 0.017) using multivariate regression analysis. The finding from our study suggests the potential of BCG in preventing more severe COVID-19.
The importance of timely identification and treatment of psychosis are increasingly the focus of early interventions, with research targeting the initial high-risk period in the months following first-episode hospitalization. However, ongoing psychiatric treatment and service utilization after the symptoms have been stabilized over the initial years following first-episode has received less research attention.
Objectives
To model the variables predicting continued service utilization with psychiatrists for adolescents following their first-episode psychosis; examine associated temporal patterns in continued psychiatric service utilization.
Methods
This study utilized a cohort design to assess adolescents (age 14.4 ± 2.5 years) discharged following their index hospitalization for first-episode psychosis. Bivariate analyses were conducted on predictor variables associated with psychiatric service utilization. All significant predictor variables were included in a logistic regression model.
Results
Variables that were significantly associated with psychiatric service utilization included: diagnosis with a schizophrenia spectrum disorder rather than major mood disorder with psychotic features (OR = 24.0; P = 0.02), a first degree relative with depression (OR = 0.12; P = 0.05), and months since last psychiatric inpatient discharge (OR = 0.92; P = 0.02). Further examination of time since last hospitalization found that all adolescents continued service utilization up to 18 months post-discharge.
Conclusions
Key findings highlight the importance of early diagnosis, that a first degree relative with depression may negatively influence the adolescent's ongoing service utilization, and that 18 months post-discharge may a critical time to review current treatment strategies and collaborate with youth and families to ensure that services continue to meet their needs.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
The Sort, Access, Life-saving interventions, Treatment and/or Triage (SALT) mass-casualty incident (MCI) algorithm is unique in that it includes two subjective questions during the triage process: “Is the victim likely to survive given the resources?” and “Is the injury minor?”
Hypothesis/Problem:
Given this subjectivity, it was hypothesized that as casualties increase, the inter-rater reliability (IRR) of the tool would decline, due to an increase in the number of patients triaged as Minor and Expectant.
Methods:
A pre-collected dataset of pediatric trauma patients age <14 years from a single Level 1 trauma center was used to generate “patients.” Three trained raters triaged each patient using SALT as if they were in each of the following scenarios: 10, 100, and 1,000 victim MCIs. Cohen’s kappa test was used to evaluate IRR between the raters in each of the scenarios.
Results:
A total of 247 patients were available for triage. The kappas were consistently “poor” to “fair:” 0.37 to 0.59 in the 10-victim scenario; 0.13 to 0.36 in the 100-victim scenario; and 0.05 to 0.36 in the 1,000-victim scenario. There was an increasing percentage of subjects triaged Minor as the number of estimated victims increased: 27.8% increase from 10- to 100-victim scenario and 7.0% increase from 100- to 1,000-victim scenario. Expectant triage categorization of patients remained stable as victim numbers increased.
Conclusion:
Overall, SALT demonstrated poor IRR in this study of increasing casualty counts while triaging pediatric patients. Increased casualty counts in the scenarios did lead to increased Minor but not Expectant categorizations.
Recent investigations now suggest that cerebrovascular reactivity (CVR) is impaired in Alzheimer’s disease (AD) and may underpin part of the disease’s neurovascular component. However, our understanding of the relationship between the magnitude of CVR, the speed of cerebrovascular response, and the progression of AD is still limited. This is especially true in patients with mild cognitive impairment (MCI), which is recognized as an intermediate stage between normal aging and dementia. The purpose of this study was to investigate AD and MCI patients by mapping repeatable and accurate measures of cerebrovascular function, namely the magnitude and speed of cerebrovascular response (τ) to a vasoactive stimulus in key predilection sites for vascular dysfunction in AD.
Methods:
Thirty-three subjects (age range: 52–83 years, 20 males) were prospectively recruited. CVR and τ were assessed using blood oxygen level-dependent MRI during a standardized carbon dioxide stimulus. Temporal and parietal cortical regions of interest (ROIs) were generated from anatomical images using the FreeSurfer image analysis suite.
Results:
Of 33 subjects recruited, 3 individuals were excluded, leaving 30 subjects for analysis, consisting of 6 individuals with early AD, 11 individuals with MCI, and 13 older healthy controls (HCs). τ was found to be significantly higher in the AD group compared to the HC group in both the temporal (p = 0.03) and parietal cortex (p = 0.01) following a one-way ANCOVA correcting for age and microangiopathy scoring and a Bonferroni post-hoc correction.
Conclusion:
The study findings suggest that AD is associated with a slowing of the cerebrovascular response in the temporal and parietal cortices.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Background: Buprenorphine/naloxone (bup/nal) is a partial opioid agonist/antagonist and recommended first line treatment for opioid use disorder (OUD). Emergency departments (EDs) are a key point of contact with the healthcare system for patients living with OUD. Aim Statement: We implemented a multi-disciplinary quality improvement project to screen patients for OUD, initiate bup/nal for eligible individuals, and provide rapid next business day walk-in referrals to addiction clinics in the community. Measures & Design: From May to September 2018, our team worked with three ED sites and three addiction clinics to pilot the program. Implementation involved alignment with regulatory requirements, physician education, coordination with pharmacy to ensure in-ED medication access, and nurse education. The project is supported by a full-time project manager, data analyst, operations leaders, physician champions, provincial pharmacy, and the Emergency Strategic Clinical Network leadership team. For our pilot, our evaluation objective was to determine the degree to which our initiation and referral pathway was being utilized. We used administrative data to track the number of patients given bup/nal in ED, their demographics and whether they continued to fill bup/nal prescriptions 30 days after their ED visit. Addiction clinics reported both the number of patients referred to them and the number of patients attending their referral. Evaluation/Results: Administrative data shows 568 opioid-related visits to ED pilot sites during the pilot phase. Bup/nal was given to 60 unique patients in the ED during 66 unique visits. There were 32 (53%) male patients and 28 (47%) female patients. Median patient age was 34 (range: 21 to 79). ED visits where bup/nal was given had a median length of stay of 6 hours 57 minutes (IQR: 6 hours 20 minutes) and Canadian Triage Acuity Scores as follows: Level 1 – 1 (2%), Level 2 – 21 (32%), Level 3 – 32 (48%), Level 4 – 11 (17%), Level 5 – 1 (2%). 51 (77%) of these visits led to discharge. 24 (47%) discharged patients given bup/nal in ED continued to fill bup/nal prescriptions 30 days after their index ED visit. EDs also referred 37 patients with OUD to the 3 community clinics, and 16 of those individuals (43%) attended their first follow-up appointment. Discussion/Impact: Our pilot project demonstrates that with dedicated resources and broad institutional support, ED patients with OUD can be appropriately initiated on bup/nal and referred to community care.
The last 12 years have seen the evolution of a new funding regime under the supervision of the Pensions Regulator. Over this period, there has been significant turbulence in financial markets, including record low interest rates. This paper takes a critical look at the development of funding approaches and methodologies over this period. It analyses the Pensions Regulator guidance and how scheme specific actuarial methods have emerged since the move away from the Minimum Funding Requirement in 2001 and the introduction of the Scheme Specific Funding Requirements in 2005. It asks whether these new methodologies have been successful from the perspective of members, trustees, employers and shareholders. At a time when actuarial valuation methodologies have faced considerable criticism, this paper aims to propose a pension funding methodology which is fit for purpose and also reflects the latest guidance from the Pensions Regulator on integrated risk management.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Objectives: Treatments for childhood brain tumors (BT) confer substantial risks to neurological development and contribute to neuropsychological deficits in young adulthood. Evidence suggests that individuals who experience more significant neurological insult may lack insight into their neurocognitive limitations. The present study compared survivor, mother, and performance-based estimates of executive functioning (EF), and their associations with treatment intensity history in a subsample of young adult survivors of childhood BTs. Methods: Thirty-four survivors (52.9% female), aged 18 to 30 years (M=23.5; SD=3.4), 16.1 years post-diagnosis (SD=5.9), were administered self-report and performance-based EF measures. Mothers also rated survivor EF skills. Survivors were classified by treatment intensity history into Minimal, Average/Moderate, or Intensive/Most-Intensive groups. Discrepancies among survivor, mother, and performance-based EF estimates were compared. Results: Survivor-reported and performance-based measures were not correlated, although significant associations were found between mother-reported and performance measures. Survivors in the Intensive/Most-Intensive treatment group evidenced the greatest score discrepancies, reporting less executive dysfunction relative to mother-reported F(2,31)=7.81, p<.01, and performance-based measures F(14,50)=2.54, p<.05. Conversely, survivors in the Minimal treatment group reported greater EF difficulties relative to mothers t(8)=2.82, p<.05, but not performance-based estimates (ps>.05). Conclusions: There may be a lack of agreement among survivor, mother, and performance-based estimates of EF skills in young adult survivors of childhood BT, and these discrepancies may be associated with treatment intensity history. Neuropsychologists should use a multi-method, multi-reporter approach to assessment of EF in this population. Providers also should be aware of these discrepancies as they may be a barrier to intervention efforts. (JINS, 2016, 22, 900–910)