We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Prior trials suggest that intravenous racemic ketamine is a highly effective for treatment-resistant depression (TRD), but phase 3 trials of racemic ketamine are needed.
Aims
To assess the acute efficacy and safety of a 4-week course of subcutaneous racemic ketamine in participants with TRD. Trial registration: ACTRN12616001096448 at www.anzctr.org.au.
Method
This phase 3, double-blind, randomised, active-controlled multicentre trial was conducted at seven mood disorders centres in Australia and New Zealand. Participants received twice-weekly subcutaneous racemic ketamine or midazolam for 4 weeks. Initially, the trial tested fixed-dose ketamine 0.5 mg/kg versus midazolam 0.025 mg/kg (cohort 1). Dosing was revised, after a Data Safety Monitoring Board recommendation, to flexible-dose ketamine 0.5–0.9 mg/kg or midazolam 0.025–0.045 mg/kg, with response-guided dosing increments (cohort 2). The primary outcome was remission (Montgomery-Åsberg Rating Scale for Depression score ≤10) at the end of week 4.
Results
The final analysis (those who received at least one treatment) comprised 68 in cohort 1 (fixed-dose), 106 in cohort 2 (flexible-dose). Ketamine was more efficacious than midazolam in cohort 2 (remission rate 19.6% v. 2.0%; OR = 12.1, 95% CI 2.1–69.2, P = 0.005), but not different in cohort 1 (remission rate 6.3% v. 8.8%; OR = 1.3, 95% CI 0.2–8.2, P = 0.76). Ketamine was well tolerated. Acute adverse effects (psychotomimetic, blood pressure increases) resolved within 2 h.
Conclusions
Adequately dosed subcutaneous racemic ketamine was efficacious and safe in treating TRD over a 4-week treatment period. The subcutaneous route is practical and feasible.
The dynamics of interfaces in slow diffusion equations with strong absorption are studied. Asymptotic methods are used to give descriptions of the behaviour local to a comprehensive range of possible singular events that can occur in any evolution. These events are: when an interface changes its direction of propagation (reversing and anti-reversing), when an interface detaches from an absorbing obstacle (detaching), when two interfaces are formed by film rupture (touchdown) and when the solution undergoes extinction. Our account of extinction and self-similar reversing and anti-reversing is built upon previous work; results on non-self-similar reversing and anti-reversing and on the various types of detachment and touchdown are developed from scratch. In all cases, verification of the asymptotic results against numerical solutions to the full PDE is provided. Self-similar solutions, both of the full equation and of its asymptotic limits, play a central role in the analysis.
Background: Pain in a common symptom in adult-onset idiopathic dystonia (AOID). An appropriate tool to understand this symptom is needed to improve AOID patients’ care. We developed a rating instrument for pain in AOID and validated it in cervical dystonia (CD). Methods: Development and validation of the Pain in Dystonia Scale (PIDS) in three phases: 1. International experts and participants generated and evaluated the preliminary items for content validity; 2. The PIDS was drafted and revised, followed by cognitive interviews to ensure suitability for self-administration; and 3. the clinimetric properties of the final PIDS were assessed in 85 participants. Results: PIDS evaluates pain severity (by body part), functional impact and external modulating factors. It showed high test-retest reliability the total score (0.9, p<0.001), intraclass correlation coefficients higher than 0.7 for all items and high internal consistency (Cronbach’s alpha 0.9). Convergent validity analysis revealed a strong correlation between the PIDS severity score and the TWSTRS pain subscale (0.8, p<0.001), the brief pain inventory short form (0.7, p<0.001) and impact of pain on daily functioning (0.7, p<0.001). Conclusions: The PIDS is the first specific questionnaire developed to evaluate pain in patients with AOID with high-level clinimetric properties in people with CD.
This paper documents trends over the last two decades in retirement behavior and retirement income choices of participants in TIAA, a large and mature defined contribution plan. From 2000 and 2018, the average age at which TIAA participants stopped contributing to their accounts, which is a lower bound on their retirement age, rose by 1.2 years for female and 2.0 years for male participants. There is considerable variation in the elapsed time between the time of the last contribution to and the first income draw from plan accounts. Only 40% of participants take an initial income payment within 48 months of their last contribution. Later retirement and lags between retirement and the first retirement income payout led to a growing fraction of participants reaching the required minimum distribution (RMD) age before starting income draws. Between 2000 and 2018, the fraction of first-time income recipients who took no income until their RMD rose from 10% to 52%, while the fraction of these recipients who selected a life-contingent annuitized payout stream declined from 61% to 18%. Among those who began receiving income before age 70, annuitization rates were significantly higher than among those who did so at older ages. Aggregating across all income-receiving beneficiaries at TIAA, not just new income recipients, the proportion with a life annuity as part of their payout strategy fell from 52% in 2008 to 31% in 2018. By comparison, the proportion of all income recipients taking an RMD payment rose from 16% to 29%. About one-fifth of retirees received more than one type of income; the most common pairing was an RMD and a life annuity. In the later years of our sample, the RMD was becoming the de facto default distribution option for newly retired TIAA participants.
Adolescence is a time of heightened vulnerability for both peer victimization (PV) and internalizing symptoms. While the positive association between them is well established, there is little understanding of the mechanisms underpinning this relationship. To address this gap, the current study aimed to investigate sleep hygiene and school night sleep duration as individual and sequential mediators of the relationship between PV and both depressive and social anxiety symptoms during pre- to mid-adolescence. The study drew upon a community sample of 528 Australian youth aged 10–12 years at baseline (Mage = 11.19, SD = .55; 51.1% boys) and data were collected over five annual measurement occasions. Direct and indirect longitudinal and bidirectional associations were examined using cross-lagged panel analysis. There was no evidence of sequential mediation through both sleep hygiene and sleep duration to depression and social anxiety. Instead, the findings show that sleep hygiene mediated the prospective association between PV and both depressive and social anxiety symptoms, and between PV and sleep duration. Overall, sleep hygiene represents a modifiable transdiagnostic factor that can be targeted to break the cycle of PV, inadequate sleep, and internalizing symptoms.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Method
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Results
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
Conclusions
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.
Current dam discharge patterns in Noxon Rapids Reservoir reduce concentration and exposure times (CET) of herbicides used for aquatic plant management. Herbicide applications during periods of low dam discharge may increase herbicide CETs and improve efficacy. Applications of rhodamine WT dye were monitored under peak (736 to 765 m3 s−1) and minimum (1.4 to 2.8 m3 s−1) dam discharge patterns to quantify water-exchange processes. Whole-plot dye half-life under minimal discharge was 33 h, a 15-fold increase compared with the dye treatment during peak discharge. Triclopyr concentrations measured during minimum discharge within the treated plot ranged from 214 ± 25 to 1,243 ± 36 µg L−1 from 0 to 48 h after treatment (HAT), respectively. Endothall concentrations measured during minimum discharge in the same plot ranged from 164 ± 78 to 2,195 ± 1,043 µg L−1 from 0 to 48 HAT, respectively. Eurasian watermilfoil (Myriophyllum spicatum L.) occurrence in the treatment plot was 66%, 8%, and 14% during pretreatment, 5 wk after treatment (WAT), and 52 WAT, respectively. Myriophyllum spicatum occurrence in the nontreated plot was 68%, 71%, and 83% during pretreatment, 5 WAT, and 52 WAT, respectively. Curlyleaf pondweed (Potamogeton crispus L.) occurrence in the treatment plot was 29%, 0%, and 97% during pretreatment, 5 WAT, and 52 WAT, respectively. Potamogeton crispus increased from 24% to 83% at 0 WAT to 52 WAT, respectively, in the nontreated plot. Native species richness declined from 3.3 species per point to 2.1 in the treatment plot in the year of treatment but returned to pretreatment numbers by 52 WAT. Native species richness did not change during the study in the nontreated reference plot. Herbicide applications during periods of low flow can increase CETs and improve control, whereas applications during times of high-water flow would shorten CETs and could result in reduced treatment efficacy.
Reinforcement learning has previously been applied to the problem of controlling a perched landing manoeuvre for a custom sweep-wing aircraft. Previous work showed that the use of domain randomisation to train with atmospheric disturbances improved the real-world performance of the controllers, leading to increased reward. This paper builds on the previous project, investigating enhancements and modifications to the learning process to further improve performance, and reduce final state error. These changes include modifying the observation by adding information about the airspeed to the standard aircraft state vector, employing further domain randomisation of the simulator, optimising the underlying RL algorithm and network structure, and changing to a continuous action space. Simulated investigations identified hyperparameter optimisation as achieving the most significant increase in reward performance. Several test cases were explored to identify the best combination of enhancements. Flight testing was performed, comparing a baseline model against some of the best performing test cases from simulation. Generally, test cases that performed better than the baseline in simulation also performed better in the real world. However, flight tests also identified limitations with the current numerical model. For some models, the chosen policy performs well in simulation yet stalls prematurely in reality, a problem known as the reality gap.
Primary care providers (PCPs) are expected to help patients with obesity to lose weight through behavior change counseling and patient-centered use of available weight management resources. Yet, many PCPs face knowledge gaps and clinical time constraints that hinder their ability to successfully support patients’ weight loss. Fortunately, a small and growing number of physicians are now certified in obesity medicine through the American Board of Obesity Medicine (ABOM) and can provide personalized and effective obesity treatment to individual patients. Little is known, however, about how to extend the expertise of ABOM-certified physicians to support PCPs and their many patients with obesity.
Aim:
To develop and pilot test an innovative care model – the Weight Navigation Program (WNP) – to integrate ABOM-certified physicians into primary care settings and to enhance the delivery of personalized, effective obesity care.
Methods:
Quality improvement program with an embedded, 12-month, single-arm pilot study. Patients with obesity and ≥1 weight-related co-morbidity may be referred to the WNP by PCPs. All patients seen within the WNP during the first 12 months of clinical operations will be compared to a matched cohort of patients from another primary care site. We will recruit a subset of WNP patients (n = 30) to participate in a remote weight monitoring pilot program, which will include surveys at 0, 6, and 12 months, qualitative interviews at 0 and 6 months, and use of an electronic health record (EHR)-based text messaging program for remote weight monitoring.
Discussion:
Obesity is a complex chronic condition that requires evidence-based, personalized, and longitudinal care. To deliver such care in general practice, the WNP leverages the expertise of ABOM-certified physicians, health system and community weight management resources, and EHR-based population health management tools. The WNP is an innovative model with the potential to be implemented, scaled, and sustained in diverse primary care settings.
Poor mental health is a state of psychological distress that is influenced by lifestyle factors such as sleep, diet, and physical activity. Compulsivity is a transdiagnostic phenotype cutting across a range of mental illnesses including obsessive–compulsive disorder, substance-related and addictive disorders, and is also influenced by lifestyle. Yet, how lifestyle relates to compulsivity is presently unknown, but important to understand to gain insights into individual differences in mental health. We assessed (a) the relationships between compulsivity and diet quality, sleep quality, and physical activity, and (b) whether psychological distress statistically contributes to these relationships.
Methods
We collected harmonized data on compulsivity, psychological distress, and lifestyle from two independent samples (Australian n = 880 and US n = 829). We used mediation analyses to investigate bidirectional relationships between compulsivity and lifestyle factors, and the role of psychological distress.
Results
Higher compulsivity was significantly related to poorer diet and sleep. Psychological distress statistically mediated the relationship between poorer sleep quality and higher compulsivity, and partially statistically mediated the relationship between poorer diet and higher compulsivity.
Conclusions
Lifestyle interventions in compulsivity may target psychological distress in the first instance, followed by sleep and diet quality. As psychological distress links aspects of lifestyle and compulsivity, focusing on mitigating and managing distress may offer a useful therapeutic approach to improve physical and mental health. Future research may focus on the specific sleep and diet patterns which may alter compulsivity over time to inform lifestyle targets for prevention and treatment of functionally impairing compulsive behaviors.
This paper presents the current state of mathematical modelling of the electrochemical behaviour of lithium-ion batteries (LIBs) as they are charged and discharged. It reviews the models developed by Newman and co-workers, both in the cases of dilute and moderately concentrated electrolytes and indicates the modelling assumptions required for their development. Particular attention is paid to the interface conditions imposed between the electrolyte and the active electrode material; necessary conditions are derived for one of these, the Butler–Volmer relation, in order to ensure physically realistic solutions. Insight into the origin of the differences between various models found in the literature is revealed by considering formulations obtained by using different measures of the electric potential. Materials commonly used for electrodes in LIBs are considered and the various mathematical models used to describe lithium transport in them discussed. The problem of upscaling from models of behaviour at the single electrode particle scale to the cell scale is addressed using homogenisation techniques resulting in the pseudo-2D model commonly used to describe charge transport and discharge behaviour in lithium-ion cells. Numerical solution to this model is discussed and illustrative results for a common device are computed.
This project will work closely with existing service partners involved in street level services and focus on testing and evaluating three approaches for street level interventions for youth who are homeless and who have severe or moderate mentally illness. Youth will be asked to choose their preferred service approach:
Housing First related initiatives focused on interventions designed to move youth to appropriate and available housing and ongoing housing supports.
Treatment First initiatives to provide Mental Health/Addiction supports and treatment solutions, and; Simultaneous attention to both Housing and Treatment Together
Our primary objective is to understand the service delivery preferences of homeless youth and understand the outcomes of these choices. Our research questions include:
1. Which approaches to service are chosen by youth?
2. What are the differences and similarities between groups choosing each approach?
3. What are the critical ingredients needed to effectively implement services for homeless youth from the perspectives of youth, families and service providers?
Focus groups with staff and family members will occur to assist in understanding the nature of each of service approach, changes that evolve within services, & facilitators and barriers to service delivery. This work will be important in determining which approach is chosen by youth and why. Evaluating the outcomes with each choice will provide valuable information about outcomes for the service options chosen by youth. This assist in better identifying weaknesses in the services offered and inform further development of treatment options that youth will accept.
There is increasing evidence for a neurobiological basis of antisocial personality disorder (ASPD), includinggenetic liability, aberrant serotonergic function, neuropsychological deficits and structural and functional brain abnormalities. However, few functional brain imaging studies have been conducted using tasks of clinically relevant functions such as impulse control and reinforcement processing. Here we report on a study investigating the neural basis of behavioural inhibition and reward sensitivity in ASPD using functional magnetic resonance imaging (fMRI).
Methods
17 medication-free male individuals with DSM IV ASPD and 14 healthy controls were included. All subjects were screened for Axis I pathology and substance misuse. Scanner tasks included two block design tasks: one Go/No-Go task and one reward task. Scanning was carried out on a 1.5T Phillips system. Whole brain coverage was achieved using 40 axial slices with 3.5mm spacing a TR of 5 seconds. Data were analysed using SPM5 using random effects models.
Results
Results of the Go/No-Go task confirmed brain activation previously described in the processing of impulse inhibition, namely in the orbitofrontal and dorsolateral prefrontal cortex and the anterior cingulate, and these were enhanced in the PD group. The reward task was associated with BOLD response changes in the reward network in both groups. However, these BOLD responses were reduced in the ASPD group, particularly in prefrontal areas.
Conclusions
Our results further support the notion of prefrontal dysfunction in ASPD. However, contrary to previous studies suggesting “hypofrontality” in this disorder, we found task specific increased and decreased BOLD responses.
Gene × environment (G × E) interactions in eating pathology have been increasingly investigated, however studies have been limited by sample size due to the difficulty of obtaining genetic data.
Objective
To synthesize existing G × E research in the eating disorders (ED) field and provide a clear picture of the current state of knowledge with analyses of larger samples.
Method
Complete data from seven studies investigating community (n = 1750, 64.5% female) and clinical (n = 426, 100% female) populations, identified via systematic review, were included. Data were combined to perform five analyses: 5-HTTLPR × Traumatic Life Events (0–17 events) to predict ED status (n = 909), 5-HTTLPR × Sexual and Physical Abuse (n = 1097) to predict bulimic symptoms, 5-HTLPR × Depression to predict bulimic symptoms (n = 1256), and 5-HTTLPR × Impulsivity to predict disordered eating (n = 1149).
Results
The low function (s) allele of 5-HTTLPR interacted with number of traumatic life events (P < .01) and sexual and physical abuse (P < .05) to predict increased likelihood of an ED in females but not males (Fig. 1). No other G × E interactions were significant, possibly due to the medium to low compatibility between datasets (Fig. 1).
Conclusion
Early promising results suggest that increased knowledge of G × E interactions could be achieved if studies increased uniformity of measuring ED and environmental variables, allowing for continued collaboration to overcome the restrictions of obtaining genetic samples.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Feed represents a substantial proportion of production costs in the dairy industry and is a useful target for improving overall system efficiency and sustainability. The objective of this study was to develop methodology to estimate the economic value for a feed efficiency trait and the associated methane production relevant to Canada. The approach quantifies the level of economic savings achieved by selecting animals that convert consumed feed into product while minimizing the feed energy used for inefficient metabolism, maintenance and digestion. We define a selection criterion trait called Feed Performance (FP) as a 1 kg increase in more efficiently used feed in a first parity lactating cow. The impact of a change in this trait on the total lifetime value of more efficiently used feed via correlated selection responses in other life stages is then quantified. The resulting improved conversion of feed was also applied to determine the resulting reduction in output of emissions (and their relative value based on a national emissions value) under an assumption of constant methane yield, where methane yield is defined as kg methane/kg dry matter intake (DMI). Overall, increasing the FP estimated breeding value by one unit (i.e. 1 kg of more efficiently converted DMI during the cow’s first lactation) translates to a total lifetime saving of 3.23 kg in DMI and 0.055 kg in methane with the economic values of CAD $0.82 and CAD $0.07, respectively. Therefore, the estimated total economic value for FP is CAD $0.89/unit. The proposed model is robust and could also be applied to determine the economic value for feed efficiency traits within a selection index in other production systems and countries.
We evaluated the impact of an electronic health record based 72-hour antimicrobial time-out (ATO) on antimicrobial utilization. We observed that 6 hours after the ATO, 21% of empiric antimicrobials were discontinued or de-escalated. There was a significant reduction in the duration of antimicrobial therapy but no impact on overall antimicrobial usage metrics.
Introduction: Although use of point of care ultrasound (PoCUS) protocols for patients with undifferentiated hypotension in the Emergency Department (ED) is widespread, our previously reported SHoC-ED study showed no clear survival or length of stay benefit for patients assessed with PoCUS. In this analysis, we examine if the use of PoCUS changed fluid administration and rates of other emergency interventions between patients with different shock types. The primary comparison was between cardiogenic and non-cardiogenic shock types. Methods: A post-hoc analysis was completed on the database from an RCT of 273 patients who presented to the ED with undifferentiated hypotension (SBP <100 or shock index > 1) and who had been randomized to receive standard care with or without PoCUS in 6 centres in Canada and South Africa. PoCUS-trained physicians performed scans after initial assessment. Shock categories and diagnoses recorded at 60 minutes after ED presentation, were used to allocate patients into subcategories of shock for analysis of treatment. We analyzed actual care delivered including initial IV fluid bolus volumes (mL), rates of inotrope use and major procedures. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: Although there were expected differences in the mean fluid bolus volume between patients with non-cardiogenic and cardiogenic shock, there was no difference in fluid bolus volume between the control and PoCUS groups (non-cardiogenic control 1878 mL (95% CI 1550 – 2206 mL) vs. non-cardiogenic PoCUS 1687 mL (1458 – 1916 mL); and cardiogenic control 768 mL (194 – 1341 mL) vs. cardiogenic PoCUS 981 mL (341 – 1620 mL). Likewise there were no differences in rates of inotrope administration, or major procedures for any of the subcategories of shock between the control group and PoCUS group patients. The most common subcategory of shock was distributive. Conclusion: Despite differences in care delivered by subcategory of shock, we did not find any significant difference in actual care delivered between patients who were examined using PoCUS and those who were not. This may help to explain the previously reported lack of outcome difference between groups.
Introduction: Point of care ultrasound has been reported to improve diagnosis in non-traumatic hypotensive ED patients. We compared diagnostic performance of physicians with and without PoCUS in undifferentiated hypotensive patients as part of an international prospective randomized controlled study. The primary outcome was diagnostic performance of PoCUS for cardiogenic vs. non-cardiogenic shock. Methods: SHoC-ED recruited hypotensive patients (SBP < 100 mmHg or shock index > 1) in 6 centres in Canada and South Africa. We describe previously unreported secondary outcomes relating to diagnostic accuracy. Patients were randomized to standard clinical assessment (No PoCUS) or PoCUS groups. PoCUS-trained physicians performed scans after initial assessment. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses including shock category were recorded at 0 and 60 minutes. Final diagnosis was determined by independent blinded chart review. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: 273 patients were enrolled with follow-up for primary outcome completed for 270. Baseline demographics and perceived category of shock were similar between groups. 11% of patients were determined to have cardiogenic shock. PoCUS had a sensitivity of 80.0% (95% CI 54.8 to 93.0%), specificity 95.5% (90.0 to 98.1%), LR+ve 17.9 (7.34 to 43.8), LR-ve 0.21 (0.08 to 0.58), Diagnostic OR 85.6 (18.2 to 403.6) and accuracy 93.7% (88.0 to 97.2%) for cardiogenic shock. Standard assessment without PoCUS had a sensitivity of 91.7% (64.6 to 98.5%), specificity 93.8% (87.8 to 97.0%), LR+ve 14.8 (7.1 to 30.9), LR- of 0.09 (0.01 to 0.58), Diagnostic OR 166.6 (18.7 to 1481) and accuracy of 93.6% (87.8 to 97.2%). There was no significant difference in sensitivity (-11.7% (-37.8 to 18.3%)) or specificity (1.73% (-4.67 to 8.29%)). Diagnostic performance was also similar between other shock subcategories. Conclusion: As reported in other studies, PoCUS based assessment performed well diagnostically in undifferentiated hypotensive patients, especially as a rule-in test. However performance was similar to standard (non-PoCUS) assessment, which was excellent in this study.