We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
There has been rapidly growing interest in understanding the pharmaceutical and clinical properties of psychedelic and dissociative drugs, with a particular focus on ketamine. This compound, long known for its anesthetic and dissociative properties, has garnered attention due to its potential to rapidly alleviate symptoms of depression, especially in individuals with treatment-resistant depression (TRD) or acute suicidal ideation or behavior. However, while ketamine’s psychopharmacological effects are increasingly well-documented, the specific patterns of its neural impact remain a subject of exploration and basic questions remain about its effects on functional activation in both clinical and healthy populations.
Objectives
This meta-analysis seeks to contribute to the evolving landscape of neuroscience research on dissociative drugs such as ketamine by comprehensively examining the effects of acute ketamine administration on neural activation, as measured by functional magnetic resonance imaging (fMRI), in healthy participants.
Methods
We conducted a meta-analysis of existing fMRI activation studies of ketamine using multilevel kernel density analysis (MKDA). Following a comprehensive PubMed search, we quantitatively synthesized all published primary fMRI whole-brain activation studies of the effects of ketamine in healthy subjects with no overlapping samples (N=18). This approach also incorporated ensemble thresholding (α=0.05-0.0001) to minimize cluster-size detection bias and Monte Carlo simulations to correct for multiple comparisons.
Results
Our meta-analysis revealed statistically significant (p<0.05-0.0001; FWE-corrected) alterations in neural activation in multiple cortical and subcortical regions following the administration of ketamine to healthy participants (N=306).
Conclusions
These results offer valuable insights into the functional neuroanatomical effects caused by acute ketamine administration. These findings may also inform development of therapeutic applications of ketamine for various psychiatric and neurological conditions. Future studies should investigate the neural effects of ketamine administration, including both short-term and long-term effects, in clinical populations and their relation to clinical and functional improvements.
Surface ozone is an air pollutant that contributes to hundreds of thousands of premature deaths annually. Accurate short-term ozone forecasts may allow improved policy actions to reduce the risk to human health. However, forecasting surface ozone is a difficult problem as its concentrations are controlled by a number of physical and chemical processes that act on varying timescales. We implement a state-of-the-art transformer-based model, the temporal fusion transformer, trained on observational data from three European countries. In four-day forecasts of daily maximum 8-hour ozone (DMA8), our novel approach is highly skillful (MAE = 4.9 ppb, coefficient of determination $ {\mathrm{R}}^2=0.81 $) and generalizes well to data from 13 other European countries unseen during training (MAE = 5.0 ppb, $ {\mathrm{R}}^2=0.78 $). The model outperforms other machine learning models on our data (ridge regression, random forests, and long short-term memory networks) and compares favorably to the performance of other published deep learning architectures tested on different data. Furthermore, we illustrate that the model pays attention to physical variables known to control ozone concentrations and that the attention mechanism allows the model to use the most relevant days of past ozone concentrations to make accurate forecasts on test data. The skillful performance of the model, particularly in generalizing to unseen European countries, suggests that machine learning methods may provide a computationally cheap approach for accurate air quality forecasting across Europe.
People who inject drugs are at risk of acute bacterial and fungal injecting-related infections. There is evidence that incidence of hospitalizations for injecting-related infections are increasing in several countries, but little is known at an individual level. We aimed to examine injecting-related infections in a linked longitudinal cohort of people who inject drugs in Melbourne, Australia. A retrospective descriptive analysis was conducted to estimate the prevalence and incidence of injecting-related infections using administrative emergency department and hospital separation datasets linked to the SuperMIX cohort, from 2008 to 2018. Over the study period, 33% (95%CI: 31–36%) of participants presented to emergency department with any injecting-related infections and 27% (95%CI: 25–30%) were admitted to hospital. Of 1,044 emergency department presentations and 740 hospital separations, skin and soft tissue infections were most common, 88% and 76%, respectively. From 2008 to 2018, there was a substantial increase in emergency department presentations and hospital separations with any injecting-related infections, 48 to 135 per 1,000 person-years, and 18 to 102 per 1,000 person-years, respectively. The results emphasize that injecting-related infections are increasing, and that new models of care are needed to help prevent and facilitate early detection of superficial infection to avoid potentially life-threatening severe infections.
Recruiting persons with dementia for clinical trials can be challenging. Building on a guide initially developed to assist primary-care-based memory clinics in their efforts to support research, a key stakeholder working group meeting was held to develop a standardized research recruitment process, with input from patients, care partners, researchers, and clinicians. Discussions in this half-day facilitated meeting focused on the wishes and needs of patients and care partners, policy and procedures for researchers, information provided to patients, and considerations for memory clinics. Patients and care partners valued the opportunity to contribute to science and provided important insights on how to best facilitate recruitment. Discussions regarding proposed processes and procedures for research recruitment highlighted the need for a new, patient-driven approach. Accordingly, a key stakeholder co-designed “Memory Clinic Research Match” program was developed that has the potential to overcome existing barriers and to increase recruitment for dementia-related research.
The COVID-19 pandemic and mitigation measures are likely to have a marked effect on mental health. It is important to use longitudinal data to improve inferences.
Aims
To quantify the prevalence of depression, anxiety and mental well-being before and during the COVID-19 pandemic. Also, to identify groups at risk of depression and/or anxiety during the pandemic.
Method
Data were from the Avon Longitudinal Study of Parents and Children (ALSPAC) index generation (n = 2850, mean age 28 years) and parent generation (n = 3720, mean age 59 years), and Generation Scotland (n = 4233, mean age 59 years). Depression was measured with the Short Mood and Feelings Questionnaire in ALSPAC and the Patient Health Questionnaire-9 in Generation Scotland. Anxiety and mental well-being were measured with the Generalised Anxiety Disorder Assessment-7 and the Short Warwick Edinburgh Mental Wellbeing Scale.
Results
Depression during the pandemic was similar to pre-pandemic levels in the ALSPAC index generation, but those experiencing anxiety had almost doubled, at 24% (95% CI 23–26%) compared with a pre-pandemic level of 13% (95% CI 12–14%). In both studies, anxiety and depression during the pandemic was greater in younger members, women, those with pre-existing mental/physical health conditions and individuals in socioeconomic adversity, even when controlling for pre-pandemic anxiety and depression.
Conclusions
These results provide evidence for increased anxiety in young people that is coincident with the pandemic. Specific groups are at elevated risk of depression and anxiety during the COVID-19 pandemic. This is important for planning current mental health provisions and for long-term impact beyond this pandemic.
Studies suggest that alcohol consumption and alcohol use disorders have distinct genetic backgrounds.
Methods
We examined whether polygenic risk scores (PRS) for consumption and problem subscales of the Alcohol Use Disorders Identification Test (AUDIT-C, AUDIT-P) in the UK Biobank (UKB; N = 121 630) correlate with alcohol outcomes in four independent samples: an ascertained cohort, the Collaborative Study on the Genetics of Alcoholism (COGA; N = 6850), and population-based cohorts: Avon Longitudinal Study of Parents and Children (ALSPAC; N = 5911), Generation Scotland (GS; N = 17 461), and an independent subset of UKB (N = 245 947). Regression models and survival analyses tested whether the PRS were associated with the alcohol-related outcomes.
Results
In COGA, AUDIT-P PRS was associated with alcohol dependence, AUD symptom count, maximum drinks (R2 = 0.47–0.68%, p = 2.0 × 10−8–1.0 × 10−10), and increased likelihood of onset of alcohol dependence (hazard ratio = 1.15, p = 4.7 × 10−8); AUDIT-C PRS was not an independent predictor of any phenotype. In ALSPAC, the AUDIT-C PRS was associated with alcohol dependence (R2 = 0.96%, p = 4.8 × 10−6). In GS, AUDIT-C PRS was a better predictor of weekly alcohol use (R2 = 0.27%, p = 5.5 × 10−11), while AUDIT-P PRS was more associated with problem drinking (R2 = 0.40%, p = 9.0 × 10−7). Lastly, AUDIT-P PRS was associated with ICD-based alcohol-related disorders in the UKB subset (R2 = 0.18%, p < 2.0 × 10−16).
Conclusions
AUDIT-P PRS was associated with a range of alcohol-related phenotypes across population-based and ascertained cohorts, while AUDIT-C PRS showed less utility in the ascertained cohort. We show that AUDIT-P is genetically correlated with both use and misuse and demonstrate the influence of ascertainment schemes on PRS analyses.
Background: Measurement of cognitive behavioural therapy (CBT) competency is often resource intensive. A popular emerging alternative to independent observers’ ratings is using other perspectives for rating competency. Aims: This pilot study compared ratings of CBT competency from four perspectives – patient, therapist, supervisor and independent observer using the Cognitive Therapy Scale (CTS). Method: Patients (n = 12, 75% female, mean age 30.5 years) and therapists (n = 5, female, mean age 26.6 years) completed the CTS after therapy sessions, and clinical supervisor and independent observers rated recordings of the same session. Results: Analyses of variance revealed that therapist average CTS competency ratings were not different from supervisor ratings, and supervisor ratings were not different from independent observer ratings; however, therapist ratings were higher than independent observer ratings and patient ratings were higher than all other raters. Conclusions: Raters differed in competency ratings. Implications for potential use and adaptation of CBT competency measurement methods to enhance training and implementation are discussed.
Field experiments were conducted in 1991, 1992, and 1993 to evaluate the dissipation and carryover potential of atrazine from starch-encapsulated (SE) and commercial formulations (CF). Formulation was not a significant factor in atrazine dissipation at any application rate. The dissipation time required to reach one-half of the original concentration (DT50) was measured for each formulation. The atrazine DT50 combined over all years (1991 to 1993) and rates (1.1 to 3.4 kg ai ha−1) was 7 wk for the CF, 10.3 wk for the SE large particles (0.85 to 1.4 mm), and 8.2 wk for the SE small (0.425 to 0.85 mm). Oat injury in the spring of 1992 from all rates of both SE formulations applied in 1991 was greater than that from the CF formulation. Increased oat injury from SE formulations was attributed to more atrazine present in the top 0 to 8 cm of soil compared to that for the CF. Despite significant oat injury from the 1991 application, no injury was observed on soybeans planted in 1992. Soybeans planted in 1993 and 1994 also showed no injury from the respective applications. These findings suggest that the potential for atrazine carryover from starch-encapsulated formulations was not greater than that from the commercial formulation.
The objective of this study was to evaluate growth and seed production of giant foxtail under different N sources (NO3 and NH4) and N fertilizer application rates. Nitrate and NH4 fertilizers plus nitrification inhibitor were applied at 56, 112, and 225 kg N ha−1 under field conditions, and in the greenhouse four N rates (1, 5, 10, and 25 mM N) were applied in three NO3: NH4 ratios (100 : 0, 50 : 50, 0 : 100). Growth of giant foxtail was affected by N rates under both greenhouse and field conditions. In 1993, abundant rainfall in May and June allowed a rapid and earlier uptake of N by giant foxtail, resulting in larger plants with greater N accumulation and higher numbers of heads and seeds than in 1994. Total dry weight increased with increasing N rates; however, seed production reached a maximum at approximately 150 kg N ha−1. Nitrogen translocation efficiency decreased with increasing N rates. Giant foxtail did not show any preference to N form; however, seed production was reduced when the high N rate was applied as NH4 compared to NO3. These results suggest that NH4 fertilizer applications with a long-term nitrification inhibitor could reduce the seed production of giant foxtail and its contribution to the soil seedbank for subsequent growing seasons.
Field experiments were conducted in 1990, 1991, and 1992 to evaluate and compare the efficacy of commercial herbicide formulations with starch encapsulated granules that contain one, two, or three herbicides. Atrazine in combination with alachlor or metolachlor composed the two-component granules and the addition of dicamba to both composed the three-component granules. All starch-encapsulated formulations were produced by twin screw extrusion technology and evaluated in two granule sizes, 1.4 to 0.5-mm (14 to 20 mesh) and 0.85 to 0.43-mm (20 to 40 mesh). Active ingredient rates applied were selected for the existing soil conditions and combination granules contained active ingredients proportional to premixed commercial formulations available or suggested for tank mixes. Evaluations were performed under conventional, chisel, and no-tillage systems. PRE and PPI applications of starch encapsulated two-component (atrazine-alachlor) in 1990 controlled giant foxtail (except at the low rate with large granule size), redroot pigweed, and common lambsquarters excellently but control of velvetleaf was fair to poor. Smaller-sized granules were generally more effective for controlling weeds than the larger-sized granules. In 1991, starch-encapsulated two-component (atrazine-metolachlor) granules applied both PRE and PPI in conventional, chisel, and no-till systems gave results similar to 1990 with the small granules more effective on velvetleaf. The addition of dicamba in the granule to form three-component starch-encapsulated granules in 1992 resulted in control of velvetleaf, ivyleaf morningglory, and jimsonweed statistically equal to commercial formulations except in one case of no-till corn. In our experiments, herbicide formulations (granular vs. commercial) had no significant effect on corn yield in 28 out of 31 treatments. These data indicate that if the experimental three-component starch-encapsulated formulations of corn herbicides used in these studies were optimized they could become as efficacious as commercial formulations presently on the market. This is the first report of research containing data on two- and three-component starch-encapsulated granular formulations.
Experiments were conducted in controlled environment chambers to evaluate the effects of temperature and soil water content on the time of dissipation of commercial (CF) and starch encapsulated (SE) atrazine formulations to one-half the original concentration (T50). SE samples were also analyzed for the amount of atrazine remaining within the starch particles (percent encapsulation). The dissipation of CF atrazine was affected by changes in temperature and soil water content. SE atrazine dissipation was most influenced by changes in soil water content rather than temperature. Independent of soil water, there was no atrazine dissipation from any formulation at 15 C. The T50 for CF atrazine at 20% soil water content was 53.4 and 29.9 d for 25 and 35 C, respectively. At 20% soil water content, all SE treatments gave a T50 greater than 60 d. The percent starch encapsulation at 20% soil water content was greater than or equal to 55.8 and 30.4% for SE large and SE small, respectively. This high level of encapsulated atrazine accounts for the reduced SE dissipation observed at 20% soil water content. At 40% soil water content, the dissipation of CF and SE small atrazine were not different for either 25 or 35 C. Compared to the CF, the SE large formulation extended the T50 by 7.4 and 6.7 d at 25 and 35 C, respectively. At 40% soil water content, there was no encapsulated atrazine present in SE formulations 60 DAT.
Risk factors for alcohol problems (AP) include biological and environmental factors that are relevant across development. The pathways through which these factors are related, and how they lead to AP, are optimally considered in the context of a comprehensive developmental model.
Method
Using data from a prospectively assessed, population-based UK cohort, we constructed a structural equation model that integrated risk factors reflecting individual, family and peer/community-level constructs across childhood, adolescence and young adulthood. These variables were used to predict AP at the age of 20 years.
Results
The final model explained over 30% of the variance in liability to age 20 years AP. Most prominent in the model was an externalizing pathway to AP, with conduct problems, sensation seeking, AP at age 17.5 years and illicit substance use acting as robust predictors. In conjunction with these individual-level risk factors, familial AP, peer relationships and low parental monitoring also predicted AP. Internalizing problems were less consistently associated with AP. Some risk factors previously identified were not associated with AP in the context of this comprehensive model.
Conclusions
The etiology of young adult AP is complex, influenced by risk factors that manifest across development. The most prominent pathway to AP is via externalizing and related behaviors. These findings underscore the importance of jointly assessing both biologically influenced and environmental risk factors for AP in a developmental context.
A clearer understanding of the basis for the association between cannabis use and psychotic experiences (PEs) is required. Our aim was to examine the extent to which associations between cannabis and cigarette use and PEs are due to confounding.
Method.
A cohort study of 1756 adolescents with data on cannabis use, cigarette use and PEs.
Results.
Cannabis use and cigarette use at age 16 were both associated, to a similar degree, with PEs at age 18 [odds ratio (OR) 1.48, 95% confidence interval (CI) 1.18–1.86 for cannabis and OR 1.61, 95% CI 1.31–1.98 for cigarettes]. Adjustment for cigarette smoking frequency (OR 1.27, 95% CI 0.91–1.76) or other illicit drug use (OR 1.25, 95% CI 0.91–1.73) substantially attenuated the relationship between cannabis and PEs. The attenuation was to a lesser degree when cannabis use was adjusted for in the cigarette PE association (OR 1.42, 95% CI 1.05–1.92). However, almost all of the participants used cannabis with tobacco, including those who classed themselves as non-cigarette smokers.
Conclusions.
Teasing out the effects of cannabis from tobacco is highly complex and may not have been dealt with adequately in studies to date, including this one. Complementary methods are required to robustly examine the independent effects of cannabis, tobacco and other illicit drugs on PEs.
People who inject drugs (PWID) are vulnerable to infections and injuries at injection sites. The factors associated with reporting symptoms of these, seeking related advice, and hospital admission are examined. PWID were recruited in Birmingham, Bristol and Leeds using respondent-driven sampling (N = 855). During the preceding year, 48% reported having redness, swelling and tenderness (RST), 19% an abscess, and 10% an open wound at an injection site. Overall, 54% reported ⩾1 symptoms, with 45% of these seeking medical advice (main sources emergency departments and General Practitioners). Advice was often sought ⩾5 days after the symptom first appeared (44% of those seeking advice about an abscess, 45% about an open wound, and 35% for RST); the majority received antibiotics. Overall, 9·5% reported hospital admission during the preceding year. Ever being diagnosed with septicaemia and endocarditis were reported by 8·8% and 2·9%, respectively. Interventions are needed to reduce morbidity, healthcare burden and delays in accessing treatment.
In developed countries the majority of hepatitis C virus (HCV) infections occur in injecting drug users (IDUs) with prevalence in IDUs often high, but with wide geographical differences within countries. Estimates of local prevalence are needed for planning services for IDUs, but it is not practical to conduct HCV seroprevalence surveys in all areas. In this study survey data from IDUs attending specialist services were collected in 52/149 sites in England between 2006 and 2008. Spatially correlated random-effects models were used to estimate HCV prevalence for all sites, using auxiliary data to aid prediction. Estimates ranged from 14% to 82%, with larger cities, London and the North West having the highest HCV prevalence. The methods used generated robust estimates for each area, with a well-identified spatial pattern that improved predictions. Such models may be of use in other areas of study where surveillance data are sparse.
We describe the largest outbreak of hepatitis B virus infection reported to date in the UK. Between July 2001 and December 2005, 237 cases were identified in Avon, South West England. The likely route of transmission was injecting drug use in 44% (104/237) and heterosexual intercourse in 30% (71/237) of cases. A case-control study in injectors showed that injecting crack cocaine [adjusted odds ratio (aOR) 23·8, 95% confidence interval (CI) 3·04–186, P<0·001] and sharing injecting paraphernalia in the year before diagnosis (aOR 16·67, 95% CI 1·78–100, P=0·010) were strongly associated with acute hepatitis B. In non-IDUs number of sexual partners and lack of consistent condom use were high compared to a national sample. We describe the control measures implemented in response to the outbreak. This outbreak has highlighted the problems associated with the low uptake from the national hepatitis B vaccination policy which targets high-risk groups, the difficulties of identifying those at risk of acquiring hepatitis B infection through heterosexual sex, and injecting crack cocaine as a risk factor for hepatitis B.
In recent years infection caused by Salmonella serotype Enteritidis (SE) phage type 4 has spread through Europe but has been uncommon in the USA. The first recognized outbreak of this strain in the USA occurred in a Chinese restaurant in El Paso, Texas, in April 1993; no source was identified. In September 1993, a second outbreak caused by SE phage type 4 was associated with the same restaurant. To determine the cause of the second outbreak, we compared food exposures of the 19 patients with that of two control groups. Egg rolls were the only item significantly associated with illness in both analyses (first control group: oddsratio [OR] 8·2, 95% confidence interval [CI] 2·3–31·6; second control group: OR 13·1, 95% CI 2·1–97·0). Retrospective analysis of the April outbreak also implicated egg rolls (OR 32·4, 95% CI 9·1–126·6). Egg roll batter was made from pooled shell eggs and was left at room temperature throughout the day. These two outbreaks of SE phage type 4 likely could have been prevented by using pasteurized eggs and safe food preparation practices.
In 1990, a Salmonella enteritidis (SE) outbreak occurred in a restaurant chain in Pennsylvania. To determine its cause(s), we conducted a case-control study and a cohort study at one restaurant, and a survey of restaurants. Egg dishes were associated with illness (P = 0.03). Guests from one hotel eating at the restaurant had a diarrhocal attack rate of 14%, 4.7-fold higher than among those not eating there (P = 0.04). There were no differences in egg handling between affected and unaffected restaurants. Eggs supplied to affected restaurants were medium grade AA eggs from a single farm, and were reportedly refrigerated during distribution. Human and hen SE isolates were phage type 8 and had similar plasmid profiles and antibiograms. We estimate the prevalence of infected eggs during the outbreak to be as high as 1 in 12. Typical restaurant egg-handling practices and refrigeration during distribution appear to be insufficient by themselves to prevent similar outbreaks.