We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Following acquired brain injury (ABI), individuals often experience anxiety and/or depressive symptoms. BrainACT is an adapted form of Acceptance and Commitment Therapy (ACT) tailored to this target group. The current study is a trial-based health-economic evaluation comparing BrainACT to a psychoeducation and relaxation control treatment.
Methods
An economic evaluation from a societal perspective was conducted in the Netherlands alongside a multicenter randomized controlled two-armed parallel trial including 72 participants. A cost-utility and cost-effectiveness analysis was conducted where incremental costs, quality-adjusted life-years (QALYs), and anxiety/depression (Hospital Anxiety and Depression Scale (HADS) score) were collected and presented over a 1-year follow-up period. Bootstrapping, scenario, and subgroup analyses were performed to test the robustness of the results.
Results
The BrainACT arm reported non-significant lower total costs (incremental difference of €−4,881; bootstrap interval €−12,139 to €2,330) combined with significantly decreased anxiety/depression (HADS) (3.2; bootstrap intervals 0.7–5.7). However, the total QALYs were non-significantly lower (−0.008; bootstrap interval −0.060 to 0.042) for BrainACT. The probability of the intervention being cost-effective was 86 percent at a willingness-to-accept threshold of €50,000/QALY. The scenario and subgroup analyses confirmed the robustness of the results.
Conclusion
BrainACT may be a more cost-effective alternative to a psychoeducation and relaxation intervention for anxiety and/or depressive symptoms following ABI. Despite limitations, BrainACT appears to be a promising addition to treatment options in the Netherlands. Further research is needed to validate these findings, and consideration should be given to implementing BrainACT in Dutch clinical settings with ongoing monitoring.
With wide-field phased array feed technology, the Australian Square Kilometre Array Pathfinder (ASKAP) is ideally suited to search for seemingly rare radio transient sources that are difficult to discover previous-generation narrow-field telescopes. The Commensal Real-time ASKAP Fast Transient (CRAFT) Survey Science Project has developed instrumentation to continuously search for fast radio transients (duration $\lesssim$ 1 s) with ASKAP, with a particular focus on finding and localising fast radio bursts (FRBs). Since 2018, the CRAFT survey has been searching for FRBs and other fast transients by incoherently adding the intensities received by individual ASKAP antennas, and then correcting for the impact of frequency dispersion on these short-duration signals in the resultant incoherent sum (ICS) in real time. This low-latency detection enables the triggering of voltage buffers, which facilitates the localisation of the transient source and the study of spectro-polarimetric properties at high time resolution. Here we report the sample of 43 FRBs discovered in this CRAFT/ICS survey to date. This includes 22 FRBs that had not previously been reported: 16 FRBs localised by ASKAP to $\lesssim 1$ arcsec and 6 FRBs localised to $\sim 10$ arcmin. Of the new arcsecond-localised FRBs, we have identified and characterised host galaxies (and measured redshifts) for 11. The median of all 30 measured host redshifts from the survey to date is $z=0.23$. We summarise results from the searches, in particular those contributing to our understanding of the burst progenitors and emission mechanisms, and on the use of bursts as probes of intervening media. We conclude by foreshadowing future FRB surveys with ASKAP using a coherent detection system that is currently being commissioned. This will increase the burst detection rate by a factor of approximately ten and also the distance to which ASKAP can localise FRBs.
We examine the energy distribution of the fast radio burst (FRB) population using a well-defined sample of 63 FRBs from the Australian Square Kilometre Array Pathfinder (ASKAP) radio telescope, 28 of which are localised to a host galaxy. We apply the luminosity-volume ($V/V_{\mathrm{max}}$) test to examine the distribution of these transient sources, accounting for cosmological and instrumental effects, and determine the energy distribution for the sampled population over the redshift range $0.01 \lesssim z \lesssim 1.02$. We find the distribution between $10^{23}$ and $10^{26}$ J Hz$^{-1}$ to be consistent with both a pure power-law with differential slope $\gamma=-1.96 \pm 0.15$, and a Schechter function with $\gamma = -1.82 \pm 0.12$ and downturn energy $E_\mathrm{max} \sim 6.3 \, \times 10^{25}$ J Hz$^{-1}$. We identify systematic effects which currently limit our ability to probe the luminosity function outside this range and give a prescription for their treatment. Finally, we find that with the current dataset, we are unable to distinguish between the evolutionary and spectral models considered in this work.
Mandatory folic acid fortification of enriched grains has reduced neural tube defect prevalence in several countries. We examined salt as an additional vehicle for folic acid fortification. The primary objective was to examine the change in serum folate concentration after 1 month of consumption of fortified iodised salt with folic acid (FISFA) among women of reproductive age. The secondary objectives were to examine (1) the feasibility of implementing FISFA intervention and (2) the acceptability of FISFA.
Design:
We conducted a pre–post intervention study (January–April 2023). Participants received a FISFA saltshaker with the study salt (1 g of sodium chloride salt fortified with 100 mcg of folic acid) to use instead of regular table salt for 1 month. Serum folate was measured using the Elecsys Folate-III immunoassay method at baseline and 1-month endpoint. Change in serum folate was assessed using a two-tailed Wilcoxon signed rank test for paired samples.
Setting:
Metropolitan city, Southern USA.
Participants:
Non-pregnant, 18–40-year-old women who lived alone/with a partner.
Results:
Thirty-two eligible women consented to participate, including eleven non-Hispanic-White, eleven non-Hispanic-Black and ten Hispanic. Post-intervention, there was a significant increase in median serum folate concentration of 1·40 nmol/l (IQR 0·74–2·05; P < 0·001) from 24·08 nmol/l to 25·96 nmol/l in an analytical sample of n 29. An increase was seen in 28/29 (93 %) participants. Feasibility: 100 % study consent and compliance. FISFA acceptability: 25 d average use; 1·28 g average daily intake; 96·7 % and 90 % reported taste and colour of FISFA as highly acceptable, respectively.
Conclusions:
FISFA is an effective approach to increasing serum folate concentrations among women of reproductive age. Findings should be replicated in a larger study.
Central line-associated bloodstream infection (CLABSI) is one of the most prevalent pediatric healthcare-associated infections and is used to benchmark hospital performance. Pediatric patients have increased in acuity and complexity over time. Existing approaches to risk adjustment do not control for individual patient characteristics, which are strong predictors of CLABSI risk and vary over time. Our objective was to develop a risk adjustment model for CLABSI in hospitalized children and compare observed to expected rates over time.
Design and Setting:
We conducted a prospective cohort study using electronic health record data at a quaternary Children’s Hospital.
Patients:
We included hospitalized children with central catheters.
Methods:
Risk factors identified from published literature were considered for inclusion in multivariable modeling based on association with CLABSI risk in bivariable analysis and expert input. We calculated observed and expected (risk model-adjusted) annual CLABSI rates.
Results:
Among 16,411 patients with 520,209 line days, 633 patients experienced 796 CLABSIs. The final model included age, behavioral health condition, non-English speaking, oncology service, port catheter type, catheter dwell time, lymphatic condition, total parenteral nutrition, and number of organ systems requiring ICU level care. For every organ system receiving ICU level care the odds ratio for CLABSI was 1.24 (95% CI 1.12–1.37). Although not statistically different, observed rates were lower than expected rates for later years.
Conclusions:
Failure to adjust for patient factors, particularly acuity and complexity of disease, may miss clinically significant differences in CLABSI rates, and may lead to inaccurate interpretation of the impact of quality improvement efforts.
Dilute aqueous solutions of quinoline were contacted with Na-montmorillonite to elucidate the sorption process of the neutral and protonated species. Sorption occurs via a combination of ion exchange and molecular adsorption and yields S-type isotherms. Exchange between the quinolinium ion (QH+) and Na can be described by means of Vanselow selectivity coefficients and a thermodynamic exchange constant (Kex). Due to the apparent adsorption of the neutral species at high mole fractions (x) of the solid phase, the thermodynamic standard state was defined as 0.5 mole fraction. The selectivity at pH ~4.95 of the QH+ species over Na (at XQH+ = 0.5) was determined to be Kv = 340. At pH ≥ 5.5 surface mole fractions of 0.5 could not be obtained without adsorption of the neutral species. This study suggests that at dilute solution concentrations quinoline is sorbed preferentially as the cation even at pHs ≫ pKa. A critical surface-solution concentration is apparently necessary for adsorption of the neutral species.
Forecasts play a central role in decision-making under uncertainty. After a brief review of the general issues, this article considers ways of using high-dimensional data in forecasting. We consider selecting variables from a known active set, known knowns, using Lasso and One Covariate at a time Multiple Testing, and approximating unobserved latent factors, known unknowns, by various means. This combines both sparse and dense approaches to forecasting. We demonstrate the various issues involved in variable selection in a high-dimensional setting with an application to forecasting UK inflation at different horizons over the period 2020q1–2023q1. This application shows both the power of parsimonious models and the importance of allowing for global variables.
Former professional American football players have a high relative risk for neurodegenerative diseases like chronic traumatic encephalopathy (CTE). Interpreting low cognitive test scores in this population occasionally is complicated by performance on validity testing. Neuroimaging biomarkers may help inform whether a neurodegenerative disease is present in these situations. We report three cases of retired professional American football players who completed comprehensive neuropsychological testing, but “failed” performance validity tests, and underwent multimodal neuroimaging (structural MRI, Aß-PET, and tau-PET).
Participants and Methods:
Three cases were identified from the Focused Neuroimaging for the Neurodegenerative Disease Chronic Traumatic Encephalopathy (FIND-CTE) study, an ongoing multimodal imaging study of retired National Football League players with complaints of progressive cognitive decline conducted at Boston University and the UCSF Memory and Aging Center. Participants were relatively young (age range 55-65), had 16 or more years of education, and two identified as Black/African American. Raw neuropsychological test scores were converted to demographically-adjusted z-scores. Testing included standalone (Test of Memory Malingering; TOMM) and embedded (reliable digit span, RDS) performance validity measures. Validity cutoffs were TOMM Trial 2 < 45 and RDS < 7. Structural MRIs were interpreted by trained neurologists. Aß-PET with Florbetapir was used to quantify cortical Aß deposition as global Centiloids (0 = mean cortical signal for a young, cognitively normal, Aß negative individual in their 20s, 100 = mean cortical signal for a patient with mild-to-moderate Alzheimer’s disease dementia). Tau-PET was performed with MK-6240 and first quantified as standardized uptake value ratio (SUVR) map. The SUVR map was then converted to a w-score map representing signal intensity relative to a sample of demographically-matched healthy controls.
Results:
All performed in the average range on a word reading-based estimate of premorbid intellect. Contribution of Alzheimer’s disease pathology was ruled out in each case based on Centiloids quantifications < 0. All cases scored below cutoff on TOMM Trial 2 (Case #1=43, Case #2=42, Case #3=19) and Case #3 also scored well below RDS cutoff (2). Each case had multiple cognitive scores below expectations (z < -2.0) most consistently in memory, executive function, processing speed domains. For Case #1, MRI revealed mild atrophy in dorsal fronto-parietal and medial temporal lobe (MTL) regions and mild periventricular white matter disease. Tau-PET showed MTL tau burden modestly elevated relative to controls (regional w-score=0.59, 72nd%ile). For Case #2, MRI revealed cortical atrophy, mild hippocampal atrophy, and a microhemorrhage, with no evidence of meaningful tau-PET signal. For Case #3, MRI showed cortical atrophy and severe white matter disease, and tau-PET revealed significantly elevated MTL tau burden relative to controls (w-score=1.90, 97th%ile) as well as focal high signal in the dorsal frontal lobe (overall frontal region w-score=0.64, 74th%ile).
Conclusions:
Low scores on performance validity tests complicate the interpretation of the severity of cognitive deficits, but do not negate the presence of true cognitive impairment or an underlying neurodegenerative disease. In the rapidly developing era of biomarkers, neuroimaging tools can supplement neuropsychological testing to help inform whether cognitive or behavioral changes are related to a neurodegenerative disease.
Severe infections and psychiatric disorders have a large impact on both society and the individual. Studies investigating these conditions and the links between them are therefore important. Most past studies have focused on binary phenotypes of particular infections or overall infection, thereby losing some information regarding susceptibility to infection as reflected in the number of specific infection types, or sites, which we term infection load. In this study we found that infection load was associated with increased risk for attention-deficit/hyperactivity disorder, autism spectrum disorder, bipolar disorder, depression, schizophrenia and overall psychiatric diagnosis. We obtained a modest but significant heritability for infection load (h2 = 0.0221), and a high degree of genetic correlation between it and overall psychiatric diagnosis (rg = 0.4298). We also found evidence supporting a genetic causality for overall infection on overall psychiatric diagnosis. Our genome-wide association study for infection load identified 138 suggestive associations. Our study provides further evidence for genetic links between susceptibility to infection and psychiatric disorders, and suggests that a higher infection load may have a cumulative association with psychiatric disorders, beyond what has been described for individual infections.
There are indications that problematic alcohol use may negatively impact the course of major depressive disorder (MDD). However, most studies on alcohol use and adverse MDD outcomes are conducted amongst MDD populations with (severe) alcohol use disorder in psychiatric treatment settings. Therefore, it remains unclear whether these results can be generalised to the general population. In light of this, we examined the longitudinal relationship between alcohol use and MDD persistence after a 3-year follow-up amongst people with MDD from the general population.
Methods
Data were derived from the Netherlands Mental Health Survey and Incidence Study-2 (NEMESIS-2), a psychiatric epidemiological prospective study comprising four waves amongst the adult Dutch general population (n = 6.646). The study sample (n = 642) consisted of those with 12-month MDD who participated at the follow-up wave. The outcome was 12-month MDD persistence after the 3-year follow-up, which was assessed via the Composite International Diagnostic Interview version 3.0. Weekly alcohol consumption was operationalised as non-drinking (0 drinks), low-risk drinking (⩽7 drinks; reference), at-risk drinking (women 8–13 drinks, men 8–20 drinks) and high-risk drinking (women ⩾14, men ⩾21 drinks). We performed univariate and multiple logistic regression analyses, which were adjusted for various socio-demographic and health-related factors.
Results
The majority (67.4%) of the MDD sample were female, while the mean age was 47.1 years. Amongst these, 23.8% were non-drinkers, 52.0% were low-risk drinkers and 14.3% and 9.4% were at-risk and high-risk drinkers, respectively. Around one-quarter of the sample (23.6%) met the criteria for a persistent MDD after 3-year follow-up. No statistically significant association was found between alcohol use and MDD persistence, either for the crude model or the adjusted models. In comparison to low-risk drinking, the full adjusted model showed no statistically significant associations between MDD persistence and non-drinking (odds ratio (OR) = 1.15, p = 0.620), at-risk drinking (OR = 1.25, p = 0.423), or high-risk drinking (OR = 0.74, p = 0.501).
Conclusions
Contrary to our expectations, our findings showed that alcohol use was not a predictor of MDD persistence after 3-year follow-up amongst people with MDD from the general population.
To evaluate the construct validity of the NIH Toolbox Cognitive Battery (NIH TB-CB) in the healthy oldest-old (85+ years old).
Method:
Our sample from the McKnight Brain Aging Registry consists of 179 individuals, 85 to 99 years of age, screened for memory, neurological, and psychiatric disorders. Using previous research methods on a sample of 85 + y/o adults, we conducted confirmatory factor analyses on models of NIH TB-CB and same domain standard neuropsychological measures. We hypothesized the five-factor model (Reading, Vocabulary, Memory, Working Memory, and Executive/Speed) would have the best fit, consistent with younger populations. We assessed confirmatory and discriminant validity. We also evaluated demographic and computer use predictors of NIH TB-CB composite scores.
Results:
Findings suggest the six-factor model (Vocabulary, Reading, Memory, Working Memory, Executive, and Speed) had a better fit than alternative models. NIH TB-CB tests had good convergent and discriminant validity, though tests in the executive functioning domain had high inter-correlations with other cognitive domains. Computer use was strongly associated with higher NIH TB-CB overall and fluid cognition composite scores.
Conclusion:
The NIH TB-CB is a valid assessment for the oldest-old samples, with relatively weak validity in the domain of executive functioning. Computer use’s impact on composite scores could be due to the executive demands of learning to use a tablet. Strong relationships of executive function with other cognitive domains could be due to cognitive dedifferentiation. Overall, the NIH TB-CB could be useful for testing cognition in the oldest-old and the impact of aging on cognition in older populations.
This book provides a timely and novel contribution to the debate surrounding the role of 'evidence' in specific public policy areas. It explores the creation, dissemination and use of evidence within the areas of healthcare, education, criminal justice, social care, welfare, housing, transport and urban renewal.
Psychiatric disorders are highly polygenic and show patterns of partner resemblance. Partner resemblance has direct population-level genetic implications if it is caused by assortative mating, but not if it is caused by convergence or social homogamy. Using genetics may help distinguish these different mechanisms. Here, we investigated whether partner resemblance for schizophrenia and bipolar disorder is influenced by assortative mating using polygenic risk scores (PRSs).
Methods
PRSs from The Danish High-Risk and Resilience Study—VIA 7 were compared between parents in three subsamples: population-based control parent pairs (N=198), parent pairs where at least one parent had schizophrenia (N=193), and parent pairs where at least one parent had bipolar disorder (N=115).
Results
The PRS for schizophrenia was predictive of schizophrenia in the full sample and showed a significant correlation between parent pairs (r=0.121, p=0.0440), indicative of assortative mating. The PRS for bipolar disorder was also correlated between parent pairs (r=0.162, p=0.0067), but it was not predictive of bipolar disorder in the full sample, limiting the interpretation.
Conclusions
Our study provides genetic evidence for assortative mating for schizophrenia, with important implications for our understanding of the genetics of schizophrenia.
Healthcare workers (HCWs) not adhering to physical distancing recommendations is a risk factor for acquisition of severe acute respiratory coronavirus virus 2 (SARS-CoV-2). The study objective was to assess the impact of interventions to improve HCW physical distancing on actual distance between HCWs in a real-life setting.
Methods:
HCWs voluntarily wore proximity beacons to measure the number and intensity of physical distancing interactions between each other in a pediatric intensive care unit. We compared interactions before and after implementing a bundle of interventions including changes to the layout of workstations, cognitive aids, and individual feedback from wearable proximity beacons.
Results:
Overall, we recorded 10,788 interactions within 6 feet (∼2 m) and lasting >5 seconds. The number of HCWs wearing beacons fluctuated daily and increased over the study period. On average, 13 beacons were worn daily (32% of possible staff; range, 2–32 per day). We recorded 3,218 interactions before the interventions and 7,570 interactions after the interventions began. Using regression analysis accounting for the maximum number of potential interactions if all staff had worn beacons on a given day, there was a 1% decline in the number of interactions per possible interactions in the postintervention period (incident rate ratio, 0.99; 95% confidence interval, 0.98–1.00; P = .02) with fewer interactions occurring at nursing stations, in workrooms and during morning rounds.
Conclusions:
Using quantitative data from wearable proximity beacons, we found an overall small decline in interactions within 6 feet between HCWs in a busy intensive care unit after a multifaceted bundle of interventions was implemented to improve physical distancing.
Non-alcoholic fatty liver disease (NAFLD) is an increasing cause of chronic liver disease that accompanies obesity and the metabolic syndrome. Excess fructose consumption can initiate or exacerbate NAFLD in part due to a consequence of impaired hepatic fructose metabolism. Preclinical data emphasized that fructose-induced altered gut microbiome, increased gut permeability, and endotoxemia play an important role in NAFLD, but human studies are sparse. The present study aimed to determine if two weeks of excess fructose consumption significantly alters gut microbiota or permeability in humans.
Methods:
We performed a pilot double-blind, cross-over, metabolic unit study in 10 subjects with obesity (body mass index [BMI] 30–40 mg/kg/m2). Each arm provided 75 grams of either fructose or glucose added to subjects’ individual diets for 14 days, substituted isocalorically for complex carbohydrates, with a 19-day wash-out period between arms. Total fructose intake provided in the fructose arm of the study totaled a mean of 20.1% of calories. Outcome measures included fecal microbiota distribution, fecal metabolites, intestinal permeability, markers of endotoxemia, and plasma metabolites.
Results:
Routine blood, uric acid, liver function, and lipid measurements were unaffected by the fructose intervention. The fecal microbiome (including Akkermansia muciniphilia), fecal metabolites, gut permeability, indices of endotoxemia, gut damage or inflammation, and plasma metabolites were essentially unchanged by either intervention.
Conclusions:
In contrast to rodent preclinical findings, excess fructose did not cause changes in the gut microbiome, metabolome, and permeability as well as endotoxemia in humans with obesity fed fructose for 14 days in amounts known to enhance NAFLD.
There is evidence that environmental and genetic risk factors for schizophrenia spectrum disorders are transdiagnostic and mediated in part through a generic pathway of affective dysregulation.
Methods
We analysed to what degree the impact of schizophrenia polygenic risk (PRS-SZ) and childhood adversity (CA) on psychosis outcomes was contingent on co-presence of affective dysregulation, defined as significant depressive symptoms, in (i) NEMESIS-2 (n = 6646), a representative general population sample, interviewed four times over nine years and (ii) EUGEI (n = 4068) a sample of patients with schizophrenia spectrum disorder, the siblings of these patients and controls.
Results
The impact of PRS-SZ on psychosis showed significant dependence on co-presence of affective dysregulation in NEMESIS-2 [relative excess risk due to interaction (RERI): 1.01, p = 0.037] and in EUGEI (RERI = 3.39, p = 0.048). This was particularly evident for delusional ideation (NEMESIS-2: RERI = 1.74, p = 0.003; EUGEI: RERI = 4.16, p = 0.019) and not for hallucinatory experiences (NEMESIS-2: RERI = 0.65, p = 0.284; EUGEI: −0.37, p = 0.547). A similar and stronger pattern of results was evident for CA (RERI delusions and hallucinations: NEMESIS-2: 3.02, p < 0.001; EUGEI: 6.44, p < 0.001; RERI delusional ideation: NEMESIS-2: 3.79, p < 0.001; EUGEI: 5.43, p = 0.001; RERI hallucinatory experiences: NEMESIS-2: 2.46, p < 0.001; EUGEI: 0.54, p = 0.465).
Conclusions
The results, and internal replication, suggest that the effects of known genetic and non-genetic risk factors for psychosis are mediated in part through an affective pathway, from which early states of delusional meaning may arise.
The concentration of radiocarbon (14C) differs between ocean and atmosphere. Radiocarbon determinations from samples which obtained their 14C in the marine environment therefore need a marine-specific calibration curve and cannot be calibrated directly against the atmospheric-based IntCal20 curve. This paper presents Marine20, an update to the internationally agreed marine radiocarbon age calibration curve that provides a non-polar global-average marine record of radiocarbon from 0–55 cal kBP and serves as a baseline for regional oceanic variation. Marine20 is intended for calibration of marine radiocarbon samples from non-polar regions; it is not suitable for calibration in polar regions where variability in sea ice extent, ocean upwelling and air-sea gas exchange may have caused larger changes to concentrations of marine radiocarbon. The Marine20 curve is based upon 500 simulations with an ocean/atmosphere/biosphere box-model of the global carbon cycle that has been forced by posterior realizations of our Northern Hemispheric atmospheric IntCal20 14C curve and reconstructed changes in CO2 obtained from ice core data. These forcings enable us to incorporate carbon cycle dynamics and temporal changes in the atmospheric 14C level. The box-model simulations of the global-average marine radiocarbon reservoir age are similar to those of a more complex three-dimensional ocean general circulation model. However, simplicity and speed of the box model allow us to use a Monte Carlo approach to rigorously propagate the uncertainty in both the historic concentration of atmospheric 14C and other key parameters of the carbon cycle through to our final Marine20 calibration curve. This robust propagation of uncertainty is fundamental to providing reliable precision for the radiocarbon age calibration of marine based samples. We make a first step towards deconvolving the contributions of different processes to the total uncertainty; discuss the main differences of Marine20 from the previous age calibration curve Marine13; and identify the limitations of our approach together with key areas for further work. The updated values for ΔR, the regional marine radiocarbon reservoir age corrections required to calibrate against Marine20, can be found at the data base http://calib.org/marine/.
Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.
To evaluate the influence of cannabis in the long-term follow-up in patients with a first psychotic episode, comparing those who have never used cannabis with (a) those who used cannabis before the first psychotic episode but stopped it during the follow-up, and (b) those who used cannabis both before and after the first psychotic episode.
Method
Patients were followed from the first psychotic admission. They were assessed at 1, 3 and 5 years obtaining information about functional outcome, positive and negative symptoms. At 8th year functional outcome was evaluated. Patients were classified in 3 groups: 40 that never used cannabis (NU), 27 that used cannabis and stopped during follow-up (CUS), and 25 that had continued use during follow-up (CU).
Results
At baseline, there were differences neither in functional outcome nor in negative symptoms. The CUS group improved the functional outcome during the follow-up (p< 0.001), while CU and NU groups did not show any significant results (p= 0.466 and p= 0.370 respectively). CUS group had also a significant decreasing trend in negative symptoms (p= 0.012), whereas for the other two groups no significant results were observed (p= 0.069 and p= 0.226 respectively). All groups improved in positive symptoms during follow-up.
Conclusions
Although cannabis use has deleterious effect, to stop it after the first psychotic episode produces a clearly improvement in the long-term follow-up.