We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The health benefits of the long-chain omega-3 polyunsaturated fatty acids (PUFA), eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) have been known for over 50 years and underpin the UK population recommendation to consume >450 mg EPA + DHA per day. These recommendations, last revised in 2004, are based mainly on epidemiological evidence. Much research has been conducted in the interim. Most randomised controlled trials (RCT) use doses of EPA + DHA of 840 mg/d or more. For anti-inflammatory, triacylglycerol-lowering and anti-hypertensive effects, >1.5 g EPA + DHA per day is needed. Cognitive benefits are also likely to require these higher intakes. Farmed salmon now contains considerably less EPA + DHA relative to farmed fish of 20 years ago, meaning one portion per week will no longer provide the equivalent of 450 mg EPA + DHA per day. Oily fish alone can only provide a fraction of the EPA + DHA required to meet global needs. Furthermore, there is low global oily fish consumption, with typical intakes of <200 mg EPA + DHA per day, and limited intakes in vegans and vegetarians. Therefore, there is an urgent need for affordable, acceptable, alternative EPA + DHA sources, including vegan/vegetarian friendly options, such as bio-enriched poultry, red meat and milk products; fortified foods; enriched oilseeds (for example, genetically modified Camelina sativa); algae and algal oils; and approaches which enhance endogenous EPA/DHA synthesis. In this narrative review, we suggest that current EPA + DHA intake recommendations are too low, consider EPA/DHA from a holistic health-sustainability perspective and identify research, policy and knowledge mobilisation areas which need attention.
Epidemiologic research suggests that youth cannabis use is associated with psychotic disorders. However, current evidence is based heavily on 20th-century data when cannabis was substantially less potent than today.
Methods
We linked population-based survey data from 2009 to 2012 with records of health services covered under universal healthcare in Ontario, Canada, up to 2018. The cohort included respondents aged 12–24 years at baseline with no prior psychotic disorder (N = 11 363). The primary outcome was days to first hospitalization, ED visit, or outpatient visit related to a psychotic disorder according to validated diagnostic codes. Due to non-proportional hazards, we estimated age-specific hazard ratios during adolescence (12–19 years) and young adulthood (20–33 years). Sensitivity analyses explored alternative model conditions including restricting the outcome to hospitalizations and ED visits to increase specificity.
Results
Compared to no cannabis use, cannabis use was significantly associated with psychotic disorders during adolescence (aHR = 11.2; 95% CI 4.6–27.3), but not during young adulthood (aHR = 1.3; 95% CI 0.6–2.6). When we restricted the outcome to hospitalizations and ED visits only, the strength of association increased markedly during adolescence (aHR = 26.7; 95% CI 7.7–92.8) but did not change meaningfully during young adulthood (aHR = 1.8; 95% CI 0.6–5.4).
Conclusions
This study provides new evidence of a strong but age-dependent association between cannabis use and risk of psychotic disorder, consistent with the neurodevelopmental theory that adolescence is a vulnerable time to use cannabis. The strength of association during adolescence was notably greater than in previous studies, possibly reflecting the recent rise in cannabis potency.
Survey researchers testing the effectiveness of arguments for or against policies traditionally employ between-subjects designs. In doing so, they lose statistical power and the ability to precisely estimate public attitudes. We explore the efficacy of an approach often used to address these limitations: the repeated measures within-subjects (RMWS) design. This study tests the competing hypotheses that (1) the RMWS will yield smaller effects due to respondents' desire to maintain consistency (the “opinion anchor” hypothesis), and (2) the RMWS will yield larger effects because the researcher provides respondents with the opportunity to update their attitudes (the “opportunity to revise” hypothesis). Using two survey experiments, we find evidence for the opportunity to revise hypothesis, and discuss the implications for future survey research.
Prior trials suggest that intravenous racemic ketamine is a highly effective for treatment-resistant depression (TRD), but phase 3 trials of racemic ketamine are needed.
Aims
To assess the acute efficacy and safety of a 4-week course of subcutaneous racemic ketamine in participants with TRD. Trial registration: ACTRN12616001096448 at www.anzctr.org.au.
Method
This phase 3, double-blind, randomised, active-controlled multicentre trial was conducted at seven mood disorders centres in Australia and New Zealand. Participants received twice-weekly subcutaneous racemic ketamine or midazolam for 4 weeks. Initially, the trial tested fixed-dose ketamine 0.5 mg/kg versus midazolam 0.025 mg/kg (cohort 1). Dosing was revised, after a Data Safety Monitoring Board recommendation, to flexible-dose ketamine 0.5–0.9 mg/kg or midazolam 0.025–0.045 mg/kg, with response-guided dosing increments (cohort 2). The primary outcome was remission (Montgomery-Åsberg Rating Scale for Depression score ≤10) at the end of week 4.
Results
The final analysis (those who received at least one treatment) comprised 68 in cohort 1 (fixed-dose), 106 in cohort 2 (flexible-dose). Ketamine was more efficacious than midazolam in cohort 2 (remission rate 19.6% v. 2.0%; OR = 12.1, 95% CI 2.1–69.2, P = 0.005), but not different in cohort 1 (remission rate 6.3% v. 8.8%; OR = 1.3, 95% CI 0.2–8.2, P = 0.76). Ketamine was well tolerated. Acute adverse effects (psychotomimetic, blood pressure increases) resolved within 2 h.
Conclusions
Adequately dosed subcutaneous racemic ketamine was efficacious and safe in treating TRD over a 4-week treatment period. The subcutaneous route is practical and feasible.
Bipolar disorder (BD) is a potentially chronic mental disorder marked by recurrent depressive and manic episodes, circadian rhythm disruption, and changes in energetic metabolism. “Metabolic jet lag” refers to a state of shift in circadian patterns of energy homeostasis, affecting neuroendocrine, immune, and adipose tissue function, expressed through behavioral changes such as irregularities in sleep and appetite. Risk factors include genetic variation, mitochondrial dysfunction, lifestyle factors, poor gut microbiome health and abnormalities in hunger, satiety, and hedonistic function. Evidence suggests metabolic jet lag is a core component of BD pathophysiology, as individuals with BD frequently exhibit irregular eating rhythms and circadian desynchronization of their energetic metabolism, which is associated with unfavorable clinical outcomes. Although current diagnostic criteria lack any assessment of eating rhythms, technological advancements including mobile phone applications and ecological momentary assessment allow for the reliable tracking of biological rhythms. Overall, methodological refinement of metabolic jet lag assessment will increase knowledge in this field and stimulate the development of interventions targeting metabolic rhythms, such as time-restricted eating.
OBJECTIVES/GOALS: Identification of COVID-19 patients at risk for deterioration following discharge from the emergency department (ED) remains a clinical challenge. Our objective was to develop a prediction model that identifies COVID-19 patients at risk for return and hospital admission within 30 days of ED discharge. METHODS/STUDY POPULATION: We performed a retrospective cohort study of discharged adult ED patients (n = 7,529) with SARS-CoV-2 infection from 116 unique hospitals contributing to the national REgistry of suspected COVID-19 in EmeRgency care (RECOVER). The primary outcome was return hospital admission within 30 days. Models were developed using Classification and Regression Tree (CART), Gradient Boosted Machine (GBM), Random Forest (RF), and least absolute shrinkage and selection (LASSO) approaches. RESULTS/ANTICIPATED RESULTS: Among COVID-19 patients discharged from the ED on their index encounter, 571 (7.6%) returned for hospital admission within 30 days. The machine learning (ML) models (GBM, RF,: and LASSO) performed similarly. The RF model yielded a test AUC of 0.74 (95% confidence interval [CI] 0.71–0.78) with a sensitivity of 0.46 (0.39-0.54) and specificity of 0.84 (0.82-0.85). Predictive variables including: lowest oxygen saturation, temperature; or history of hypertension,: diabetes, hyperlipidemia, or obesity, were common to all ML models. DISCUSSION/SIGNIFICANCE: A predictive model identifying adult ED patients with COVID-19 at risk for return hospital admission within 30 days is feasible. Ensemble/boot-strapped classification methods outperform the single tree CART method. Future efforts may focus on the application of ML models in the hospital setting to optimize allocation of follow up resources.
We interviewed 1,208 healthcare workers with positive SARS-CoV-2 tests between October 2020 and June 2021 to determine likely exposure sources. Overall, 689 (57.0%) had community exposures (479 from household members), 76 (6.3%) had hospital exposures (64 from other employees including 49 despite masking), 11 (0.9%) had community and hospital exposures, and 432 (35.8%) had no identifiable source of exposure.
A classic example of microbiome function is its role in nutrient assimilation in both plants and animals, but other less obvious roles are becoming more apparent, particularly in terms of driving infectious and non-infectious disease outcomes and influencing host behaviour. However, numerous biotic and abiotic factors influence the composition of these communities, and host microbiomes can be susceptible to environmental change. How microbial communities will be altered by, and mitigate, the rapid environmental change we can expect in the next few decades remain to be seen. That said, given the enormous range of functional diversity conferred by microbes, there is currently something of a revolution in microbial bioengineering and biotechnology in order to address real-world problems including human and wildlife disease and crop and biofuel production. All of these concepts are explored in further detail throughout the book.
Extracorporeal cardiopulmonary resuscitation in refractory cardiac arrest (ECPR) is an emerging resuscitative therapy that has shown promising results for selected patients who may not otherwise survive. We sought to identify the characteristics of cardiac arrest patients presenting to our institution to begin assessing the feasibility of an ECPR program.
Methods
This retrospective health records review included patients aged 18–75 years old presenting to our academic teaching hospital campuses with refractory nontraumatic out-of-hospital or in-emergency department (ED) cardiac arrest over a 2-year period. Based on a scoping review of the literature, both “liberal” and “restrictive” ECPR criteria were defined and applied to our cohort.
Results
A total of 179 patients met inclusion criteria. Median age was 60 years, and patients were predominantly male (72.6%). The initial rhythm was ventricular tachycardia/ventricular fibrillation in 49.2%. The majority of arrests were witnessed (69.3%), with immediate bystander CPR performed on 53.1% and an additional 12% receiving CPR within 10 minutes of collapse. Median prehospital time was 40 minutes (interquartile range, 31–53.3). Two-thirds of patients (65.9%) were identified as having a reversible cause of arrest and favorable premorbid status was identified in nearly three quarters (74.3%). Our two sets of ECPR inclusion criteria revealed that 33 and 5 patients (liberal and restrictive criteria, respectively), would have been candidates for ECPR.
Conclusion
At our institution, we estimate between 6% and 40% of ED refractory cardiac arrest patients would be candidates for ECPR. These findings suggest that the implementation of an ECPR program should be explored.
Electoral boundaries are an integral part of election administration. District boundaries delineate which legislative election voters are eligible to participate in, and precinct boundaries identify, in many localities, where voters cast in-person ballots on Election Day. Election officials are tasked with resolving a tremendously large number of intersections of registered voters with overlapping electoral boundaries. Any large-scale data project is susceptible to errors, and this task is no exception. In two recent close elections, these errors were consequential to the outcome. To address this problem, we describe a method to audit the assignment of registered voters to districts. We apply the methodology to Florida’s voter registration file to identify thousands of registered voters assigned to the wrong state House district, many of which local election officials have verified and rectified. We discuss how election officials can best use this technique to detect registered voters assigned to the wrong electoral boundary.
In his book, Taking Rights Seriously Prof. Dworkin argues that all elements of society — citizens, legislature, and courts — ought to be ‘taking rights seriously’ in reaching decisions both about the actions and design of public institutions. This has two aspects. First in accord with traditional rights-centred views, there are certain ways of treating individuals that rights rule out. Formally, rights are ‘political trumps held by individuals’; they deny society certain kinds of access to collective goals (xi). Materially, rights can be seen as an expression of equal concern for each citizen as a human being; to deny a right is to ‘insult’ a person by failing to treat him as an equal of other persons, that is, as deserving the same basic respect as others. Secondly, and in contrast to the traditional natural rights view, Dworkin presents a particular way of reasoning about rights that does not presuppose their existence as Platonic entities; instead, rights are discovered in the process of constructing a theory which plausibly accounts for both the letter and the spirit of our society's institutional actions.
Seven half-day regional listening sessions were held between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide-resistance management. The objective of the listening sessions was to connect with stakeholders and hear their challenges and recommendations for addressing herbicide resistance. The coordinating team hired Strategic Conservation Solutions, LLC, to facilitate all the sessions. They and the coordinating team used in-person meetings, teleconferences, and email to communicate and coordinate the activities leading up to each regional listening session. The agenda was the same across all sessions and included small-group discussions followed by reporting to the full group for discussion. The planning process was the same across all the sessions, although the selection of venue, time of day, and stakeholder participants differed to accommodate the differences among regions. The listening-session format required a great deal of work and flexibility on the part of the coordinating team and regional coordinators. Overall, the participant evaluations from the sessions were positive, with participants expressing appreciation that they were asked for their thoughts on the subject of herbicide resistance. This paper details the methods and processes used to conduct these regional listening sessions and provides an assessment of the strengths and limitations of those processes.
Herbicide resistance is ‘wicked’ in nature; therefore, results of the many educational efforts to encourage diversification of weed control practices in the United States have been mixed. It is clear that we do not sufficiently understand the totality of the grassroots obstacles, concerns, challenges, and specific solutions needed for varied crop production systems. Weed management issues and solutions vary with such variables as management styles, regions, cropping systems, and available or affordable technologies. Therefore, to help the weed science community better understand the needs and ideas of those directly dealing with herbicide resistance, seven half-day regional listening sessions were held across the United States between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide resistance management. The major goals of the sessions were to gain an understanding of stakeholders and their goals and concerns related to herbicide resistance management, to become familiar with regional differences, and to identify decision maker needs to address herbicide resistance. The messages shared by listening-session participants could be summarized by six themes: we need new herbicides; there is no need for more regulation; there is a need for more education, especially for others who were not present; diversity is hard; the agricultural economy makes it difficult to make changes; and we are aware of herbicide resistance but are managing it. The authors concluded that more work is needed to bring a community-wide, interdisciplinary approach to understanding the complexity of managing weeds within the context of the whole farm operation and for communicating the need to address herbicide resistance.
This research aims to explore the submerged landscapes of the Pilbara of western Australia, using predictive archaeological modelling, airborne LiDAR, marine acoustics, coring and diver survey. It includes excavation and geophysical investigation of a submerged shell midden in Denmark to establish guidelines for the underwater discovery of such sites elsewhere.
Thirty-two distal tephra layers that are interbedded in Quaternary loess at 13 sites in the Channeled Scabland and Palouse were sampled as part of a regional study of the stratigraphy and chronology of dominantly windblown sediments on the Columbia Plateau. An electron microprobe was used to determine the elemental composition of volcanic glass in all of the samples and also to determine the composition of ilmenite in 14 of them. Two of the distal tephra layers correlate with Glacier Peak eruptions (11,200 yr B.P.), five with Mount St. Helens tephra set S (13,000 yr B.P.), and nine with Mount St. Helens tephra set C (ca. 36,000 yr B.P.) based on analysis of glass and ilmenite in reference pumices from Glacier Peak, Mount St. Helens, Mount Mazama, Mount Rainier, and Mount Jefferson, on the calculation of similarity coefficients for comparisons of both glass and ilmenite reference compositions with those of distal tephras, and on considerations of stratigraphic position. The composition of glass and ilmenite and the stratigraphic position of one distal tephra layer in the loess suggests that it is from an eruption of Mount St. Helens at least several thousand years older than the set C eruptions. Glass composition and stratigraphic position of a distal tephra at another site in loess suggested a possible correlation with some layers of the Pumice Castle eruptive sequence at Mount Mazama (ca. 70,000 yr B.P.), but similarity coefficients on ilmenite of only 45 and 48 fail to support the correlation and show why multiple correlation methods should be used. Similarity coefficients higher than 96 for both glass and ilmenite establish a correlation with Mount St. Helens layer Cw for distal layers in two widely separated sites. These layers are in sedimentary successions that are closely associated with giant floods in the Channeled Scabland. The 36,000 yr B.P. radiocarbon age of the Mount St. Helens set C establishes a minimum limiting date for an episode of flooding that predates the widespread late Wisconsin floods. A correlation of distal tephra layers at two other sites in the Scabland and Palouse establishes a chronostratigraphic link to a still-older episode of flooding within the Brunhes Normal Polarity Chron. Six distal tephra layers in pre-late Quaternary loess that are not correlated with known or dated eruptions have compositions and distinctive stratigraphic positions relative to magnetic reversal boundaries that make them key markers for future work.