We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Bipolar depression remains difficult to treat, and people often experience ongoing residual symptoms, decreased functioning and impaired quality of life. Adjunctive therapies targeting novel pathways can provide wider treatment options and improve clinical outcomes. Garcinia mangostana Linn. (mangosteen) pericarp has serotonogenic, antioxidant anti-inflammatory and neurogenic properties of relevance to the mechanisms of bipolar depression.
Aims
The current 28-week randomised, multisite, double-blind, placebo-controlled trial investigated mangosteen pericarp extract as an adjunct to treatment-as-usual for treatment of bipolar depression.
Method
This trial was prospectively registered on the Australia New Zealand Clinical Trials Registry (no. ACTRN12616000028404). Participants aged 18 years and older with a diagnosis of bipolar I or II and with at least moderate depressive symptoms were eligible for the study. A total of 1016 participants were initially approached or volunteered for the study, of whom 712 did not progress to screening, with an additional 152 screened out. Seventy participants were randomly allocated to mangosteen and 82 to a placebo control. Fifty participants in the mangosteen and 64 participants in the placebo condition completed the treatment period and were analysed.
Results
Results indicated limited support for the primary hypothesis of superior depression symptom reduction following 24 weeks of treatment. Although overall changes in depressive symptoms did not substantially differ between conditions over the course of the trial, we observed significantly greater improvements for the mangosteen condition at 24 weeks, compared with baseline, for mood symptoms, clinical impressions of bipolar severity and social functioning compared with controls. These differences were attenuated at week 28 post-discontinuation assessment.
Conclusions
Adjunctive mangosteen pericarp treatment appeared to have limited efficacy in mood and functional symptoms associated with bipolar disorder, but not with manic symptoms or quality of life, suggesting a novel therapeutic approach that should be verified by replication.
To use the validated Online Quality Assessment Tool (OQAT) to assess the quality of online nutrition information.
Setting:
The social networking platform was formerly known as Twitter (now X).
Design:
Utilising the Twitter search application programming interface (API; v1·1), all tweets that included the word ‘nutrition’, along with associated metadata, were collected on seven randomly selected days in 2021. Tweets were screened, those without a URL were removed and the remainder were grouped on retweet status. Articles (shared via URL) were assessed using the OQAT, and quality levels were assigned (low, satisfactory, high). Mean differences between retweeted and non-retweeted data were assessed by the Mann–Whitney U test. The Cochran–Mantel–Haenszel test was used to compare information quality by source.
Results:
In total, 10 573 URL were collected from 18 230 tweets. After screening for relevance, 1005 articles were assessed (9568 were out of scope) sourced from professional blogs (n 354), news outlets (n 213), companies (n 166), personal blogs (n 120), NGO (n 60), magazines (n 55), universities (n 19) and government (n 18). Rasch measures indicated the quality levels: 0–3·48, poor, 3·49–6·3, satisfactory and 6·4–10, high quality. Personal and company-authored blogs were more likely to rank as poor quality. There was a significant difference in the quality of retweeted (n 267, sum of rank, 461·6) and non-retweeted articles (n 738, sum of rank, 518·0), U = 87 475, P= 0·006 but no significant effect of information source on quality.
Conclusions:
Lower-quality nutrition articles were more likely to be retweeted. Caution is required when using or sharing articles, particularly from companies and personal blogs, which tend to be lower-quality sources of nutritional information.
Older adults have low levels of mental health literacy relating to anxiety which may contribute to delaying or not seeking help. Lifestyle interventions, including physical activity (PA), have increasing evidence supporting their effectiveness in reducing anxiety. The COVID-19 pandemic also highlighted the potential for technology to facilitate healthcare provision. This study aimed to investigate perspectives of older adults about their understanding of anxiety, possible use of PA interventions to reduce anxiety, and whether technology could help this process.
Methods:
The INDIGO trial evaluated a PA intervention for participants aged 60 years and above at risk of cognitive decline and not meeting PA guidelines. Twenty-nine of the INDIGO trial completers, including some with anxiety and/or cognitive symptoms, attended this long-term follow-up study including semi-structured qualitative interviews. Transcripts were analyzed thematically.
Results:
There was quite a diverse understanding of anxiety amongst participants. Some participants were able to describe anxiety as involving worry, uncertainty and fear, as well as relating it to physical manifestations and feeling out of control. Others had less understanding of the concept of anxiety or found it confusing. Participants generally believed that PA could potentially reduce anxiety and thought that this could occur through a “mindfulness” and/or “physiological” process. Technology use was a more controversial topic with some participants quite clearly expressing a dislike or distrust of technology or else limited access or literacy in relation to technology. Participants who were supportive of using technology described that it could help with motivation, information provision and health monitoring. Wearable activity monitors were described favorably, with online platforms and portable devices also being options.
Conclusion:
Our results highlight the importance of increasing information and education about anxiety to older adults. This may increase awareness of anxiety and reduce delays in seeking help or not seeking help at all. Findings also emphasize the need for clinicians to support understanding of anxiety in older adults that they are seeing and provide information and education where needed. It is likely that PA interventions to reduce anxiety, with the option of a technology component with support, will be acceptable to most older adults.
We assessed susceptibility patterns to newer antimicrobial agents among clinical carbapenem-resistant Klebsiella pneumoniae (CRKP) isolates from patients in long-term acute-care hospitals (LTACHs) from 2014 to 2015. Meropenem-vaborbactam and imipenem-relebactam nonsusceptibility were observed among 9.9% and 9.1% of isolates, respectively. Nonsusceptibility to ceftazidime-avibactam (1.1%) and plazomicin (0.8%) were uncommon.
Despite extensive paleoenvironmental research on the postglacial history of the Kenai Peninsula, Alaska, uncertainties remain regarding the region's deglaciation, vegetation development, and past hydroclimate. To elucidate this complex environmental history, we present new proxy datasets from Hidden and Kelly lakes, located in the eastern Kenai lowlands at the foot of the Kenai Mountains, including sedimentological properties (magnetic susceptibility, organic matter, grain size, and biogenic silica), pollen and macrofossils, diatom assemblages, and diatom oxygen isotopes. We use a simple hydrologic and isotope mass balance model to constrain interpretations of the diatom oxygen isotope data. Results reveal that glacier ice retreated from Hidden Lake's headwaters by ca. 13.1 cal ka BP, and that groundwater was an important component of Kelly Lake's hydrologic budget in the Early Holocene. As the forest developed and the climate became wetter in the Middle to Late Holocene, Kelly Lake reached or exceeded its modern level. In the last ca. 75 years, rising temperature caused rapid changes in biogenic silica content and diatom oxygen isotope values. Our findings demonstrate the utility of mass balance modeling to constrain interpretations of paleolimnologic oxygen isotope data, and that groundwater can exert a strong influence on lake water isotopes, potentially confounding interpretations of regional climate.
Cross-species evidence suggests that the ability to exert control over a stressor is a key dimension of stress exposure that may sensitize frontostriatal-amygdala circuitry to promote more adaptive responses to subsequent stressors. The present study examined neural correlates of stressor controllability in young adults. Participants (N = 56; Mage = 23.74, range = 18–30 years) completed either the controllable or uncontrollable stress condition of the first of two novel stressor controllability tasks during functional magnetic resonance imaging (fMRI) acquisition. Participants in the uncontrollable stress condition were yoked to age- and sex-matched participants in the controllable stress condition. All participants were subsequently exposed to uncontrollable stress in the second task, which is the focus of fMRI analyses reported here. A whole-brain searchlight classification analysis revealed that patterns of activity in the right dorsal anterior insula (dAI) during subsequent exposure to uncontrollable stress could be used to classify participants' initial exposure to either controllable or uncontrollable stress with a peak of 73% accuracy. Previous experience of exerting control over a stressor may change the computations performed within the right dAI during subsequent stress exposure, shedding further light on the neural underpinnings of stressor controllability.
The J. Derek Bewley Career Lectures presented at the triennial meetings of the International Society of Seed Science support early-career seed scientists by providing retrospective views, from those late in their careers, of lessons learned and future implications. Ambition, ability, inspiration, foresight, hard work and opportunity are obvious career requirements. The importance of mentoring and teamwork combined with the clear communication of results, understanding and ideas are emphasized. The role of illustration in research, and its dissemination, is outlined: illustration can support hypothesis development, testing and communication. Climate change may perturb the production of high-quality seed affecting conservation as well as agriculture, horticulture and forestry. An illustrative synthesis of the current understanding of temporal aspects of the effects of seed production environment on seed quality (assessed by subsequent seed storage longevity) is provided for wheat (Triticum aestivum L.) and rice (Oryza sativa L.). Seed science research can contribute to complex global challenges such as future food supplies from seed-propagated crops in our changing climate whilst conserving biological diversity (through seed ecology and technologies such as ex situ plant genetic resources conservation by long-term seed storage in genebanks), but only if that research can be – and then is – applied.
Anticholinergic medications block cholinergic transmission. The central effects of anticholinergic drugs can be particularly marked in patients with dementia. Furthermore, anticholinergics antagonise the effects of cholinesterase inhibitors, the main dementia treatment.
Objectives
This study aimed to assess anticholinergic drug prescribing among dementia patients before and after admission to UK acute hospitals.
Methods
352 patients with dementia were included from 17 hospitals in the UK. All were admitted to surgical, medical or Care of the Elderly wards in 2019. Information about patients’ prescriptions were recorded on a standardised form. An evidence-based online calculator was used to calculate the anticholinergic drug burden of each patient. The correlation between two subgroups upon admission and discharge was tested with Spearman’s Rank Correlation.
Results
Table 1 shows patient demographics. On admission, 37.8% of patients had an anticholinergic burden score ≥1 and 5.68% ≥3. At discharge, 43.2% of patients had an anticholinergic burden score ≥1 and 9.1% ≥3. The increase was statistically significant (rho 0.688; p=2.2x10-16). The most common group of anticholinergic medications prescribed at discharge were psychotropics (see Figure 1). Among patients prescribed cholinesterase inhibitors, 44.9% were also taking anticholinergic medications.
Conclusions
This multicentre cross-sectional study found that people with dementia are frequently prescribed anticholinergic drugs, even if also taking cholinesterase inhibitors, and are significantly more likely to be discharged with a higher anticholinergic drug burden than on admission to hospital.
Conflict of interest
This project was planned and executed by the authors on behalf of SPARC (Student Psychiatry Audit and Research Collaborative). We thank the National Student Association of Medical Research for allowing us use of the Enketo platform. Judith Harrison was su
Lusala (Dioscorea hirtiflora Benth. subsp. pedicellata Milne-Redh) is an important wild edible tuber foraged widely from natural forests in Southern Zambia, but at risk from overharvesting and deforestation. Its propagation was investigated in glasshouse studies to explore potential domestication and future in situ and ex situ genetic resources conservation. Almost all tubers planted with visible shoot buds produced vines, with no effect of tuber size on vine emergence or tuber yield. Few tubers without visible shoot buds at planting produced vines, but those that did not re-tuberized. The progeny provided good vine emergence and similar tuber yield, with vines from tubers produced by re-tuberization being more vigorous. Re-tuberization in the absence of vine emergence also occurred in other experiments. Minisetts cut from the proximal end of tubers provided better vine emergence (with more from 20-mm than 10-mm-long sections) and greater tuber yield than mid- or distal minisetts. Nodal stem cuttings rooted well, vined, and provided small tubers. This study shows that lusala can be propagated successfully from tubers, minisetts, nodal vine cuttings, or mini-tubers from nodal vine cuttings, for genetic resources conservation and/or domestication. Domestication is likely to be hampered by the long period required for vines to emerge and establish. More sustainable foraging, including re-planting in natural forests, is recommended to balance consumption of lusala in the region and promote its long-term conservation.
We reviewed current state of research involving the applications of TMS and rTMS in understanding of pathophysiology as well as the treatment of ADHD.
Objectives
To assess how TMS has furthered our knowledge of neurobiological models of ADHD and to consider further research. To look at possible applications of rTMS in the management of ADHD and to evaluate the current state of research.
Methods
Literature review using an online search.
Results
The investigative studies are small in numbers, but show some promising results. TMS adds weight to the theory of a hypofunctional dopaminergic circuit involved in ADHD pathophysiology. Treatment studies (only 2) using rTMS shows some use in treatment of ADHD, such as brief improvement in attention. These studies, however, are very preliminary, small in numbers and suffer from methodological difficulties.
Conclusions
TMS has provided some useful information about the likely pathophysiology of ADHD, and results show that it is a safe an effective way to investigate and treat this condition. Much more research is needed to investigate the potential applications of this technology.
Mechanistic models (MMs) have served as causal pathway analysis and ‘decision-support’ tools within animal production systems for decades. Such models quantitatively define how a biological system works based on causal relationships and use that cumulative biological knowledge to generate predictions and recommendations (in practice) and generate/evaluate hypotheses (in research). Their limitations revolve around obtaining sufficiently accurate inputs, user training and accuracy/precision of predictions on-farm. The new wave in digitalization technologies may negate some of these challenges. New data-driven (DD) modelling methods such as machine learning (ML) and deep learning (DL) examine patterns in data to produce accurate predictions (forecasting, classification of animals, etc.). The deluge of sensor data and new self-learning modelling techniques may address some of the limitations of traditional MM approaches – access to input data (e.g. sensors) and on-farm calibration. However, most of these new methods lack transparency in the reasoning behind predictions, in contrast to MM that have historically been used to translate knowledge into wisdom. The objective of this paper is to propose means to hybridize these two seemingly divergent methodologies to advance the models we use in animal production systems and support movement towards truly knowledge-based precision agriculture. In order to identify potential niches for models in animal production of the future, a cross-species (dairy, swine and poultry) examination of the current state of the art in MM and new DD methodologies (ML, DL analytics) is undertaken. We hypothesize that there are several ways via which synergy may be achieved to advance both our predictive capabilities and system understanding, being: (1) building and utilizing data streams (e.g. intake, rumination behaviour, rumen sensors, activity sensors, environmental sensors, cameras and near IR) to apply MM in real-time and/or with new resolution and capabilities; (2) hybridization of MM and DD approaches where, for example, a ML framework is augmented by MM-generated parameters or predicted outcomes and (3) hybridization of the MM and DD approaches, where biological bounds are placed on parameters within a MM framework, and the DD system parameterizes the MM for individual animals, farms or other such clusters of data. As animal systems modellers, we should expand our toolbox to explore new DD approaches and big data to find opportunities to increase understanding of biological systems, find new patterns in data and move the field towards intelligent, knowledge-based precision agriculture systems.
Drought and high temperature each damage rice (Oryza sativa L.) crops. Their effect during seed development and maturation on subsequent seed quality development was investigated in Japonica (cv. Gleva) and Indica (cv. Aeron 1) plants grown in controlled environments subjected to drought (irrigation ended) and/or brief high temperature (HT; 3 days at 40/30°C). Ending irrigation early in cv. Gleva (7 or 14 days after anthesis, DAA) resulted in earlier plant senescence, more rapid decline in seed moisture content, more rapid seed quality development initially, but substantial decline later in planta in the ability of seeds to germinate normally. Subsequent seed storage longevity amongst later harvests was greatest with no drought because with drought it declined from 16 or 22 DAA onwards in planta, 9 or 8 days after irrigation ended, respectively. Later drought (14 or 28 DAA) also reduced seed longevity at harvest maturity (42 DAA). Well-irrigated plants provided poorer longevity the earlier during seed development they were exposed to HT (greatest at anthesis and histodifferentiation; no effect during seed maturation). Combining drought and HT damaged seed quality more than each stress alone, and more so in the Japonica cv. Gleva than the Indica cv. Aeron 1. Overall, the earlier plant drought occurred the greater the damage to subsequent seed quality; seed quality was most vulnerable to damage from plant drought and HT at anthesis and histodifferentiation; and seed quality of the Indica rice was more resilient to damage from these stresses than the Japonica.
This chapter presents reflections on next-generation ethical issues by four deans at the University of Southern California: Public Policy, Medicine, Business, and Engineering. Each of the deans was asked to reflect on some of the important ethical issues that they believe we face today or that we will face in the near future. Their responses follow.
The long-standing hypothesis that seed quality improves during seed filling, is greatest at the end of seed filling, and declines thereafter (because seed deterioration was assumed to begin then), provided a template for research in seed quality development. It was rejected by investigations where seed quality was shown to improve throughout both seed development and maturation until harvest maturity, before seed deterioration was first observed. Several other temporal patterns of seed quality development and decline have also been reported. These are portrayed and compared. The assessment suggests that the original hypothesis was too simple, because it combined several component hypotheses: (a) the seed improvement (only) phase ends before seed deterioration (only) commences; (b) there is only a brief single point in time during seed development and maturation when, in all circumstances, seed quality is maximal; (c) the seed quality improvement phase coincides perfectly with seed filling, with deterioration only post-seed filling. It is concluded that the search for the single point of maximum seed quality was a false quest because (a) seed improvement and deterioration may cycle (sequentially if not simultaneously) during seed development and maturation; (b) the relative sensitivity of the rates of improvement and deterioration to environment may differ; (c) the period of maximum quality may be brief or extended. Hence, when maximum quality is first attained, and for how long it is maintained, during seed development and maturation varies with genotype and environment. This is pertinent to quality seed production in current and future climates as it will be affected by climate change and a likelihood of more frequent coincidence of brief periods of extreme temperatures with highly sensitive phases of seed development and maturation. This is a possible tipping point for food security and for ecological diversity.
England has recently started a new paediatric influenza vaccine programme using a live-attenuated influenza vaccine (LAIV). There is uncertainty over how well the vaccine protects against more severe end-points. A test-negative case–control study was used to estimate vaccine effectiveness (VE) in vaccine-eligible children aged 2–16 years of age in preventing laboratory-confirmed influenza hospitalisation in England in the 2015–2016 season using a national sentinel laboratory surveillance system. Logistic regression was used to estimate the VE with adjustment for sex, risk-group, age group, region, ethnicity, deprivation and month of sample collection. A total of 977 individuals were included in the study (348 cases and 629 controls). The overall adjusted VE for all study ages and vaccine types was 33.4% (95% confidence interval (CI) 2.3–54.6) after adjusting for age group, sex, index of multiple deprivation, ethnicity, region, sample month and risk group. Risk group was shown to be an important confounder. The adjusted VE for all influenza types for the live-attenuated vaccine was 41.9% (95% CI 7.3–63.6) and 28.8% (95% CI −31.1 to 61.3) for the inactivated vaccine. The study provides evidence of the effectiveness of influenza vaccination in preventing hospitalisation due to laboratory-confirmed influenza in children in 2015–2016 and continues to support the rollout of the LAIV childhood programme.
Effective communication is a critical part of managing an emergency. During an emergency, the ways in which health agencies normally communicate warnings may not reach all of the intended audience. Not all communities are the same, and households within communities are diverse. Because different communities prefer different communication methods, community leaders and emergency planners need to know their communities’ preferred methods for seeking information about an emergency. This descriptive report explores findings from previous community assessments that have collected information on communication preferences, including television (TV), social media, and word-of-mouth (WoM) delivery methods. Data were analyzed from 12 Community Assessments for Public Health Emergency Response (CASPERs) conducted from 2014-2017 that included questions regarding primary and trusted communication sources. A CASPER is a rapid needs assessment designed to gather household-based information from a community. In 75.0% of the CASPERs, households reported TV as their primary source of information for specific emergency events (range = 24.0%-83.1%). Households reporting social media as their primary source of information differed widely across CASPERs (3.2%-41.8%). In five of the CASPERs, nearly one-half of households reported WoM as their primary source of information. These CASPERs were conducted in response to a specific emergency (ie, chemical spill, harmful algal bloom, hurricane, and flood). The CASPERs conducted as part of a preparedness activity had lower percentages of households reporting WoM as their primary source of information (8.3%-10.4%). The findings in this report demonstrate the need for emergency plans to include hybrid communication models, combining traditional methods with newer technologies to reach the broadest audience. Although TV was the most commonly reported preferred source of information, segments of the population relied on social media and WoM messaging. By using multiple methods for risk communication, emergency planners are more likely to reach the whole community and engage vulnerable populations that might not have access to, trust in, or understanding of traditional news sources. Multiple communication channels that include user-generated content, such as social media and WoM, can increase the timeliness of messaging and provide community members with message confirmation from sources they trust encouraging them to take protective public health actions.
WolkinAF, SchnallAH, NakataNK, EllisEM. Getting the Message Out: Social Media and Word-of-Mouth as Effective Communication Methods during Emergencies. Prehosp Disaster Med. 2019;34(1):89–94.