We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Previous observational studies suggested that vitamin D may control the absorption of iron (Fe) by inhibition of hepcidin, but the causal relevance of these associations is uncertain. Using placebo-controlled randomisation, we assessed the effects of supplementation with vitamin D on biochemical markers of Fe status and erythropoiesis in community-dwelling older people living in the UK. The BEST-D trial, designed to establish the optimum dose of vitamin D3 for future trials, had 305 participants, aged 65 years or older, randomly allocated to 4000 IU vitamin D3 (n 102), 2000 IU vitamin D3 (n 102) or matching placebo (n 101). We estimated the effect of vitamin D allocation on plasma levels of hepcidin, soluble transferrin receptor (sTfR), ferritin, Fe, transferrin, saturated transferrin (TSAT%) and the sTfR–ferritin index. Despite increases in 25-hydroxy-vitamin D, neither dose had significant effects on biochemical markers of Fe status or erythropoiesis. Geometric mean concentrations were similar in vitamin D3 arms v. placebo for hepcidin (20·7 [se 0·90] v. 20·5 [1·21] ng/ml), sTfR (0·69 [0·010] v. 0·70 [0·015] µg/ml), ferritin (97·1 [2·81] v. 97·8 [4·10] µg/l) and sTfR–ferritin ratio (0·36 [0·006] v. 0·36 [0·009]), respectively, while arithmetic mean levels were similar for Fe (16·7 [0·38] v. 17·3 [0·54] µmol/l), transferrin (2·56 [0·014] v. 2·60 [0·021] g/dl) and TSAT% (26·5 [0·60] v. 27·5 [0·85]). The proportions of participants with ferritin < 15 µg/l and TSAT < 16 % were unaltered by vitamin D3 suggesting that 12 months of daily supplementation with moderately high doses of vitamin D3 are unlikely to alter the Fe status of older adults.
Seed genebanks must maintain collections of healthy seeds and regenerate accessions before seed viability declines. Seed shelf life is often characterized at the species level; however, large, unexplained variation among genetic lines within a species can and does occur. This variation contributes to unreliable predictions of seed quality decline with storage time. To assess variation of seed longevity and aid in timing regeneration, ten varieties of pea (Pisum sativum L.), chickpea (Cicer arietinum L.) and lentil (Lens culinaris Medikus subsp. culinaris) from the Australian Grains Genebank were stored at moderate temperature (20°C) and moisture (7–11% water, relative humidity [RH] ~30%) and deterioration was assessed by yearly germination tests for 20 years. Decline in germination was fit to a sigmoidal model and the time corresponding to 50% germination (P50) was used to express seed longevity for each genetic line. The feasibility of using RNA fragmentation to assess changed seed health was measured using RNA integrity number (RIN) from RNA extracted from seeds that were stored for 13 and 20 years. Seed lots of legume grains that maintained high survival throughout the 20 years (i.e. they aged slower than other lines) had higher RIN than samples that degraded faster. RIN was lower in embryonic axes compared with cotyledons in the more deteriorated samples, perhaps indicating that axes exhibit symptoms of ageing sooner than cotyledons. Overall, RIN appears to be associated with longevity indicators of germination for these legumes and indicating that RIN decline can be used to assess ageing rate, which is needed to optimize viability monitoring.
Scholars, pundits, and politicians use opinion surveys to study citizen beliefs about political facts, such as the current unemployment rate, and more conspiratorial beliefs, such as whether Barack Obama was born abroad. Many studies, however, ignore acquiescence-response bias, the tendency for survey respondents to endorse any assertion made in a survey question regardless of content. With new surveys fielding questions asked in recent scholarship, we show that acquiescence bias inflates estimated incidence of conspiratorial beliefs and political misperceptions in the United States and China by up to 50%. Acquiescence bias is disproportionately prevalent among more ideological respondents, inflating correlations between political ideology such as conservatism and endorsement of conspiracies or misperception of facts. We propose and demonstrate two methods to correct for acquiescence bias.
Little is known about environmental factors that may influence associations between genetic liability to suicidality and suicidal behavior.
Methods
This study examined whether a suicidality polygenic risk score (PRS) derived from a large genome-wide association study (N = 122,935) was associated with suicide attempts in a population-based sample of European-American US military veterans (N = 1664; 92.5% male), and whether cumulative lifetime trauma exposure moderated this association.
Results
Eighty-five veterans (weighted 6.3%) reported a history of suicide attempt. After adjusting for sociodemographic and psychiatric characteristics, suicidality PRS was associated with lifetime suicide attempt (odds ratio 2.65; 95% CI 1.37–5.11). A significant suicidality PRS-by-trauma exposure interaction emerged, such that veterans with higher levels of suicidality PRS and greater trauma burden had the highest probability of lifetime suicide attempt (16.6%), whereas the probability of attempts was substantially lower among those with high suicidality PRS and low trauma exposure (1.4%). The PRS-by-trauma interaction effect was enriched for genes implicated in cellular and developmental processes, and nervous system development, with variants annotated to the DAB2 and SPNS2 genes, which are implicated in inflammatory processes. Drug repurposing analyses revealed upregulation of suicide gene-sets in the context of medrysone, a drug targeting chronic inflammation, and clofibrate, a triacylglyceride level lowering agent.
Conclusion
Results suggest that genetic liability to suicidality is associated with increased risk of suicide attempt among veterans, particularly in the presence of high levels of cumulative trauma exposure. Additional research is warranted to investigate whether incorporation of genomic information may improve suicide prediction models.
Patients presenting to hospital with suspected coronavirus disease 2019 (COVID-19), based on clinical symptoms, are routinely placed in a cohort together until polymerase chain reaction (PCR) test results are available. This procedure leads to delays in transfers to definitive areas and high nosocomial transmission rates. FebriDx is a finger-prick point-of-care test (PoCT) that detects an antiviral host response and has a high negative predictive value for COVID-19. We sought to determine the clinical impact of using FebriDx for COVID-19 triage in the emergency department (ED).
Design:
We undertook a retrospective observational study evaluating the real-world clinical impact of FebriDx as part of an ED COVID-19 triage algorithm.
Setting:
Emergency department of a university teaching hospital.
Patients:
Patients presenting with symptoms suggestive of COVID-19, placed in a cohort in a ‘high-risk’ area, were tested using FebriDx. Patients without a detectable antiviral host response were then moved to a lower-risk area.
Results:
Between September 22, 2020, and January 7, 2021, 1,321 patients were tested using FebriDx, and 1,104 (84%) did not have a detectable antiviral host response. Among 1,104 patients, 865 (78%) were moved to a lower-risk area within the ED. The median times spent in a high-risk area were 52 minutes (interquartile range [IQR], 34–92) for FebriDx-negative patients and 203 minutes (IQR, 142–255) for FebriDx-positive patients (difference of −134 minutes; 95% CI, −144 to −122; P < .0001). The negative predictive value of FebriDx for the identification of COVID-19 was 96% (661 of 690; 95% CI, 94%–97%).
Conclusions:
FebriDx improved the triage of patients with suspected COVID-19 and reduced the time that severe acute respiratory coronavirus virus 2 (SARS-CoV-2) PCR-negative patients spent in a high-risk area alongside SARS-CoV-2–positive patients.
The coronavirus disease-2019 (COVID-19) pandemic has caused myriad health, social, and economic stressors. To date, however, no known study has examined changes in mental health during the pandemic in the U.S. military veteran population.
Methods
Data were analyzed from the 2019–2020 National Health and Resilience in Veterans Study, a nationally representative, prospective cohort survey of 3078 veterans. Pre-to-peri-pandemic changes in psychiatric symptoms were evaluated, as well as pre-pandemic risk and protective factors and pandemic-related correlates of increased psychiatric distress.
Results
The prevalence of generalized anxiety disorder (GAD) positive screens increased from pre- to peri-pandemic (7.1% to 9.4%; p < 0.001) and was driven by an increase among veterans aged 45–64 years (8.2% to 13.5%; p < 0.001), but the prevalence of major depressive disorder and posttraumatic stress disorder positive screens remained stable. Using a continuous measure of psychiatric distress, an estimated 13.2% of veterans reported a clinically meaningful pre-to-peri-pandemic increase in distress (mean = 1.1 standard deviation). Veterans with a larger pre-pandemic social network size and secure attachment style were less likely to experience increased distress, whereas veterans reporting more pre-pandemic loneliness were more likely to experience increased distress. Concerns about pandemic-related social losses, mental health COVID-19 effects, and housing stability during the pandemic were associated with increased distress, over-and-above pre-pandemic factors.
Conclusions
Although most U.S. veterans showed resilience to mental health problems nearly 1 year into the pandemic, the prevalence of GAD positive screens increased, particularly among middle-aged veterans, and one of seven veterans experienced increased distress. Clinical implications of these findings are discussed.
Analysis of human remains and a copper band found in the center of a Late Archaic (ca. 5000–3000 cal BP) shell ring demonstrate an exchange network between the Great Lakes and the coastal southeast United States. Similarities in mortuary practices suggest that the movement of objects between these two regions was more direct and unmediated than archaeologists previously assumed based on “down-the-line” models of exchange. These findings challenge prevalent notions that view preagricultural Native American communities as relatively isolated from one another and suggest instead that wide social networks spanned much of North America thousands of years before the advent of domestication.
To examine the feasibility of using social media to assess the consumer nutrition environment by comparing sentiment expressed in Yelp reviews with information obtained from a direct observation audit instrument for grocery stores.
Design
Trained raters used the Nutrition Environment Measures Survey in Stores (NEMS-S) in 100 grocery stores from July 2015 to March 2016. Yelp reviews were available for sixty-nine of these stores and were retrieved in February 2017 using the Yelp Application Program Interface. A sentiment analysis was conducted to quantify the perceptions of the consumer nutrition environment in the review text. Pearson correlation coefficients (ρ) were used to compare NEMS-S scores with Yelp review text on food availability, quality, price and shopping experience.
Setting
Detroit, Michigan, USA.
Participants
None.
Results
Yelp reviews contained more comments about food availability and the overall shopping experience than food price and food quality. Negative sentiment about food prices in Yelp review text and the number of dollar signs on Yelp were positively correlated with observed food prices in stores (ρ=0·413 and 0·462, respectively). Stores with greater food availability were rated as more expensive on Yelp. Other aspects of the food store environment (e.g. overall quality and shopping experience) were captured only in Yelp.
Conclusions
While Yelp cannot replace in-person audits for collecting detailed information on the availability, quality and cost of specific food items, Yelp holds promise as a cost-effective means to gather information on the overall cost, quality and experience of food stores, which may be relevant for nutrition outcomes.
In five field experiments from 1986 to 1988, herbicides were evaluated alone and in combinations for weed control in water-seeded rice. Combinations of bensulfuron with either molinate or thiobencarb applied into the paddy water at the 2-leaf stage of rice, controlled all broadleaf and sedge weeds, and 92% or more early watergrass. These combinations were equivalent to a commercial standard of molinate at the 2-leaf stage followed by a separate application of bentazon to the drained paddy at midtillering.
The functional relationships between rainfall intensities and amounts, and the washoff of dicamba and 3,6-DCSA from turfgrass foliage were determined. Dicamba was applied to Kentucky bluegrass field plots and the turfgrass was subjected to 2 to 58 mm of simulated rainfall 18 to 48 h later. Rainfall was applied at an average intensity of 20.6 or 39.9 mm h−1. The 39.9 mm h−1 intensity reduced dicamba washoff by 10% for a given amount of rainfall. Washoff of 3,6-DCSA was independent of rainfall intensity. When averaged over intensities, washoff of dicamba was best described by the equation y = 1 − 0.341x0.187, and 3,6-DCSA washoff by the equation y = exp(-0.210x), where x represents millimeters of rainfall and y, the proportion of compound remaining on the foliage after rainfall.
The Tasmanian Cenozoic macrofossil record is relatively rich, and changes that have occurred in the vegetation of the region are becoming increasingly well understood. The record is essentially one of rainforest elements, especially in the Paleogene, but taxa that are now common in sclerophyllous heathlands and woodlands are increasingly prevalent in Quaternary sediments.
Extant Tasmanian rainforest is renowned for its beauty, and botanists have long recognised its marked taxonomic and structural similarity to other southern hemisphere ‘cool temperate’ forests of New Zealand and Chile. These are generally dominated by Nothofagus trees, their boughs laden with lichens and verdant shrouds of bryophytes. Other links are often made by phytogeographers to similar forests in high altitude regions of northern New South Wales and the much more species-rich vegetation of the generally montane regions of New Guinea and New Caledonia where Nothofagus also grows. A striking aspect of these forests is the presence of a variety of conifers, principally Podocarpaceae, but also Cupressaceae and Araucariaceae. In Tasmania the Araucariaceae are extinct, but the region is unique in the southern hemisphere in having a genus of Taxodiaceae, Athrotaxis. Athrotaxis spp. are often associated with Australia's only winter deciduous plant, Nothofagus gunnii, in montane regions of the island. The macrofossil record shows conclusively that the current diversity of Tasmania's woody rainforest flora is very much lower than at any other time during the Cenozoic. It confirms that there are strong floristic links to regions as widespread as eastern and southwestern mainland Australia, southern South America, New Zealand and New Guinea. In fact, Tasmanian Paleogene floras contain a wealth of taxa that are closely related to plants now confined to these regions.
Apart from the relatively large tracts of rainforest in Tasmania, closed forest lacking eucalypts is now confined to small patches along the east coast of Australia. In contrast to mainland Australia, Tasmania is relatively mountainous and has a well-developed woody alpine vegetation, dominated by shrubs of the Asteraceae, Epacridaceae, Myrtaceae and Proteaceae.
There is now expert consensus that directly observing the work of trainee therapists vs. relying upon self-report of sessions, is critical to providing the accurate feedback required to attain a range of competencies. In spite of this expert consensus however, and the broadly positive attitudes towards video review among supervisees, video feedback methods remain under-utilized in clinical supervision. This paper outlines some of the weaknesses that affect feedback based solely on self-report methods, before introducing some of the specific benefits that video feedback methods can offer the training and supervision context. It is argued that video feedback methods fit seamlessly into CBT supervision providing direct, accessible, effective, efficient and accurate observation of the learning situation, and optimizing the chances for accurate self-reflections and planning further improvements in performance. To demonstrate the utility of video feedback techniques to CBT supervision, two specific video feedback techniques are introduced and described: the Give-me-5 technique and the I-spy technique. Case examples of CBT supervision using the two techniques are provided and explored, and guidance as to the supervision contexts in which each of the two techniques are suitable, individually, and in tandem, are outlined. Finally, best practice guidelines for the use of video feedback techniques in supervision are outlined.
The incidence of recreational water-associated outbreaks in the United States has significantly increased, driven, at least in part, by outbreaks both caused by Cryptosporidium and associated with treated recreational water venues. Because of the parasite's extreme chlorine tolerance, transmission can occur even in well-maintained treated recreational water venues (e.g. pools) and a focal cryptosporidiosis outbreak can evolve into a community-wide outbreak associated with multiple recreational water venues and settings (e.g. childcare facilities). In August 2004 in Auglaize County, Ohio, multiple cryptosporidiosis cases were identified and anecdotally linked to pool A. Within 5 days of the first case being reported, pool A was hyperchlorinated to achieve 99·9% Cryptosporidium inactivition. A case-control study was launched to epidemiologically ascertain the outbreak source 11 days later. A total of 150 confirmed and probable cases were identified; the temporal distribution of illness onset was peaked, indicating a point-source exposure. Cryptosporidiosis was significantly associated with swimming in pool A (matched odds ratio 121·7, 95% confidence interval 27·4–∞) but not with another venue or setting. The findings of this investigation suggest that proactive implementation of control measures, when increased Cryptosporidium transmission is detected but before an outbreak source is epidemiologically ascertained, might prevent a focal cryptosporidiosis outbreak from evolving into a community-wide outbreak.
Accuracy of intracranial magnetic resonance angiography (MRA) and reliability of interpretation are not well established compared to conventional selective catheter angiography. The purpose of this study was to determine the accuracy of MRA in evaluation of intracranial vessels in acute stroke and transient ischemic attack (TIA) patients.
Methods:
Twenty-nine patients (seven females, 22 males; median age 53) with acute stroke or TIA were enrolled into the study. All patients underwent both MRA using a 3 T clinical magnet and conventional angiography within 48 hours. Median time between MRA and angiography was 263 minutes. Conventional angiography preceded MRA in 15 cases. Fourteen patients received thrombolysis during MRA or angiography. National Institutes of Health Stroke Scale scores were obtained prior to the MR exam. One neuroradiologist rated all conventional angiograms, which were used as gold standard. Five observers, blinded to conventional angiography results and all clinical information except symptom side, rated the MR angiograms. Kappa statistics were used to assess reliability; contingency tables were used to assess accuracy of non-enhanced and enhanced MRA.
Results:
Two hundred and fifty two intracranial vessels were assessed. Agreement between raters was good for both non-enhanced (k=0.50) and gadolinium-enhanced (k=0.46) images. There were a total of 26 vessels occluded by DSA. Overall, the non-enhanced MRA showed sensitivity of 84.2% (95% CI 60.4-96.6) and specificity of 84.6% (95% CI 78.6-89.4). The enhanced MRA showed sensitivity of 69.2 (95% CI 38.6-90.9) and specificity of 73.6 (95% CI 65.5-80.7).
Conclusions:
Magnetic resonance angiography is a good non-invasive screening tool for assessing intracranial vessel status in acute ischemic stroke. Angiography remains the gold standard for definitive assessment of the intracranial circulation.
Longitudinal, patient-level data on resource use and costs after an ischemic stroke are lacking in Canada. The objectives of this analysis were to calculate costs for the first year post-stroke and determine the impact of disability on costs.
Methodology:
The Economic Burden of Ischemic Stroke (BURST) Study was a one-year prospective study with a cohort of ischemic stroke patients recruited at 12 Canadian stroke centres. Clinical history, disability, health preference and resource utilization information was collected at discharge, three months, six months and one year. Resources included direct medical costs (2009 CAN$) such as emergency services, hospitalizations, rehabilitation, physician services, diagnostics, medications, allied health professional services, homecare, medical/assistive devices, changes to residence and paid caregivers, as well as indirect costs. Results were stratified by disability measured at discharge using the modified Rankin Score (mRS): non-disabling stroke (mRS 0-2) and disabling stroke (mRS 3-5).
Results:
We enrolled 232 ischemic stroke patients (age 69.4 ± 15.4 years; 51.3% male) and 113 (48.7%) were disabled at hospital discharge. The average annual cost was $74,353; $107,883 for disabling strokes and $48,339 for non-disabling strokes.
Conclusions:
An average annual cost for ischemic stroke was calculated in which a disabling stroke was associated with a two-fold increase in costs compared to NDS. Costs during the hospitalization to three months phase were the highest contributor to the annual cost. A “back of the envelope” calculation using 38,000 stroke admissions and the average annual cost yields $2.8 billion as the burden of ischemic stroke.