We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Knowledge of the status of ecosystems is vital to help develop and implement conservation strategies. This is particularly relevant to the Arctic where the need for biodiversity conservation and monitoring has long been recognised, but where issues of local capacity and logistic barriers make surveys challenging. This paper demonstrates how long-term monitoring programmes outside the Arctic can contribute to developing composite trend indicators, using monitoring of annual abundance and population-level reproduction of species of migratory Arctic-breeding waterbirds on their temperate non-breeding areas. Using data from the UK and the Netherlands, countries with year-round waterbird monitoring schemes and supporting relevant shares of Arctic-breeding populations of waterbirds, we present example multi-species abundance and productivity indicators related to the migratory pathways used by different biogeographical populations of Arctic-breeding wildfowl and wader species in the East Atlantic Flyway. These composite trend indicators show that long-term increases in population size have slowed markedly in recent years and in several cases show declines over, at least, the last decade. These results constitute proof of concept. Some other non-Arctic countries located on the flyways of Arctic-breeding waterbirds also annually monitor abundance and breeding success, and we advocate that future development of “Arctic waterbird indicators” should be as inclusive of data as possible to derive the most robust outputs and help account for effects of current changes in non-breeding waterbird distributions. The incorporation of non-Arctic datasets into assessments of the status of Arctic biodiversity is recognised as highly desirable, because logistic constraints in monitoring within the Arctic region limit effective population-scale monitoring there, in effect enabling “monitoring at a distance”.
Many companion kittens entering shelters are fostered by volunteer community members during the sensitive period for socialisation (~2 to 9 weeks of age) when early experiences are critical to behavioural development. Using a mixed-method survey, we explored current fostering practices relevant to kitten behavioural development and welfare. Foster caretaker participants (n = 487) described their fostering practices and reported providing kittens with a majority of recommended socialisation experiences, such as handling and exposure to various toys and exploratory items. In open-ended text responses, foster caretakers described how they adapted socialisation practices for fearful kittens and the supports and challenges they perceived to impact their ability to properly socialise kittens. Some non-recommended techniques (e.g. flooding) were reported for socialising fearful kittens, with a decreased odds of reporting non-recommended techniques for participants with a higher level of agreeableness personality trait and an increased odds of reporting if fostering practices had been impacted by the COVID-19 pandemic. Foster caretakers reported feeling supported through shelter-supplied resources, personal knowledge, external support, and having access to socialisation opportunities; however, faced personal (e.g. time constraints), shelter-specific (e.g. lack of shelter support), and kitten-specific challenges (e.g. kitten illness). This study highlights the perspectives of foster caretakers as related to optimal socialisation, behavioural development, and welfare. To identify opportunities for improvement it is important to investigate the socialisation guidelines provided to foster caretakers, with the ultimate goal of enhancing kitten behavioural development for improved welfare, long-term adoption, and caretaker satisfaction.
Digital Mental Health Interventions (DMHIs) that meet the definition of a medical device are regulated by the Medicines and Healthcare products Regulatory Agency (MHRA) in the UK. The MHRA uses procedures that were originally developed for pharmaceuticals to assess the safety of DMHIs. There is recognition that this may not be ideal, as is evident by an ongoing consultation for reform led by the MHRA and the National Institute for Health and Care Excellence.
Aims
The aim of this study was to generate an experts’ consensus on how the medical regulatory method used for assessing safety could best be adapted for DMHIs.
Method
An online Delphi study containing three rounds was conducted with an international panel of 20 experts with experience/knowledge in the field of UK digital mental health.
Results
Sixty-four items were generated, of which 41 achieved consensus (64%). Consensus emerged around ten recommendations, falling into five main themes: Enhancing the quality of adverse events data in DMHIs; Re-defining serious adverse events for DMHIs; Reassessing short-term symptom deterioration in psychological interventions as a therapeutic risk; Maximising the benefit of the Yellow Card Scheme; and Developing a harmonised approach for assessing the safety of psychological interventions in general.
Conclusion
The implementation of the recommendations provided by this consensus could improve the assessment of safety of DMHIs, making them more effective in detecting and mitigating risk.
The 2020 update of the Canadian Stroke Best Practice Recommendations (CSBPR) for the Secondary Prevention of Stroke includes current evidence-based recommendations and expert opinions intended for use by clinicians across a broad range of settings. They provide guidance for the prevention of ischemic stroke recurrence through the identification and management of modifiable vascular risk factors. Recommendations address triage, diagnostic testing, lifestyle behaviors, vaping, hypertension, hyperlipidemia, diabetes, atrial fibrillation, other cardiac conditions, antiplatelet and anticoagulant therapies, and carotid and vertebral artery disease. This update of the previous 2017 guideline contains several new or revised recommendations. Recommendations regarding triage and initial assessment of acute transient ischemic attack (TIA) and minor stroke have been simplified, and selected aspects of the etiological stroke workup are revised. Updated treatment recommendations based on new evidence have been made for dual antiplatelet therapy for TIA and minor stroke; anticoagulant therapy for atrial fibrillation; embolic strokes of undetermined source; low-density lipoprotein lowering; hypertriglyceridemia; diabetes treatment; and patent foramen ovale management. A new section has been added to provide practical guidance regarding temporary interruption of antithrombotic therapy for surgical procedures. Cancer-associated ischemic stroke is addressed. A section on virtual care delivery of secondary stroke prevention services in included to highlight a shifting paradigm of care delivery made more urgent by the global pandemic. In addition, where appropriate, sex differences as they pertain to treatments have been addressed. The CSBPR include supporting materials such as implementation resources to facilitate the adoption of evidence into practice and performance measures to enable monitoring of uptake and effectiveness of recommendations.
The Subglacial Antarctic Lakes Scientific Access (SALSA) Project accessed Mercer Subglacial Lake using environmentally clean hot-water drilling to examine interactions among ice, water, sediment, rock, microbes and carbon reservoirs within the lake water column and underlying sediments. A ~0.4 m diameter borehole was melted through 1087 m of ice and maintained over ~10 days, allowing observation of ice properties and collection of water and sediment with various tools. Over this period, SALSA collected: 60 L of lake water and 10 L of deep borehole water; microbes >0.2 μm in diameter from in situ filtration of ~100 L of lake water; 10 multicores 0.32–0.49 m long; 1.0 and 1.76 m long gravity cores; three conductivity–temperature–depth profiles of borehole and lake water; five discrete depth current meter measurements in the lake and images of ice, the lake water–ice interface and lake sediments. Temperature and conductivity data showed the hydrodynamic character of water mixing between the borehole and lake after entry. Models simulating melting of the ~6 m thick basal accreted ice layer imply that debris fall-out through the ~15 m water column to the lake sediments from borehole melting had little effect on the stratigraphy of surficial sediment cores.
In 2018, the Clostridium difficile LabID event methodology changed so that hospitals doing 2-step tests, nucleic acid amplification test (NAAT) plus enzyme immunofluorescence assay (EIA), had their adjustment modified to EIA-based tests, and only positive final tests (eg, EIA) were counted in the numerator. We report the immediate impact of this methodological change at 3 Milwaukee hospitals.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
Although there is a growing interest for the effects of intermittent fasting on energy balance, this study aimed to compare appetite, energy intake and food reward responses with an energy depletion induced either by 24-h food restriction or an equivalent deficit with exercise in healthy males. In all, twelve healthy lean males (21·5 (sd 0·5) years old; BMI: 22·5 (sd 1·7) kg/m2) participated in this study. Body composition, aerobic capacity, food preferences and energy intake were assessed. They randomly completed three conditions: (i) no depletion (CON); (ii) full 24-h energy restrictions (Def-EI); and (iii) exercise condition (Def-EX). Ad libitum energy intake and food reward were assessed at the end of each session. Appetite feelings were assessed regularly. Ad libitum energy intake was higher on Def-EI (7330 (sd 2975) kJ (1752 (sd 711) kcal) compared with that on CON (5301 (sd 1205) kJ (1267 (sd 288) kcal)) (P<0·05), with no difference between CON and Def-EX (6238 (sd 1741) kJ (1491 (sd 416) kcal) (P=0·38) and between Def-EX and Def-EI (P=0·22). There was no difference in the percent energy ingested from macronutrients. Hunger was lower on CON and Def-EX compared with Def-EI (P<0·001). Satiety was higher on CON and Def-EI compared with that on Def-EX (P<0·001). There was a significant interaction condition × time for food choice fat bias (P=0·04), showing a greater preference for high-fat v. low-fat food during Def-EI and Def-EX. Although 24-h fasting leads to increased energy intake at the following test meal (without total daily energy intake difference), increased hunger profile and decreased post-meal food choice fat bias, such nutritional responses are not observed after a similar deficit induced by exercise.
Chylothorax after paediatric cardiac surgery incurs significant morbidity; however, a detailed understanding that does not rely on single-centre or administrative data is lacking. We described the present clinical epidemiology of postoperative chylothorax and evaluated variation in rates among centres with a multicentre cohort of patients treated in cardiac ICU.
Methods
This was a retrospective cohort study using prospectively collected clinical data from the Pediatric Cardiac Critical Care Consortium registry. All postoperative paediatric cardiac surgical patients admitted from October, 2013 to September, 2015 were included. Risk factors for chylothorax and association with outcomes were evaluated using multivariable logistic or linear regression models, as appropriate, accounting for within-centre clustering using generalised estimating equations.
Results
A total of 4864 surgical hospitalisations from 15 centres were included. Chylothorax occurred in 3.8% (n=185) of hospitalisations. Case-mix-adjusted chylothorax rates varied from 1.5 to 7.6% and were not associated with centre volume. Independent risk factors for chylothorax included age <1 year, non-Caucasian race, single-ventricle physiology, extracardiac anomalies, longer cardiopulmonary bypass time, and thrombosis associated with an upper-extremity central venous line (all p<0.05). Chylothorax was associated with significantly longer duration of postoperative mechanical ventilation, cardiac ICU and hospital length of stay, and higher in-hospital mortality (all p<0.001).
Conclusions
Chylothorax after cardiac surgery in children is associated with significant morbidity and mortality. A five-fold variation in chylothorax rates was observed across centres. Future investigations should identify centres most adept at preventing and managing chylothorax and disseminate best practices.
To investigate relationships between mortality and circulating 25-hydroxyvitamin D (25(OH)D), 25-hydroxycholecalciferol (25(OH)D3) and 25-hydroxyergocalciferol (25(OH)D2).
Design
Case–cohort study within the Melbourne Collaborative Cohort Study (MCCS). We measured 25(OH)D2 and 25(OH)D3 in archived dried blood spots by LC–MS/MS. Cox regression was used to estimate mortality hazard ratios (HR), with adjustment for confounders.
Setting
General community.
Subjects
The MCCS included 29 206 participants, who at recruitment in 1990–1994 were aged 40–69 years, had dried blood spots collected and no history of cancer. For the present study we selected participants who died by 31 December 2007 (n 2410) and a random sample (sub-cohort, n 2996).
Results
The HR per 25 nmol/l increment in concentration of 25(OH)D and 25(OH)D3 were 0·86 (95 % CI 0·78, 0·96; P=0·007) and 0·85 (95 % CI 0·77, 0·95; P=0·003), respectively. Of 5108 participants, sixty-three (1·2 %) had detectable 25(OH)D2; their mean 25(OH)D concentration was 11·9 (95 % CI 7·3, 16·6) nmol/l higher (P<0·001). The HR for detectable 25(OH)D2 was 1·80 (95 % CI 1·09, 2·97; P=0·023); for those with detectable 25(OH)D2, the HR per 25 nmol/l increment in 25(OH)D was 1·06 (95 % CI 0·87, 1·29; P interaction=0·02). HR were similar for participants who reported being in good, very good or excellent health four years after recruitment.
Conclusions
Total 25(OH)D and 25(OH)D3 concentrations were inversely associated with mortality. The finding that the inverse association for 25(OH)D was restricted to those with no detectable 25(OH)D2 requires confirmation in populations with higher exposure to ergocalciferol.
The optimal perioperative feeding strategies for neonates with CHD are unknown. In the present study, we describe the current feeding practices across a multi-institutional cohort.
Methods
Inclusion criteria for this study were as follows: all neonates undergoing cardiac surgery admitted to the cardiac ICU for ⩾24 hours preoperatively between October, 2013 and July, 2014 in the Pediatric Cardiac Critical Care Consortium registry.
Results
The cohort included 251 patients from eight centres. The most common diagnoses included the following: hypoplastic left heart syndrome (17%), coarctation/aortic arch hypoplasia (18%), and transposition of the great arteries (22%); 14% of the patients were <37weeks of gestational age. The median total hospital length of stay was 21 days (interquartile range (IQR) 14–35) and overall mortality was 8%. Preoperative feeding occurred in 133 (53%) patients. The overall preoperative feeding rates across centres ranged from 29 to 79%. Postoperative feeds started on median day 2 (IQR 1–4); for patients with hypoplastic left heart syndrome postoperative feeds started on median day 4. Postoperative feeds were initiated in 89 (35%) patients before extubation (range across centres: 21–61%). The median cardiac ICU discharge feeding volume was 108 cc/kg/day, varying across centres. The mean discharge weight was 280 g above birth weight, ranging from +100 to 430 g across centres. A total of 110 (44%) patients had discharge feeding tubes, ranging from 6 to 80% across centres, and 40/110 patients had gastrostomy/enterostomy tubes placed. In addition, eight (3.2%) patients developed necrotising enterocolitis – three preoperatively and five postoperatively.
Conclusion
In this cohort, neonatal feeding practices and outcomes appear to vary across diagnostic groups and institutions. Only half of the patients received preoperative enteral nutrition; almost half had discharge feeding tubes. Multi-institutional collaboration is necessary to determine feeding strategies associated with best clinical outcomes.
High-protein diets are an effective means for weight loss (WL), but the mechanisms are unclear. One hypothesis relates to the release of gut hormones by either protein or amino acids (AA). The present study involved overweight and obese male volunteers (n 18, mean BMI 36·8 kg/m2) who consumed a maintenance diet for 7 d followed by fully randomised 10 d treatments with three iso-energetic WL diets, i.e. with either normal protein (NP, 15 % of energy) or high protein (HP, 30 %) or with a combination of protein and free AA, each 15 % of energy (NPAA). Psychometric ratings of appetite were recorded hourly. On day 10, plasma samples were taken at 30 min intervals over two consecutive 5 h periods (covering post-breakfast and post-lunch) and analysed for AA, glucose and hormones (insulin, total glucose-dependent insulinotropic peptide, active ghrelin and total peptide YY (PYY)) plus leucine kinetics (first 5 h only). Composite hunger was 16 % lower for the HP diet than for the NP diet (P< 0·01) in the 5 h period after both meals. Plasma essential AA concentrations were greatest within 60 min of each meal for the NPAA diet, but remained elevated for 3–5 h after the HP diet. The three WL diets showed no difference for either fasting concentrations or the postprandial net incremental AUC (net AUCi) for insulin, ghrelin or PYY. No strong correlations were observed between composite hunger scores and net AUCi for either AA or gut peptides. Regulation of hunger may involve subtle interactions, and a range of signals may need to be integrated to produce the overall response.
The Australian Imaging, Biomarkers and Lifestyle (AIBL) Flagship Study of Ageing is a prospective study of 1,112 individuals (211 with Alzheimer's disease (AD), 133 with mild cognitive impairment (MCI), and 768 healthy controls (HCs)). Here we report diagnostic and cognitive findings at the first (18-month) follow-up of the cohort. The first aim was to compute rates of transition from HC to MCI, and MCI to AD. The second aim was to characterize the cognitive profiles of individuals who transitioned to a more severe disease stage compared with those who did not.
Methods:
Eighteen months after baseline, participants underwent comprehensive cognitive testing and diagnostic review, provided an 80 ml blood sample, and completed health and lifestyle questionnaires. A subgroup also underwent amyloid PET and MRI neuroimaging.
Results:
The diagnostic status of 89.9% of the cohorts was determined (972 were reassessed, 28 had died, and 112 did not return for reassessment). The 18-month cohort comprised 692 HCs, 82 MCI cases, 197 AD patients, and one Parkinson's disease dementia case. The transition rate from HC to MCI was 2.5%, and cognitive decline in HCs who transitioned to MCI was greatest in memory and naming domains compared to HCs who remained stable. The transition rate from MCI to AD was 30.5%.
Conclusion:
There was a high retention rate after 18 months. Rates of transition from healthy aging to MCI, and MCI to AD, were consistent with established estimates. Follow-up of this cohort over longer periods will elucidate robust predictors of future cognitive decline.
Previous work has shown that hunger and food intake are lower in individuals on high-protein (HP) diets when combined with low carbohydrate (LC) intakes rather than with moderate carbohydrate (MC) intakes and where a more ketogenic state occurs. The aim of the present study was to investigate whether the difference between HPLC and HPMC diets was associated with changes in glucose and ketone body metabolism, particularly within key areas of the brain involved in appetite control. A total of twelve men, mean BMI 34·9 kg/m2, took part in a randomised cross-over trial, with two 4-week periods when isoenergetic fixed-intake diets (8·3 MJ/d) were given, with 30 % of the energy being given as protein and either (1) a very LC (22 g/d; HPLC) or (2) a MC (182 g/d; HPMC) intake. An 18fluoro-deoxyglucose positron emission tomography scan of the brain was conducted at the end of each dietary intervention period, following an overnight fast (n 4) or 4 h after consumption of a test meal (n 8). On the next day, whole-body ketone and glucose metabolism was quantified using [1,2,3,4-13C]acetoacetate, [2,4-13C]3-hydroxybutyrate and [6,6-2H2]glucose. The composite hunger score was 14 % lower (P= 0·013) for the HPLC dietary intervention than for the HPMC diet. Whole-body ketone flux was approximately 4-fold greater for the HPLC dietary intervention than for the HPMC diet (P< 0·001). The 9-fold difference in carbohydrate intakes between the HPLC and HPMC dietary interventions led to a 5 % lower supply of glucose to the brain. Despite this, the uptake of glucose by the fifty-four regions of the brain analysed remained similar for the two dietary interventions. In conclusion, differences in the composite hunger score observed for the two dietary interventions are not associated with the use of alternative fuels by the brain.
Background: Research suggests that core schemas are important in both the development and maintenance of psychosis. Aims: The aim of the study was to investigate and compare core schemas in four groups along the continuum of psychosis and examine the relationships between schemas and positive psychotic symptomatology. Method: A measure of core schemas was distributed to 20 individuals experiencing first-episode psychosis (FEP), 113 individuals with “at risk mental states” (ARMS), 28 participants forming a help-seeking clinical group (HSC), and 30 non-help-seeking individuals who endorse some psychotic-like experiences (NH). Results: The clinical groups scored significantly higher than the NH group for negative beliefs about self and about others. No significant effects of group on positive beliefs about others were found. For positive beliefs about the self, the NH group scored significantly higher than the clinical groups. Furthermore, negative beliefs about self and others were related to positive psychotic symptomatology and to distress related to those experiences. Conclusions: Negative evaluations of the self and others appear to be characteristic of the appraisals of people seeking help for psychosis and psychosis-like experiences. The results support the literature that suggests that self-esteem should be a target for intervention. Future research would benefit from including comparison groups of people experiencing chronic psychosis and people who do not have any psychotic-like experiences.