We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Cardiac surgery-associated acute kidney injury (CS-AKI) and fluid overload (FO) are common among neonates who undergo cardiopulmonary bypass, and increase mortality risk. Current diagnostic criteria may delay diagnosis. Thus, there is a need to identify urine biomarkers that permit earlier and more accurate diagnosis.
Methods:
This single-centre ancillary prospective cohort study describes age- and disease-specific ranges of 14 urine biomarkers at perioperative time points and explores associations with CS-AKI and FO. Neonates (≤28 days) undergoing cardiac surgery were included. Preterm neonates or those who had pre-operative acute kidney injury were excluded. Urine biomarkers were measured pre-operatively, at 0 to < 8 hours after surgery, and at 8 to 24 hours after surgery. Exploratory outcomes included CS-AKI, defined by the modified Kidney Disease Improving Global Outcomes criteria, and>10% FO, both measured at 48 hours after surgery.
Results:
Overall, α-glutathione S-transferase, β-2 microglobulin, albumin, cystatin C, neutrophil gelatinase-associated lipocalin, osteopontin, uromodulin, clusterin, and vascular endothelial growth factor concentrations peaked in the early post-operative period; over the sampling period, kidney injury molecule-1 increased and trefoil factor-3 decreased. In the early post-operative period, β-2 microglobulin and α-glutathione S-transferase were higher in neonates who developed CS-AKI; and clusterin, cystatin C, neutrophil gelatinase-associated lipocalin, osteopontin, and α-glutathione S-transferase were higher in neonates who developed FO.
Conclusion:
In a small, single-centre cohort, age- and disease-specific urine biomarker concentrations are described. These data identify typical trends and will inform future studies.
The crystal structure of perfluorononanoic acid (PFNA) was solved via parallel tempering using synchrotron powder diffraction data obtained from the Brockhouse X-ray Diffraction and Scattering (BXDS) Wiggler Lower Energy (WLE) beamline at the Canadian Light Source. PFNA crystallizes in monoclinic space group P21/c (#14) with lattice parameters a = 26.172(1) Å, b = 5.6345(2) Å, c = 10.9501(4) Å, and β = 98.752(2)°. The crystal structure is composed of dimers, with pairs of PFNA molecules connected by hydrogen bonds via the carboxylic acid functional groups. The Rietveld-refined structure was compared to a density functional theory-optimized structure, and the root-mean-square Cartesian difference was larger than normally observed for correct powder structures. The powder data likely exhibited evidence of disorder which was not successfully modeled.
The rising number of dementia diagnoses and imminent adoption of disease-modifying treatments necessitate innovative approaches to identify individuals at risk, monitor disease course and intervene non-pharmacologically earlier in the disease course. Digital assessments of dementia risk and cognitive function have the potential to outperform traditional in-person assessments in terms of their affordability, accuracy and longitudinal tracking abilities. However, their accessibility and reliability in older adults is unclear.
Aims
To evaluate the usability and reliability of a smartphone assessment of lifestyle and cognitive factors relevant to dementia risk in a group of UK-based older adults.
Method
Cognitively healthy adults (n = 756) recruited through the Dementias Platform UK Great Minds volunteer register completed three assessments of cognitive function and dementia risk over a 3-month period and provided usability feedback on the Five Lives smartphone application (app). We evaluated cognitive test scores for age, gender and higher education effects, normality distributions, test–retest reliability and their relationship with participants’ lifestyle dementia risk factors.
Results
Participants found the app ‘easy to use’, ‘quick to complete’ and ‘enjoyable’. The cognitive tests showed normal or near-to-normal distributions, variable test–retest reliabilities and age-related effects. Only tests of verbal ability showed gender and education effects. The cognitive tests did not correlate with lifestyle dementia risk scores.
Conclusions
The Five Lives assessment demonstrates high usability and reliability among older adults. These findings highlight the potential of digital assessments in dementia research and clinical practice, enabling improved accessibility and better monitoring of cognitive health on a larger scale than traditional in-person assessments.
Human infection with antimicrobial-resistant Campylobacter species is an important public health concern due to the potentially increased severity of illness and risk of death. Our objective was to synthesise the knowledge of factors associated with human infections with antimicrobial-resistant strains of Campylobacter. This scoping review followed systematic methods, including a protocol developed a priori. Comprehensive literature searches were developed in consultation with a research librarian and performed in five primary and three grey literature databases. Criteria for inclusion were analytical and English-language publications investigating human infections with an antimicrobial-resistant (macrolides, tetracyclines, fluoroquinolones, and/or quinolones) Campylobacter that reported factors potentially linked with the infection. The primary and secondary screening were completed by two independent reviewers using Distiller SR®. The search identified 8,527 unique articles and included 27 articles in the review. Factors were broadly categorised into animal contact, prior antimicrobial use, participant characteristics, food consumption and handling, travel, underlying health conditions, and water consumption/exposure. Important factors linked to an increased risk of infection with a fluoroquinolone-resistant strain included foreign travel and prior antimicrobial use. Identifying consistent risk factors was challenging due to the heterogeneity of results, inconsistent analysis, and the lack of data in low- and middle-income countries, highlighting the need for future research.
Caregivers of patients with primary brain tumor (PBT) describe feeling preoccupied with the inevitability of their loved one's death. However, there are currently no validated instruments to assess death anxiety in caregivers. This study sought to examine (1) the psychometric properties of the Death and Dying Distress Scale (DADDS), adapted for caregivers (DADDS-CG), and (2) the prevalence and correlates of death anxiety in caregivers of patients with PBT.
Methods
Caregivers (N = 67) of patients with PBT completed the DADDS-CG, Patient Health Questionnaire (PHQ-9), Generalized Anxiety Disorder (GAD-7), Fear of Cancer Recurrence (FCR-7), and God Locus of Health Control (GLHC). Caregivers’ sociodemographic information and patients’ medical characteristics were also collected. Preliminary examination of the psychometric properties of the DADDS-CG was conducted using exploratory factor analysis, Cronbach's alpha, and correlations. The prevalence and risk factors of death anxiety were assessed using frequencies, pair-wise comparisons, and correlations.
Results
Factor analysis of the DADDS-CG revealed a two-factor structure consistent with the original DADDS. The DADDS-CG demonstrated excellent internal consistency, convergent validity with the PHQ-9, GAD-7, and FCR-7, and discriminant validity with the GLHC. Over two-thirds of caregivers reported moderate-to-severe symptoms of death anxiety. Death anxiety was highest in women and caregivers of patients with high-grade PBT.
Significance of results
The DADDS-CG demonstrates sound psychometric properties in caregivers of patients with PBT, who report high levels of death anxiety. Further research is needed to support the measure's value in clinical care and research — both in this population and other caregivers — in order to address this unmet, psychosocial need.
American Indian and Alaska Native peoples (AI/AN) have a disproportionately high rate of obesity, but little is known about the social determinants of obesity among older AI/AN. Thus, our study assessed social determinants of obesity in AI/AN aged ≥ 50 years.
Design:
We conducted a cross-sectional analysis using multivariate generalised linear mixed models to identify social determinants associated with the risk of being classified as obese (BMI ≥ 30·0 kg/m2). Analyses were conducted for the total study population and stratified by median county poverty level.
Setting:
Indian Health Service (IHS) data for AI/AN who used IHS services in FY2013.
Participants:
Totally, 27 696 AI/AN aged ≥ 50 years without diabetes.
Results:
Mean BMI was 29·8 ± 6·6 with 43 % classified as obese. Women were more likely to be obese than men, and younger ages were associated with higher obesity risk. While having Medicaid coverage was associated with lower odds of obesity, private health insurance was associated with higher odds. Living in areas with lower rates of educational attainment and longer drive times to primary care services were associated with higher odds of obesity. Those who lived in a county where a larger percentage of people had low access to a grocery store were significantly less likely to be obese.
Conclusions:
Our findings contribute to the understanding of social determinants of obesity among older AI/AN and highlight the need to investigate AI/AN obesity, including longitudinal studies with a life course perspective to further examine social determinants of obesity in older AI/AN.
We identified quality indicators (QIs) for care during transitions of older persons (≥ 65 years of age). Through systematic literature review, we catalogued QIs related to older persons’ transitions in care among continuing care settings and between continuing care and acute care settings and back. Through two Delphi survey rounds, experts ranked relevance, feasibility, and scientific soundness of QIs. A steering committee reviewed QIs for their feasible capture in Canadian administrative databases. Our search yielded 326 QIs from 53 sources. A final set of 38 feasible indicators to measure in current practice was included. The highest proportions of indicators were for the emergency department (47%) and the Institute of Medicine (IOM) quality domain of effectiveness (39.5%). Most feasible indicators were outcome indicators. Our work highlights a lack of standardized transition QI development in practice, and the limitations of current free-text documentation systems in capturing relevant and consistent data.
Transitions for older persons from long-term care (LTC) to the emergency department (ED) and back, can result in adverse events. Effective communication among care settings is required to ensure continuity of care. We implemented a standardized form for improving consistency of documentation during LTC to ED transitions of residents 65 years of age or older, via emergency medical services (EMS), and back. Data on form use and form completion were collected through chart review. Practitioners’ perspectives were collected using surveys. The form was used in 90/244 (37%) LTC to ED transitions, with large variation in data element completion. EMS and ED reported improved identification of resident information. LTC personnel preferred usual practice to the new form and twice reported prioritizing form completion before calling 911. To minimize risk of harmful unintended consequences, communication forms should be implemented as part of broader quality improvement programs, rather than as stand-alone interventions.
The systems ecology paradigm (SEP) emerged in the late 1960s at a time when societies throughout the world were beginning to recognize that our environment and natural resources were being threatened by their activities. Management practices in rangelands, forests, agricultural lands, wetlands, and waterways were inadequate to meet the challenges of deteriorating environments, many of which were caused by the practices themselves. Scientists recognized an immediate need was developing a knowledge base about how ecosystems function. That effort took nearly two decades (1980s) and concluded with the acceptance that humans were components of ecosystems, not just controllers and manipulators of lands and waters. While ecosystem science was being developed, management options based on ecosystem science were shifting dramatically toward practices supporting sustainability, resilience, ecosystem services, biodiversity, and local to global interconnections of ecosystems. Emerging from the new knowledge about how ecosystems function and the application of the systems ecology approach was the collaboration of scientists, managers, decision-makers, and stakeholders locally and globally. Today’s concepts of ecosystem management and related ideas, such as sustainable agriculture, ecosystem health and restoration, consequences of and adaptation to climate change, and many other important local to global challenges are a direct result of the SEP.
To investigate the touch-contact antimicrobial efficacy of novel cold spray surface coatings composed of copper and silver metals, regard to their rate of microbial elimination.
Design:
Antimicrobial time-kill assay.
Setting:
Laboratory-based study.
Methods:
An adapted time-kill assay was conducted to characterize the antimicrobial efficacy of the developed coatings. A simulated touch-contact pathogenic exposure to Gram-positive Staphylococcus aureus (ATCC 25923), Gram-negative Pseudomonas aeruginosa (ATCC 27853), and the yeast Candida albicans (ATCC 10231), as well as corresponding resistant strains of gentamicin-methicillin–resistant S. aureus (ATCC 33592), azlocillin-carbenicillin–resistant P. aeruginosa (DSM 46316), and a fluconazole-resistant C. albicans strain was undertaken. Linear regression modeling was used to deduce microbial reduction rates.
Results:
A >7 log reduction in microbial colony forming units was achieved within minutes on surfaces with cold spray coatings compared to a single log bacterial reduction on copper metal sheets within a 3 hour contact period. Copper-coated 3-dimensional (3D) printed acrylonitrile butadiene styrene (ABS) achieved complete microbial elimination against all tested pathogens within a 15 minute exposure period. Similarly, a copper-on-copper coating achieved microbial elimination within 10 minutes and within 5 minutes with the addition of silver powder as a 5 wt% coating constituent.
Conclusions:
In response to the global need for alternative solutions for infection control and prevention, these effective antimicrobial surface coatings were proposed. A longitudinal study is the next step toward technology integration.
Beginning upwards of half a millennium ago, European sojourners and their settler colonialist inheritors sought to acquire resource assets and eventually the land itself in Mi’kma’ki and the neighbouring homelands of the Wolastoqiyik and the Beothuk/Innu. This area, corresponding broadly in settler terms to Atlantic Canada, has seen a process of European expansion premised on appropriating the wealth, the resources and the bodies of non-European peoples. Historically, it was a process of unique antiquity, beginning with fisheries that predated the turn of the sixteenth century, and one in which Scots took an early and influential role. This volume, focusing primarily on eras following the onset of colonial settlement, offers a series of reappraisals of key developments not only in settler societies themselves but also in relation to African and Indigenous inhabitants. Insofar as the geographical frame of reference is Atlantic Canada, there is of course a sense in which the term is anachronistic. Only with the joining of Newfoundland (formally known from 2001 as Newfoundland and Labrador) to Canada in 1949 did Atlantic Canada become a regional designation for what had previously been distinguished respectively as the Dominion of Newfoundland and the Maritime provinces of the Dominion of Canada. Yet for analytical purposes, the term Atlantic Canada represents a justifiable shorthand for a portion of north-eastern North America that – despite variations in environment and in economic trajectories – shared important elements of both Indigenous and settlement histories.
In nineteenth- and twentieth-century historiographies, influenced by the ‘British and settler scholars’ clustered notably in the institutions described by Tamson Pietsch as ‘settler universities’, imperial expansion and colonial settlement were attributed central roles throughout the post-contact era. Yet Indigenous societies in this part of North America, which had evolved over a period of at least some ten thousand years, were not in reality so easily overshadowed. Contact with non-Indigenous commercial voyagers – English, French, Basque and others – from approximately 1500 onwards did make a difference, but not necessarily an unmanageable difference. Prior to that time, continuity and change were underwritten by factors operating within North America and, generally speaking, within north-eastern North America. Environmental change took forms ranging from the gradual but transformative process of warming that followed the last Ice Age to shorter-term variations that influenced transportation patterns and seasonal characteristics.
Behavioural therapy often involves self-monitoring techniques to increase awareness about mood and stressful events. In turn, emotional self-awareness is likely to decrease symptoms of depression. Self monitoring also has potential as an early intervention tool for young people, particularly when mobile phones are used as a medium. Previous qualitative research indicates that self-monitoring via mobile phones increase emotional self-awareness with five categories proposed: awareness, identification, communication, contextualisation and decision-making.
Aims
This RCT investigates the relationships between self-monitoring, emotional self-awareness and depression using an early intervention mobile phone self-monitoring tool with young people at risk of developing depression.
Methods
Young people (between 14 and 24 years of age) identified by their GP as being at risk of depression were recruited by GPs in rural and metropolitan Victoria and randomly assigned to either the intervention group (where they monitored their mood, stress and daily activities) or the comparison group (where the questions about mood and stress were excluded). Participants completed baseline and follow-up measures of depression as well as measures of emotional self-awareness.
Results
Results will be presented on the effects of self-monitoring on emotional self-awareness, the effects of self-monitoring on depression, anxiety and stress and the relationship between emotional self-awareness and depression, anxiety and stress.
Conclusion
Emotional self-awareness as a mediator in the relationship between self-monitoring and depression will be discussed focusing on the relationships between
(i) self-monitoring,
(ii) emotional self-awareness and
(iii) symptoms of depression, anxiety and stress.
Possible avenues for early intervention are suggested.
The mobiletype program is a cell/mobile phone mental health assessment and management tool designed specifically for young people aged 14–24 years to assist in detecting, managing, and treating of youth mental health problems. The mobiletype self-starts 4 times per day and the patient completes a brief survey of their current mood, stresses, coping, alcohol and cannabis use, exercise, sleeping and eating patterns. This data is transmitted in real- time to a website interface which collates it and produces individual reports for young people to share with their doctor.
Methods
118 young people identified with mild or more mental health symptoms were blindly randomly allocated at the individual level, to either the intervention group (mobiletype plus usual care) or the comparison group (abbreviated mobiletype plus usual care) according to CONSORT guidelines. Participants and doctors completed baseline and follow-up questionnaires measuring mental health, patient-doctor relationship, and pathways to care (i.e. referrals, medication, and testing). Participants were followed-up at 6 weeks and 6 months.
Results
Results from fixed effects analyses of covariance examining the differences between the experimental and control groups on the main outcome measures, with the baseline values as the covariates will be presented. The extent to which the mobiletype program reduces mental health symptoms, enhances the patient-doctor relationship and assists patients in pathways to care will be explored in detail.
Conclusions
Mobile and new information and connected technologies have much to offer clinical care in terms of increased efficiencies in data collection, increased engagement of participants and overall enhanced care.
Nitrous oxide (N2O) misuse is widespread in the UK. Although it is well-known that it can cause devastating myeloneuropathy, psychiatric presentations are poorly described. There is little understanding of who it affects, how it presents, its mechanism of action and principles of treatment. We begin this article with a case study. We then review the literature to help psychiatrists understand this area and deal with this increasing problem, and make diagnosis and treatment recommendations. We describe a diagnostic pentad of weakness, numbness, paraesthesia, psychosis and cognitive impairment to alert clinicians to the need to urgently treat these patients. Nitrous oxide misuse is a pending neuropsychiatric emergency requiring urgent treatment with vitamin B12 to prevent potentially irreversible neurological and psychiatric symptoms.
In Canada, recreational use of cannabis was legalized in October 2018. This policy change along with recent publications evaluating the efficacy of cannabis for the medical treatment of epilepsy and media awareness about its use have increased the public interest about this agent. The Canadian League Against Epilepsy Medical Therapeutics Committee, along with a multidisciplinary group of experts and Canadian Epilepsy Alliance representatives, has developed a position statement about the use of medical cannabis for epilepsy. This article addresses the current Canadian legal framework, recent publications about its efficacy and safety profile, and our understanding of the clinical issues that should be considered when contemplating cannabis use for medical purposes.
Knowledge of the effects of burial depth and burial duration on seed viability and, consequently, seedbank persistence of Palmer amaranth (Amaranthus palmeri S. Watson) and waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] ecotypes can be used for the development of efficient weed management programs. This is of particular interest, given the great fecundity of both species and, consequently, their high seedbank replenishment potential. Seeds of both species collected from five different locations across the United States were investigated in seven states (sites) with different soil and climatic conditions. Seeds were placed at two depths (0 and 15 cm) for 3 yr. Each year, seeds were retrieved, and seed damage (shrunken, malformed, or broken) plus losses (deteriorated and futile germination) and viability were evaluated. Greater seed damage plus loss averaged across seed origin, burial depth, and year was recorded for lots tested at Illinois (51.3% and 51.8%) followed by Tennessee (40.5% and 45.1%) and Missouri (39.2% and 42%) for A. palmeri and A. tuberculatus, respectively. The site differences for seed persistence were probably due to higher volumetric water content at these sites. Rates of seed demise were directly proportional to burial depth (α=0.001), whereas the percentage of viable seeds recovered after 36 mo on the soil surface ranged from 4.1% to 4.3% compared with 5% to 5.3% at the 15-cm depth for A. palmeri and A. tuberculatus, respectively. Seed viability loss was greater in the seeds placed on the soil surface compared with the buried seeds. The greatest influences on seed viability were burial conditions and time and site-specific soil conditions, more so than geographical location. Thus, management of these weed species should focus on reducing seed shattering, enhancing seed removal from the soil surface, or adjusting tillage systems.
A major challenge in addressing the loss of benefits and services provided by the natural environment is that it can be difficult to find ways for those who benefit from them to pay for their preservation. We examine one such context in Malawi, where erosion from soils disturbed by agriculture affects not only farmers’ incomes, but also damages aquatic habitat and inhibits the storage and hydropower potential of dams downstream. We demonstrate that payments from hydropower producers to farmers to maintain land cover and prevent erosion can have benefits for all parties involved.