We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The H3N2 canine influenza virus (CIV) emerged from an avian reservoir in Asia to circulate entirely among dogs for the last 20 years. The virus was first seen circulating outside Asian dog populations in 2015, in North America. Utilizing viral genomic data in addition to clinical reports and diagnostic testing data, we provide an updated analysis of the evolution and epidemiology of the virus in its canine host. CIV in dogs in North America is marked by a complex life history – including local outbreaks, regional lineage die-outs, and repeated reintroductions of the virus (with diverse genotypes) from different regions of Asia. Phylogenetic and Bayesian analysis reveal multiple CIV clades, and viruses from China have seeded recent North American outbreaks, with 2 or 3 introductions in the past 3 years. Genomic epidemiology confirms that within North America the virus spreads very rapidly among dogs in kennels and shelters in different regions – but then dies out locally. The overall epidemic therefore requires longer-distance dispersal of virus to maintain outbreaks over the long term. With a constant evolutionary rate over 20 years, CIV still appears best adapted to transmission in dense populations and has not gained properties for prolonged circulation among dogs.
Children are a critical part of certain legal trials, such as cases involving child abuse and neglect, and especially in cases of child sexual abuse. It is common for the only evidence in these types of cases to be the statement of the child victims. Children’s decisions about if and when to disclose the abuse are affected by many factors, and delays in disclosure are common. Police, forensic interviewers, prosecutors, and other professionals make decisions about when and how to interview children, the accuracy/credibility of their statements, if a case will move forward, and if and how children will testify in court. In some courtrooms, children are given special accommodations (e.g., testifying through closed-circuit TV or being accompanied by a therapy dog). Decisions about evaluating children in these situations have implications for the safety of children and the protection of innocent adults. Future research and policy implications are discussed.
Several evidence-informed consent practices (ECPs) have been shown to improve informed consent in clinical trials but are not routinely used. These include optimizing consent formatting, using plain language, using validated instruments to assess understanding, and involving legally authorized representatives when appropriate. We hypothesized that participants receiving an implementation science toolkit and a social media push would have increased adoption of ECPs and other outcomes.
Methods:
We conducted a 1-year trial with clinical research professionals in the USA (n = 1284) who have trials open to older adults or focus on Alzheimer’s disease. We randomized participants to receive information on ECPs via receiving a toolkit with a social media push (intervention) or receiving an online learning module (active control). Participants completed a baseline survey and a follow-up survey after 1 year. A subset of participants was interviewed (n = 43).
Results:
Participants who engaged more with the toolkit were more likely to have tried to implement an ECP during the trial than participants less engaged with the toolkit or the active control group. However, there were no significant differences in the adoption of ECPs, intention to adopt, or positive attitudes. Participants reported the toolkit and social media push were satisfactory, and participating increased their awareness of ECPs. However, they reported lacking the time needed to engage with the toolkit more fully.
Conclusions:
Using an implementation science approach to increase the use of ECPs was only modestly successful. Data suggest that having institutional review boards recommend or require ECPs may be an effective way to increase their use.
Hospital readmission is unsettling to patients and caregivers, costly to the healthcare system, and may leave patients at additional risk for hospital-acquired infections and other complications. We evaluated the association between comorbidities present during index coronavirus disease 2019 (COVID-19) hospitalization and the risk of 30-day readmission.
Design, setting, and participants:
We used the Premier Healthcare database to perform a retrospective cohort study of COVID-19 hospitalized patients discharged between April 2020 and March 2021 who were followed for 30 days after discharge to capture readmission to the same hospital.
Results:
Among the 331,136 unique patients in the index cohort, 36,827 (11.1%) had at least 1 all-cause readmission within 30 days. Of the readmitted patients, 11,382 (3.4%) were readmitted with COVID-19 as the primary diagnosis. In the multivariable model adjusted for demographics, hospital characteristics, coexisting comorbidities, and COVID-19 severity, each additional comorbidity category was associated with an 18% increase in the odds of all-cause readmission (adjusted odds ratio [aOR], 1.18; 95% confidence interval [CI], 1.17–1.19) and a 10% increase in the odds of readmission with COVID-19 as the primary readmission diagnosis (aOR, 1.10; 95% CI, 1.09–1.11). Lymphoma (aOR, 1.86; 95% CI, 1.58–2.19), renal failure (aOR, 1.32; 95% CI, 1.25–1.40), and chronic lung disease (aOR, 1.29; 95% CI, 1.24–1.34) were most associated with readmission for COVID-19.
Conclusions:
Readmission within 30 days was common among COVID-19 survivors. A better understanding of comorbidities associated with readmission will aid hospital care teams in improving postdischarge care. Additionally, it will assist hospital epidemiologists and quality administrators in planning resources, allocating staff, and managing bed-flow issues to improve patient care and safety.
Participants and research professionals often overestimate how well participants understand and appreciate consent information for clinical trials, and experts often vary in their determinations of participant’s capacity to consent to research. Past research has developed and validated instruments designed to assess participant understanding and appreciation, but the frequency with which they are utilized is unknown.
Methods:
We administered a survey to clinical researchers working with older adults or those at risk of cognitive impairment (N = 1284), supplemented by qualitative interviews (N = 60).
Results:
We found that using a validated assessment of consent is relatively uncommon, being used by only 44% of researchers who had an opportunity. Factors that predicted adoption of validated assessments included not seeing the study sponsor as a barrier, positive attitudes toward assessments, and being confident that they had the resources needed to implement an assessment. The perceived barriers to adopting validated assessments of consent included lack of awareness, lack of knowledge, being unsure of how to administer such an assessment, and the burden associated with implementing this practice.
Conclusions:
Increasing the use of validated assessments of consent will require educating researchers on the practice and emphasizing very practical assessments, and may require Institutional Review Boards (IRBs) or study sponsors to champion the use of assessments.
To determine whether electronically available comorbidities and laboratory values on admission are risk factors for hospital-onset Clostridioides difficile infection (HO-CDI) across multiple institutions and whether they could be used to improve risk adjustment.
Patients:
All patients at least 18 years of age admitted to 3 hospitals in Maryland between January 1, 2016, and January 1, 2018.
Methods:
Comorbid conditions were assigned using the Elixhauser comorbidity index. Multivariable log-binomial regression was conducted for each hospital using significant covariates (P < .10) in a bivariate analysis. Standardized infection ratios (SIRs) were computed using current Centers for Disease Control and Prevention (CDC) risk adjustment methodology and with the addition of Elixhauser score and individual comorbidities.
Results:
At hospital 1, 314 of 48,057 patient admissions (0.65%) had a HO-CDI; 41 of 8,791 patient admissions (0.47%) at community hospital 2 had a HO-CDI; and 75 of 29,211 patient admissions (0.26%) at community hospital 3 had a HO-CDI. In multivariable regression, Elixhauser score was a significant risk factor for HO-CDI at all hospitals when controlling for age, antibiotic use, and antacid use. Abnormal leukocyte level at hospital admission was a significant risk factor at hospital 1 and hospital 2. When Elixhauser score was included in the risk adjustment model, it was statistically significant (P < .01). Compared with the current CDC SIR methodology, the SIR of hospital 1 decreased by 2%, whereas the SIRs of hospitals 2 and 3 increased by 2% and 6%, respectively, but the rankings did not change.
Conclusions:
Electronically available patient comorbidities are important risk factors for HO-CDI and may improve risk-adjustment methodology.
Schizophrenia is a complex genetic disorder. The lack of concordance on disease manifestation in monozygotic twins provides clear evidence that environmental factors are key determinants in disease aetiology. In accordance, various environmental modulators such as hormones and vitamins may well contribute to the altered expression pattern of several genes identified in post-mortem brain tissue of patients. We constructed a microarray platform containing 1,808 human genes, where 142 belong to vitamin A, thyroid hormone and estrogens’ metabolic pathways, to specifically address whether these could contribute to a molecular signature of schizophrenia in peripheral lymphocytes. We found that the genes encoding for PLP1, UGT1A6, NTRK1, THRAP1, ESR2, TRIP13, MAPK8, TGFB2, IGFBP4 are differentially expressed in lymphocytes of patients under treatment with clozapine, risperdal or haloperidol. Some of these genes have been previously reported by others to be altered in brain regions of schizophrenic patients, and all are implicated in pathways suggested to be involved in the disease. These observations further support that:
1. studies in peripheral blood lymphocytes may contribute to reveal candidate genes in schizophrenia;
2. transcription modulation of several genes may occur through the mediation of vitamin and hormones, therefore linking genes and environment in schizophrenia.
In order to understand the involvement of these genes in schizophrenia, future studies should investigate whether some of the observed changes are replicated in animal models of the disease, and how antipsychotic treatment interferes with their expression.
Levels of serotonin in the body are regulated by the serotonin transporters (SERT), which are predominantly located on the presynaptic terminals of serotonin-containing neurons. Alterations in the density of SERT have been implicated in the pathophysiology of many neuropsychiatric disorders.
Aim
To evaluate 123-I mZIENT (2(S)-[(S)-2b-carbomethoxy-3b-[3′-((Z)-2-iodoethenyl)phenyl]nortropane), a novel radiopharmaceutical for imaging SERT. The bio-distribution of the radiopharmaceutical in humans was investigated and dosimetry performed.
Methods
The study includes three healthy volunteers and three patients receiving SSRIs. Whole body images obtained on a gamma camera at 10 minutes, 1, 2, 3, 6, 24 and 48 hours post administration. Dosimetry was performed. ROIs were drawn over the brain, heart, kidneys, liver, lungs, salivary glands, spleen, thyroid and intestines. Blood was sampled at 5, 15, & 30 minutes and 1, 2, 3, 6, 24 and 48 hours post administration. Urine was collected at 1, 2, 3, 4, 6, 24 and 48 hours. Brain SPECT images were obtained using a neuroSPECT scanner at 4 hours, evaluated visually and analysed using ROI analysis.
Results
High quality SPECT images can be obtained after 100–150 MBq 123-ImZEINT. Regional brain uptake was observed in midbrain and basal ganglia in healthy volunteers, consistent with the known distribution of SERT. Biodistribution images demonstrated highest uptake in the lungs, brain, liver and intestines. The effective dose was within range of other commonly used ligands and is acceptable for clinical imaging.
Conclusion
123-ImZIENT is a promising agent for imaging SERT in humans with acceptable dosimetry.
Despite their legal protection status, protected areas (PAs) can benefit from priority ranks when ongoing threats to their biodiversity and habitats outpace the financial resources available for their conservation. It is essential to develop methods to prioritize PAs that are not computationally demanding in order to suit stakeholders in developing countries where technical and financial resources are limited. We used expert knowledge-derived biodiversity measures to generate individual and aggregate priority ranks of 98 mostly terrestrial PAs on Madagascar. The five variables used were state of knowledge (SoK), forest loss, forest loss acceleration, PA size and relative species diversity, estimated by using standardized residuals from negative binomial models of SoK regressed onto species diversity. We compared our aggregate ranks generated using unweighted averages and principal component analysis (PCA) applied to each individual variable with those generated via Markov chain (MC) and PageRank algorithms. SoK significantly affected the measure of species diversity and highlighted areas where more research effort was needed. The unweighted- and PCA-derived ranks were strongly correlated, as were the MC and PageRank ranks. However, the former two were weakly correlated with the latter two. We recommend using these methods simultaneously in order to provide decision-makers with the flexibility to prioritize those PAs in need of additional research and conservation efforts.
The mammal family Tenrecidae (Afrotheria: Afrosoricida) is endemic to Madagascar. Here we present the conservation priorities for the 31 species of tenrec that were assessed or reassessed in 2015–2016 for the IUCN Red List of Threatened Species. Six species (19.4%) were found to be threatened (4 Vulnerable, 2 Endangered) and one species was categorized as Data Deficient. The primary threat to tenrecs is habitat loss, mostly as a result of slash-and-burn agriculture, but some species are also threatened by hunting and incidental capture in fishing traps. In the longer term, climate change is expected to alter tenrec habitats and ranges. However, the lack of data for most tenrecs on population size, ecology and distribution, together with frequent changes in taxonomy (with many cryptic species being discovered based on genetic analyses) and the poorly understood impact of bushmeat hunting on spiny species (Tenrecinae), hinders conservation planning. Priority conservation actions are presented for Madagascar's tenrecs for the first time since 1990 and focus on conserving forest habitat (especially through improved management of protected areas) and filling essential knowledge gaps. Tenrec research, monitoring and conservation should be integrated into broader sustainable development objectives and programmes targeting higher profile species, such as lemurs, if we are to see an improvement in the conservation status of tenrecs in the near future.
Targeted screening for carbapenem-resistant organisms (CROs), including carbapenem-resistant Enterobacteriaceae (CRE) and carbapenemase-producing organisms (CPOs), remains limited; recent data suggest that existing policies miss many carriers.
Objective:
Our objective was to measure the prevalence of CRO and CPO perirectal colonization at hospital unit admission and to use machine learning methods to predict probability of CRO and/or CPO carriage.
Methods:
We performed an observational cohort study of all patients admitted to the medical intensive care unit (MICU) or solid organ transplant (SOT) unit at The Johns Hopkins Hospital between July 1, 2016 and July 1, 2017. Admission perirectal swabs were screened for CROs and CPOs. More than 125 variables capturing preadmission clinical and demographic characteristics were collected from the electronic medical record (EMR) system. We developed models to predict colonization probabilities using decision tree learning.
Results:
Evaluating 2,878 admission swabs from 2,165 patients, we found that 7.5% and 1.3% of swabs were CRO and CPO positive, respectively. Organism and carbapenemase diversity among CPO isolates was high. Despite including many characteristics commonly associated with CRO/CPO carriage or infection, overall, decision tree models poorly predicted CRO and CPO colonization (C statistics, 0.57 and 0.58, respectively). In subgroup analyses, however, models did accurately identify patients with recent CRO-positive cultures who use proton-pump inhibitors as having a high likelihood of CRO colonization.
Conclusions:
In this inpatient population, CRO carriage was infrequent but was higher than previously published estimates. Despite including many variables associated with CRO/CPO carriage, models poorly predicted colonization status, likely due to significant host and organism heterogeneity.
Timely identification of multidrug-resistant gram-negative infections remains an epidemiological challenge. Statistical models for predicting drug resistance can offer utility where rapid diagnostics are unavailable or resource-impractical. Logistic regression–derived risk scores are common in the healthcare epidemiology literature. Machine learning–derived decision trees are an alternative approach for developing decision support tools. Our group previously reported on a decision tree for predicting ESBL bloodstream infections. Our objective in the current study was to develop a risk score from the same ESBL dataset to compare these 2 methods and to offer general guiding principles for using each approach.
Methods:
Using a dataset of 1,288 patients with Escherichia coli or Klebsiella spp bacteremia, we generated a risk score to predict the likelihood that a bacteremic patient was infected with an ESBL-producer. We evaluated discrimination (original and cross-validated models) using receiver operating characteristic curves and C statistics. We compared risk score and decision tree performance, and we reviewed their practical and methodological attributes.
Results:
In total, 194 patients (15%) were infected with ESBL-producing bacteremia. The clinical risk score included 14 variables, compared to the 5 decision-tree variables. The positive and negative predictive values of the risk score and decision tree were similar (>90%), but the C statistic of the risk score (0.87) was 10% higher.
Conclusions:
A decision tree and risk score performed similarly for predicting ESBL infection. The decision tree was more user-friendly, with fewer variables for the end user, whereas the risk score offered higher discrimination and greater flexibility for adjusting sensitivity and specificity.
Using samples collected for VRE surveillance, we evaluated unit admission prevalence of carbapenem-resistant Enterobacteriaceae (CRE) perirectal colonization and whether CRE carriers (unknown to staff) were on contact precautions for other indications. CRE colonization at unit admission was infrequent (3.9%). Most CRE carriers were not on contact precautions, representing a reservoir for healthcare-associated CRE transmission.
The procedure for synthesizing proto-imogolite (an acid-soluble hydroxy-aluminium orthosilicate complex) and imogolite (a tubular aluminosilicate mineral) was used to produce ferruginous aluminosilicates over a range of Al/Fe ratios to determine whether Fe3+ can be incorporated in the imogolite structure. Analysis of the synthesized products by transmission electron microscopy, electron diffraction, and IR spectroscopy indicated that, while imogolite was formed in the presence of iron, increased Fe3+ in the systems caused the formation of ferrihydrite and poorly-organized aluminosilicates resembling proto-imogolite allophane. Treatment of these materials with Na-citrate/dithionite/bicarbonate dissolved the ferrihydrite and poorly-organized aluminosilicate, and concentrated products with tubular morphology. Analysis of the structural Fe3+ by ESR spectroscopy suggested that little or no Fe3+ was incorporated in the structure of imogolite, although the less crystalline proto-imogolite allophane may have accommodated some structural Fe3+. A separate iron-rich product, identified as ferrihydrite, was formed at low Al/Fe ratios. Mössbauer spectroscopic analysis of 57Fe3+ doped at very low levels into proto-imogolite and imogolite indicated that the sites of substitution were better defined in the latter. At least part of this Fe3+ may have been incorporated in the structure of boehmite, an impurity formed during synthesis.
The suggestion that patches of basal ice may freeze to the bed of a glacier due to certain regelation effects has been tested in the laboratory by applying high hydrostatic pressures to ice samples at the pressure-melting point. During compression, ice temperatures follow the pressure-melting point closely, but after rapid decompression the ice temperature at first returns only half to three-quarters of the way to the pressure-melting point, after which it appears to warm by thermal conduction from outside the ice sample. If the moving ice at the base of a glacier behaves in the same way as it is exposed to changing pressure fields, frozen patches at bedrock are to be expected.
Records of strain variations in a tunnel beneath Glacier d’Argentière show two types of strain events. The first is a rapid jump or offset in the recorded strain, while the second are strain excursions, initiated by a change in strain over a period up to ten seconds, followed by a gradual recovery to the original strain over some minutes. It is suggested that the offset events are due to nearby stress release due to fracturing of frozen patches of ice at the bedrock while the strain-excursion events show the more distant adjustment of the glacial bed to the former events due to the time lags associated with changes of water-film thickness and regelation heat flow.
One of the most conspicuous phenomena in the Arctic Is the fracture of sea ice. It is scarcely possible to travel far without seeing a variety of fracture forms, produced both by natural processes and by human activity.
At strain-rates below about 10−4 s−1, deformation is dominated by creep, but at higher strain-rates fracture is much more important. One of the reasons for this is the very low fracture toughness of ice. The movements of ice in contact with offshore structures often induce strain-rates well beyond the level at which fracture begins, and so offshore structures will often operate in the fracture regime, and it is fracture processes which will determine the design loads. We consider the different modes of repeated fracture that will occur, and classify them into distinct mechanisms of crushing, spalling, and radial and circumferential cracking. Experimental and field observations are plotted on a deformation mode map. A theoretical treatment of radial cracking confirms that very low loads can propagate cracks to long distances; these loads are small by comparison with those calculated from theoretical models that treat ice as a plastically-deforming continuum.
Suicide is a devastating public health problem and very few biological treatments have been found to be effective for quickly reducing the intensity of suicidal ideation (SI). We have previously shown that a single dose of ketamine, a glutamate N-methyl-d-aspartate (NMDA) receptor antagonist, is associated with a rapid reduction in depressive symptom severity and SI in patients with treatment-resistant depression.
Method.
We conducted a randomized, controlled trial of ketamine in patients with mood and anxiety spectrum disorders who presented with clinically significant SI (n = 24). Patients received a single infusion of ketamine or midazolam (as an active placebo) in addition to standard of care. SI measured using the Beck Scale for Suicidal Ideation (BSI) 24 h post-treatment represented the primary outcome. Secondary outcomes included the Montgomery–Asberg Depression Rating Scale – Suicidal Ideation (MADRS-SI) score at 24 h and additional measures beyond the 24-h time-point.
Results.
The intervention was well tolerated and no dropouts occurred during the primary 7-day assessment period. BSI score was not different between the treatment groups at 24 h (p = 0.32); however, a significant difference emerged at 48 h (p = 0.047). MADRS-SI score was lower in the ketamine group compared to midazolam group at 24 h (p = 0.05). The treatment effect was no longer significant at the end of the 7-day assessment period.
Conclusions.
The current findings provide initial support for the safety and tolerability of ketamine as an intervention for SI in patients who are at elevated risk for suicidal behavior. Larger, well-powered studies are warranted.
Helicobacter pylori infection is a major cause of peptic ulcer and is also associated with chronic gastritis, mucosa-associated lymphoid tissue (MALT) lymphoma, and adenocarcinoma of the stomach. Guidelines have been developed in the United States and Europe (areas with low prevalence) for the diagnosis and management of this infection, including the recommendation to ‘test and treat’ those with dyspepsia. A group of international experts performed a targeted literature review and formulated an expert opinion for evidenced-based benefits and harms for screening and treatment of H. pylori in high-prevalence countries. They concluded that in Arctic countries where H. pylori prevalence exceeds 60%, treatment of persons with H. pylori infection should be limited only to instances where there is strong evidence of direct benefit in reduction of morbidity and mortality, associated peptic ulcer disease and MALT lymphoma and that the test-and-treat strategy may not be beneficial for those with dyspepsia.