We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
People with dementia are more prone to premature nursing home placement after hospitalization due to physical and mental deconditioning which makes care-at- home more difficult. This study aimed to evaluate the effect of a post hospital discharge transitional care program on reduction of nursing home placement in people with dementia.
Methods:
A matched case-control study was conducted between 2018 and 2021. A transitional care program using case management approach was developed. Participants enrolled the program by self-enrolment or referral from hospitals or NGOs. Community-dwelling people with dementia discharged from hospitals received a four- week residential care at a dementia care centre with intensive nursing care, physiotherapy and group activities promoting social engagement, followed by eight- week day care rehabilitation activities to improve their mobility and cognitive functioning. They were matched on a 1:5 ratio by age and sex to people with dementia discharged from a convalescent hospital who did not participate in this program for comparison. The study outcome was nursing home admission, measured three months (i.e. post-intervention), six months, and nine months after hospital discharge. Multinomial logistic regression was conducted to investigate factors associated with nursing home placement at each measurement time-point.
Results:
361 hospital admission episodes (n=67 interevntion, n=294 control) were examined. The regression results showed that participants in the intervention group were significantly less likely to be admitted to nursing home three months (OR = 0.023, 95% CI: 0.003-0.201, p = .001) and six months (OR = 0.094, 95% CI: 0.025-0.353, p = .001) than the controls after hospital discharge, but the intervention effect did not sustain nine months after hospital discharge. Longer hospital length of stay, and hospital admission due to dementia, mental disturbances such as delirium, or mental disorders IPA_Abstract_PDP_20230119_clean 2 such as schizophrenia significantly predicted nursing home admission three months and six months after hospital discharge.
Conclusion:
The transitional care program could help reduce nursing home placement in people with dementia after hospital discharge. To sustain the intervention effect, more continual support after the intervention as well as family caregiver training would be required.
Many clinical trials leverage real-world data. Typically, these data are manually abstracted from electronic health records (EHRs) and entered into electronic case report forms (CRFs), a time and labor-intensive process that is also error-prone and may miss information. Automated transfer of data from EHRs to eCRFs has the potential to reduce data abstraction and entry burden as well as improve data quality and safety.
Methods:
We conducted a test of automated EHR-to-CRF data transfer for 40 participants in a clinical trial of hospitalized COVID-19 patients. We determined which coordinator-entered data could be automated from the EHR (coverage), and the frequency with which the values from the automated EHR feed and values entered by study personnel for the actual study matched exactly (concordance).
Results:
The automated EHR feed populated 10,081/11,952 (84%) coordinator-completed values. For fields where both the automation and study personnel provided data, the values matched exactly 89% of the time. Highest concordance was for daily lab results (94%), which also required the most personnel resources (30 minutes per participant). In a detailed analysis of 196 instances where personnel and automation entered values differed, both a study coordinator and a data analyst agreed that 152 (78%) instances were a result of data entry error.
Conclusions:
An automated EHR feed has the potential to significantly decrease study personnel effort while improving the accuracy of CRF data.
Unmanned aerial vehicle (UAV) swarm coverage is one of the key technologies for multi-UAV cooperation, which plays an important role in collaborative investigation, detection, rescue and other applications. Aiming at the coverage optimisation problem of UAV in the target area, a collaborative visual coverage control method under positioning uncertainty is presented. First, the visual perception area with imprecise localisation, UAV model and sensor model are created based on the given task environment. Second, a regional division algorithm for the target task area is designed based on the principle of Guaranteed Voronoi (GV) diagram. Then a visual area coverage planning algorithm is designed, in which the task area is allocated to the UAV according to the corresponding weight coefficient of each area, and the input control law is adjusted by the expected state information of the UAV, so that the optimal coverage quality target value and the maximum coverage of the target area can be achieved. Finally, three task scenarios for regional division and coverage planning are simulated respectively, the results show that the proposed area coverage planning algorithm can realise the optimal regional distribution and can obtain more than 90% coverage in different scenarios.
Glutamatergic dysfunction has been implicated in sensory integration deficits in schizophrenia, yet how glutamatergic function contributes to behavioural impairments and neural activities of sensory integration remains unknown.
Methods
Fifty schizophrenia patients and 43 healthy controls completed behavioural assessments for sensory integration and underwent magnetic resonance spectroscopy (MRS) for measuring the anterior cingulate cortex (ACC) glutamate levels. The correlation between glutamate levels and behavioural sensory integration deficits was examined in each group. A subsample of 20 pairs of patients and controls further completed an audiovisual sensory integration functional magnetic resonance imaging (fMRI) task. Blood Oxygenation Level Dependent (BOLD) activation and task-dependent functional connectivity (FC) were assessed based on fMRI data. Full factorial analyses were performed to examine the Group-by-Glutamate Level interaction effects on fMRI measurements (group differences in correlation between glutamate levels and fMRI measurements) and the correlation between glutamate levels and fMRI measurements within each group.
Results
We found that schizophrenia patients exhibited impaired sensory integration which was positively correlated with ACC glutamate levels. Multimodal analyses showed significantly Group-by-Glutamate Level interaction effects on BOLD activation as well as task-dependent FC in a ‘cortico-subcortical-cortical’ network (including medial frontal gyrus, precuneus, ACC, middle cingulate gyrus, thalamus and caudate) with positive correlations in patients and negative in controls.
Conclusions
Our findings indicate that ACC glutamate influences neural activities in a large-scale network during sensory integration, but the effects have opposite directionality between schizophrenia patients and healthy people. This implicates the crucial role of glutamatergic system in sensory integration processing in schizophrenia.
The risk of antipsychotic-associated cardiovascular and metabolic events may differ among countries, and limited real-world evidence has been available comparing the corresponding risks among children and young adults. We, therefore, evaluated the risks of cardiovascular and metabolic events in children and young adults receiving antipsychotics.
Methods
We conducted a multinational self-controlled case series (SCCS) study and included patients aged 6–30 years old who had both exposure to antipsychotics and study outcomes from four nationwide databases of Taiwan (2004–2012), Korea (2010–2016), Hong Kong (2001–2014) and the UK (1997–2016) that covers a total of approximately 100 million individuals. We investigated three antipsychotics exposure windows (i.e., 90 days pre-exposure, 1–30 days, 30–90 days and 90 + days of exposure). The outcomes were cardiovascular events (stroke, ischaemic heart disease and acute myocardial infarction), or metabolic events (hypertension, type 2 diabetes mellitus and dyslipidaemia).
Results
We included a total of 48 515 individuals in the SCCS analysis. We found an increased risk of metabolic events only in the risk window with more than 90-day exposure, with a pooled IRR of 1.29 (95% CI 1.20–1.38). The pooled IRR was 0.98 (0.90–1.06) for 1–30 days and 0.88 (0.76–1.02) for 31–90 days. We found no association in any exposure window for cardiovascular events. The pooled IRR was 1.86 (0.74–4.64) for 1–30 days, 1.35 (0.74–2.47) for 31–90 days and 1.29 (0.98–1.70) for 90 + days.
Conclusions
Long-term exposure to antipsychotics was associated with an increased risk of metabolic events but did not trigger cardiovascular events in children and young adults.
Pooling of samples in detecting the presence of virus is an effective and efficient strategy in screening carriers in a large population with low infection rate, leading to reduction in cost and time. There are a number of pooling test methods, some being simple and others being complicated. In such pooling tests, the most important parameter to decide is the pool or group size, which can be optimised mathematically. Two pooling methods are relatively simple. The minimum numbers required in these two tests for a population with known infection rate are discussed and compared. Results are useful for identifying asymptomatic carriers in a short time and in implementing health codes systems.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to coronavirus disease 2019 (COVID-19) with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplementary materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
To describe the infection control preparedness measures undertaken for coronavirus disease (COVID-19) due to SARS-CoV-2 (previously known as 2019 novel coronavirus) in the first 42 days after announcement of a cluster of pneumonia in China, on December 31, 2019 (day 1) in Hong Kong.
Methods:
A bundled approach of active and enhanced laboratory surveillance, early airborne infection isolation, rapid molecular diagnostic testing, and contact tracing for healthcare workers (HCWs) with unprotected exposure in the hospitals was implemented. Epidemiological characteristics of confirmed cases, environmental samples, and air samples were collected and analyzed.
Results:
From day 1 to day 42, 42 of 1,275 patients (3.3%) fulfilling active (n = 29) and enhanced laboratory surveillance (n = 13) were confirmed to have the SARS-CoV-2 infection. The number of locally acquired case significantly increased from 1 of 13 confirmed cases (7.7%, day 22 to day 32) to 27 of 29 confirmed cases (93.1%, day 33 to day 42; P < .001). Among them, 28 patients (66.6%) came from 8 family clusters. Of 413 HCWs caring for these confirmed cases, 11 (2.7%) had unprotected exposure requiring quarantine for 14 days. None of these was infected, and nosocomial transmission of SARS-CoV-2 was not observed. Environmental surveillance was performed in the room of a patient with viral load of 3.3 × 106 copies/mL (pooled nasopharyngeal and throat swabs) and 5.9 × 106 copies/mL (saliva), respectively. SARS-CoV-2 was identified in 1 of 13 environmental samples (7.7%) but not in 8 air samples collected at a distance of 10 cm from the patient’s chin with or without wearing a surgical mask.
Conclusion:
Appropriate hospital infection control measures was able to prevent nosocomial transmission of SARS-CoV-2.
Copy number variations (CNVs), as an important source of genetic variation, can affect a wide range of phenotypes by diverse mechanisms. The somatostatin receptor 2 (SSTR2) gene plays important roles in cell proliferation and apoptosis. Recently, this gene was mapped to a CNV region, which encompasses quantitative trait loci of cattle economic traits including body weight, marbling score, etc. Therefore, SSTR2 CNV may exhibit phenotypic effects on cattle growth traits. In the current study, distribution of SSTR2 gene CNVs was investigated in six Chinese cattle breeds (XN, QC, NY, JA, LX and PN), and the results showed higher CNV polymorphisms in XN, QC and NY cattle. Next, association analysis between growth traits and SSTR2 CNV was performed for XN, QC and NY cattle. In NY, individuals with fewer copies showed better performance than those with more copies. Further, the effects of SSTR2 CNV on the SSTR2 mRNA level were also investigated, but revealed no significant correlation in either muscle or adipose tissue of adult NY cattle. The results suggested the potential for use of SSTR2 CNV as a marker for the molecular breeding of NY cattle.
Choosing Wisely Canada (CWC) is a national initiative designed to encourage patient-clinician discussions about the appropriate, evidence-based use of medical tests, procedures and treatments. The Canadian Association of Emergency Physicians’ (CAEP) Choosing Wisely Canada (CWC) working group developed and released ten recommendations relevant to Emergency Medicine in June 2015 (items 1–5) and October 2016 (items 6–10). In November 2016, the CAEP CWC working group developed a process for updating the recommendations. This process involves: 1) Using GRADE to evaluate the quality of evidence, 2) reviewing relevant recommendations on an ad hoc basis as new evidence emerges, and 3) reviewing all recommendations every five years. While the full review of the CWC recommendations will be performed in 2020, a number of high-impact studies were published after our initial launch that prompted an ad hoc review of the relevant three of our ten recommendations prior to the full review in 2020. This paper describes the results of the CAEP CWC working group's ad hoc review of three of our ten recommendations in light of recent publications.
The response of soil microbial communities to soil quality changes is a sensitive indicator of soil ecosystem health. The current work investigated soil microbial communities under different fertilization treatments in a 31-year experiment using the phospholipid fatty acid (PLFA) profile method. The experiment consisted of five fertilization treatments: without fertilizer input (CK), chemical fertilizer alone (MF), rice (Oryza sativa L.) straw residue and chemical fertilizer (RF), low manure rate and chemical fertilizer (LOM), and high manure rate and chemical fertilizer (HOM). Soil samples were collected from the plough layer and results indicated that the content of PLFAs were increased in all fertilization treatments compared with the control. The iC15:0 fatty acids increased significantly in MF treatment but decreased in RF, LOM and HOM, while aC15:0 fatty acids increased in these three treatments. Principal component (PC) analysis was conducted to determine factors defining soil microbial community structure using the 21 PLFAs detected in all treatments: the first and second PCs explained 89.8% of the total variance. All unsaturated and cyclopropyl PLFAs except C12:0 and C15:0 were highly weighted on the first PC. The first and second PC also explained 87.1% of the total variance among all fertilization treatments. There was no difference in the first and second PC between RF and HOM treatments. The results indicated that long-term combined application of straw residue or organic manure with chemical fertilizer practices improved soil microbial community structure more than the mineral fertilizer treatment in double-cropped paddy fields in Southern China.
Recent studies indicate that early postnatal period is a critical window for gut microbiota manipulation to optimise the immunity and body growth. This study investigated the effects of maternal faecal microbiota orally administered to neonatal piglets after birth on growth performance, selected microbial populations, intestinal permeability and the development of intestinal mucosal immune system. In total, 12 litters of crossbred newborn piglets were selected in this study. Litter size was standardised to 10 piglets. On day 1, 10 piglets in each litter were randomly allotted to the faecal microbiota transplantation (FMT) and control groups. Piglets in the FMT group were orally administrated with 2ml faecal suspension of their nursing sow per day from the age of 1 to 3 days; piglets in the control group were treated with the same dose of a placebo (0.1M potassium phosphate buffer containing 10% glycerol (vol/vol)) inoculant. The experiment lasted 21 days. On days 7, 14 and 21, plasma and faecal samples were collected for the analysis of growth-related hormones and cytokines in plasma and lipocalin-2, secretory immunoglobulin A (sIgA), selected microbiota and short-chain fatty acids (SCFAs) in faeces. Faecal microbiota transplantation increased the average daily gain of piglets during week 3 and the whole experiment period. Compared with the control group, the FMT group had increased concentrations of plasma growth hormone and IGF-1 on days 14 and 21. Faecal microbiota transplantation also reduced the incidence of diarrhoea during weeks 1 and 3 and plasma concentrations of zonulin, endotoxin and diamine oxidase activities in piglets on days 7 and 14. The populations of Lactobacillus spp. and Faecalibacterium prausnitzii and the concentrations of faecal and plasma acetate, butyrate and total SCFAs in FMT group were higher than those in the control group on day 21. Moreover, the FMT piglets have higher concentrations of plasma transforming growth factor-β and immunoglobulin G, and faecal sIgA than the control piglets on day 21. These findings indicate that early intervention with maternal faecal microbiota improves growth performance, decreases intestinal permeability, stimulates sIgA secretion, and modulates gut microbiota composition and metabolism in suckling piglets.
Our objective was to identify predictors of severe acute respiratory infection in hospitalised patients and understand the impact of vaccination and neuraminidase inhibitor administration on severe influenza. We analysed data from a study evaluating influenza vaccine effectiveness in two Michigan hospitals during the 2014–2015 and 2015–2016 influenza seasons. Adults admitted to the hospital with an acute respiratory infection were eligible. Through patient interview and medical record review, we evaluated potential risk factors for severe disease, defined as ICU admission, 30-day readmission, and hospital length of stay (LOS). Two hundred sixteen of 1119 participants had PCR-confirmed influenza. Frailty score, Charlson score and tertile of prior-year healthcare visits were associated with LOS. Charlson score >2 (OR 1.5 (1.0–2.3)) was associated with ICU admission. Highest tertile of prior-year visits (OR 0.3 (0.2–0.7)) was associated with decreased ICU admission. Increasing tertile of visits (OR 1.5 (1.2–1.8)) was associated with 30-day readmission. Frailty and prior-year healthcare visits were associated with 30-day readmission among influenza-positive participants. Neuraminidase inhibitors were associated with decreased LOS among vaccinated participants with influenza A (HR 1.6 (1.0–2.4)). Overall, frailty and lack of prior-year healthcare visits were predictors of disease severity. Neuraminidase inhibitors were associated with reduced severity among vaccine recipients.
Introduction: With the current opioid crisis in Canada, presentations of acute opioid withdrawal (AOW) to emergency departments (ED) are increasing. Undertreated symptoms may result in relapse, overdose and death. Buprenorphine/naloxone (bup/nal) is a partial opioid agonist/antagonist used to mitigate symptoms of AOW, approved by Health Canad in 2007 for opioid use disorder. It is superior to clonidine, and increases follow up with addiction treatment programs when initiated in the ED. Nevertheless, in our inner-city ED in 2014, bup/nal was rarely prescribed. We aimed to increase ED physician prescribing of bup/nal for AOW by 50% over a 26-month period. Methods: Commencing in 2014, an interprofessional team of ED physicians, nurses (RN), pharmacists and QI specialists collaborated to improve the care of patients with AOW. PDSA cycles included: (1) needs assessment of emergency physicians knowledge and practices in 2014; (2) Grand Rounds and a web based information sheet in 2015; (3) ED stocking of bup/nal; (4) convenience order set to standardize AOW management; (5) Grand Rounds in 2016 and (6) peer-coaching for RNs, including case-based discussions and pocket card cognitive aids. The outcome was the number of times bup/nal was prescribed per month by ED physicians between Sept, 2015 and Oct, 2017. Data included the prescriber and use of order set as the process measure. The balancing measure was the number of patients referred to the Addiction Medicine Team who subsequently received bup/nal. Results: Bup/nal was prescribed by ED physicians 70 times, and 14 times by the Addiction Medicine Team. With each PDSA cycle, there was an increase in prescribing, with no significant shifts or trends. By all physicians, the median number of prescriptions per month was 3, and increased from 2 to 4 prescriptions/month after nursing education. There was a smaller increase in the median from 2 to 3 prescriptions/month by ED physicians alone. The order set was used 97% of the time. Conclusion: Bup/nal is safe, effective, and increases follow up with addiction programs for comprehensive assessment and treatment planning. We met our goal of increasing bup/nal prescribing in the ED for AOW by 50%. Moreover, prescribing increased by 100% with the addition of patients who received bup/nal after a referral to the Addiction Medicine Team. The intervention with the greatest impact was RN education, demonstrating that peer-coaching and teaching by an interprofessional team is key to changing practice. Unfortunately, overall prescribing remains low, and ED physicians may still be hesitant to prescribe bup/nal and defer to the specialists. It is unclear if this is due to a low number of patients presenting with AOW, patients with contraindications to bup/nal, or ED physician factors. The next step is an audit of all patients with AOW to see what percentage of those eligible are treated with bup/nal. A follow up survey to determine ongoing barriers will inform further PDSA cycles.
Multidrug-resistant organisms (MDROs) are increasingly reported in residential care homes for the elderly (RCHEs). We assessed whether implementation of directly observed hand hygiene (DOHH) by hand hygiene ambassadors can reduce environmental contamination with MDROs.
METHODS
From July to August 2017, a cluster-randomized controlled study was conducted at 10 RCHEs (5 intervention versus 5 nonintervention controls), where DOHH was performed at two-hourly intervals during daytime, before meals and medication rounds by a one trained nurse in each intervention RCHE. Environmental contamination by MRDOs, such as methicillin-resistant Staphylococcus aureus (MRSA), carbapenem-resistant Acinetobacter species (CRA), and extended-spectrum β-lactamse (ESBL)–producing Enterobacteriaceae, was evaluated using specimens collected from communal areas at baseline, then twice weekly. The volume of alcohol-based hand rub (ABHR) consumed per resident per week was measured.
RESULTS
The overall environmental contamination of communal areas was culture-positive for MRSA in 33 of 100 specimens (33%), CRA in 26 of 100 specimens (26%), and ESBL-producing Enterobacteriaceae in 3 of 100 specimens (3%) in intervention and nonintervention RCHEs at baseline. Serial monitoring of environmental specimens revealed a significant reduction in MRSA (79 of 600 [13.2%] vs 197 of 600 [32.8%]; P<.001) and CRA (56 of 600 [9.3%] vs 94 of 600 [15.7%]; P=.001) contamination in the intervention arm compared with the nonintervention arm during the study period. The volume of ABHR consumed per resident per week was 3 times higher in the intervention arm compared with the baseline (59.3±12.9 mL vs 19.7±12.6 mL; P<.001) and was significantly higher than the nonintervention arm (59.3±12.9 mL vs 23.3±17.2 mL; P=.006).
CONCLUSIONS
The direct observation of hand hygiene of residents could reduce environmental contamination by MDROs in RCHEs.
To investigate the role of local allergic inflammation and Staphylococcus aureus enterotoxins in chronic rhinosinusitis with nasal polyps.
Methods:
This study included 36 patients with chronic rhinosinusitis with nasal polyps and 18 controls. Total immunoglobulin E, eosinophil cationic protein, staphylococcal enterotoxin types A and B specific immunoglobulin E, staphylococcal enterotoxin types A and B, and myeloperoxidase levels were determined.
Results:
Four patients with chronic rhinosinusitis with nasal polyps had a local allergy. All chronic rhinosinusitis with nasal polyps patients tested negative for staphylococcal enterotoxin types A and B specific immunoglobulin E. The chronic rhinosinusitis with nasal polyps group had significantly elevated staphylococcal enterotoxin types A and B levels in the supernatant. Fourteen patients belonged to the eosinophilic chronic rhinosinusitis with nasal polyps group and the others were characterised as having non-eosinophilic chronic rhinosinusitis with nasal polyps.
Conclusion:
Local allergy may play a role in chronic rhinosinusitis with nasal polyps, independent of staphylococcal enterotoxin superantigens. Staphylococcal enterotoxins may be important in the pathogenesis of chronic rhinosinusitis with nasal polyps; however, their roles as superantigens were not confirmed in this study. In Chinese subjects, chronic rhinosinusitis with nasal polyps usually manifests as a neutrophilic inflammation.
Bacillary dysentery continues to be a major health issue in developing countries and ambient temperature is a possible environmental determinant. However, evidence about the risk of bacillary dysentery attributable to ambient temperature under climate change scenarios is scarce. We examined the attributable fraction (AF) of temperature-related bacillary dysentery in urban and rural Hefei, China during 2006–2012 and projected its shifting pattern under climate change scenarios using a distributed lag non-linear model. The risk of bacillary dysentery increased with the temperature rise above a threshold (18·4 °C), and the temperature effects appeared to be acute. The proportion of bacillary dysentery attributable to hot temperatures was 18·74% (95 empirical confidence interval (eCI): 8·36–27·44%). Apparent difference of AF was observed between urban and rural areas, with AF varying from 26·87% (95% eCI 16·21–36·68%) in urban area to −1·90% (95 eCI −25·03 to 16·05%) in rural area. Under the climate change scenarios alone (1–4 °C rise), the AF from extreme hot temperatures (>31·2 °C) would rise greatly accompanied by the relatively stable AF from moderate hot temperatures (18·4–31·2 °C). If climate change proceeds, urban area may be more likely to suffer from rapidly increasing burden of disease from extreme hot temperatures in the absence of effective mitigation and adaptation strategies.