We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To evaluate the utility of selective reactive whole-genome sequencing (WGS) in aiding healthcare-associated cluster investigations.
Design:
Mixed-methods quality-improvement study.
Setting:
Thes study was conducted across 8 acute-care facilities in an integrated health system.
Methods:
We analyzed healthcare-associated coronavirus disease 2019 (COVID-19) clusters between May 2020 and July 2022 for which facility infection prevention and control (IPC) teams selectively requested reactive WGS to aid the epidemiologic investigation. WGS was performed with real-time results provided to IPC teams, including genetic relatedness of sequenced isolates. We conducted structured interviews with IPC teams on the informativeness of WGS for transmission investigation and prevention.
Results:
In total, 8 IPC teams requested WGS to aid the investigation of 17 COVID-19 clusters comprising 226 cases and 116 (51%) sequenced isolates. Of these, 16 (94%) clusters had at least 1 WGS-defined transmission event. IPC teams hypothesized transmission pathways in 14 (82%) of 17 clusters and used data visualizations to characterize these pathways in 11 clusters (65%). The teams reported that in 15 clusters (88%), WGS identified a transmission pathway; the WGS-defined pathway was not one that was predicted by epidemiologic investigation in 7 clusters (41%). WGS changed the understanding of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) transmission in 8 clusters (47%) and altered infection prevention interventions in 8 clusters (47%).
Conclusions:
Selectively utilizing reactive WGS helped identify cryptic SARS-CoV-2 transmission pathways and frequently changed the understanding and response to SARS-CoV-2 outbreaks. Until WGS is widely adopted, a selective reactive WGS approach may be highly impactful in response to healthcare-associated cluster investigations.
To assess the impact of a 24-hour autocancellation of uncollected Clostridioides difficile samples in reducing reported healthcare-associated infections (HAIs).
The study was conducted in 17 hospitals in Pennsylvania.
Interventions:
Clostridioides difficile tests that are not collected within 24 hours are automatically canceled (“autocancel”) through the electronic health record. The intervention took place at 2 facilities (intervention period November 2021–July 2022) and subsequently at 15 additional facilities (April 2022–July 2022). Quality measures included percentage of orders canceled, C. difficile HAI rate, percent positivity of completed tests, and potential adverse outcomes of canceled or delayed testing.
Results:
Of 6,101 orders, 1,090 (17.9%) were automatically canceled after not being collected for 24 hours during the intervention periods. The reported C. difficile HAI rates per 10,000 patient days did not significantly change. These rates were 8.07 in the 6-month preintervention period and 8.77 in the intervention period for facilities A and B combined (incidence rate ratio [IRR], 1.09; 95% CI, 0.88–1.34; P = .43), and were 5.23 HAIs per 10,000 patient days in the 6-month preintervention period and 5.33 in the intervention period for facilities C–Q combined (IRR, 1.02; 95% CI, 0.79–1.32; P = .87). From the preintervention to the intervention periods, the percent positivity rates of completed C. difficile tests increased by 1.1% for facilities A and B and by 1.4% for facilities C–Q. No adverse outcomes were observed.
Conclusions:
The 24-hour autocancellation of uncollected C. difficile orders reduced testing but did not result in reported HAI reduction.
We analyzed efficacy of a centralized surveillance infection prevention (CSIP) program in a healthcare system on healthcare-associated infection (HAI) rates amid the coronavirus disease 2019 (COVID-19) pandemic. HAI rates were variable in CSIP and non-CSIP facilities. Central-line–associated bloodstream infection (CLABSI), C. difficile infection (CSI), and surgical-site infection (SSI) rates were negatively correlated with COVID-19 intensity in CSIP facilities.
To develop, implement, and evaluate the effectiveness of a unique centralized surveillance infection prevention (CSIP) program.
Design:
Observational quality improvement project.
Setting:
An integrated academic healthcare system.
Intervention:
The CSIP program comprises senior infection preventionists who are responsible for healthcare-associated infection (HAI) surveillance and reporting, allowing local infection preventionists (LIPs) a greater portion of their time to non-surveillance patient safety activities. Four CSIP team members accrued HAI responsibilities at 8 facilities.
Methods:
We evaluated the effectiveness of the CSIP program using 4 measures: recovery of LIP time, efficiency of surveillance activities by LIPs and CSIP staff, surveys characterizing LIP perception of their effectiveness in HAI reduction, and nursing leaders’ perception of LIP effectiveness.
Results:
The amount of time spent by LIP teams on HAI surveillance was highly variable, while CSIP time commitment and efficiency was steady. Post-CSIP implementation, 76.9% of LIPs agreed that they spend adequate time on inpatient units, compared to 15.4% pre-CSIP; LIPs also reported more time to allot to non-surveillance activities. Nursing leaders reported greater satisfaction with LIP involvement with HAI reduction practices.
Conclusion:
CSIP programs are a little-reported strategy to ease burden on LIPs with reallocation of HAI surveillance. The analyses presented here will aid health systems in anticipating the benefit of CSIP programs.
Barrett’s oesophagus (BE) is the precursor of oesophageal adenocarcinoma, which has become the most common type of oesophageal cancer in many Western populations. Existing evidence on diet and risk of BE predominantly comes from case–control studies, which are subject to recall bias in measurement of diet. We aimed to investigate the potential effect of diet, including macronutrients, carotenoids, food groups, specific food items, beverages and dietary scores, on risk of BE in over 20 000 participants of the Melbourne Collaborative Cohort Study. Diet at baseline (1990–1994) was measured using a food frequency questionnaire. The outcome was BE diagnosed between baseline and follow-up (2007–2010). Logistic regression models were used to estimate OR and 95 % CI for diet in relation to risk of BE. Intakes of leafy vegetables and fruit were inversely associated with risk of BE (highest v. lowest quartile: OR = 0·59; CI: 0·38, 0·94; P-trend = 0·02 and OR = 0·58; CI: 0·37, 0·93; P-trend = 0·02 respectively), as were dietary fibre and carotenoids. Stronger associations were observed for food than the nutrients found in them. Positive associations were observed for discretionary food (OR = 1·54; CI: 0·97, 2·44; P-trend = 0·04) and total fat intake (OR per 10 g/d = 1·11; CI: 1·00, 1·23), the association for fat was less robust in sensitivity analyses. No association was observed for meat, protein, dairy products or diet scores. Diet is a potential modifiable risk factor for BE. Public health and clinical guidelines that incorporate dietary recommendations could contribute to reduction in risk of BE and, thereby, oesophageal adenocarcinoma.
To evaluate the effectiveness of ultraviolet-C (UV-C) disinfection as an adjunct to standard chlorine-based disinfectant terminal room cleaning in reducing transmission of hospital-acquired multidrug-resistant organisms (MDROs) from a prior room occupant.
Design:
A retrospective cohort study was conducted to compare rates of MDRO transmission by UV-C status from January 1, 2016, through December 31, 2018.
Setting:
Acute-care, single-patient hospital rooms at 6 hospitals within an academic healthcare system in Pennsylvania.
Methods:
Transmission of hospital-acquired MDRO infection was assessed in patients subsequently assigned to a single-patient room of a source occupant with carriage of 1 or more MDROs on or during admission. Acquisition of 5 pathogens was compared between exposed patients in rooms with standard-of-care chlorine-based disinfectant terminal cleaning with or without adjunct UV-C disinfection. Logistic regression analysis was used to estimate the adjusted risk of pathogen transfer with adjunctive use of UV-C disinfection.
Results:
In total, 33,771 exposed patient admissions were evaluated; the source occupants carried 46,688 unique pathogens. Prior to the 33,771 patient admissions, 5,802 rooms (17.2%) were treated with adjunct UV-C disinfection. After adjustment for covariates, exposed patients in rooms treated with adjunct UV-C were at comparable risk of transfer of any pathogen (odds ratio, 1.06; 95% CI, 0.84–1.32; P = .64).
Conclusion:
Our analysis does not support the use of UV-C in addition to post-discharge cleaning with chlorine-based disinfectant to lower the risk of prior room occupant pathogen transfer.
To define conditions in which contact precautions can be safely discontinued for methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococcus (VRE).
Design:
Interrupted time series.
Setting:
15 acute-care hospitals.
Participants:
Inpatients.
Intervention:
Contact precautions for endemic MRSA and VRE were discontinued in 12 intervention hospitals and continued at 3 nonintervention hospitals. Rates of MRSA and VRE healthcare-associated infections (HAIs) were collected for 12 months before and after. Trends in HAI rates were analyzed using Poisson regression. To predict conditions when contact precautions may be safely discontinued, selected baseline hospital characteristics and infection prevention practices were correlated with HAI rate changes, stratified by hospital.
Results:
Aggregated HAI rates from intervention hospitals before and after discontinuation of contact precautions were 0.14 and 0.15 MRSA HAI per 1,000 patient days (P = .74), 0.05 and 0.05 VRE HAI per 1,000 patient days (P = .96), and 0.04 and 0.04 MRSA laboratory-identified (LabID) events per 100 admissions (P = .57). No statistically significant rate changes occurred between intervention and non-intervention hospitals. All successful hospitals had low baseline MRSA and VRE HAI rates and high hand hygiene adherence. We observed no correlations between rate changes after discontinuation and the assessed hospital characteristics and infection prevention factors, but the rate improved with higher proportion of semiprivate rooms (P = .04).
Conclusions:
Discontinuing contact precautions for MRSA/VRE did not result in increased HAI rates, suggesting that contact precautions can be safely removed from diverse hospitals, including community hospitals and those with lower proportions of private rooms. Good hand hygiene and low baseline HAI rates may be conditions permissive of safe removal of contact precautions.
To examine associations between diet and risk of developing gastro-oesophageal reflux disease (GERD).
Design:
Prospective cohort with a median follow-up of 15·8 years. Baseline diet was measured using a FFQ. GERD was defined as self-reported current or history of daily heartburn or acid regurgitation beginning at least 2 years after baseline. Sex-specific logistic regressions were performed to estimate OR for GERD associated with diet quality scores and intakes of nutrients, food groups and individual foods and beverages. The effect of substituting saturated fat for monounsaturated or polyunsaturated fat on GERD risk was examined.
Setting:
Melbourne, Australia.
Participants:
A cohort of 20 926 participants (62 % women) aged 40–59 years at recruitment between 1990 and 1994.
Results:
For men, total fat intake was associated with increased risk of GERD (OR 1·05 per 5 g/d; 95 % CI 1·01, 1·09; P = 0·016), whereas total carbohydrate (OR 0·89 per 30 g/d; 95 % CI 0·82, 0·98; P = 0·010) and starch intakes (OR 0·84 per 30 g/d; 95 % CI 0·75, 0·94; P = 0·005) were associated with reduced risk. Nutrients were not associated with risk for women. For both sexes, substituting saturated fat for polyunsaturated or monounsaturated fat did not change risk. For both sexes, fish, chicken, cruciferous vegetables and carbonated beverages were associated with increased risk, whereas total fruit and citrus were associated with reduced risk. No association was observed with diet quality scores.
Conclusions:
Diet is a possible risk factor for GERD, but food considered as triggers of GERD symptoms might not necessarily contribute to disease development. Potential differential associations for men and women warrant further investigation.
In Chapter 19, the authors deal with the practices of good teachers of listening. They describe an interventionist study aimed at developing teachers’ understanding of listening pedagogy and practice and present the implications of the study for models of teacher development.
We examined if and when English-learning 17-month-olds would accommodate Japanese forms as labels for novel objects. In Experiment 1, infants (n = 22) who were habituated to Japanese word–object pairs looked longer at switched test pairs than familiar test pairs, suggesting that they had mapped Japanese word forms to objects. In Experiments 2 (n = 44) and 3 (n = 22), infants were presented with a spoken passage prior to habituation to assess whether experience with a different language would shift their perception of Japanese word forms. Here, infants did not demonstrate learning of Japanese word–object pairs. These findings offer insight into the flexibility of the developing perceptual system. That is, when there is no evidence to the contrary, 17-month-olds will accommodate forms that vary from their typical input but will efficiently constrain their perception when cued to the fact that they are not listening to their native language.
This paper considers research and practice relating to listening in instructed classroom settings, limiting itself to what might be called unidirectional listening (Macaro, Graham & Vanderplank 2007) – in other words, where learners listen to a recording, a TV or radio clip or lecture, but where there is no communication back to the speaker(s). A review of the literature relating to such listening reveals a tendency for papers to highlight two features in their introductory lines: first, the acknowledged importance of listening as a skill underpinning second language (L2) acquisition more broadly, and second, the relative paucity of research into listening compared with the skills of speaking, reading or writing. In the last ten years or so, however, there has been a growth in the number of studies conducted in the field, as evidenced in Vandergrift's review in 2007 and Vanderplank's more recent overview (2013). Consequently, my view is that it is possible to identify from that research certain key principles in relation to listening within instructed settings, particularly regarding listening strategies.
We investigated 16- and 20-month-olds' flexibility in mapping phonotactically illegal words to objects. Using an associative word-learning task, infants were presented with a training phase that either highlighted or did not highlight the referential status of a novel label. Infants were then habituated to two novel objects, each paired with a phonotactically illegal Czech word. When referential cues were provided, 16-, but not 20-month-olds, formed word–object mappings. In the absence of referential cues, infants of both ages failed to map the novel words. These findings illustrate the complex interplay between infants' developing sound system and their word learning abilities.
Research suggests that the way in which cognitive therapy is delivered is an important factor in determining outcomes. We test the hypotheses in which the development of a shared problem list, use of case formulation, homework tasks and active intervention strategies will act as process variables.
Method
Presence of these components during therapy is taken from therapist notes. The direct and indirect effect of the intervention is estimated by an instrumental variable analysis.
Results
A significant decrease in the symptom score for case formulation (coefficient =–23, 95% CI –44 to –1.7, P = 0.036) and homework (coefficient =–0.26, 95% CI –0.51 to –0.001, P = 0.049) is found. Improvement with the inclusion of active change strategies is of borderline significance (coefficient =–0.23, 95% CI –0.47 to 0.005, P = 0.056).
Conclusions
There is a greater treatment effect if formulation and homework are involved in therapy. However, high correlation between components means that these may be indicators of overall treatment fidelity.
The difficulties in conducting palliative care research have been widely acknowledged. In order to generate the evidence needed to underpin palliative care provision, collaborative research is considered essential. Prior to formalizing the development of a research network for the state of Victoria, Australia, a preliminary study was undertaken to ascertain interest and recommendations for the design of such a collaboration.
Method:
Three data-collection strategies were used: a cross-sectional questionnaire, interviews, and workshops. The questionnaire was completed by multidisciplinary palliative care specialists from across the state (n = 61); interviews were conducted with senior clinicians and academics (n = 21) followed by two stakeholder workshops (n = 29). The questionnaire was constructed specifically for this study, measuring involvement of and perceptions of palliative care research.
Results:
Both the interview and the questionnaire data demonstrated strong support for a palliative care research network and aided in establishing a research agenda. The stakeholder workshops assisted with strategies for the formation of the Palliative Care Research Network Victoria (PCRNV) and guided the development of the mission and strategic plan.
Significance of results:
The research and efforts to date to establish the PCRNV are encouraging and provide optimism for the evolution of palliative care research in Australia. The international implications are highlighted.
Background: Substantial epidemiological research has shown that psychotic experiences are more common in densely populated areas. Many patients with persecutory delusions find it difficult to enter busy social urban settings. The stress and anxiety caused by being outside lead many patients to remain in-doors. We therefore developed a brief CBT intervention, based upon a formulation of the way urban environments cause stress and anxiety, to help patients with paranoid thoughts to feel less distressed when outside in busy streets. Aims: The aim was to pilot the new intervention for feasibility and acceptability and gather preliminary outcome data. Method: Fifteen patients with persecutory delusions in the context of a schizophrenia diagnosis took part. All patients first went outside to test their reactions, received the intervention, and then went outside again. Results: The intervention was considered useful by the patients. There was evidence that going outside after the intervention led to less paranoid responses than the initial exposure, but this was only statistically significant for levels of distress. Conclusions: Initial evidence was obtained that a brief CBT module specifically focused on helping patients with paranoia go outside is feasible, acceptable, and may have clinical benefits. However, it could not be determined from this small feasibility study that any observed improvements were due to the CBT intervention. Challenges in this area and future work required are outlined.
Background: Research suggests that core schemas are important in both the development and maintenance of psychosis. Aims: The aim of the study was to investigate and compare core schemas in four groups along the continuum of psychosis and examine the relationships between schemas and positive psychotic symptomatology. Method: A measure of core schemas was distributed to 20 individuals experiencing first-episode psychosis (FEP), 113 individuals with “at risk mental states” (ARMS), 28 participants forming a help-seeking clinical group (HSC), and 30 non-help-seeking individuals who endorse some psychotic-like experiences (NH). Results: The clinical groups scored significantly higher than the NH group for negative beliefs about self and about others. No significant effects of group on positive beliefs about others were found. For positive beliefs about the self, the NH group scored significantly higher than the clinical groups. Furthermore, negative beliefs about self and others were related to positive psychotic symptomatology and to distress related to those experiences. Conclusions: Negative evaluations of the self and others appear to be characteristic of the appraisals of people seeking help for psychosis and psychosis-like experiences. The results support the literature that suggests that self-esteem should be a target for intervention. Future research would benefit from including comparison groups of people experiencing chronic psychosis and people who do not have any psychotic-like experiences.
Delusions are a key symptom of psychosis and they are frequently distressing and disabling. Existing treatments, both pharmacological and psychological, are only partially effective. It is important to develop new treatment approaches based on theoretically derived and empirically tested processes. Delusions are associated with a reasoning bias: the jumping to conclusions (JTC) bias involves gathering limited information to reach decisions. It is proposed that this bias influences appraisals of psychotic experiences leading to the formation and persistence of delusions. Existing treatments do not influence JTC. A new intensive treatment approach – ‘reasoning training’ – is described. It aims to encourage participants to gather information, consider alternative explanations for events and review the evidence before reaching a decision. Preliminary data suggest that it is possible to change the JTC bias and that this improves belief flexibility and may reduce delusional conviction. The concepts and methods of this new approach have implications for clinical practice.
Background: Dementia research often requires the participation of people with dementia. Obtaining informed consent is problematic when potential participants lack the capacity to provide it. We investigated comfort with proxy consent to research involving older adults deemed incapable of this decision, and examined if comfort varies with the type of proxy and the study's risk-benefit profile.
Methods: We surveyed random samples of five relevant groups (older adults, informal caregivers, physicians, researchers in aging, and Research Ethics Board members) from four Canadian provinces. Respondents were presented with scenarios involving four types of proxies (non-assigned, designated in a healthcare advance directive with or without instructions specific to research participation, and court-appointed). Given a series of risk-benefit profiles, respondents indicated whether they were comfortable with proxy consent to research for each scenario.
Results: Two percent of the respondents felt proxy consent should never be allowed. In all groups, comfort depended far more on the risk-benefit profile associated with the research scenario than with type of proxy. For research involving little or no risk and potential personal benefits, over 90% of the respondents felt comfortable with substitute consent by a designated or court-appointed proxy while 80% were at ease with a non-assigned proxy. For studies involving serious risks with potentially greater personal benefits, older adults and informal caregivers were less comfortable with proxy consent.
Conclusions: A large majority of Canadians are comfortable with proxy consent for low-risk research. Further work is needed to establish what kinds of research are considered to be low risk.