We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In functional magnetic resonance imaging (fMRI), the blood oxygenation level dependent (BOLD) signal is often interpreted as a measure of neural activity. However, because the BOLD signal reflects the complex interplay of neural, vascular, and metabolic processes, such an interpretation is not always valid. There is growing evidence that changes in the baseline neurovascular state can result in significant modulations of the BOLD signal that are independent of changes in neural activity. This paper introduces some of the normalization and calibration methods that have been proposed for making the BOLD signal a more accurate reflection of underlying brain activity for human fMRI studies.
Functional magnetic resonance imaging (fMRI) is a noninvasive method for measuring brain function by correlating temporal changes in local cerebral blood oxygenation with behavioral measures. fMRI is used to study individuals at single time points, across multiple time points (with or without intervention), as well as to examine the variation of brain function across normal and ill populations. fMRI may be collected at multiple sites and then pooled into a single analysis. This paper describes how fMRI data is analyzed at each of these levels and describes the noise sources introduced at each level.
Little is known about the skills involved in clinical formulation. The individual case formulation (ICF) approach, based on functional analysis, employs clinical descriptions that are theory-free and depicts formulations constructed according to a set of basic conventions.
Aims:
We report a test of whether this method could be taught and if the quality of the resulting diagrams could be reliably rated.
Method:
Participants (n=40) participated in a training course in formulation. A draft rating scale was refined in the course of rating formulation diagrams and basic inter-rater reliability established.
Results:
Results of the study support further development of the ICF approach.
Imagery rescripting (ImRs) is a therapy technique that, unlike traditional re-living techniques, focuses less on exposure and verbal challenging of cognitions and instead encourages patients to directly transform the intrusive imagery to change the depicted course of events in a more desired direction. However, a comprehensive account of how and in what circumstances ImRs brings about therapeutic change is required if treatment is to be optimised, and this is yet to be developed. The present study reports on the development of a coding scheme of ImRs psychotherapy elements identified in the literature as potential ImRs mechanisms. The codes were assessed in relation to short-term outcomes of 27 individuals undergoing ImRs for post-traumatic stress disorder. The timing of the change in the image, degree of activation of the new image and associated cognitive, emotional and physiological processes, self-guided rescripting, rescript believability, narrative coherence and cognitive and emotional shift were identified as being related to symptom change and so are potentially important factors for the re-scripting process.
The Critically Endangered Chapman's pygmy chameleon Rhampholeon chapmanorum is endemic to the low elevation rainforest of the Malawi Hills in southern Malawi. Much of this forest has been converted to agriculture and it was uncertain whether chameleon populations have persisted. We used current and historical satellite imagery to identify remaining forest patches and assess deforestation. We then surveyed forest patches for the presence of this chameleon, and assessed its genetic diversity and structure. We estimated that 80% of the forest has been destroyed since 1984, although we found extant populations of the chameleon in each of the patches surveyed. Differentiation of genetic structure was strong between populations, suggesting that gene flow has been impaired. Genetic diversity was not low, but this could be the result of a temporal lag as well as lack of sensitivity in the mitochondrial marker used. Overall, the impact of forest loss is assumed to have led to a large demographic decline, with forest fragmentation preventing gene flow.
Imagery rescripting (ImRs) is an experiential therapy technique used to change the content and meaning of intrusive imagery in post-traumatic stress disorder (PTSD) by imagining alternative endings to traumatic events. There is growing evidence that ImRs is an effective treatment for PTSD; however, little is known about how it brings about change.
Aims:
This study aimed to explore the role of mental simulation as a candidate mechanism of action in ImRs, and, specifically, whether well-simulated imagery rescripts are associated with greater change in symptom severity during ImRs.
Method:
Using a single-case experimental design, seven participants receiving cognitive therapy for PTSD were assessed before, during and after sessions of imagery rescripting for one intrusive image. Participants completed continuous symptom severity measures. Sessions were recorded, then coded for goodness of simulation (GOS) as well as additional factors (e.g. rescript believability, vividness).
Results:
Participants were divided into high- and low-responders and coding was compared across groups. Correlational analyses were supported by descriptive analysis of individual sessions. High-responders’ rescripts tended to be rated as well-simulated compared with those of low-responders. Specific factors (e.g. intensity of thoughts/emotions related to original and new imagery elements, level of cognitive and emotional shift and belief in the resultant rescript) were also associated with reductions in symptom severity.
Conclusions:
There was tentative evidence that well-simulated rescripted images tended to be associated with greater reductions in symptom severity of the target image. Clinical implications and avenues for further research are discussed.
Survivor guilt can arise after surviving a trauma in which others die. No studies have systematically investigated psychological treatment for survivor guilt. The present study was a proof-of-concept investigation of treatment of survivor guilt using imagery rescripting. Thirteen participants with post-traumatic stress disorder and self-reported survivor guilt attended two consecutive imagery therapy sessions, to first elaborate and then rescript related imagery. Significant improvements were observed on idiographic process measures of cognitons, emotions and distress related to survivor guilt following the rescripting session. The study provides preliminary evidence that imagery rescripting can be used as an experiential technique to treat survivor guilt.
Interfacility patient movement plays an important role in the dissemination of antimicrobial-resistant organisms throughout healthcare systems. We evaluated how 3 alternative measures of interfacility patient sharing were associated with C. difficile infection incidence in Ontario acute-care facilities.
Design:
The cohort included adult acute-care facility stays of ≥3 days between April 2003 and March 2016. We measured 3 facility-level metrics of patient sharing: general patient importation, incidence-weighted patient importation, and C. difficile case importation. Each of the 3 patient-sharing metrics were examined against the incidence of C. difficile infection in the facility per 1,000 stays, using Poisson regression models.
Results:
The analyzed cohort included 6.70 million stays at risk of C. difficile infection across 120 facilities. Over the 13-year period, we included 62,189 new cases of healthcare-associated CDI (incidence, 9.3 per 1,000 stays). After adjustment for facility characteristics, general importation was not strongly associated with C. difficile infection incidence (risk ratio [RR] per doubling, 1.10; 95% confidence interval [CI], 0.97–1.24; proportional change in variance [PCV], −2.0%). Incidence-weighted (RR per doubling, 1.18; 95% CI, 1.06–1.30; PCV, −8.4%) and C. difficile case importation (RR per doubling, 1.43; 95% CI, 1.29–1.58; PCV, −30.1%) were strongly associated with C. difficile infection incidence.
Conclusions:
In this 13-year study of acute-care facilities in Ontario, interfacility variation in C. difficile infection incidence was associated with importation of patients from other high-incidence acute-care facilities or specifically of patients with a recent history of C. difficile infection. Regional infection control strategies should consider the potential impact of importation of patients at high risk of C. difficile shedding from outside facilities.
The goal of this study was to evaluate the ability of semantic (animal naming) and phonemic (FAS) fluency in their ability to discriminate between normal aging, amnestic-Mild Cognitive Impairment (a-MCI), and Alzheimer’s disease (AD).
Design:
We used binary logistic regressions, multinomial regressions, and discriminant analysis to evaluate the predictive value of semantic and phonemic fluency in regards to specific diagnostic classifications.
Setting:
Outpatient geriatric neuropsychology clinic.
Participants:
232 participants (normal aging = 99, a-MCI = 90, AD = 43; mean age = 65.75 years).
Measurements:
Mini-mental State Examination (MMSE), Controlled Oral Word Association Test
Results:
Results indicate that semantic and phonemic fluency were significant predictors of diagnostic classification, and semantic fluency explained a greater amount of the discriminant ability of the model.
Conclusions:
These results suggest that verbal fluency, particularly semantic fluency, may be an accurate and efficient tool in screening for early dementia in time-limited medical settings.
Clostridium difficile spores play an important role in transmission and can survive in the environment for several months. Optimal methods for measuring environmental C. difficile are unknown. We sought to determine whether increased sample surface area improved detection of C. difficile from environmental samples.
Setting
Samples were collected from 12 patient rooms in a tertiary-care hospital in Toronto, Canada.
Methods
Samples represented small surface-area and large surface-area floor and bedrail pairs from single-bed rooms of patients with low (without prior antibiotics), medium (with prior antibiotics), and high (C. difficile infected) shedding risk. Presence of C. difficile in samples was measured using quantitative polymerase chain reaction (qPCR) with targets on the 16S rRNA and toxin B genes and using enrichment culture.
Results
Of the 48 samples, 64·6% were positive by 16S qPCR (geometric mean, 13·8 spores); 39·6% were positive by toxin B qPCR (geometric mean, 1·9 spores); and 43·8% were positive by enrichment culture. By 16S qPCR, each 10-fold increase in sample surface area yielded 6·6 times (95% CI, 3·2–13) more spores. Floor surfaces yielded 27 times (95% CI, 4·9–181) more spores than bedrails, and rooms of C. difficile–positive patients yielded 11 times (95% CI, 0·55–164) more spores than those of patients without prior antibiotics. Toxin B qPCR and enrichment culture returned analogous findings.
Conclusions
Clostridium difficile spores were identified in most floor and bedrail samples, and increased surface area improved detection. Future research aiming to understand the role of environmental C. difficile in transmission should prefer samples with large surface areas.
Antibiotic use varies widely between hospitals, but the influence of antimicrobial stewardship programs (ASPs) on this variability is not known. We aimed to determine the key structural and strategic aspects of ASPs associated with differences in risk-adjusted antibiotic utilization across facilities.
Design
Observational study of acute-care hospitals in Ontario, Canada
Methods
A survey was sent to hospitals asking about both structural (8 elements) and strategic (32 elements) components of their ASP. Antibiotic use from hospital purchasing data was acquired for January 1 to December 31, 2014. Crude and adjusted defined daily doses per 1,000 patient days, accounting for hospital and aggregate patient characteristics, were calculated across facilities. Rate ratios (RR) of defined daily doses per 1,000 patient days were compared for hospitals with and without each antimicrobial stewardship element of interest.
Results
Of 127 eligible hospitals, 73 (57%) participated in the study. There was a 7-fold range in antibiotic use across these facilities (min, 253 defined daily doses per 1,000 patient days; max, 1,872 defined daily doses per 1,000 patient days). The presence of designated funding or resources for the ASP (RRadjusted, 0·87; 95% CI, 0·75–0·99), prospective audit and feedback (RRadjusted, 0·80; 95% CI, 0·67–0·96), and intravenous-to-oral conversion policies (RRadjusted, 0·79; 95% CI, 0·64–0·99) were associated with lower risk-adjusted antibiotic use.
Conclusions
Wide variability in antibiotic use across hospitals may be partially explained by both structural and strategic ASP elements. The presence of funding and resources, prospective audit and feedback, and intravenous-to-oral conversion should be considered priority elements of a robust ASP.
Cognitive behavioural therapy (CBT) is a highly effective treatment for obsessive compulsive disorder (OCD). Identifying, challenging and monitoring interpretations of intrusions is considered a key element of CBT for OCD but preliminary research suggests that treatment does not always include identification and modification of misinterpretations. The present investigation explored ‘OCD-expert’ and ‘non-OCD-expert’ clinicians’ views on key elements of CBT for OCD to determine whether identifying and modifying key interpretations were considered important in therapy and whether clinicians who do not have specific expertise in OCD found working with interpretations difficult. Study 1 used a qualitative approach to investigate OCD-expert and non-OCD-expert clinician's views on key elements of CBT for OCD. Study 2 used a questionnaire to investigate what non-OCD-expert clinicians viewed as important and difficult aspects of CBT for OCD. Study 1 results showed that OCD-expert and non-OCD-expert clinicians reported working with interpretations was a key element of CBT for OCD. However, OCD-expert clinicians linked interpretations more closely to a formulation and intervention plan and reported using more techniques and questionnaires when working with interpretations compared with non-OCD-expert clinicians. Study 2 results showed that non-OCD-expert clinicians rated interpretations as both important and difficult to work with but no more important or difficult than other key elements of CBT for OCD. OCD-expert and non-OCD-expert clinicians identify working with interpretations as a key element of CBT for OCD. Non-OCD-expert clinicians may benefit from additional training on formulation tools that help identify, monitor and challenge interpretations of intrusions.
Carbapenem-resistant Enterobacteriaceae (CRE) are a significant clinical and public health concern. Understanding the distribution of CRE colonization and developing a coordinated approach are key components of control efforts. The prevalence of CRE in the District of Columbia is unknown. We sought to determine the CRE colonization prevalence within healthcare facilities (HCFs) in the District of Columbia using a collaborative, regional approach.
DESIGN
Point-prevalence study.
SETTING
This study included 16 HCFs in the District of Columbia: all 8 acute-care hospitals (ACHs), 5 of 19 skilled nursing facilities, 2 (both) long-term acute-care facilities, and 1 (the sole) inpatient rehabilitation facility.
PATIENTS
Inpatients on all units excluding psychiatry and obstetrics-gynecology.
METHODS
CRE identification was performed on perianal swab samples using real-time polymerase chain reaction, culture, and antimicrobial susceptibility testing (AST). Prevalence was calculated by facility and unit type as the number of patients with a positive result divided by the total number tested. Prevalence ratios were compared using the Poisson distribution.
RESULTS
Of 1,022 completed tests, 53 samples tested positive for CRE, yielding a prevalence of 5.2% (95% CI, 3.9%–6.8%). Of 726 tests from ACHs, 36 (5.0%; 95% CI, 3.5%–6.9%) were positive. Of 244 tests from long-term-care facilities, 17 (7.0%; 95% CI, 4.1%–11.2%) were positive. The relative prevalence ratios by facility type were 0.9 (95% CI, 0.5–1.5) and 1.5 (95% CI, 0.9–2.6), respectively. No CRE were identified from the inpatient rehabilitation facility.
CONCLUSION
A baseline CRE prevalence was established, revealing endemicity across healthcare settings in the District of Columbia. Our study establishes a framework for interfacility collaboration to reduce CRE transmission and infection.
In North America, terrestrial records of biodiversity and climate change that span Marine Oxygen Isotope Stage (MIS) 5 are rare. Where found, they provide insight into how the coupling of the ocean–atmosphere system is manifested in biotic and environmental records and how the biosphere responds to climate change. In 2010–2011, construction at Ziegler Reservoir near Snowmass Village, Colorado (USA) revealed a nearly continuous, lacustrine/wetland sedimentary sequence that preserved evidence of past plant communities between ~140 and 55 ka, including all of MIS 5. At an elevation of 2705 m, the Ziegler Reservoir fossil site also contained thousands of well-preserved bones of late Pleistocene megafauna, including mastodons, mammoths, ground sloths, horses, camels, deer, bison, black bear, coyotes, and bighorn sheep. In addition, the site contained more than 26,000 bones from at least 30 species of small animals including salamanders, otters, muskrats, minks, rabbits, beavers, frogs, lizards, snakes, fish, and birds. The combination of macro- and micro-vertebrates, invertebrates, terrestrial and aquatic plant macrofossils, a detailed pollen record, and a robust, directly dated stratigraphic framework shows that high-elevation ecosystems in the Rocky Mountains of Colorado are climatically sensitive and varied dramatically throughout MIS 5.
Surgical site infections (SSIs) are responsible for significant morbidity and mortality. Preadmission skin antisepsis, while controversial, has gained acceptance as a strategy for reducing the risk of SSI. In this study, we analyze the benefit of an electronic alert system for enhancing compliance to preadmission application of 2% chlorhexidine gluconate (CHG).
DESIGN, SETTING, AND PARTICIPANTS
Following informed consent, 100 healthy volunteers in an academic, tertiary care medical center were randomized to 5 chlorhexidine gluconate (CHG) skin application groups: 1, 2, 3, 4, or 5 consecutive applications. Participants were further randomized into 2 subgroups: with or without electronic alert. Skin surface concentrations of CHG (μg/mL) were analyzed using a colorimetric assay at 5 separate anatomic sites.
INTERVENTION
Preadmission application of chlorhexidine gluconate, 2%
RESULTS
Mean composite skin surface CHG concentrations in volunteer participants receiving EA following 1, 2, 3, 4, and 5 applications were 1,040.5, 1,334.4, 1,278.2, 1,643.9, and 1,803.1 µg/mL, respectively, while composite skin surface concentrations in the no-EA group were 913.8, 1,240.0, 1,249.8, 1,194.4, and 1,364.2 µg/mL, respectively (ANOVA, P<.001). Composite ratios (CHG concentration/minimum inhibitory concentration required to inhibit the growth of 90% of organisms [MIC90]) for 1, 2, 3, 4, or 5 applications using the 2% CHG cloth were 208.1, 266.8, 255.6, 328.8, and 360.6, respectively, representing CHG skin concentrations effective against staphylococcal surgical pathogens. The use of an electronic alert system resulted in significant increase in skin concentrations of CHG in the 4- and 5-application groups (P<.04 and P<.007, respectively).
CONCLUSION
The findings of this study suggest an evidence-based standardized process that includes use of an Internet-based electronic alert system to improve patient compliance while maximizing skin surface concentrations effective against MRSA and other staphylococcal surgical pathogens.
Infect. Control Hosp. Epidemiol. 2016;37(3):254–259
Direct stimulation of 23 median, 13 ulnar and 2 peroneal nerves at the time of surgical exploration has been used to locate, and characterize the conduction abnormalities in thenerves. The most frequent location of the major conduction abnormalities in the median nerve was in the first 1-2 cm distal to the origin of the carpal tunnel. In the ulnar nerve the important conduction abnormalities were located most frequently in the segments 1 cm proximal and distal to the medial epicondyle. In the peroneal nerve the major conduction abnormalities occurred proximal or distal to the entry point of the common peroneal nerve into the peroneus longusmuscle.