We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Creating a sustainable residency research program is necessary to develop a sustainable research pipeline, as highlighted by the recent Society for Academic Emergency Medicine 2024 Consensus Conference. We sought to describe the implementation of a novel, immersive research program for first-year emergency medicine residents. We describe the curriculum development, rationale, implementation process, and lessons learned from the implementation of a year-long research curriculum for first-year residents. We further evaluated resident perception of confidence in research methodology, interest in research, and the importance of their research experience through a 32-item survey. In two cohorts, 25 first-year residents completed the program. All residents met their scholarly project requirements by the end of their first year. Two conference abstracts and one peer-reviewed publication were accepted for publication, and one is currently under review. Survey responses indicated that there was an increase in residents’ perceived confidence in research methodology, but this was limited by the small sample size. In summary, this novel resident research curriculum demonstrated a standardized, reproducible, and sustainable approach to provide residents with an immersive research program.
To evaluate the impact of changes in the size and characteristics of the hospitalized patient population during the COVID-19 pandemic on the incidence of hospital-associated Clostridioides difficile infection (HA-CDI).
Design:
Interrupted time-series analysis.
Setting:
A 576-bed academic medical center in Portland, Oregon.
Methods:
We established March 23, 2020 as our pandemic onset and included 24 pre-pandemic and 24 pandemic-era 30-day intervals. We built an autoregressive segmented regression model to evaluate immediate and gradual changes in HA-CDI rate during the pandemic while controlling for changes in known CDI risk factors.
Results:
We observed 4.5 HA-CDI cases per 10,000 patient-days in the two years prior to the pandemic and 4.7 cases per 10,000 patient-days in the first two years of the pandemic. According to our adjusted segmented regression model, there were neither significant changes in HA-CDI rate at the onset of the pandemic (level-change coefficient = 0.70, P-value = 0.57) nor overtime during the pandemic (slope-change coefficient = 0.003, P-value = 0.97). We observed significant increases in frequency and intensity of antibiotic use, time at risk, comorbidities, and patient age before and after the pandemic onset. Frequency of C. difficile testing did not significantly change during the pandemic (P= 0.72).
Conclusions:
Despite large increases in several CDI risk factors, we did not observe the expected corresponding changes in HA-CDI rate during the first two years of the COVID-19 pandemic. We hypothesize that infection prevention measures responding to COVID-19 played a role in CDI prevention.
Clostridioides difficile infection (CDI) research relies upon accurate identification of cases when using electronic health record (EHR) data. We developed and validated a multi-component algorithm to identify hospital-associated CDI using EHR data and determined that the tandem of CDI-specific treatment and laboratory testing has 97% accuracy in identifying HA-CDI cases.
We compared study characteristics of randomized controlled trials funded by industry (N=697) to those not funded by industry (N=835). RCTs published in high-impact journals are more likely to be blinded, more likely to include a placebo, and more likely to post trial results on ClinicalTrials.gov. Our findings emphasize the importance of evaluating the quality of an RCT based on its methodological rigor, not its funder type.
Glauconites in early ankerite concretions, ferroan calcite-cemented sandstones, and uncemented sandstones in the first Wilcox sandstone of the Lockhart Crossing field, Livingston Parish, Louisiana, show a progressive substitution of Fe for octahedral Al with increasing diagenesis. An octahedral Fe content of 0.50 atoms was calculated from glauconite located in early ankeritic concretions. Octahedral Fe averaged 0.60 and 0.90 atoms in later ferroan calcite-cemented sandstone and uncemented sandstone, respectively. Corresponding octahedral Al averages were 1.16, 1.03, and 0.67, respectively. A systematic increase in average interlayer K from 0.49 to 0.54 to 0.61 was also observed, with apparent increases in diagenesis. All element determinations were made with an electron microprobe and recast on an anion equivalent basis to structural formulae based on the O10(OH)2 unit. The clay preserved in the early ankerite concretions was found to be an illite/smectite containing about 20% expandable layers, and the mineral in the glauconite pellets from uncemented areas of the sandstone, an ordered glauconite. “Minus cement” porosities of the sandstone indicate that glauconitization may have taken place at burial depths greater than 0.6 to 1.8 km, but the mechanism for the incorporation of Fe3+ in the glauconite at that depth is not apparent.
All very massive early-type galaxies contain supermassive blackholes, but are these blackholes all sufficiently active to produce detectable radio continuum sources? We have used the 887.5 MHz Rapid ASKAP Continuum Survey DR1 to measure the radio emission from morphological early-type galaxies brighter than $K_S=9.5$ selected from the 2MASS Redshift Survey, HyperLEDA, and RC3. In line with previous studies, we find median radio power increases with infrared luminosity, with $P_{1.4} \propto L_K^{2.2}$, although the scatter about this relation spans several orders of magnitude. All 40 of the $M_K<-25.7$ early-type galaxies in our sample have measured radio flux densities that are more than $2\sigma$ above the background noise, with $1.4\,{\rm GHz}$ radio powers spanning ${\sim} 3 \times 10^{20}$ to ${\sim} 3\times 10^{25}\,{\rm W/Hz^{-1}}$. Cross-matching our sample with integral field spectroscopy of early-type galaxies reveals that the most powerful radio sources preferentially reside in galaxies with relatively low angular momentum (i.e. slow rotators). While the infrared colours of most galaxies in our early-type sample are consistent with passive galaxies with negligible star formation and the radio emission produced by active galactic nuclei or AGN remnants, very low levels of star formation could power the weakest radio sources with little effect on many other star formation rate tracers.
Neurocognitive impairment and quality of life are two important long-term challenges for patients with complex CHD. The impact of re-interventions during adolescence and young adulthood on neurocognition and quality of life is not well understood.
Methods:
In this prospective longitudinal multi-institutional study, patients 13–30 years old with severe CHD referred for surgical or transcatheter pulmonary valve replacement were enrolled. Clinical characteristics were collected, and executive function and quality of life were assessed prior to the planned pulmonary re-intervention. These results were compared to normative data and were compared between treatment strategies.
Results:
Among 68 patients enrolled from 2016 to 2020, a nearly equal proportion were referred for surgical and transcatheter pulmonary valve replacement (53% versus 47%). Tetralogy of Fallot was the most common diagnosis (59%) and pulmonary re-intervention indications included stenosis (25%), insufficiency (40%), and mixed disease (35%). There were no substantial differences between patients referred for surgical and transcatheter therapy. Executive functioning deficits were evident in 19–31% of patients and quality of life was universally lower compared to normative sample data. However, measures of executive function and quality of life did not differ between the surgical and transcatheter patients.
Conclusion:
In this patient group, impairments in neurocognitive function and quality of life are common and can be significant. Given similar baseline characteristics, comparing changes in neurocognitive outcomes and quality of life after surgical versus transcatheter pulmonary valve replacement will offer unique insights into how treatment approaches impact these important long-term patient outcomes.
Background: The epidemiology of Clostridioides difficile infection (CDI) is complex, and the COVID-19 pandemic has had extreme impacts on known risk factors such as comorbidity burden and antibiotic prescribing. However, whether these changes have affected the incidence of hospital-associated CDI (HA-CDI) remains unknown. We compared incidence and trends of HA-CDI before and after the pandemic onset, and we assessed the impact of changes in key CDI-related risk factors. Methods: We conducted an interrupted time-series study (March 2018–March 2021) of adult inpatients hospitalized 4 or more days with no known CDI on admission at a 576-bed academic medical center. Our primary outcome was monthly HA-CDI per 10,000 patient days. We performed segmented linear regression to compare the preinterruption trend in HA-CDI rate to the postinterruption slope and level change. We established a series of 30-day intervals before and after the interruption timepoint of March 23, 2020, which corresponds with the Oregon stay-at-home executive order. The data included 24 preinterruption time points and 12 postinterruption time points. We also assessed changes in slope and trend for known HA-CDI risk factors. Results: We included 34,592 inpatient encounters in our prepandemic period and 10,932 encounters in our postinterruption period. The mean prepandemic HA-CDI rate was 4.07 cases per 10,000 patient days. After the pandemic onset, the rate was 3.6 per 10,000 patient days. However, the observed differences in rate (both in terms of slope and level) were not statistically significant (P = .90 for level; P = .60 for slope change). We observed a significant decrease in admissions per 30 days (1,441 vs 911; level-change P < .0001) and a slight increase in the mean number of Elixhauser comorbidities (1.96 vs 2.07; level-change P = .05). We also observed significant increases in both frequency and intensity of antibiotic use, with an increase average days of therapy per encounter (5.8 vs 7.2; level-change P = .01; slope-change P < .0001) and in antibiotic spectrum index (ASI) points per antibiotic day (4.4 vs 4.9; P < .0001). We observed a consistent downward trend for case days of CDI colonization pressure per hospital day (preinterruption slope P < .0001), which remained consistent after the pandemic onset (P = 0.5 for postinterruption slope change) (Fig. 1). Conclusions: Despite significant increases in high-intensity antibiotic use and comorbidity burden, we did not observe significant differences in HA-CDI after the pandemic onset. This may be due to the significant decrease in colonization pressure in the postpandemic period. Further research is required to fully understand the impact of the pandemic-related changes on
Studies of early fourth-millennium BC Britain have typically focused on the Early Neolithic sites of Wessex and Orkney; what can the investigation of sites located in areas beyond these core regions add? The authors report on excavations (2011–2019) at Dorstone Hill in Herefordshire, which have revealed a remarkable complex of Early Neolithic monuments: three long barrows constructed on the footprints of three timber buildings that had been deliberately burned, plus a nearby causewayed enclosure. A Bayesian chronological model demonstrates the precocious character of many of the site's elements and strengthens the evidence for the role of tombs and houses/halls in the creation and commemoration of foundational social groups in Neolithic Britain.
Companion animals, or ‘pets’, are integral to many people's lives and to their sense of home. However, older people living with companion animals are vulnerable to separation from their animals when moving to a care home. Such separation is often a highly significant loss which, combined with other losses, may reinforce experiences of dislocation. Existing research draws attention to the importance of developing a sense of ‘home’ in a care home through reinforcing and preserving personal connections. However, there is a paucity of research examining the preservation of connections between older people resident in care homes and their animal/s. This study draws on thematic analysis of 29 qualitative interviews with older people living in care homes, relatives, care home staff and other relevant stakeholders. It highlights that retaining existing, often long-term, bonds with companion animals represent important continuities and connections which may contribute to positive adjustment to life in a care home and creating a sense of home. However, participants highlighted that supporting an older person to move into a care home with their companion animal may be challenged by real or perceived constraints such as use of shared space, concerns about the risks posed by animals and staff implications. While our study found examples of good practice of how shared residence between an older person and companion animal can be achieved in a care home, other examples highlighted that the time, complexity of planning and structures required to accommodate animals were prohibitive to merit a change of policy and practice. Our research concludes that more attention should be given to the older person–animal bond as an important source of continuity and connection.
The National Disability Insurance Scheme (NDIS) offers opportunity against a historical background of underfunded and fragmented services for people with disability. For people with acquired brain injury (ABI), concerns have been raised about how they access NDIS individualised funded supports. The aim of this research was to explore how community-dwelling individuals with ABI in Queensland navigate the NDIS participant pathway to individualised funded supports.
Methods:
This study used a multiple case study design within a policy implementation framework. Twelve people with ABI, nine family members and eight NDIS funded and mainstream service providers participated. Data was collected from relevant NDIS documentation, health records and semi-structured interviews with individuals with ABI, family members, and service providers.
Results:
The current study highlighted the complexity of navigating the NDIS participant pathway of access, planning, implementation and review for people with ABI, their family and service providers. The NDIS pathway was impacted by the insurance and market based NDIS model itself, time, communication, and the requirement for external supports. Equally, the process was affected by environmental factors, individual person and injury factors as well as service providers, with a range of outcomes evident at the individual, family and system level.
Conclusions:
Findings suggest that the NDIS has struggled to make specific allowance for people with ABI and the complexity of their disabilities. Providing people with ABI access to the NDIS Complex Support Needs Pathway may redress many of the difficulties people with ABI experience accessing and using NDIS funded supports.
Fluting is a technological and morphological hallmark of some of the most iconic North American Paleoindian stone points. Through decades of detailed artifact analyses and replication experiments, archaeologists have spent considerable effort reconstructing how flute removals were achieved, and they have explored possible explanations of why fluting was such an important aspect of early point technologies. However, the end of fluting has been less thoroughly researched. In southern North America, fluting is recognized as a diagnostic characteristic of Clovis points dating to approximately 13,000 cal yr BP, the earliest widespread use of fluting. One thousand years later, fluting occurs more variably in Dalton and is no longer useful as a diagnostic indicator. How did fluting change, and why did point makers eventually abandon fluting? In this article, we use traditional 2D measurements, geometric morphometric (GM) analysis of 3D models, and 2D GM of flute cross sections to compare Clovis and Dalton point flute and basal morphologies. The significant differences observed show that fluting in Clovis was highly standardized, suggesting that fluting may have functioned to improve projectile durability. Because Dalton points were used increasingly as knives and other types of tools, maximizing projectile functionality became less important. We propose that fluting in Dalton is a vestigial technological trait retained beyond its original functional usefulness.
We present the data and initial results from the first pilot survey of the Evolutionary Map of the Universe (EMU), observed at 944 MHz with the Australian Square Kilometre Array Pathfinder (ASKAP) telescope. The survey covers
$270 \,\mathrm{deg}^2$
of an area covered by the Dark Energy Survey, reaching a depth of 25–30
$\mu\mathrm{Jy\ beam}^{-1}$
rms at a spatial resolution of
$\sim$
11–18 arcsec, resulting in a catalogue of
$\sim$
220 000 sources, of which
$\sim$
180 000 are single-component sources. Here we present the catalogue of single-component sources, together with (where available) optical and infrared cross-identifications, classifications, and redshifts. This survey explores a new region of parameter space compared to previous surveys. Specifically, the EMU Pilot Survey has a high density of sources, and also a high sensitivity to low surface brightness emission. These properties result in the detection of types of sources that were rarely seen in or absent from previous surveys. We present some of these new results here.
Ethnohistoric accounts indicate that the people of Australia's Channel Country engaged in activities rarely recorded elsewhere on the continent, including food storage, aquaculture and possible cultivation, yet there has been little archaeological fieldwork to verify these accounts. Here, the authors report on a collaborative research project initiated by the Mithaka people addressing this lack of archaeological investigation. The results show that Mithaka Country has a substantial and diverse archaeological record, including numerous large stone quarries, multiple ritual structures and substantial dwellings. Our archaeological research revealed unknown aspects, such as the scale of Mithaka quarrying, which could stimulate re-evaluation of Aboriginal socio-economic systems in parts of ancient Australia.
Background: Hospital-onset bacteremia and fungemia (HOB) may be a preventable hospital-acquired condition and a potential healthcare quality measure. We developed and evaluated a tool to assess the preventability of HOB and compared it to a more traditional consensus panel approach. Methods: A 10-member healthcare epidemiology expert panel independently rated the preventability of 82 hypothetical HOB case scenarios using a 6-point Likert scale (range, 1= “Definitively or Almost Certainly Preventable” to 6= “Definitely or Almost Certainly Not Preventable”). Ratings on the 6-point scale were collapsed into 3 categories: Preventable (1–2), Uncertain (3–4), or Not preventable (5–6). Consensus was defined as concurrence on the same category among ≥70% expert raters. Cases without consensus were deliberated via teleconference, web-based discussion, and a second round of rating. The proportion meeting consensus, overall and by predefined HOB source attribution, was calculated. A structured HOB preventability rating tool was developed to explicitly account for patient intrinsic and extrinsic healthcare-related risks (Fig. 1). Two additional physician reviewers independently applied this tool to adjudicate the same 82 case scenarios. The tool was iteratively revised based on reviewer feedback followed by repeat independent tool-based adjudication. Interrater reliability was evaluated using the Kappa statistic. Proportion of cases where tool-based preventability category matched expert consensus was calculated. Results: After expert panel round 1, consensus criteria were met for 29 cases (35%), which increased to 52 (63%) after round 2. Expert consensus was achieved more frequently for respiratory or surgical site infections than urinary tract and central-line–associated bloodstream infections (Fig. 2a). Most likely to be rated preventable were vascular catheter infections (64%) and contaminants (100%). For tool-based adjudication, following 2 rounds of rating with interim tool revisions, agreement between the 2 reviewers was 84% for cases overall (κ, 0.76; 95% CI, 0.64–0.88]), and 87% for the 52 cases with expert consensus (κ, 0.79; 95% CI, 0.65–0.94). Among cases with expert consensus, tool-based rating matched expert consensus in 40 of 52 (77%) and 39 of 52 (75%) cases for reviewer 1 and reviewer 2, respectively. The proportion of cases rated “uncertain“ was lower among tool-based adjudicated cases with reviewer agreement (15 of 69) than among cases with expert consensus (23 of 52) (Fig. 2b). Conclusions: Healthcare epidemiology experts hold varying perspectives on HOB preventability. Structured tool-based preventability rating had high interreviewer reliability, matched expert consensus in most cases, and rated fewer cases with uncertain preventability compared to expert consensus. This tool is a step toward standardized assessment of preventability in future HOB evaluations.
Background: Healthcare-associated infections (HAIs) are a major global threat to patient safety. Systematic surveillance is crucial for understanding HAI rates and antimicrobial resistance trends and to guide infection prevention and control (IPC) activities based on local epidemiology. In India, no standardized national HAI surveillance system was in place before 2017. Methods: Public and private hospitals from across 21 states in India were recruited to participate in an HAI surveillance network. Baseline assessments followed by trainings ensured that basic microbiology and IPC implementation capacity existed at all sites. Standardized surveillance protocols for central-line–associated bloodstream infections (CLABSIs) and catheter-associated urinary tract infections (CAUTIs) were modified from the NHSN for the Indian context. IPC nurses were trained to implement surveillance protocols. Data were reported through a locally developed web portal. Standardized external data quality checks were performed to assure data quality. Results: Between May 2017 and April 2019, 109 ICUs from 37 hospitals (29 public and 8 private) enrolled in the network, of which 33 were teaching hospitals with >500 beds. The network recorded 679,109 patient days, 212,081 central-line days, and 387,092 urinary catheter days. Overall, 4,301 bloodstream infection (BSI) events and 1,402 urinary tract infection (UTI) events were reported. The network CLABSI rate was 9.4 per 1,000 central-line days and the CAUTI rate was 3.4 per 1,000 catheter days. The central-line utilization ratio was 0.31 and the urinary catheter utilization ratio was 0.57. Moreover, 3,542 (73%) of 4,742 pathogens reported from BSIs and 868 (53%) of 1,644 pathogens reported from UTIs were gram negative. Also, 1,680 (26.3%) of all 6,386 pathogens reported were Enterobacteriaceae. Of 1,486 Enterobacteriaceae with complete antibiotic susceptibility testing data reported, 832 (57%) were carbapenem resistant. Of 951 Enterobacteriaceae subjected to colistin broth microdilution testing, 62 (7%) were colistin resistant. The surveillance platform identified 2 separate hospital-level HAI outbreaks; one caused by colistin-resistant K. pneumoniae and another due to Burkholderia cepacia. Phased expansion of surveillance to additional hospitals continues. Conclusions: HAI surveillance was successfully implemented across a national network of diverse hospitals using modified NHSN protocols. Surveillance data are being used to understand HAI burden and trends at the facility and national levels, to inform public policy, and to direct efforts to implement effective hospital IPC activities. This network approach to HAI surveillance may provide lessons to other countries or contexts with limited surveillance capacity.
Background: Contamination of healthcare workers and patient environments likely play a role in the spread of antibiotic-resistant organisms. The mechanisms that contribute to the distribution of organisms within and between patient rooms are not well understood, but they may include movement patterns and patient interactions of healthcare workers. We used an innovative technology for tracking healthcare worker movement and patient interactions in ICUs. Methods: The Kinect system, a device developed by Microsoft, was used to detect the location of a person’s hands and head over time, each represented with 3-dimensional coordinates. The Kinects were deployed in 2 intensive care units (ICUs), at 2 different hospitals, and they collected data from 5 rooms in a high-acuity 20-bed cardiovascular ICU (unit 1) and 3 rooms in a 10-bed medical-surgical ICU (unit 2). The length of the Kinect deployment varied by room (range, 15–48 days). The Kinect data were processed to included date, time, and location of head and hands for all individuals. Based on the coordinates of the bed, we defined events indicating bed touch, distance 30 cm (1 foot) from the bed, and distance 1 m (3 feet) from the bed. The processed Kinect data were then used to generate heat maps showing density of person locations within a room and summarizing bed touches and time spent in different locations within the room. Results: The Kinect systems captured In total, 2,090 hours of room occupancy by at least 1 person within ~1 m of the bed (Table 1). Approximately half of the time spent within ~1 m from the bed was at the bedside (within ~30 cm). The estimated number of bed touches per hour when within ~1 m was 13–23. Patients spent more time on one side of the bed, which varied by room and facility (Fig. 1A, 1B). Additionally, we observed temporal variation in intensity measured by person time in the room (Fig. 1C, 1D). Conclusions: High occupancy tends to be on the far side (away from the door) of the patient bed where the computers are, and the bed touch rate is relatively high. These results can be used to help us understand the potential for room contamination, which can contribute to both transmission and infection, and they highlight critical times and locations in the room, with a potential for focused deep cleaning.
The coronavirus disease 2019 (COVID-19) has greatly impacted health-care systems worldwide, leading to an unprecedented rise in demand for health-care resources. In anticipation of an acute strain on established medical facilities in Dallas, Texas, federal officials worked in conjunction with local medical personnel to convert a convention center into a Federal Medical Station capable of caring for patients affected by COVID-19. A 200,000 square foot event space was designated as a direct patient care area, with surrounding spaces repurposed to house ancillary services. Given the highly transmissible nature of the novel coronavirus, the donning and doffing of personal protective equipment (PPE) was of particular importance for personnel staffing the facility. Furthermore, nationwide shortages in the availability of PPE necessitated the reuse of certain protective materials. This article seeks to delineate the procedures implemented regarding PPE in the setting of a COVID-19 disaster response shelter, including workspace flow, donning and doffing procedures, PPE conservation, and exposure event protocols.
Men sexually interested in children of a specific combination of maturity and sex tend to show some lesser interest in other categories of persons. Patterns of men's sexual interest across erotic targets' categories of maturity and sex have both clinical and basic scientific implications.
Method
We examined the structure of men's sexual interest in adult, pubescent, and prepubescent males and females using multidimensional scaling (MDS) across four datasets, using three large samples and three indicators of sexual interest: phallometric response to erotic stimuli, sexual offense history, and self-reported sexual attraction. The samples were highly enriched for men sexually interested in children and men accused of sexual offenses.
Results
Results supported a two-dimensional MDS solution, with one dimension representing erotic targets' biological sex and the other dimension representing their sexual maturity. The dimension of sexual maturity placed adults and prepubescent children on opposite ends, and pubescent children intermediate. Differences between men's sexual interest in adults and prepubescent children of the same sex were similar in magnitude to the differences between their sexual interest in adult men and women. Sexual interest in adult men was no more associated with sexual interest in boys than sexual interest in adult women was associated with sexual interest in girls.
Conclusions
Erotic targets' sexual maturity and biological sex play important roles in men's preferences, which are predictive of sexual offending. The magnitude of men's preferences for prepubescent children v. adults of their preferred sex is large.