We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study investigates screening practices for antimicrobial-resistant organisms (AROs) in seventy-five hospitals participating in the Canadian Nosocomial Infection Surveillance Program (CNISP). Screening practices varied with widespread MRSA screening, selective carbapenemase-producing organisms (CPO) screening, and limited vancomycin-resistant Enterococcus (VRE) screening. These findings may help interpret ARO rates within CNISP hospitals and inform screening practices.
Antibiotics are essential to combating infections; however, misuse and overuse has contributed to antimicrobial resistance (AMR). Antimicrobial stewardship programs (ASPs) are a strategy to combat AMR and are mandatory in Canadian hospitals for accreditation. The Canadian Nosocomial Infection Surveillance Program (CNISP) sought to capture a snapshot of ASP practices within the network of Canadian acute care hospitals. Objectives of the survey were to describe the status, practices, and process indicators of ASPs across acute care hospitals participating in CNISP.
Design:
The survey explored the following items related to ASP programs: 1) program structure and leadership, 2) human, technical and financial resources allocated, 3) inventory of interventions carried and implemented, 4) tracking antimicrobial use; and 5) educational and promotional components.
Methods:
CNISP developed a 34-item survey in both English and French. The survey was administered to 109 participating CNISP hospitals from June to August 2024, responses were analyzed descriptively.
Results:
Ninety-seven percent (106/109) of CNISP hospitals responded to the survey. Eighty-four percent (89/106) reported having a formal ASP in place at the time of the study. Ninety percent (80/89) of acute care hospitals with an ASP performed prospective audit and feedback for antibiotic agents and 85% (76/89) had formal surveillance of quantitative antimicrobial use. Additionally, just over 80% (74/89) provided education to their prescribers and other healthcare staff.
Conclusions:
CNISP acute care hospitals employ multiple key aspects of ASP including implementing interventions and monitoring/tracking antimicrobial use. There were acute care hospitals without an ASP, highlighting areas for investigation and improvement.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
In the last 50 years, the field of paleobiology has undergone a computational revolution that opened multiple new avenues for recording, storing, and analyzing vital data on the history of life on Earth. With these advances, the amount of data available for research has grown, but so too has our responsibility to ensure that our data tools and infrastructures continue to innovate in order to best serve our diverse community. This review focuses on data equity in paleobiology, an aspirational goal, wherein data in all forms are collected, stored, shared and analyzed in a responsible, equitable, and sustainable manner. While there have been many advancements across the last five decades, inequities persist. Our most significant challenges relate to several interconnected factors, including ethical data collection, sustainable infrastructure, socioeconomic biases, and global inequalities. We highlight the ways in which data equity is critical for paleobiology and stress the need for collaborative efforts across the paleobiological community to urgently address these data equity challenges. We also provide recommendations for actions from individuals, teams, academic publishers, and academic societies in order to continue enhancing data equity and ensuring an equitable and sustainable future for our field.
A 15-year-old male presented with vasovagal syncope and troponin leak 4 days after his second COVID-19 vaccine. Based on initial diagnostic work-up, he was thought to have COVID-19 vaccine-associated myocarditis. His cardiac dysfunction persisted and further work-up including genetic evaluation and serial MRI studies later confirmed a diagnosis of arrhythmogenic cardiomyopathy. This is a unique case of an incorrect diagnosis based on timing and context of vaccine-related myocarditis. Reports of mild and self-limited myocarditis post-COVID-19 vaccination may cause vaccine hesitancy among the public, and so case reports such as this one show the importance of discerning underlying conditions amongst rare COVID-19 vaccination complications.
By coupling long-range polymerase chain reaction, wastewater-based epidemiology, and pathogen sequencing, we show that adenovirus type 41 hexon-sequence lineages, described in children with hepatitis of unknown origin in the United States in 2021, were already circulating within the country in 2019. We also observed other lineages in the wastewater, whose complete genomes have yet to be documented from clinical samples.
Rift propagation, rather than basal melt, drives the destabilization and disintegration of the Thwaites Eastern Ice Shelf. Since 2016, rifts have episodically advanced throughout the central ice-shelf area, with rapid propagation events occurring during austral spring. The ice shelf's speed has increased by ~70% during this period, transitioning from a rate of 1.65 m d−1 in 2019 to 2.85 m d−1 by early 2023 in the central area. The increase in longitudinal strain rates near the grounding zone has led to full-thickness rifts and melange-filled gaps since 2020. A recent sea-ice break out has accelerated retreat at the western calving front, effectively separating the ice shelf from what remained of its northwestern pinning point. Meanwhile, a distributed set of phase-sensitive radar measurements indicates that the basal melting rate is generally small, likely due to a widespread robust ocean stratification beneath the ice–ocean interface that suppresses basal melt despite the presence of substantial oceanic heat at depth. These observations in combination with damage modeling show that, while ocean forcing is responsible for triggering the current West Antarctic ice retreat, the Thwaites Eastern Ice Shelf is experiencing dynamic feedbacks over decadal timescales that are driving ice-shelf disintegration, now independent of basal melt.
Depression is a common mental health disorder that often starts during adolescence, with potentially important future consequences including ‘Not in Education, Employment or Training’ (NEET) status.
Methods
We took a structured life course modeling approach to examine how depressive symptoms during adolescence might be associated with later NEET status, using a high-quality longitudinal data resource. We considered four plausible life course models: (1) an early adolescent sensitive period model where depressive symptoms in early adolescence are more associated with later NEET status relative to exposure at other stages; (2) a mid adolescent sensitive period model where depressive symptoms during the transition from compulsory education to adult life might be more deleterious regarding NEET status; (3) a late adolescent sensitive period model, meaning that depressive symptoms around the time when most adults have completed their education and started their careers are the most strongly associated with NEET status; and (4) an accumulation of risk model which highlights the importance of chronicity of symptoms.
Results
Our analysis sample included participants with full information on NEET status (N = 3951), and the results supported the accumulation of risk model, showing that the odds of NEET increase by 1.015 (95% CI 1.012–1.019) for an increase of 1 unit in depression at any age between 11 and 24 years.
Conclusions
Given the adverse implications of NEET status, our results emphasize the importance of supporting mental health during adolescence and early adulthood, as well as considering specific needs of young people with re-occurring depressed mood.
Estimate the impact of 20 % flat-rate and tiered sugary drink tax structures on the consumption of sugary drinks, sugar-sweetened beverages and 100 % juice by age, sex and socio-economic position.
Design:
We modelled the impact of price changes – for each tax structure – on the demand for sugary drinks by applying own- and cross-price elasticities to self-report sugary drink consumption measured using single-day 24-h dietary recalls from the cross-sectional, nationally representative 2015 Canadian Community Health Survey-Nutrition. For both 20 % flat-rate and tiered sugary drink tax scenarios, we used linear regression to estimate differences in mean energy intake and proportion of energy intake from sugary drinks by age, sex, education, food security and income.
Setting:
Canada.
Participants:
19 742 respondents aged 2 and over.
Results:
In the 20 % flat-rate scenario, we estimated mean energy intake and proportion of daily energy intake from sugary drinks on a given day would be reduced by 29 kcal/d (95 % UI: 18, 41) and 1·3 % (95 % UI: 0·8, 1·8), respectively. Similarly, in the tiered tax scenario, additional small, but meaningful reductions were estimated in mean energy intake (40 kcal/d, 95 % UI: 24, 55) and proportion of daily energy intake (1·8 %, 95 % UI: 1·1, 2·5). Both tax structures reduced, but did not eliminate, inequities in mean energy intake from sugary drinks despite larger consumption reductions in children/adolescents, males and individuals with lower education, food security and income.
Conclusions:
Sugary drink taxation, including the additional benefit of taxing 100 % juice, could reduce overall and inequities in mean energy intake from sugary drinks in Canada.
To evaluate the comparative epidemiology of hospital-onset bloodstream infection (HOBSI) and central line-associated bloodstream infection (CLABSI)
Design and Setting:
Retrospective observational study of HOBSI and CLABSI across a three-hospital healthcare system from 01/01/2017 to 12/31/2021
Methods:
HOBSIs were identified as any non-commensal positive blood culture event on or after hospital day 3. CLABSIs were identified based on National Healthcare Safety Network (NHSN) criteria. We performed a time-series analysis to assess comparative temporal trends among HOBSI and CLABSI incidence. Using univariable and multivariable regression analyses, we compared demographics, risk factors, and outcomes between non-CLABSI HOBSI and CLABSI, as HOBSI and CLABSI are not exclusive entities.
Results:
HOBSI incidence increased over the study period (IRR 1.006 HOBSI/1,000 patient days; 95% CI 1.001–1.012; P = .03), while no change in CLABSI incidence was observed (IRR .997 CLABSIs/1,000 central line days, 95% CI .992–1.002, P = .22). Differing demographic, microbiologic, and risk factor profiles were observed between CLABSIs and non-CLABSI HOBSIs. Multivariable analysis found lower odds of mortality among patients with CLABSIs when adjusted for covariates that approximate severity of illness (OR .27; 95% CI .11–.64; P < .01).
Conclusions:
HOBSI incidence increased over the study period without a concurrent increase in CLABSI in our study population. Furthermore, risk factor and outcome profiles varied between CLABSI and non-CLABSI HOBSI, which suggest that these metrics differ in important ways worth considering if HOBSI is adopted as a quality metric.
The origins and timing of inpatient room sink contamination with carbapenem-resistant organisms (CROs) are poorly understood.
Methods:
We performed a prospective observational study to describe the timing, rate, and frequency of CRO contamination of in-room handwashing sinks in 2 intensive care units (ICU) in a newly constructed hospital bed tower. Study units, A and B, were opened to patient care in succession. The patients in unit A were moved to a new unit in the same bed tower, unit B. Each unit was similarly designed with 26 rooms and in-room sinks. Microbiological samples were taken every 4 weeks from 3 locations from each study sink: the top of the bowl, the drain cover, and the p-trap. The primary outcome was sink conversion events (SCEs), defined as CRO contamination of a sink in which CRO had not previously been detected.
Results:
Sink samples were obtained 22 times from September 2020 to June 2022, giving 1,638 total environmental cultures. In total, 2,814 patients were admitted to study units while sink sampling occurred. We observed 35 SCEs (73%) overall; 9 sinks (41%) in unit A became contaminated with CRO by month 10, and all 26 sinks became contaminated in unit B by month 7. Overall, 299 CRO isolates were recovered; the most common species were Enterobacter cloacae and Pseudomonas aeruginosa.
Conclusion:
CRO contamination of sinks in 2 newly constructed ICUs was rapid and cumulative. Our findings support in-room sinks as reservoirs of CRO and emphasize the need for prevention strategies to mitigate contamination of hands and surfaces from CRO-colonized sinks.
Persistent brain fog is common in adults with Post-Acute Sequelae of SARS-CoV-2 infection (PASC), in whom it causes distress and in many cases interferes with performance of instrumental activities of daily living (IADL) and return-to-work. There are no interventions with rigorous evidence of efficacy for this new, often disabling condition. The purpose of this pilot is to evaluate the efficacy, on a preliminary basis, of a new intervention for this condition termed Constraint-Induced Cognitive therapy (CICT). CICT combines features of two established therapeutic approaches: cognitive speed of processing training (SOPT) developed by the laboratory of K. Ball and the Transfer Package and task-oriented training components of Constraint-Induced Movement therapy developed by the laboratory of E. Taub and G. Uswatte.
Participants and Methods:
Participants were > 3 months after recovery from acute COVID symptoms and had substantial brain fog and impairment in IADL. Participants were randomized to CICT immediately or after a 3-month delay. CICT involved 36 hours of outpatient therapy distributed over 4-6 weeks. Sessions had three components: (a) videogamelike training designed to improve how quickly participants process sensory input (SOPT), (b) training on IADLs following shaping principles, and (c) a set of behavioral techniques designed to transfer gains from the treatment setting to daily life, i.e., the Transfer Package. The Transfer Package included (a) negotiating a behavioral contract with participants and one or more family members about the responsibilities of the participants, family members, and treatment team; (b) assigning homework during and after the treatment period; (c) monitoring participants’ out-of-session behavior; (d) supporting problem-solving by participants and family members about barriers to performance of IADL; and (e) making follow-up phone calls. IADL performance, brain fog severity, and cognitive impairment were assessed using validated, trans-diagnostic measures before and after treatment and three months afterwards in the immediate-CICT group and on parallel occasions in the delayed-CICT group (aka waitlist controls).
Results:
To date, five were enrolled in the immediate-CICT group; four were enrolled in the wait-list group. All had mild cognitive impairment, except for one with moderate impairment in the immediate-CICT group. Immediate-CICT participants, on average, had large reductions in brain fog severity on the Mental Clutter Scale (MCS, range = 0 to 10 points, mean change = -3.7, SD = 2.0); wait-list participants had small increases (mean change = 1.0, SD = 1.4). Notably, all five in the immediate-CICT group had clinically meaningful improvements (i.e., changes > 2 points) in performance of IADL outside the treatment setting as measured by the Canadian Occupational Performance Measure (COPM) Performance scale; only one did in the wait-list group. The advantage for the immediate-CICT group was very large on both the MCS and COPM (d’s = 1.7, p’s < .05). In follow-up, immediate-CICT group gains were retained or built-upon.
Conclusions:
These preliminary findings warrant confirmation by a large-scale randomized controlled trial. To date, CICT shows high promise as an efficacious therapy for brain fog due to PASC. CICT participants had large, meaningful improvements in IADL performance outside the treatment setting, in addition to large reductions in brain fog severity.
We compared the number of blood-culture events before and after the introduction of a blood-culture algorithm and provider feedback. Secondary objectives were the comparison of blood-culture positivity and negative safety signals before and after the intervention.
Design:
Prospective cohort design.
Setting:
Two surgical intensive care units (ICUs): general and trauma surgery and cardiothoracic surgery
Patients:
Patients aged ≥18 years and admitted to the ICU at the time of the blood-culture event.
Methods:
We used an interrupted time series to compare rates of blood-culture events (ie, blood-culture events per 1,000 patient days) before and after the algorithm implementation with weekly provider feedback.
Results:
The blood-culture event rate decreased from 100 to 55 blood-culture events per 1,000 patient days in the general surgery and trauma ICU (72% reduction; incidence rate ratio [IRR], 0.38; 95% confidence interval [CI], 0.32–0.46; P < .01) and from 102 to 77 blood-culture events per 1,000 patient days in the cardiothoracic surgery ICU (55% reduction; IRR, 0.45; 95% CI, 0.39–0.52; P < .01). We did not observe any differences in average monthly antibiotic days of therapy, mortality, or readmissions between the pre- and postintervention periods.
Conclusions:
We implemented a blood-culture algorithm with data feedback in 2 surgical ICUs, and we observed significant decreases in the rates of blood-culture events without an increase in negative safety signals, including ICU length of stay, mortality, antibiotic use, or readmissions.
Background: Blood cultures are commonly ordered for patients with low risk of bacteremia. Liberal blood-culture ordering increases the risk of false-positive results, which can lead to increased length of stay, excess antibiotics, and unnecessary diagnostic procedures. We implemented a blood-culture indication algorithm with data feedback and assessed the impact on ordering volume and percent positivity. Methods: We performed a prospective cohort study from February 2022 to November 2022 using historical controls from February 2020 to January 2022. We introduced the blood-culture algorithm (Fig. 1) in 2 adult surgical intensive care units (ICUs). Clinicians reviewed charts of eligible patients with blood cultures weekly to determine whether the blood-culture algorithm was followed. They provided feedback to the unit medical directors weekly. We defined a blood-culture event as ≥1 blood culture within 24 hours. We excluded patients aged <18 years, absolute neutrophil count <500, and heart and lung transplant recipients at the time of blood-culture review. Results: In total, 7,315 blood-culture events in the preintervention group and 2,506 blood-culture events in the postintervention group met eligibility criteria. The average monthly blood-culture rate decreased from 190 blood cultures per 1,000 patient days to 142 blood cultures per 1,000 patient days (P < .01) after the algorithm was implemented. (Fig. 2) The average monthly blood-culture positivity increased from 11.7% to 14.2% (P = .13). Average monthly days of antibiotic therapy (DOT) was lower in the postintervention period than in the preintervention period (2,200 vs 1,940; P < .01). (Fig. 3) The ICU length of stay did not change before the intervention compared to after the intervention: 10 days (IQR, 5–18) versus 10 days (IQR, 5–17; P = .63). The in-hospital mortality rate was lower during the postintervention period, but the difference was not statistically significant: 9.24% versus 8.34% (P = .17). The all-cause 30-day mortality was significantly lower during the intervention period: 11.9% versus 9.7% (P < .01). The unplanned 30-day readmission percentage was significantly lower during the intervention period (10.6% vs 7.6%; P < .01). Over the 9-month intervention, we reviewed 916 blood-culture events in 452 unique patients. Overall, 74.6% of blood cultures followed the algorithm. The most common reasons overall for ordering blood cultures were severe sepsis or septic shock (37%), isolated fever and/or leukocytosis (19%), and documenting clearance of bacteremia (15%) (Table 1). The most common indications for inappropriate blood cultures were isolated fever and/or leukocytosis (53%). Conclusions: We introduced a blood-culture algorithm with data feedback in 2 surgical ICUs and observed decreases in blood-culture volume without a negative impact on ICU LOS or mortality rate.
Background: The Centers for Disease Control and Prevention’s Emerging Infections Program conducts active laboratory- and population-based surveillance for carbapenem-resistant Enterobacterales (CRE) and extended spectrum beta-lactamase-producing Enterobacterales (ESBL-E). To better understand the U.S. epidemiology of these organisms among children, we determined the incidence of pediatric CRE and ESBL-E cases and described their clinical characteristics. Methods: Surveillance was conducted among children <18 years of age for CRE from 2016–2020 in 10 sites, and for ESBL-E from 2019–2020 in 6 sites. Among catchment-area residents, an incident CRE case was defined as the first isolation of Escherichia coli, Enterobacter cloacae complex, Klebsiella aerogenes, K. oxytoca, or K. pneumoniae in a 30-day period resistant to ≥1 carbapenem from a normally sterile site or urine. An incident ESBL-E case was defined as the first isolation of E. coli, K. pneumoniae, or K. oxytoca in a 30-day period resistant to any third-generation cephalosporin and non-resistant to all carbapenems from a normally sterile site or urine. Case records were reviewed. Results: Among 159 CRE cases, 131 (82.9%) were isolated from urine and 19 (12.0%) from blood; median age was 5 years (IQR 1–10) and 94 (59.1%) were female. Combined CRE incidence rate per 100,000 population by year ranged from 0.47 to 0.87. Among 207 ESBL-E cases, 160 (94.7%) were isolated from urine and 6 (3.6%) from blood; median age was 6 years (IQR 2–15) and 165 (79.7%) were female. Annual ESBL incidence rate per 100,000 population was 26.5 in 2019 and 19.63 in 2020. Incidence rates of CRE and ESBL-E were >2-fold higher in infants (children <1 year) than other age groups. Among those with data available, CRE cases were more likely than ESBL-E cases to have underlying conditions (99/158 [62.7%] versus 59/169 [34.9%], P<0.0001), prior healthcare exposures (74/158 [46.8%] versus 38/169 [22.5%], P<0.0001), and be hospitalized for any reason around time of their culture collection (75/158 [47.5%] versus 38/169 [22.5%], P<0.0001); median duration of admission was 18 days [IQR 3–103] for CRE versus 10 days [IQR 4–43] for ESBL-E. Urinary tract infection was the most frequent infection for CRE (89/158 [56.3%]) and ESBL-E (125/169 [74.0%]) cases. Conclusion: CRE infections occurred less frequently than ESBL-infections in U.S. children but were more often associated with healthcare risk factors and hospitalization. Infants had highest incidence of CRE and ESBL-E. Continued surveillance, infection prevention and control efforts, and antibiotic stewardship outside and within pediatric care are needed
Background:Candida auris is a frequently drug-resistant yeast that can cause invasive disease and is easily transmitted in healthcare settings. Pediatric cases are rare in the United States, with <10 reported before 2022. In August 2021, the first C. auris case in Las Vegas was identified in an adult. By May 2022, 117 cases were identified across 16 healthcare facilities, including 3 pediatric cases at an acute-care hospital (ACH) with adult cases, representing the first pediatric cluster in the United States. The CDC and Nevada Division of Public and Behavioral Health (NVDPBH) sought to describe these cases and risk factors for C. auris acquisition. Methods: We defined a case as a patient’s first positive C. auris specimen. We reviewed medical records and infection prevention and control (IPC) practices. Environmental sampling was conducted on high-touch surfaces throughout affected adult and pediatric units. Isolate relatedness was assessed using whole-genome sequencing (WGS). Results: All 3 pediatric patients were born at the facility and had congenital heart defects. All were aged <6 months when they developed C. auris bloodstream infections; 2 developed C. auris endocarditis. One patient died. Patients overlapped in the pediatric cardiac intensive care unit; 2 did not leave between birth and C. auris infection. Mobile medical equipment was shared between adult and pediatric patients; lapses in cleaning and disinfection of shared mobile medical equipment and environmental surfaces were observed, presenting opportunities for transmission. Overall, 32 environmental samples were collected, and C. auris was isolated from 2 specimens from an adult unit without current cases. One was a composite sample from an adult patient’s bed handles, railings, tray table and call buttons, and the second was from an adult lift-assistance device. WGS of specimens from adult and pediatric cases and environmental isolates were in the same genetic cluster, with 2–10 single-nucleotide polymorphisms (SNPs) different, supporting within-hospital transmission. The pediatric cases varied by 0–3 SNPs; at least 2 were highly related. Conclusions:C. auris was likely introduced to the pediatric population from adults via inadequately cleaned and disinfected mobile medical equipment. We made recommendations to ensure adequate cleaning and disinfection and implement monitoring and audits. No pediatric cases have been identified since. This investigation demonstrates transmission can occur between unrelated units and populations and that robust infection prevention and control practices throughout the facility are critical for reducing C. auris environmental burden and limiting transmission, including to previously unaffected vulnerable populations, like children.
To increase inclusivity, diversity, equity and accessibility in Antarctic science, we must build more positive and inclusive Antarctic field work environments. The International Thwaites Glacier Collaboration (ITGC) has engaged in efforts to contribute to that goal through a variety of activities since 2018, including creating an open-access ‘Field and Ship Best Practices’ guide, engaging in pre-field season team dynamics meetings, and surveying post-field season reflections and experiences. We report specific actions taken by ITGC and their outcomes. We found that strong and supported early career researchers brought new and important perspectives regarding strategies for transforming culture. We discovered that engaged and involved senior leadership was also critical for expanding participation and securing funding to support efforts. Pre-field discussions involving all field team members were particularly helpful for setting expectations, improving sense of belonging, describing field work best practices, and co-creating a positive work culture.
In this article, we consider the role that academics play in the global illicit trade in cultural objects. Academics connect sources to buyers and influence market values by publishing looted and stolen cultural objects (passive facilitation) and by collaborating with market players, including by collecting artifacts themselves (active facilitation). Their actions shape market desire, changing what is targeted for looting, theft, and illicit trading across borders. However, this crucial facilitative role often goes unnoticed or unaddressed in scholarship on collecting, white collar crime, and the illicit market in cultural objects. This article explores the importance of academic facilitation through a case study of the career of Mary Slusser, a renowned American scholar of Nepali art and art history.
Postgraduate education is important in preparing and enhancing health professionals for the practice of disaster and terror medicine. The World Association for Disaster and Emergency Medicine (WADEM) has formulated a standardized international perspective for education and training in disaster medicine and health. Notwithstanding, there continues to be a reported gap in competency-based training in disaster and terror medicine internationally, particularly across Asia Pacific, which is a known vulnerable region. We report on a new Graduate Diploma in Disaster and Terror Medicine, to be expanded to Master level in 2024. The course is delivered mainly online to a multidisciplinary international audience. This paper summarizes the development of the course and outlines the key influences that have contributed to the design of the course.
Method:
A survey of the critical care workforce conducted by the Department of Critical Care at the University of Melbourne in early 2020 identified the need to develop education in disaster and terror medicine. A market and competitor analysis identified there was a gap in clinician focused courses offered in Australia and internationally. Based upon these results, a new course was developed to meet these needs.
Results:
Based on the results of the survey and feedback from expert stakeholders, the new postgraduate courses in disaster and terror medicine were developed. They offer both core and elective subjects, utilizing a modular approach with supervised simulation and practical training. The courses incorporate problem-based learning, the principles and practices of online education and advances in simulation-based learning, providing both a public health and clinical lens.
Conclusion:
The nested suite of postgraduate disaster and terror medicine courses at the University of Melbourne is at the forefront of learning within this field and meets the contemporary needs of health professionals who practice disaster and terror medicine