We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We present a re-discovery of G278.94+1.35a as possibly one of the largest known Galactic supernova remnants (SNRs) – that we name Diprotodon. While previously established as a Galactic SNR, Diprotodon is visible in our new Evolutionary Map of the Universe (EMU) and GaLactic and Extragalactic All-sky MWA (GLEAM) radio continuum images at an angular size of $3{{{{.\!^\circ}}}}33\times3{{{{.\!^\circ}}}}23$, much larger than previously measured. At the previously suggested distance of 2.7 kpc, this implies a diameter of 157$\times$152 pc. This size would qualify Diprotodon as the largest known SNR and pushes our estimates of SNR sizes to the upper limits. We investigate the environment in which the SNR is located and examine various scenarios that might explain such a large and relatively bright SNR appearance. We find that Diprotodon is most likely at a much closer distance of $\sim$1 kpc, implying its diameter is 58$\times$56 pc and it is in the radiative evolutionary phase. We also present a new Fermi-LAT data analysis that confirms the angular extent of the SNR in gamma rays. The origin of the high-energy emission remains somewhat puzzling, and the scenarios we explore reveal new puzzles, given this unexpected and unique observation of a seemingly evolved SNR having a hard GeV spectrum with no breaks. We explore both leptonic and hadronic scenarios, as well as the possibility that the high-energy emission arises from the leftover particle population of a historic pulsar wind nebula.
Objectives: This work was aimed at characterizing the experiences of discrimination, and report initial psychometric properties of a new tool to capture these experiences, among a global sample of people living with dementia.
Methods: Data from 704 people living with dementia who took part in a global survey from 33 different countries and territories were analysed. Psychometric properties were examined, including internal consistency and construct validity.
Results: A total of 83% of participants reported discrimination in one or more areas of life, and this was similar across WHO Regions. The exploratory factor analysis factor loadings and scree plot supported a unidimensional structure for the Discrimination and Stigma Scale Ultra Short for People Living with Dementia (DISCUS-Dementia). The instrument demonstrated excellent internal consistency, with most of the construct validity hypotheses being confirmed and qualitative responses demonstrating face validity.
Conclusions: The DISCUS-Dementia performs well with a global sample of people living with dementia. This scale can be integrated into large-scale studies to understand factors associated with stigma and discrimination. It can also provide an opportunity for a structured Discussion around stigma and discrimination experiences important to people living with dementia, as well as planning psychosocial services and initiatives to reduce stigma and discrimination.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Control of carbapenem-resistant Acinetobacter baumannii and Pseudomonas aeruginosa spread in healthcare settings begins with timely and accurate laboratory testing practices. Survey results show most Veterans Affairs facilities are performing recommended tests to identify these organisms. Most facilities report sufficient resources to perform testing, though medium-complexity facilities report some perceived barriers.
Background: Carbapenem-resistant Acinetobacter baumannii (CRAB) and Pseudomonas aeruginosa (CRPA) are drug-resistant pathogens causing high mortality rates with limited treatment options. Understanding the incidence of these organisms and laboratory knowledge of testing protocols is important for controlling their spread in healthcare settings. This project assessed how often Veterans Affairs (VA) healthcare facilities identify CRAB and CRPA and testing practices used. Method: An electronic survey was distributed to 126 VA acute care facilities September-October 2023. The survey focused on CRAB and CRPA incidence, testing and identification, and availability of testing resources. Responses were analyzed by complexity of patients treated at VA facilities (High, Medium, Low) using Fisher’s exact tests. Result: 77 (61.1%) facilities responded, most in urban settings (85.4%). Most respondents were lead or supervisory laboratory technologists (84.2%) from high complexity facilities (69.0%). Few facilities detected CRAB ≥ once/month (4.4%), with most reporting that they have not seen CRAB at their facility (55.0%). CRPA was detected more frequently: 19% of facilities with isolates ≥ once/month, 29.2% a few times per year, and 26.9% reporting had not seen the organism. No differences in CRAB or CRPA incidence was found by facility complexity. Nearly all facilities, regardless of complexity, utilize the recommended methods of MIC or disk diffusion to identify CRAB or CRPA (91.9%) with remaining facilities reporting that testing is done off-site (7.8%). More high complexity facilities perform on-site testing compared to low complexity facilities (32.0% vs 2.7%, p=0.04). 83% of laboratories test for Carbapenemase production, with one-fourth using off-site reference labs. One-fourth of facilities perform additional antibiotic susceptibility testing for CRAB and CRPA isolates, most of which test for susceptibility to combination antibiotics; no differences between complexities were found. Agreement that sufficient laboratory and equipment resources were available was higher in high complexity than in medium complexity facilities (70.7% vs 33.3%, p=0.01), but not low complexity facilities (43.8%). Conclusion: Having timely and accurate testing protocols for CRAB and CRPA are important to quickly control spread and reduce associated mortality. This study shows that most VA protocols follow recommended testing and identification guidelines. Interestingly, there was no difference in CRAB or CRPA incidence for facilities providing higher vs lower complexity of care. While high and low complexity facilities generally reported sufficient resources for CRAB and CRPA evaluation, some medium-complexity labs, who may feel more compelled than low-complexity labs to bring testing in house, reported that additional resources would be required.
Develop and implement a system in the Veterans Health Administration (VA) to alert local medical center personnel in real time when an acute- or long-term care patient/resident is admitted to their facility with a history of colonization or infection with a multidrug-resistant organism (MDRO) previously identified at any VA facility across the nation.
Methods:
An algorithm was developed to extract clinical microbiology and local facility census data from the VA Corporate Data Warehouse initially targeting carbapenem-resistant Enterobacterales (CRE) and methicillin-resistant Staphylococcus aureus (MRSA). The algorithm was validated with chart review of CRE cases from 2010-2018, trialed and refined in 24 VA healthcare systems over two years, expanded to other MDROs and implemented nationwide on 4/2022 as “VA Bug Alert” (VABA). Use through 8/2023 was assessed.
Results:
VABA performed well for CRE with recall of 96.3%, precision of 99.8%, and F1 score of 98.0%. At the 24 trial sites, feedback was recorded for 1,011 admissions with a history of CRE (130), MRSA (814), or both (67). Among Infection Preventionists and MDRO Prevention Coordinators, 338 (33%) reported being previously unaware of the information, and of these, 271 (80%) reported they would not have otherwise known this information. By fourteen months after nationwide implementation, 113/130 (87%) VA healthcare systems had at least one VABA subscriber.
Conclusions:
A national system for alerting facilities in real-time of patients admitted with an MDRO history was successfully developed and implemented in VA. Next steps include understanding facilitators and barriers to use and coordination with non-VA facilities nationwide.
In this exploratory study, designers’ preferred learning media in learning to design for Additive Manufacturing was explored. Furthermore, by deploying an online survey questionnaire, factors such as years of experience, and the categories of products designed were explored to understand how they influence designers’ learning media with a response from 201 respondents. The results show that designers have learned how to design for AM through experimentation and present the first step towards developing an appropriate Design for Additive Manufacturing knowledge dissemination approach.
Decreasing the time to contact precautions (CP) is critical to carbapenem-resistant Enterobacterales (CRE) prevention. Identifying factors associated with delayed CP can decrease the spread from patients with CRE. In this study, a shorter length of stay was associated with being placed in CP within 3 days.
Chronic traumatic encephalopathy (CTE) is a neurodegenerative disease that can only be diagnosed at post-mortem. Revised criteria for the clinical syndrome of CTE, known as traumatic encephalopathy syndrome (TES), include impairments in episodic memory and/or executive function as core clinical features. These criteria were informed by retrospective interviews with next-of-kin and the presence and rates of objective impairments in memory and executive functions in CTE are unknown. Here, we characterized antemortem neuropsychological test performance in episodic memory and executive functions among deceased contact sport athletes neuropathologically diagnosed with CTE.
Participants and Methods:
The sample included 80 deceased male contact sport athletes from the UNITE brain bank who had autopsy-confirmed CTE (and no other neurodegenerative diseases). Published criteria were used for the autopsy diagnosis of CTE. Neuropsychological test reports (raw scores) were acquired through medical record requests. Raw scores were converted to z-scores using the same age, sex, and education-adjusted normative data. Tests of memory included long delay trials from the Rey Complex Figure, CVLT-II, HVLT-R, RBANS, and BVMT-R. Tests of executive functions included Trail Making Test-B (TMT-B), Controlled Oral Word Association Test, WAIS-III Picture Arrangement, and various WAIS-IV subtests. Not all brain donors had the same tests, and the sample sizes vary across tests, with 33 donors having tests from both domains. Twenty-eight had 1 test in memory and 3 had 2+. Eight had 1 test of executive function and 46 had 2+. A z-score of 1.5 standard deviations below the normative mean was impaired. Interpretation of test performance followed the American Academy of Clinical Neuropsychology guidelines (Guilmette et al., 2020). Bivariate correlations assessed cumulative p-tau burden (summary semiquantitative ratings of p-tau severity across 11 brain regions) and TMT-B (n=34) and CVLT-II (n=14), the most common tests available.
Results:
Of the 80 (mean age= 59.9, SD=18.0 years; 13, 16.3% were Black), 72 played football, 4 played ice hockey, and 4 played other contact sports. Most played at the professional level (57, 71.3%). Mean time between neuropsychological testing and death was 3.9 (SD= 4.5) years. The most common reason for testing was dementia-related (43, 53.8%). Mean z-scores fell in the average psychometric range(mean z= -0.52, SD=1.5, range= -6.0 to 3.0) for executive function and the low average range for memory (mean z= -1.3, SD=1.1, range= -4.0 to 2.0). Eleven (20.4%) had impairment on 1 test and 3 (5.6%) on 2+ tests of executive functions. The most common impairment was on TMT-B (mean z= -1.77, 13 [38.2%] impaired). For memory, 13 (41.9%) had impairment on 1 test. Of the 14 who had CVLT-II, 7 were impaired (mean z= -1.33). Greater p-tau burden was associated with worse performance on CVLT-II (r= -.653, p= .02), but not TMT-B (r= .187, p>.05).
Conclusions:
This study provides the first evidence for objectively-measured impairments in executive functions and memory in a sample with known, autopsy-confirmed CTE. Furthermore, p-tau burden corresponded to worse memory test performance. Examination of neuropsychological tests from medical records has limitations but can overcome shortcomings of retrospective informant reports to provide insight into the cognitive profiles associated with CTE.
Depression and anxiety are the leading contributors to the global burden of disease among young people, accounting for over a third (34.8%) of years lived with disability. Yet there is limited evidence for interventions that prevent adolescent depression and anxiety in low- and middle-income countries (LMICs), where 90% of adolescents live. This article introduces the ‘Improving Adolescent mentaL health by reducing the Impact of poVErty (ALIVE)’ study, its conceptual framework, objectives, methods and expected outcomes. The aim of the ALIVE study is to develop and pilot-test an intervention that combines poverty reduction with strengthening self-regulation to prevent depression and anxiety among adolescents living in urban poverty in Colombia, Nepal and South Africa.
Methods
This aim will be achieved by addressing four objectives: (1) develop a conceptual framework that identifies the causal mechanisms linking poverty, self-regulation and depression and anxiety; (2) develop a multi-component selective prevention intervention targeting self-regulation and poverty among adolescents at high risk of developing depression or anxiety; (3) adapt and validate instruments to measure incidence of depression and anxiety, mediators and implementation parameters of the prevention intervention; and (4) undertake a four-arm pilot cluster randomised controlled trial to assess the feasibility, acceptability and cost of the selective prevention intervention in the three study sites.
Results
The contributions of this study include the active engagement and participation of adolescents in the research process; a focus on the causal mechanisms of the intervention; building an evidence base for prevention interventions in LMICs; and the use of an interdisciplinary approach.
Conclusions
By developing and evaluating an intervention that addresses multidimensional poverty and self-regulation, ALIVE can make contributions to evidence on the integration of mental health into broader development policy and practice.
To describe antimicrobial therapy used for multidrug-resistant (MDR) Acinetobacter spp. bacteremia in Veterans and impacts on mortality.
Methods:
This was a retrospective cohort study of hospitalized Veterans Affairs patients from 2012 to 2018 with a positive MDR Acinetobacter spp. blood culture who received antimicrobial treatment 2 days prior to through 5 days after the culture date. Only the first culture per patient was used. The association between treatment and patient characteristics was assessed using bivariate analyses. Multivariable logistic regression models examined the relationship between antibiotic regimen and in-hospital, 30-day, and 1-year mortality. Generalized linear models were used to assess cost outcomes.
Results:
MDR Acinetobacter spp. was identified in 184 patients. Most cultures identified were Acinetobacter baumannii (90%), 3% were Acinetobacter lwoffii, and 7% were other Acinetobacter species. Penicillins—β-lactamase inhibitor combinations (51.1%) and carbapenems (51.6%)—were the most prescribed antibiotics. In unadjusted analysis, extended spectrum cephalosporins and penicillins—β-lactamase inhibitor combinations—were associated with a decreased odds of 30-day mortality but were insignificant after adjustment (adjusted odds ratio (aOR) = 0.47, 95% CI, 0.21–1.05, aOR = 0.75, 95% CI, 0.37–1.53). There was no association between combination therapy vs monotherapy and 30-day mortality (aOR = 1.55, 95% CI, 0.72–3.32).
Conclusion:
In hospitalized Veterans with MDR Acinetobacter spp., none of the treatments were shown to be associated with in-hospital, 30-day, and 1-year mortality. Combination therapy was not associated with decreased mortality for MDR Acinetobacter spp. bacteremia.
In England, a range of mental health crisis care models and approaches to organising crisis care systems have been implemented, but characteristics associated with their effectiveness are poorly understood.
Aims
To (a) develop a typology of catchment area mental health crisis care systems and (b) investigate how crisis care service models and system characteristics relate to psychiatric hospital admissions and detentions.
Method
Crisis systems data were obtained from a 2019 English national survey. Latent class analyses were conducted to identify discernible typologies, and mixed-effects negative binomial regression models were fitted to explore associations between crisis care models and admissions and detention rates, obtained from nationally reported data.
Results
No clear typology of catchment area crisis care systems emerged. Regression models suggested that provision of a crisis telephone service within the local crisis system was associated with a 11.6% lower admissions rate and 15.3% lower detention rate. Provision of a crisis cafe was associated with a 7.8% lower admission rates. The provision of a crisis assessment team separate from the crisis resolution and home treatment service was associated with a 12.8% higher admission rate.
Conclusions
The configuration of crisis care systems varies considerably in England, but we could not derive a typology that convincingly categorised crisis care systems. Our results suggest that a crisis phone line and a crisis cafe may be associated with lower admission rates. However, our findings suggest crisis assessment teams, separate from home treatment teams, may not be associated with reductions in admission and detentions.
Background: Considering the threat of antimicrobial resistance (AMR), Ethiopia implemented strategies to combat AMR, including partnering with the American Society for Microbiology (ASM) to conduct an AMR training program using the Project ECHO learning platform. ECHO AMR was used to virtually connect subject-matter experts with participating sentinel laboratories in remote locations to provide ongoing education, telementoring, and foster peer-to-peer learning and problem-solving in microbiology. In phase 1, the ASM had primary leadership in conducting sessions and project administration. In phase 2, roles and responsibilities transitioned from the ASM to the Ethiopian Public Health Laboratory (EPHI) with support from ECHO India. Here we describe the transition process and lessons learned. Methods: From December 2020–2021, biweekly 1-hour sessions were conducted for 8 sentinel laboratories. Each virtual session included a lecture led by a subject-matter expert, a case presentation by a participating laboratory, open discussion, and feedback via an end-of-session online survey. Following a transition plan, initial ASM-EPHI transition activities included formal administrative and logistical training, including participation in a 3-day Project ECHO-immersion program provided by ECHO India. Selected administrative and technical roles and responsibilities, including further developing their own SMEs, were transitioned from ASM to EPHI every 4 sessions. ASM conducted postsession reviews with EPHI and ECHO India to discuss successes and suggested improvements. Results: Leadership of ECHO AMR was fully transitioned to EPHI over 12 months. End-of-session surveys and postsession reviews indicated the transition process was successful, with EPHI staff leading the lectures, session coordination, and facilitation, and positive feedback from session participants. Challenges included variable sentinel site participation due to competing priorities such as COVID-19 testing and poor internet connectivity during the rainy season. Lessons learned included the need to use a gradual transition strategy with close monitoring, training facilitators to maintain implementation fidelity (level of reproducibility to conduct ECHO AMR as in phase 1) and improve participation, and assessing individual learning, using pretests and posttests. Recommendations included that ASM should remain as an external technical advisor to ensure program technical depth and session facilitators be trained to improve participation in the discussions. Implementation fidelity compared to phase 1 was considered moderate, with the gap primarily due to the need for dedicated release time from laboratory duties to ensure session leadership, coordination, and facilitation. Conclusions: Leadership and laboratory workforce capacity-building responsibility for AMR training was successfully transitioned from ASM to EPHI, promoting self-sufficiency in training and with far-reaching benefits in the global fight against AMR.
Background: Statistically significant decreases in methicillin-resistant Staphylococcus aureus (MRSA) healthcare-associated infections (HAIs) occurred in Veterans Health Administration (VA) facilities from 2007 to 2019 using active surveillance for facility admissions and contact precautions for patients colonized (CPC) or infected (CPI) with MRSA, but the value of these interventions is controversial. Objective: To determine the impact of active surveillance, CPC, and CPI on prevention MRSA HAIs, we conducted a prospective cohort study between July 2020 and June 2022 in all 123 acute-care VA medical facilities. In April 2020, all facilities were given the option to suspend any combination of active surveillance, CPC, or CPI to free up laboratory resources for COVID-19 testing and conserve personal protective equipment. We measured MRSA HAIs (cases per 1,000 patient days) in intensive care units (ICUs) and non-ICUs by the infection control policy. Results: During the analysis period, there were 917,591 admissions, 5,225,174 patient days, and 568 MRSA HAIs. Only 20% of facilities continued all 3 MRSA infection control measures in July 2020, but this rate increased to 57% by June 2022. The MRSA HAI rate for all infection sites in non-ICUs was 0.07 (95% CI, 0.05–0.08) for facilities practicing active surveillance plus CPC plus CPI compared to 0.12 (95% CI, 0.08–0.19; P = .01) for those not practicing any of these strategies, and in ICUs the MRSA HAI rates were 0.20 (95% CI, 0.15–0.26) and 0.65 (95% CI, 0.41–0.98; P < .001) for the respective policies. Similar differences were seen when the analyses were restricted to MRSA bloodstream HAIs. Accounting for monthly COVID-19 admissions to facilities over the analysis period using a negative binomial regression model did not change the relationships between facility policy and MRSA HAI rates in the ICUs or non-ICUs. There was no statistically significant difference in monthly facility urinary catheter-associated infection rates, a nonequivalent dependent variable, in the categories during the analysis period in either ICUs or non-ICUs. Conclusions: In Veterans Affairs medical centers, there were fewer MRSA HAIs when facilities practiced active surveillance and contact precautions for colonized or infected patients during the COVID-19 pandemic. The effect was greater in ICUs than non-ICUs.
Objectives: To address the importation of multi-drug-resistant organisms (MDROs) when a colonized or infected patient is transferred from another VA facility, the Veterans Health Administration (VHA) launched the Inpatient Pathogen Tracker (IPT) in 2020. IPT tracks MDRO-infected/colonized patients and alerts MDRO Program Coordinators (MPCs) and Infection Preventionists (IPs) when such patients are admitted to their facility to facilitate rapid identification and isolation of infected/colonized patients. IPT usage has been low during initial rollout (32.5%). The VHA and the CARRIAGE QUERI Program developed targeted implementation strategies to increase utilization of IPT’s second iteration, VA Bug Alert (VABA). Methods: Familiarity with IPT was assessed via pre-education survey (3/2022). All sites received standard VABA implementation including: 1) adaptation of VABA features based on end-user feedback (completed 4/2022), 2) development and delivery of an educational module regarding the revised tool (completed 4/2022), and 3) internal facilitation from the VHA MDRO Program Office (ongoing) (see Figure for all key timepoints). Intent to register for VABA was assessed via post-education survey (4-5/2022). Sites (125 eligible) not registered for VABA by 6/1/2022 were randomly assigned to receive one of two conditions from 6/2022–8/2022: continued standard implementation alone or enhanced implementation. Enhanced implementation added the following to standard implementation: 1) audit and feedback reports and 2) external facilitation, including interviews and education about VABA. We compared the number of sites with ≥1 MPC/IP registered for VABA to-date between implementation conditions. Results:Pre-education survey. 168 MPC/IPs across 117 sites responded (94% of eligible sites). Among respondents, 25% had used IPT, 35.1% were familiar with but had not used IPT, and 39.9% were unfamiliar with IPT. Post-education survey. 93 MPC/IPs across 80 sites responded (59% of eligible sites). Of these, 81.7% said they planned to register for VABA, 4.3% said they would not register, and 14.0% said they were unsure. Post-6/1/2022 Registrations. By 6/1/2022, 71% of sites had ≥1 registered VABA user. Of the 28 unregistered sites eligible for enhanced implementation, thirteen were assigned to receive enhanced implementation, and fifteen were assigned to receive continued standard implementation. Eight sites in the enhanced implementation condition (61.5%) registered for VABA. Seven standard-implementation-only sites (46.7%) registered. The number of registered sites did not significantly differ by implementation condition (Fisher’s exact p=0.476). Conclusions: Standard and enhanced implementation were equally effective at encouraging VABA registration, suggesting that allocating resources to enhanced implementation may not be necessary.
Background: Antimicrobial resistance (AMR) presents a global health threat. Training laboratory technicians to accurately identify and report AMR is critical in low- and middle-income countries (LMICs) to control the spread of AMR. Ethiopia and Kenya implemented a telementoring program, ECHO AMR, via the Project ECHO learning platform to improve laboratory technician capacity to isolate, identify, and report AMR organisms; to perform antimicrobial susceptibility testing (AST); and to develop a community of learning. Between January 2018 and January 2022, biweekly 1-hour sessions were held for 8 and 22 laboratories averaging 19 or 43 participants per session in Ethiopia and Kenya, respectively. Each session included a lecture, a laboratory challenge case presentation, and discussion. An evaluation was conducted to assess perceived strengths and weaknesses of the program and its usefulness in improving bacteriology capacity. Methods: In July–August 2022, semistructured key informant interviews of purposively and randomly selected laboratorians were conducted to understand participant perspectives of ECHO AMR, including session structure and content, changes in laboratory performance, and the virtual learning platform. Eligible participants attended at least one-third of available sessions in Ethiopia (8 of 26 sessions) or Kenya (5 of 16 sessions) during 2021. Key informant interviews were transcribed and systematically reviewed to identify key themes. Results: In total, 22 laboratory technicians participated in the key informant interviews: 12 in Ethiopia and 10 in Kenya. Participants reported that the ECHO AMR session structure was well organized but recommended increasing session duration to allow more time for discussion. Technical content was presented at an appropriate level and was highly rated. However, participants suggested including more subject-matter experts to provide the lectures. All participants reported positive change in laboratory practice, including implementation of international standards for AST, better quality control, improved confidence and critical thinking, and increased AMR awareness and reporting. Participants learned well in the virtual environment, with the platform providing wide-ranging geographic interactions to share skills and knowledge among sites without travel. However, there were connectivity issues, competing work priorities during sessions, and a lack of dedicated space for team participation. Conclusions: Laboratory technicians reported that virtual laboratory training was well-received, efficient, and impactful. Participants benefited both individually and collectively, as a laboratory. Suggested improvements included increasing session duration, connectivity support, and including more subject-matter experts to broaden technical content. Further assessment is needed to evaluate the ECHO AMR’s impact on laboratory practices through observation and laboratory data. Virtual programs, requiring less time and resources than traditional in-country trainings, can be optimized and used to share and increase bacteriology knowledge in LMICs.
The recent World Health Organization (WHO) blueprint for dementia research and Lancet Commission on ending stigma and discrimination in mental health has identified a gap around dementia-related measures of stigma and discrimination that can be used in different cultural, language and regional contexts.
Aims
We aimed to characterise experiences of discrimination, and report initial psychometric properties of a new tool to capture these experiences, among a global sample of people living with dementia.
Method
We analysed data from 704 people living with dementia who took part in a global survey from 33 different countries and territories. Psychometric properties were examined, including internal consistency and construct validity.
Results
A total of 83% of participants reported discrimination in one or more areas of life, and this was similar across WHO Regions. The exploratory factor analysis factor loadings and scree plot supported a unidimensional structure for the Discrimination and Stigma Scale Ultra Short for People Living with Dementia (DISCUS-Dementia). The instrument demonstrated excellent internal consistency, with most of the construct validity hypotheses being confirmed and qualitative responses demonstrating face validity.
Conclusions
Our analyses suggest that the DISCUS-Dementia performs well with a global sample of people living with dementia. This scale can be integrated into large-scale studies to understand factors associated with stigma and discrimination. It can also provide an opportunity for a structured discussion around stigma and discrimination experiences important to people living with dementia, as well as planning psychosocial services and initiatives to reduce stigma and discrimination.
This is the fourth comprehensive assessment of the population status of all wild bird species in Europe. It identifies Species of European Conservation Concern (SPECs) so that action can be taken to improve their status. Species are categorised according to their global extinction risk, the size and trend of their European population and range, and Europe’s global responsibility for them. Of the 546 species assessed, 207 (38%) are SPECs: 74 (14%) of global concern (SPEC 1); 32 (6%) of European concern and concentrated in Europe (SPEC 2); and 101 (18%) of European concern but not concentrated in Europe (SPEC 3). The proportion of SPECs has remained similar (38–43%) across all four assessments since 1994, but the number of SPEC 1 species of global concern has trebled. The 44 species assessed as Non-SPECs in the third assessment (2017) but as SPECs here include multiple waders, raptors and passerines that breed in arctic, boreal or alpine regions, highlighting the growing importance of northern Europe and mountain ecosystems for bird conservation. Conversely, the 62 species assessed as SPECs in 2017 but as Non-SPECs here include various large waterbirds and raptors that are recovering due to conservation action. Since 1994, the number of specially protected species (listed on Annex I of the EU Birds Directive) qualifying as SPECs has fallen by 33%, while the number of huntable (Annex II) species qualifying as SPECs has risen by 56%. The broad patterns identified previously remain evident: 100 species have been classified as SPECs in all four assessments, including numerous farmland and steppe birds, ducks, waders, raptors, seabirds and long-distance migrants. Many of their populations are heavily depleted or continue to decline and/or contract in range. Europe still holds 3.4–5.4 billion breeding birds, but more action to halt and reverse losses is needed.