We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Adults living with obesity have a higher risk of eating disorders and disordered eating behaviours such as binge eating(1,2). However, the prevalence of disordered eating/eating disorders in adults presenting for obesity treatment is unknown and this information is needed to guide service provision. This systematic review aimed to estimate the prevalence of disordered eating/eating disorders in adults presenting for obesity treatment. Embase, MEDLINE and PsycINFO were searched to March 2024. Eligible studies (k) measured disordered eating/eating disorders in adults with overweight/obesity presenting for obesity treatment and included ≥ 325 participants to ensure a representative sample. Prevalence estimates were synthesised using random effect meta-analysis. 81 studies were included (n = 92,002, 75.9% female, median (IQR) age 44 (6) years, BMI 45 (11) kg/m2. Most studies were conducted in the United States (k = 44) and Italy (k = 15). Most prevalence data related to binge eating disorder or binge eating severity. The pooled prevalence of binge eating disorder, assessed by clinical interview, was 17% (95% CI: 12–22, 95% prediction interval (PI): 0–42, k = 19, n = 13447, τ2 = 0.01) using DSM-IV criteria and 12% (95% CI: 5–20, 95% PI: 0–40, k = 9, n = 7680, τ2 = 0.01) using DSM-V criteria. The pooled prevalence for severe binge eating (Binge Eating Scale score > 25) was 12% (95% CI: 8–16, 95% PI: 0–31, k = 18, n = 12136, τ2 = 0.01). For binge eating disorder, measured by clinical interview, the prevalence range for females and males was 14.9 to 27.0% (k = 12), and 4.0 to 24.1% (k = 3) respectively. For moderate to severe binge eating (Binge Eating Scale score ≥ 18) the prevalence for females and males ranged from 20.0 to 32.8%, and 7.1 to 77.5% (k = 2). Three studies reported prevalence by ethnicity. The prevalence of severe binge eating (Binge Eating Scale scores ≥ 27) was 9.5 to 41.7% in white populations (k = 2), 7.5 to 35.8% in black populations (k = 2), and 5.7% in Hispanic populations (k = 1). One study reported binge eating disorder, assessed by clinical interview, for white, black and Hispanic populations and reported prevalence of 15.3%, 11.3% and 11.4% respectively. Overall, there was high variability in the prevalence of binge eating and binge eating disorder in adults presenting for obesity treatment, with available data indicating prevalence can range up to 42%. It is important to identify which population level factors drive this heterogeneity to inform service provision however, the limited data highlights a significant knowledge gap in the reporting of eating disorders in underrepresented populations which needs to be addressed.
Hong Kong’s 3-year dynamic zero-COVID policy has caused prolonged exposure to stringent, pervasive anti-epidemic measures, which poses additional stressors on emotional well-being through pandemic fatigue, beyond the incumbent fear of the pandemic.
Aims
To investigate how major policy shifts in the zero-COVID strategy have corresponded with changing relationships between emotional well-being, pandemic fatigue from policy adherence, and pandemic fear, following the pandemic peak to a living-with-COVID policy.
Method
A three-wave repeated cross-sectional study (N = 2266) was conducted on the Chinese working-age population (18–64 years) during the peak outbreak (Wave 1), and subsequent policy shifts towards a living-with-COVID policy during the initial relaxation (Wave 2) and full relaxation (Wave 3) of anti-epidemic measures from March 2022 to March 2023. Non-parametric tests, consisting of robust analysis of covariance tests and quantile regression analysis, were performed.
Results
The severity of all measures was lowered after Wave 1; however, extreme pandemic fears reported in Wave 2 (n = 38, 7.7%) were associated with worse emotional well-being than the pandemic peak (Wave 1), which then subsided in Wave 3. Pandemic fatigue posed greater negative emotional well-being in Wave 1, whereas pandemic fear was the dominant predictor in Waves 2 and 3.
Conclusions
Pandemic fatigue and pandemic fear together robustly highlight the psychological cost of prolonged pandemic responses, expanding on a framework for monitoring and minimising the unintended mental health ramifications of anti-epidemic policies.
Non-native plants negatively impact ecosystems via a variety of mechanisms, including in forested riparian areas. Japanese knotweed [Polygonum cuspidatum Siebold & Zucc.] and its hybrids (referred to as Polygonum spp. hereafter) are widely spread throughout North America and can impact flora and fauna of riparian habitats. Thus, information improving our ability to understand and predict the potential spread and colonization of Polygonum spp. is valuable. One dispersal mechanism is hydrochory (i.e., dispersal by water), including the downstream dispersal of viable stems that can facilitate rapid invasion within a watershed. We used passive integrated transponder (PIT) telemetry in experimental releases of Polygonum spp. stems to track the downstream transport of Polygonum spp. in a small (second-order) stream in northern New Hampshire, USA, in the summers of 2021 and 2022. A total of 180 (90 each year) Polygonum spp. stems were released at three sites within the stream reach, with 185 (∼98%) being recaptured at least once, with a total of 686 recaptures. Individual relocated stems moved a maximum distance of 30 to 875 m downstream in 2021 and 13 to 1,233 m in 2022 during regular flows; however, a high-streamflow event in July 2021 flushed out all remaining stems downstream of the study area. Generalized additive mixed models (GAMMs) identified site-specific differences in stem movement rates and a general reduction in movement rates with increased duration of time elapsed since post-release. In general, Polygonum spp. stems moved farther downstream in sites with lower channel sinuosity, although other fine-scale habitat factors (e.g., water depth, habitat type, and presence of wood and debris jams) likely contribute to the ability for Polygonum spp. to further disperse or otherwise be retained within the channel. Thus, stream morphology and stream flow are likely to affect where Polygonum spp. stems will be retained and potentially reestablish. Predictive tools identifying areas of higher probability of hydrochory-based dispersal could help to focus removal efforts when employed or to identify riparian habitats at highest risk for spread.
People with dementia are more prone to premature nursing home placement after hospitalization due to physical and mental deconditioning which makes care-at- home more difficult. This study aimed to evaluate the effect of a post hospital discharge transitional care program on reduction of nursing home placement in people with dementia.
Methods:
A matched case-control study was conducted between 2018 and 2021. A transitional care program using case management approach was developed. Participants enrolled the program by self-enrolment or referral from hospitals or NGOs. Community-dwelling people with dementia discharged from hospitals received a four- week residential care at a dementia care centre with intensive nursing care, physiotherapy and group activities promoting social engagement, followed by eight- week day care rehabilitation activities to improve their mobility and cognitive functioning. They were matched on a 1:5 ratio by age and sex to people with dementia discharged from a convalescent hospital who did not participate in this program for comparison. The study outcome was nursing home admission, measured three months (i.e. post-intervention), six months, and nine months after hospital discharge. Multinomial logistic regression was conducted to investigate factors associated with nursing home placement at each measurement time-point.
Results:
361 hospital admission episodes (n=67 interevntion, n=294 control) were examined. The regression results showed that participants in the intervention group were significantly less likely to be admitted to nursing home three months (OR = 0.023, 95% CI: 0.003-0.201, p = .001) and six months (OR = 0.094, 95% CI: 0.025-0.353, p = .001) than the controls after hospital discharge, but the intervention effect did not sustain nine months after hospital discharge. Longer hospital length of stay, and hospital admission due to dementia, mental disturbances such as delirium, or mental disorders IPA_Abstract_PDP_20230119_clean 2 such as schizophrenia significantly predicted nursing home admission three months and six months after hospital discharge.
Conclusion:
The transitional care program could help reduce nursing home placement in people with dementia after hospital discharge. To sustain the intervention effect, more continual support after the intervention as well as family caregiver training would be required.
The possibility of ideography is an empirical question. Prior examples of graphic codes do not provide compelling evidence for the infeasibility of ideography, because they fail to satisfy essential cognitive requirements that have only recently been revealed by studies of representational systems in cognitive science. Design criteria derived from cognitive principles suggest how effective graphic codes may be engineered.
Bipolar disorder (BD) is a severe psychotic disease repeats depression, hypomania or mania. Using mobile applications to record emotions can help BD patients to self-manage and reduce emotional symptoms. Gamification applied in health-manage applications can improve the using frequency and satisfaction. Nurturing and horticultural therapy could increase the using frequency and alleviate the depression and anxiety.
Objectives
This study chose plants-nurturing to add to a self-management application, and explored the users’ experiences.
Methods
A one-group pretest-posttest design with qualitative interview was used. Analysis included the frequency of usage, emotional changes, and users’ feedback of the plants-nurturing in the first three months and after three months.。
Results
A total of 26 participants were included. In the frequency of usage, the times and ratio of days were increased but no significant difference. The emotional symptoms were no significant difference. Positive experiences were novelty and interesting, while negative experiences were the slow rate of growth.Table 1.
Demographics (N=26)
Range
Mean (SD)
Age
20-62
37.0(11.35)
n
Percentage
Gender
Female
15
57.7
Educational level
University
14
53.8
Employment situation
Employment
14
53.8
Image:
Image 2:
Image 3:
Conclusions
There was a preliminary increase when adding plants-nurturing to the self-manage application, whereas the effect should be examined by further research. The more delicate elements of gamification could be included in self-manage application, considered the users’ other sensory perception in the future. Meanwhile, to improve the frequency of usage and self-management in BD patients, the subjective experiences should be explored in-depth.
Most of the research explored the attrition rate and predictive factors for the smartphone application of emotion monitoring in bipolar disorder patients. However, there is less focus on the efficacy of maintaining the retention rate if the incentive system is employed.
Objectives
The aim of our research is to evaluate the efficacy of two different kinds of incentive systems on improving frequency of using the Smartphone Mood Relapse Warning application (MRW-APP) (Su et al., 2021) in bipolar patients.
Methods
A one-group pretest-posttest pilot study was conducted. Participants with bipolar disorder (n = 63) recorded their moods and symptoms through MRW-APP for 29 weeks with the attrition rate of 44%. Two different kinds of incentive systems, reward and lottery, were implemented. To know whether incentive implementation could play a role in motivating the participants to better adhere to the app, we used Friedman’s test and paired sample t-test to analyze the participants’ app-using frequency in the corresponding weeks.
Results
There was no significant difference in the participants’ app-using frequency (p>.05) before and after we implemented the first incentive system, reward (n=63). For the second incentive system, lottery (n=41), a significant difference in app-using frequency was still not observed (p>.05) after the intervention. But, for those who both had experienced two kinds of incentive systems (n=35), there were significant changes in their app-using frequency (p<.05).Table 1.
Demographics (n=63)
Variables
All (n=63)
Mean
SD
Age (n=55)
36.4025.27
11.10
Onset age (n=48)
9.35
n
%
Gender (n=54)
Female
33
61.1
Educational level (n=54)
Above undergraduates
40
74.1
SD= Standard deviation
Image:
Image 2:
Conclusions
This research found the two incentive systems, award and lottery, may help increase the using frequency of the smartphone monitoring app for participants with bipolar disorder. The results from our study can be a reference for mood monitoring apps development in the future, and it also suggested that incentive system has its potential on encouraging patients’ adherence to e-healthcare.
Many clinical trials leverage real-world data. Typically, these data are manually abstracted from electronic health records (EHRs) and entered into electronic case report forms (CRFs), a time and labor-intensive process that is also error-prone and may miss information. Automated transfer of data from EHRs to eCRFs has the potential to reduce data abstraction and entry burden as well as improve data quality and safety.
Methods:
We conducted a test of automated EHR-to-CRF data transfer for 40 participants in a clinical trial of hospitalized COVID-19 patients. We determined which coordinator-entered data could be automated from the EHR (coverage), and the frequency with which the values from the automated EHR feed and values entered by study personnel for the actual study matched exactly (concordance).
Results:
The automated EHR feed populated 10,081/11,952 (84%) coordinator-completed values. For fields where both the automation and study personnel provided data, the values matched exactly 89% of the time. Highest concordance was for daily lab results (94%), which also required the most personnel resources (30 minutes per participant). In a detailed analysis of 196 instances where personnel and automation entered values differed, both a study coordinator and a data analyst agreed that 152 (78%) instances were a result of data entry error.
Conclusions:
An automated EHR feed has the potential to significantly decrease study personnel effort while improving the accuracy of CRF data.
Alcohol consumption is a risk factor for various comorbidities, such as cirrhosis, chronic sclerosing stomatitis, and neuropsychiatric disorders.
Objectives
Our study examined the associations between psychological factors and alcohol addiction of the individuals with alcohol use disorder (AUD) in Southern Taiwan.
Methods
Demographic information as well as suicidal history and sources of stress were collected from 177 participants. The extent of alcoholism was assessed by AUDIT questionnaire. Demographic and linear regression analyses were performed with the Statistical Software Stata version 12.0 (StataCorp LP, College Station, TX, USA).
Results
Demographic data, suicidal history and the causes of stress of patients divided by AUDIT scores are shown in Table 1. Among 177 participants, 17 (9.6%) had suicidal thoughts, 4 (2.3%) had suicide plan, 22 (12.5%) self-injured, and four-fifth of patients lived under pressure. Patients who self-harmed were with significant lower AUDIT scores of -7.24 (95% CI: -11.49 – -3.00) (Table. 2). The AUDIT scores of patients with physical stress, interpersonal difficulties and loneliness increased significantly by 6.71 (95% CI: 3.19 – 10.30), 6.14 (95% CI: 2.15 – 10.13) and 5.02 (95% CI: 0.93 – 9.11), respectively (Table. 3).
Conclusions
Our findings indicated negative correlation with alcohol use and auto-inflicted injury. However, previous study showed systematic assessment of the association between suicide and AUD, and considered alcohol an important risk factor for suicide, which is related to mental health and affected by different genders and drinking patterns. Our results may provide reference for estimation of the alcohol-related psychological effects in Taiwan.
The great demographic pressure brings tremendous volume of beef demand. The key to solve this problem is the growth and development of Chinese cattle. In order to find molecular markers conducive to the growth and development of Chinese cattle, sequencing was used to determine the position of copy number variations (CNVs), bioinformatics analysis was used to predict the function of ZNF146 gene, real-time fluorescent quantitative polymerase chain reaction (qPCR) was used for CNV genotyping and one-way analysis of variance was used for association analysis. The results showed that there exists CNV in Chr 18: 47225201-47229600 (5.0.1 version) of ZNF146 gene through the early sequencing results in the laboratory and predicted ZNF146 gene was expressed in liver, skeletal muscle and breast cells, and was amplified or overexpressed in pancreatic cancer, which promoted the development of tumour through bioinformatics. Therefore, it is predicted that ZNF146 gene affects the proliferation of muscle cells, and then affects the growth and development of cattle. Furthermore, CNV genotyping of ZNF146 gene was three types (deletion type, normal type and duplication type) by Real-time fluorescent quantitative PCR (qPCR). The association analysis results showed that ZNF146-CNV was significantly correlated with rump length of Qinchuan cattle, hucklebone width of Jiaxian red cattle and heart girth of Yunling cattle. From the above results, ZNF146-CNV had a significant effect on growth traits, which provided an important candidate molecular marker for growth and development of Chinese cattle.
Glutamatergic dysfunction has been implicated in sensory integration deficits in schizophrenia, yet how glutamatergic function contributes to behavioural impairments and neural activities of sensory integration remains unknown.
Methods
Fifty schizophrenia patients and 43 healthy controls completed behavioural assessments for sensory integration and underwent magnetic resonance spectroscopy (MRS) for measuring the anterior cingulate cortex (ACC) glutamate levels. The correlation between glutamate levels and behavioural sensory integration deficits was examined in each group. A subsample of 20 pairs of patients and controls further completed an audiovisual sensory integration functional magnetic resonance imaging (fMRI) task. Blood Oxygenation Level Dependent (BOLD) activation and task-dependent functional connectivity (FC) were assessed based on fMRI data. Full factorial analyses were performed to examine the Group-by-Glutamate Level interaction effects on fMRI measurements (group differences in correlation between glutamate levels and fMRI measurements) and the correlation between glutamate levels and fMRI measurements within each group.
Results
We found that schizophrenia patients exhibited impaired sensory integration which was positively correlated with ACC glutamate levels. Multimodal analyses showed significantly Group-by-Glutamate Level interaction effects on BOLD activation as well as task-dependent FC in a ‘cortico-subcortical-cortical’ network (including medial frontal gyrus, precuneus, ACC, middle cingulate gyrus, thalamus and caudate) with positive correlations in patients and negative in controls.
Conclusions
Our findings indicate that ACC glutamate influences neural activities in a large-scale network during sensory integration, but the effects have opposite directionality between schizophrenia patients and healthy people. This implicates the crucial role of glutamatergic system in sensory integration processing in schizophrenia.
This paper presents a preliminary study into the spatial features that can be used to distinguish creativity andefficiency in design layouts, and the distinct pattern of cognitive and metacognitive activity that is associated with creative design. In a design experiment, a group of 12 architects were handed a design brief. Their drawing activity was recorded and they were required to externalize their thoughts during the design process. Both design solutions and verbal comments were analysed and modelled. A separate group of experienced architects used their expert knowledge to assign creativity and efficiency scores to the 12 design solutions. The design solutions were evaluated spatially. Protocol analysis studies including linkography and macroscopic analysis were used to discern distinctive patterns in the cognitive and metacognition activity of designs marked with the highest and least creativity scores. Entropy models of the linkographs and knowledge graphs were further introduced Finally, we assessed how creativity and efficiency correlates to experiment variables, cognitive activity, metacognitive activity, spatial and functional distribution of spaces in the design solutions, and the number and type of design constraints applied through the course of design. Through this investigation, we suggest that expert knowledge can be used to assess creativity and efficiency in designs. Our findings indicate that efficient layouts have distinct spatial features, and that cognitive and metacognitive activity in design that yields a highly creative outcome corresponds to higher frequencies of design moves and higher linkages between design moves.
Capacity development is critical to long-term conservation success, yet we lack a robust and rigorous understanding of how well its effects are being evaluated. A comprehensive summary of who is monitoring and evaluating capacity development interventions, what is being evaluated and how, would help in the development of evidence-based guidance to inform design and implementation decisions for future capacity development interventions and evaluations of their effectiveness. We built an evidence map by reviewing peer-reviewed and grey literature published since 2000, to identify case studies evaluating capacity development interventions in biodiversity conservation and natural resource management. We used inductive and deductive approaches to develop a coding strategy for studies that met our criteria, extracting data on the type of capacity development intervention, evaluation methods, data and analysis types, categories of outputs and outcomes assessed, and whether the study had a clear causal model and/or used a systems approach. We found that almost all studies assessed multiple outcome types: most frequent was change in knowledge, followed by behaviour, then attitude. Few studies evaluated conservation outcomes. Less than half included an explicit causal model linking interventions to expected outcomes. Half of the studies considered external factors that could influence the efficacy of the capacity development intervention, and few used an explicit systems approach. We used framework synthesis to situate our evidence map within the broader literature on capacity development evaluation. Our evidence map (including a visual heat map) highlights areas of low and high representation in investment in research on the evaluation of capacity development.
The root-knot nematode, Meloidogyne javanica, is a major problem for the production of Sacha Inchi plants. We examined the effects of strip intercropping of Sacha Inchi/Chinese leek of 3–4 years on the seasonal dynamics of plant and soil traits in tropical China. Results indicated that in the intercropping system, a partially temporal divergence of belowground resource acquisition via niche separation occurred throughout the growing seasons, besides a complete spatially-separated plant height between the two crops. Compared with Sacha Inchi monoculture, the increased seed yield per unit area in the intercropping system was mainly attributed to the higher plant survival rate, rather than the enhanced plant traits of healthy plants. Intercropping greatly suppressed M. javanica populations only in the wet season, compared with monoculture; which may be associated with the combined effects of the direct allelopathy and indigenous microbe induced-suppressiveness. Intercropping did not affect microbial richness and α-diversity in the rhizosphere, except for the decreased fungal richness. Both bacterial and fungal composition and structure were diverged between monoculture v. intercropping system. The relative abundances of the dominant bacterial genera (Bacillus, Gaiellales, Lactococcus, Massilia and Lysobacter, etc.) differed significantly between the two cropping systems. For fungi, intercropping decreased the relative abundances of Fusarium and Gibberella, but increased those of Nectriaceae_unclassified, Chaetomiaceae, Humicola and Mortierella. Overall, Sacha Inchi/Chinese leek intercropping suppressed M. javanica populations and shifted microbial compositions (especially decreased pathogen-containing Fusarium). The increased yield and economic returns in this intercropping system provide valid information for the effective agricultural management.
The risk of antipsychotic-associated cardiovascular and metabolic events may differ among countries, and limited real-world evidence has been available comparing the corresponding risks among children and young adults. We, therefore, evaluated the risks of cardiovascular and metabolic events in children and young adults receiving antipsychotics.
Methods
We conducted a multinational self-controlled case series (SCCS) study and included patients aged 6–30 years old who had both exposure to antipsychotics and study outcomes from four nationwide databases of Taiwan (2004–2012), Korea (2010–2016), Hong Kong (2001–2014) and the UK (1997–2016) that covers a total of approximately 100 million individuals. We investigated three antipsychotics exposure windows (i.e., 90 days pre-exposure, 1–30 days, 30–90 days and 90 + days of exposure). The outcomes were cardiovascular events (stroke, ischaemic heart disease and acute myocardial infarction), or metabolic events (hypertension, type 2 diabetes mellitus and dyslipidaemia).
Results
We included a total of 48 515 individuals in the SCCS analysis. We found an increased risk of metabolic events only in the risk window with more than 90-day exposure, with a pooled IRR of 1.29 (95% CI 1.20–1.38). The pooled IRR was 0.98 (0.90–1.06) for 1–30 days and 0.88 (0.76–1.02) for 31–90 days. We found no association in any exposure window for cardiovascular events. The pooled IRR was 1.86 (0.74–4.64) for 1–30 days, 1.35 (0.74–2.47) for 31–90 days and 1.29 (0.98–1.70) for 90 + days.
Conclusions
Long-term exposure to antipsychotics was associated with an increased risk of metabolic events but did not trigger cardiovascular events in children and young adults.
Pooling of samples in detecting the presence of virus is an effective and efficient strategy in screening carriers in a large population with low infection rate, leading to reduction in cost and time. There are a number of pooling test methods, some being simple and others being complicated. In such pooling tests, the most important parameter to decide is the pool or group size, which can be optimised mathematically. Two pooling methods are relatively simple. The minimum numbers required in these two tests for a population with known infection rate are discussed and compared. Results are useful for identifying asymptomatic carriers in a short time and in implementing health codes systems.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to coronavirus disease 2019 (COVID-19) with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplementary materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
The volume of evidence from scientific research and wider observation is greater than ever before, but much is inconsistent and scattered in fragments over increasingly diverse sources, making it hard for decision-makers to find, access and interpret all the relevant information on a particular topic, resolve seemingly contradictory results or simply identify where there is a lack of evidence. Evidence synthesis is the process of searching for and summarising a body of research on a specific topic in order to inform decisions, but is often poorly conducted and susceptible to bias. In response to these problems, more rigorous methodologies have been developed and subsequently made available to the conservation and environmental management community by the Collaboration for Environmental Evidence. We explain when and why these methods are appropriate, and how evidence can be synthesised, shared, used as a public good and benefit wider society. We discuss new developments with potential to address barriers to evidence synthesis and communication and how these practices might be mainstreamed in the process of decision-making in conservation.
To describe the infection control preparedness measures undertaken for coronavirus disease (COVID-19) due to SARS-CoV-2 (previously known as 2019 novel coronavirus) in the first 42 days after announcement of a cluster of pneumonia in China, on December 31, 2019 (day 1) in Hong Kong.
Methods:
A bundled approach of active and enhanced laboratory surveillance, early airborne infection isolation, rapid molecular diagnostic testing, and contact tracing for healthcare workers (HCWs) with unprotected exposure in the hospitals was implemented. Epidemiological characteristics of confirmed cases, environmental samples, and air samples were collected and analyzed.
Results:
From day 1 to day 42, 42 of 1,275 patients (3.3%) fulfilling active (n = 29) and enhanced laboratory surveillance (n = 13) were confirmed to have the SARS-CoV-2 infection. The number of locally acquired case significantly increased from 1 of 13 confirmed cases (7.7%, day 22 to day 32) to 27 of 29 confirmed cases (93.1%, day 33 to day 42; P < .001). Among them, 28 patients (66.6%) came from 8 family clusters. Of 413 HCWs caring for these confirmed cases, 11 (2.7%) had unprotected exposure requiring quarantine for 14 days. None of these was infected, and nosocomial transmission of SARS-CoV-2 was not observed. Environmental surveillance was performed in the room of a patient with viral load of 3.3 × 106 copies/mL (pooled nasopharyngeal and throat swabs) and 5.9 × 106 copies/mL (saliva), respectively. SARS-CoV-2 was identified in 1 of 13 environmental samples (7.7%) but not in 8 air samples collected at a distance of 10 cm from the patient’s chin with or without wearing a surgical mask.
Conclusion:
Appropriate hospital infection control measures was able to prevent nosocomial transmission of SARS-CoV-2.