We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Childhood trauma is a well-established risk factor for psychosis, paranoia, and substance use, with cannabis being a modifiable environmental factor that exacerbates these vulnerabilities. This study examines the interplay between childhood trauma, cannabis use, and paranoia using standard tetrahydrocannabinol (THC) units as a comprehensive measure of cannabis exposure.
Methods
Data were derived from the Cannabis&Me study, an observational, cross-sectional, online survey of 4,736 participants. Childhood trauma was assessed using a modified Childhood Trauma Screen Questionnaire, while paranoia was measured via the Green Paranoid Thoughts Scale. Cannabis use was quantified using weekly standard THC units. Structural equation modeling (SEM) was employed to evaluate direct and indirect pathways between trauma, cannabis use, and paranoia.
Results
Childhood trauma was strongly associated with paranoia, particularly emotional, and physical abuse (β = 16.10, q < 0.001; β = 16.40, q < 0.001). Cannabis use significantly predicted paranoia (β = 0.009, q < 0.001). Interactions emerged between standard THC units and both emotional abuse (β = 0.011, q < 0.001) and household discord (β = 0.011, q < 0.001). SEM revealed a small but significant indirect effect of trauma on paranoia via cannabis use (β = 0.004, p = 0.017).
Conclusions
These findings highlight childhood trauma as a primary driver of paranoia, with cannabis use amplifying its effects. While trauma had a strong direct impact, cannabis played a significant mediating role. Integrating standard THC units into psychiatric research and clinical assessments may enhance risk detection and refine intervention strategies, particularly for childhood trauma-exposed individuals.
We present a financial justification for an outpatient infectious diseases pharmacist, based on cost savings from decreases in length of stay for patients with Staphylococcus aureus infections and additional revenue generated by physicians and pharmacists while following patients discharged on outpatient parenteral antimicrobial therapy.
About 13% of pregnant women with substance use disorder (SUD) receive treatment and many may encounter challenges in accessing perinatal care, making it critical for this population to receive uninterrupted care during a global pandemic.
Methods
From October 2021-January 2022, we conducted an online survey of pregnant and postpartum women and interviews with clinicians who provide care to this population. The survey was administered to pregnant and postpartum women who used substances or received SUD treatment during the COVID-19 pandemic.
Results
Two hundred and ten respondents completed the survey. All respondents experienced pandemic-related barriers to routine health care services, including delays in prenatal care and SUD treatment. Disruptions in treatment were due to patient factors (38.2% canceled an appointment) and clinic factors (25.5% had a clinic cancel their appointment). Respondents were generally satisfied with telehealth (M = 3.97, SD = 0.82), though half preferred a combination of in-person and telehealth visits. Clinicians reported telehealth improved health care access for patients, however barriers were still observed.
Conclusions
Although strategies were employed to mitigate barriers in care during COVID-19, pregnant and postpartum women who used substances still experienced barriers in receiving consistent care. Telehealth may be a useful adjunct to enhance care access for pregnant and postpartum women during public health crises.
This study presents the black hole accretion history of obscured active galactic nuclei (AGNs) identified from the JWST CEERS survey by Chien et al. (2024) using mid-infrared (MIR) SED fitting. We compute black hole accretion rates (BHARs) to estimate the black hole accretion density (BHAD), $\rho_{L_{\text{disk}}}$, across $0 \lt z \lt 4.25$. MIR luminosity functions (LFs) are also constructed for these sources, modeled with modified Schechter and double power law forms, and corresponding BHAD, $\rho_{\text{LF}}$, is derived by integrating the LFs and multiplying by the luminosity. Both $\rho_{\text{LF}}$ extend to luminosities as low as $10^7 \, {\rm L}_{\odot}$, two orders of magnitude fainter than pre-JWST studies. Our results show that BHAD peaks between redshifts 1 and 3, with the peak varying by method and model, $z \simeq$ 1 - 2 for $\rho_{L_{\text{disk}}}$ and the double power law, and $z \simeq$ 2 - 3 for the modified Schechter function. A scenario where AGN activity peaks before cosmic star formation would challenge existing black hole formation theories, but our present study, based on early JWST observations, provides an initial exploration of this possibility. At $z \sim 3$, $\rho_{\text{LF}}$ appears higher than X-ray estimates, suggesting that MIR observations are more effective in detecting obscured AGNs missed by X-ray observations. However, given the overlapping error bars, this difference remains within the uncertainties and requires confirmation with larger samples. These findings highlight the potential of JWST surveys to enhance the understanding of co-evolution between galaxies and AGNs.
The outer solar system is theoretically predicted to harbour an undiscovered planet, often referred to as Planet Nine. Simulations suggest that its gravitational influence could explain the unusual clustering of minor bodies in the Kuiper Belt. However, no observational evidence for Planet Nine has been found so far, as its predicted orbit lies far beyond Neptune, where it reflects only a faint amount of Sunlight. This work aims to find Planet Nine candidates by taking advantage of two far-infrared all-sky surveys, which are IRAS and AKARI. The epochs of these two surveys were separated by 23 years, which is large enough to detect Planet Nine’s $\sim3'$/year orbital motion. We use a dedicated AKARI Far-Infrared point source list for the purpose of our Planet Nine search — AKARI-FIS Monthly Unconfirmed Source List (AKARI-MUSL), which includes sources detected repeatedly only in hours timescale, but not after months. AKARI-MUSL is more advantageous than the AKARI Bright Source Catalogue (AKARI-BSC) for detecting moving and faint objects like Planet Nine with a twice-deeper flux detection limit. We search for objects that moved slowly between IRAS and AKARI detections given in the catalogues. First, we estimated the expected flux and orbital motion of Planet Nine by assuming its mass, distance, and effective temperature to ensure it can be detected by IRAS and AKARI, then applied the positional and flux selection criteria to narrow down the number of sources from the catalogues. Next, we produced all possible candidate pairs including one IRAS source and one AKARI source whose angular separations were limited between 42′ and $69.6'$, corresponding to the heliocentric distance range of 500 – 700 AU and the mass range of 7 – 17M$_{\oplus}$. There are 13 candidate pairs obtained after the selection criteria. After image inspection, we found one good candidate, of which the IRAS source is absent from the same coordinate in the AKARI image after 23 years and vice versa. However, AKARI and IRAS detections are not enough to determine the full orbit of this candidate. This issue leads to the need for follow-up observations, which will determine the Keplerian motion of our Planet Nine candidate.
A terrain and path following control scheme is designed for ground detection mission of a fixed-wing unmanned aerial vehicle (UAV) considering the attitude constraint. The attitude of the UAV should be maintained for efficient exploration, leading to the degradation of mission performance. The proposed controller makes the attitude of the UAV remain in a desired range, which alleviates the mission performance degradation. The proposed algorithm consists of the guidance law and the nonlinear flight path controller. The guidance law is designed by combining a terrain-following altitude controller and a horizontal path following controller based on the Lyapunov control scheme. The generated command by the guidance law is used as a reference input to be followed in the flight path controller. The flight path controller is designed considering the attitude constraint. Especially, the roll and pitch angles of the UAV are considered as attitude constraints so that the angles remain within the desired range. To design a flight path controller satisfying the attitude constraint, the control system is decomposed into three feedback loops. State-feedback controllers are designed using the sliding mode control scheme for flight path control in the outermost loop as well as for angular rate control in the inner loop. In the second-outer loop, a quadratic programming (QP)-based controller is designed to control the sideslip angle while satisfying the attitude constraint. The control Lyapunov function is adopted to determine the QP constraint for the sideslip angle control, and the control barrier function is used to obtain the QP constraint for the attitude constraint. Numerical simulation is performed to demonstrate the effectiveness of the proposed algorithm.
Transcranial direct current stimulation (tDCS) is a promising treatment for major depressive disorder (MDD). This study evaluated its antidepressant and cognitive effects as a safe, effective, home-based therapy for MDD.
Methods
This double-blind, sham-controlled, randomized trial divided participants into low-intensity (1 mA, n = 47), high-intensity (2 mA, n = 49), and sham (n = 45) groups, receiving 42 daily tDCS sessions, including weekends and holidays, targeting the dorsolateral prefrontal cortex for 30 minutes. Assessments were conducted at baseline and weeks 2, 4, and 6. The primary outcome was cognitive improvement assessed by changes in total accuracy on the 2-back test from baseline to week 6. Secondary outcomes included changes in depressive symptoms (HAM-D), anxiety (HAM-A), and quality of life (QLES). Adverse events were monitored. This trial was registered with ClinicalTrials.gov (NCT04709952).
Results
In the tDCS study, of 141 participants (102 [72.3%] women; mean age 35.7 years, standard deviation 12.7), 95 completed the trial. Mean changes in the total accuracy scores from baseline to week 6 were compared across the three groups using an F-test. Linear mixed-effects models examined the interaction of group and time. Results showed no significant differences among groups in cognitive or depressive outcomes at week 6. Active groups experienced more mild adverse events compared to sham but had similar rates of severe adverse events and dropout.
Conclusions
Home-based tDCS for MDD demonstrated no evidence of effectiveness but was safe and well-tolerated. Further research is needed to address the technical limitations, evaluate broader cognitive functions, and extend durations to evaluate its therapeutic potential.
People with mental illness often experience a concealable stigmatized identity that may be invisible to others. As a result, they are often faced with the dilemmas of whether to disclosure or conceal their diagnosis and their experience. However, in order to overcome the social stigma and self-stigma that hinder their recovery, they must establish a network and social support through identity disclosure.
Objectives
This study investigates the effect of clinical characteristics (symptom and social function level), self-stigma and social support on the disclosure of people with mental illness.
Methods
The research was conducted with 236 respondents who are currently using community mental health services. (Male: 51.9%, Female: 48.1%; Mean age = 47.97±13.24; SPR: 66.8%, other diagnosis: 33.2%).
Results
Most respondents disclosed their mental illness to health service providers and family, but they are least open about their identity toward neighbors and co-workers. A regression analysis of predictors of disclosure revealed that only social functioning level and social support had significant predictive power. It was discovered that individuals with better level of social function and social support disclosure more about their mental illness.
Conclusions
A program that increases social functions and support network can be recommended to improve disclosure efficacy.
Human faces generally attract immediate attention. However, it has been found that children with autism spectrum disorder (ASD) tend to allocate relatively less attention to faces. Previous research showed that typically developing children (TD) exhibited an attentional bias to angry faces, regardless of their anxiety levels, but it’s unclear if this applies to children with ASD. Therefore, the present study aims to investigate attentional bias induced by angry and/or happy faces in children with ASD.
Objectives
We explored attentional bias toward angry faces in both TD children and children with ASD. We hypothesize that while TD children will show attentional capture effects in response to angry faces, children with ASD will not exhibit such attentional bias to facial stimuli, irrespective of their emotional content.
Methods
By now, five ASD participants (all male) and 34 TD participants (17 male), aged 6-12, have completed a continuous performance task. In this task, irrelevant distractors (angry or happy faces) appeared and disappeared abruptly, while the orientation of the target changed every 1,250 ms. Participants were asked to respond as quickly and accurately as possible to the orientaiton of the target. We designated the time when the distractor first appeared as T1, and subsequent time intervals at 1,250 ms increments were labeled as T2, T3, and T4. The time intervals when no distractor was present were labeled as TB (baseline). If the reaction time (RT) at T1 was significantly slower compared to TB, it indicated attentional bias by the distractor.
Results
For the RT data, separate repeated measures ANOVAs with 2 (emotion) * 5 (time) factors were conducted for each group. The results revealed a significant main effect of time (F(4, 132) = 17.59, p < .01) and a significant interaction between emotion and time (F(3.27, 107.74) = 4.92, p < .01) only in TD. Post hoc t-tests indicated that TD children exhibited significantly slower RT at T1 compared to TB, but this difference was observed only for angry faces (t(33) = 4.84, p < .01). In contrast, no significant effect was found in children with ASD. In other words, TD demonstrated attentional bias only when exposed to angry faces, while ASD children did not exhibit attentional bias to either emotion.
Conclusions
This study aimed to investigate attentional bias to angry faces in both TD and ASD children. The results indicate that TD children exhibited an attentional bias when exposed to angry faces, whereas ASD children did not display such bias. These findings are consistent with previous research suggesting that TD children tend to show attentional bias towards angry faces, regardless of their anxiety levels. Furthermore, the absence of attentional bias to angry faces in ASD suggests that their characteristic of reduced attention to faces may contribute to the lack of attentional bias towards angry faces.
Suicide has a complex relationship with several factors, and it is known that identifying high-risk groups of suicide and managing crisis in advance can help prevent suicide. Moreover in a previous study, it showed that people with chronic diseases often suffer from psychological difficulties such as depression and anxiety, which can influence one to commit suicide. Based on many studies about the relationship between diabetes and depression, 10% of diabetic patients experience major depression, and diabetic patients experience twice as much depression as the general population. But, there are few studies examining the relationship between diabetes and suicide risk, and most of them were targeted for type 1 diabetes only.
Objectives
The objectives of this study were to investigate the suicide risk in diabetic patients, and evaluate the suicide risk varies by the duration of diabetes, using a large population sample in South Korea
Methods
Using the 2019 Korea National Health and Nutrition Examination Survey data, 6,296 adults (aged 19 years or older) were included. Suicidal ideation, suicidal plan, and suicidal behavior of diabetic patients were compared with the general population. After classifying the patients into ≤ 1 year, 2 to 9 years, and 10 years ≤ for the duration of diabetes, we evaluated the relationship between the duration of diabetes and the risk of suicide.
Results
Diabetic patients had higher prevalence of suicidal ideation (9.1%, P<0.001) and suicidal plan (3.6%, P<0.001) than general population. After adjusting for potential confounding factors, suicidal plan (aOR = 3.011, 95% CI = 1.392-6.512) was significantly associated with diabetes. In the 2 to 9 year group of diabetes, we found an increase in risk of suicidal ideation (aOR=2.068, 95% CI=1.219-3.510), suicidal plan (aOR=3.640, 95% CI=1.592-8.320), and suicidal behavior (aOR=6.222, 95% CI=1.759-22.008) after adjusting covariates. However, increase in suicide risk was not observed in the ≤1 year and 10 years ≤ groups after diagnosis of diabetes.
Conclusions
In adults, diabetes is associated with increase in suicide risk. Suicide risk in diabetic patients shows an ‘inverted U-shaped’ depending on the duration of diabetes.
People experience various negative emotions when they encounter stressful events, and these negative emotions contribute to the onset of illnesses. These emotional responses are not limited to just one; a person can experience multiple emotions at once, and the primary emotional reactions can vary depending on the severity and duration of the illness or life events. This is reason why we created a self-report scale to assess short-term emotional responses, focusing on the current emotional state experienced subjectively by patients.
Objectives
The purpose of this study was to develop an affective response scale (ARS) and examine its validity and reliability.
Methods
We established clusters of affective via a literature review and developed preliminary items based on the structure. We conducted expert content validation to converge on the final items, followed by construct validity and reliability analyses.
Results
The research findings indicate that the Affective Response Scale was composed of three main dimensions: anxiety, anger, and depression. Content validity results confirmed the validity of most items. The scale developed in this study was found to be valid in both exploratory and confirmatory factor analyses, and it was identified to be stable and consistent through the analysis of the internal reliability.
Conclusions
These results indicate that the ARS is highly reliable and valid, and that it can be utilized as an effective measure of the patient’s emotion and its severity.
Accordingly, the Korean Medication Algorithm Project for Bipolar Disorder (KMAP-BP) working committee, composed of domestic experts, developed Korea’s first KMAP-BP in 2002 and later in 2006, 2010, and 2010. A revised version of KMAP-BP was announced every four years four times in 2014 and 2018.6-10). The treatment strategy considering the safety and tolerability of KMAP-BP 2022 was developed by collecting opinions from domestic bipolar disorder experts.
Objectives
Safety and tolerability of drugs are very important factors in the treatment of bipolar disorder. An expert opinion survey was conducted on treatment strategies in various special clinical situations, such as significant weight gain, characteristic drug side effects, low drug adherence, pregnant and reproductive women, and genetic counseling.
Methods
A written survey about treatment strategies related to safety and tolerability was prepared and focused on significant weight gain, characteristic drug side effects, low drug adherence, pregnant and reproductive women, and genetic counseling. Ninety-three experts of the review committee completed the survey.
Results
In the case of weight gain occurring during drug treatment, it was preferred to replace it with a drug that caused less weight gain, such as lamotrigine, aripiprazole, or ziprasidone. If there was a significant weight gain due to the treatment drug, it was preferred to intervene as soon as possible. In the case of hyperprolactinemia, it was selected to change the medication and discontinue it for benign rash caused by lamotrigine. In improving drug adherence, the preference for long-acting injections increased. Antipsychotics can be used with great caution in pregnant or reproductive women.
Conclusions
Treatment strategies in various clinical situations related to safety and tolerability in drug treatment for bipolar disorder were described. It is hoped that it will be useful in practical clinical situations.
It has been several years since the World Health Organization (WHO) advocated for shared decision-making(SDM) models when developing treatment plans for individuals with mental illnesses. It is emphasizing the importance of actively involving patients in expressing their opinions and sharing treatment-related information. However, few clinicians accept patients’ subjective views in clinical practice. Given that patients’ subjective beliefs about their symptoms significantly impact treatment satisfaction, prognosis, and adherence, it is essential to assess these perceptions. However, few studies have been conducted to assess patients’ subjective beliefs, their mental representation, of their disease. Therefore, this study aims to develop Interview that enable the utilization of patients’ cognitive representations of their mental illnesses in clinical practice.
Objectives
The primary objective of this study is to develop a semi-structured interview and a self-report scale to evaluate patients’ mental representations of their illnesses. Subsequently, validate the reliability and validity of these tools as psychological assessments.
Methods
An initial structure for both the semi-structured interview and self-report scale was established through a literature review of existing disease representation measurements. Subsequently, expert panel discussions and further literature reviews were conducted to refine the structure and content of both tools. Content validity for both the interview and self-report scale was assessed by a panel of nine experts and a group of ten students. Following this, the developed interview tool was subjected to a validity analysis with clinical patients using Missick’s six validity criteria(Content, Substantive, Structural, Generalizability, External, Consequential).
Results
Content validity index (CVI) values for the overall structure indicated that all subdomains scored above 0.8, demonstrating the appropriateness of the interview tool’s five subdomains: symptoms, causes, temporal aspects, impact, and treatment and control. Content validity assessment for individual items revealed that some items within the “causes of the disease” subdomain, specifically stress-related factors, scored below 0.6, prompting necessary item modifications. All other factors achieved CVI scores of 0.6 or higher. Facial validity assessment yielded favorable results for all items in the self-report scale. All validity was demonstrated to be satisfactory.
Conclusions
This study has provided evidence that the developed tools are reliable and valid instruments for measuring patients’ perceptions of their illnesses, offering a trustworthy means to assess these vital cognitive representations in clinical practice.
Structured processes to improve the quality and impact of clinical and translational research are a required element of the Clinical and Translational Sciences Awards (CTSA) program and are central to awardees’ strategic management efforts. Quality improvement is often assumed to be an ordinary consequence of evaluation programs, in which standardized metrics are tabulated and reported externally. Yet evaluation programs may not actually be very effective at driving quality improvement: required metrics may lack direct relevance; they lack incentive to improve on areas of relative strength; and the validity of inter-site comparability may be limited. In this article, we describe how we convened leaders at our CTSA hub in an iterative planning process to improve the quality of our CTSA program by intentionally focusing on how data collection activities can primarily advance continuous quality improvement (CQI) rather than strictly serve as evaluative tools. We describe our CQI process, which consists of three key components: (1) Logic models outlining goals and associated mechanisms; (2) relevant metrics to evaluate performance improvement opportunities; and (3) an interconnected and collaborative CQI framework that defines actions and timelines to enhance performance.
Background: CHAMPION-NMOSD (NCT04201262) is an ongoing global, open-label, phase 3 study evaluating ravulizumab in AQP4+ NMOSD. Methods: Adult patients received an intravenous, weight-based loading dose of ravulizumab on day 1 and a maintenance dose on day 15 and every 8 weeks thereafter. Following a primary treatment period (PTP; up to 2.5 years), patients could enter a long-term extension (LTE). Results: 58 patients completed the PTP; 56/2 entered/completed the LTE. As of June 16, 2023, median (range) follow-up was 138.4 (11.0-183.1) weeks for ravulizumab (n=58), with 153.9 patient-years. Across the PTP and LTE, no patients had an adjudicated on-trial relapse during ravulizumab treatment. 91.4% (53/58 patients) had stable or improved Hauser Ambulation Index score. 91.4% (53/58 patients) had no clinically important worsening in Expanded Disability Status Scale score. The incidence of treatment-emergent adverse events (TEAEs) and serious adverse events was 94.8% and 25.9%, respectively. Most TEAEs were mild to moderate in severity and unrelated to ravulizumab. TEAEs leading to withdrawal from ravulizumab occurred in 1 patient. Conclusions: Ravulizumab demonstrated long-term clinical benefit in the prevention of relapses in AQP4+ NMOSD with a safety profile consistent with prior analyses.
Background: After a transient ischemic attack (TIA) or minor stroke, the long-term risk of subsequent stroke is uncertain. Methods: Electronic databases were searched for observational studies reporting subsequent stroke during a minimum follow-up of 1 year in patients with TIA or minor stroke. Unpublished data on number of stroke events and exact person-time at risk contributed by all patients during discrete time intervals of follow-up were requested from the authors of included studies. This information was used to calculate the incidence of stroke in individual studies, and results across studies were pooled using random-effects meta-analysis. Results: Fifteen independent cohorts involving 129794 patients were included in the analysis. The pooled incidence rate of subsequent stroke per 100 person-years was 6.4 events in the first year and 2.0 events in the second through tenth years, with cumulative incidences of 14% at 5 years and 21% at 10 years. Based on 10 studies with information available on fatal stroke, the pooled case fatality rate of subsequent stroke was 9.5% (95% CI, 5.9 – 13.8). Conclusions: One in five patients is expected to experience a subsequent stroke within 10 years after a TIA or minor stroke, with every tenth patient expected to die from their subsequent stroke.
While past research suggested that living arrangements are associated with suicide death, no study has examined the impact of sustained living arrangements and the change in living arrangements. Also, previous survival analysis studies only reported a single hazard ratio (HR), whereas the actual HR may change over time. We aimed to address these limitations using causal inference approaches.
Methods
Multi-point data from a general Japanese population sample were used. Participants reported their living arrangements twice within a 5-year time interval. After that, suicide death, non-suicide death and all-cause mortality were evaluated over 14 years. We used inverse probability weighted pooled logistic regression and cumulative incidence curve, evaluating the association of time-varying living arrangements with suicide death. We also studied non-suicide death and all-cause mortality to contextualize the association. Missing data for covariates were handled using random forest imputation.
Results
A total of 86,749 participants were analysed, with a mean age (standard deviation) of 51.7 (7.90) at baseline. Of these, 306 died by suicide during the 14-year follow-up. Persistently living alone was associated with an increased risk of suicide death (risk difference [RD]: 1.1%, 95% confidence interval [CI]: 0.3–2.5%; risk ratio [RR]: 4.00, 95% CI: 1.83–7.41), non-suicide death (RD: 7.8%, 95% CI: 5.2–10.5%; RR: 1.56, 95% CI: 1.38–1.74) and all-cause mortality (RD: 8.7%, 95% CI: 6.2–11.3%; RR: 1.60, 95% CI: 1.42–1.79) at the end of the follow-up. The cumulative incidence curve showed that these associations were consistent throughout the follow-up. Across all types of mortality, the increased risk was smaller for those who started to live with someone and those who transitioned to living alone. The results remained robust in sensitivity analyses.
Conclusions
Individuals who persistently live alone have an increased risk of suicide death as well as non-suicide death and all-cause mortality, whereas this impact is weaker for those who change their living arrangements.
Faecal examinations for helminth eggs were performed on 1869 people from two riverside localities, Vientiane Municipality and Saravane Province, along the Mekong River, Laos. To obtain adult flukes, 42 people positive for small trematode eggs (Opisthorchis viverrini, heterophyid, or lecithodendriid eggs) were treated with a 20–30 mg kg−1 single dose of praziquantel and purged. Diarrhoeic stools were then collected from 36 people (18 in each area) and searched for helminth parasites using stereomicroscopes. Faecal examinations revealed positive rates for small trematode eggs of 53.3% and 70.8% (average 65.2%) in Vientiane and Saravane Province, respectively. Infections with O. viverrini and six species of intestinal flukes were found, namely, Haplorchistaichui, H. pumilio, H. yokogawai, Centrocestus caninus,Prosthodendrium molenkampi, and Phaneropsolus bonnei. The total number of flukes collected and the proportion of fluke species recovered were markedly different in the two localities; in Vientiane, 1041 O. viverrini (57.8 per person) and 615 others (34.2 per person), whereas in Saravane, 395 O. viverrini (21.9 per person) and 155207 others (8622.6 per person). Five people from Saravane harboured no O. viverrini but numerous heterophyid and/or lecithodendriid flukes. The results indicate that O. viverrini and several species of heterophyid and lecithodendriid flukes are endemic in these two riverside localities, and suggest that the intensity of infection and the relative proportion of fluke species vary by locality along the Mekong River basin.
Release of Echinostoma caproni cercariae and Schistosoma mansoni from experimentally infected Biomphalaria glabrata snails maintained under different laboratory conditions was studied. Infected snails were isolated individually for 1 h in Stender dishes containing 5 ml of artificial spring water and the number of cercariae released during this time was recorded. Of numerous conditions tested, the addition of lettuce, the use of water conditioned by B. glabrata snails and a temperature of 35°C significantly increased the release of E. caproni cercariae. A significant increase in cercarial release of S. mansoni was seen only in cultures fed lettuce. A temperature of 12°C caused a significant decrease in cercarial release of both E. caproni and S. mansoni. Increased snail activity associated with feeding behaviour was probably responsible for the enhanced cercarial sheds observed in this study.
The objective of this study was to describe changes in emergency department volumes after statewide lockdown in a network of hospitals across the United States during the COVID-19 global pandemic.
Methods:
A retrospective study was performed utilizing data on daily volumes across multiple emergency departments from a centralized data warehouse from a private for-profit hospital system during the COVID-19 pandemic. The mean daily volumes of 148 emergency departments were evaluated across 16 states in relation to each state’s governmental statewide lockdown orders. Comparisons of the same period in the prior year were evaluated for percent changes in volumes. We also compared pre-lockdown to post-lockdown volumes. A separate analysis was made for the pediatric ED volumes.
Results:
The 2020 post-lockdown volumes compared to the same 2019 dates revealed a mean percent change of −43.09%. The overall post-lockdown volumes compared to the pre-lockdown volumes had a mean percent change of −45.00%. The pediatric data revealed a greater mean percentage change in volumes of −71.52% (post-lockdown compared to 2019) and −69.03% (post-lockdown compared to pre-lockdown).
Conclusions:
This study found an overall decrease in volumes among 148 emergency departments across 16 states when compared to the comparable period pre-global pandemic.