To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Healthcare workers (HCWs) were at increased risk for mental health problems during the COVID-19 pandemic, with prior data suggesting women may be particularly vulnerable. Our global mental health study aimed to examine factors associated with gender differences in psychological distress and depressive symptoms among HCWs during COVID-19. Across 22 countries in South America, Europe, Asia and Africa, 32,410 HCWs participated in the COVID-19 HEalth caRe wOrkErS (HEROES) study between March 2020 and February 2021. They completed the General Health Questionnaire-12, the Patient Health Questionnaire-9 and questions about pandemic-relevant exposures. Consistently across countries, women reported elevated mental health problems compared to men. Women also reported increased COVID-19-relevant stressors, including insufficient personal protective equipment and less support from colleagues, while men reported increased contact with COVID-19 patients. At the country level, HCWs in countries with higher gender inequality reported less mental health problems. Higher COVID-19 mortality rates were associated with increased psychological distress merely among women. Our findings suggest that among HCWs, women may have been disproportionately exposed to COVID-19-relevant stressors at the individual and country level. This highlights the importance of considering gender in emergency response efforts to safeguard women’s well-being and ensure healthcare system preparedness during future public health crises.
We investigated how well a visual associative learning task discriminates Alzheimer’s disease (AD) dementia from other types of dementia and how it relates to AD pathology.
Methods:
3,599 patients (63.9 ± 8.9 years old, 41% female) from the Amsterdam Dementia Cohort completed two sets of the Visual Association Test (VAT) in a single test session and underwent magnetic resonance imaging. We performed receiver operating curve analysis to investigate the VAT’s discriminatory ability between AD dementia and other diagnoses and compared it to that of other episodic memory tests. We tested associations between VAT performance and medial temporal lobe atrophy (MTA), and amyloid status (n = 2,769, 77%).
Results:
Patients with AD dementia performed worse on the VAT than all other patients. The VAT discriminated well between AD and other types of dementia (area under the curve range 0.70–0.86), better than other episodic memory tests. Six-hundred forty patients (17.8%) learned all associations on VAT-A, but not on VAT-B, and they were more likely to have higher MTA scores (odds ratios range 1.63 (MTA 0.5) through 5.13 for MTA ≥ 3, all p < .001) and to be amyloid positive (odds ratio = 3.38, 95%CI = [2.71, 4.22], p < .001) than patients who learned all associations on both sets.
Conclusions:
Performance on the VAT, especially on a second set administered immediately after the first, discriminates AD from other types of dementia and is associated with MTA and amyloid positivity. The VAT might be a useful, simple tool to assess early episodic memory deficits in the presence of AD pathology.
There are no conclusive findings about the possible protective role of religion on students’ mental health during the COVID-19 pandemic. Therefore, more research is needed.
Objectives
The purpose of this study was to assess the relationship between the level of emotional distress and religiosity among students from 7 different countries during the COVID-19 pandemic.
Methods
Data were collected by an online cross-sectional survey that was distributed amongst Polish (N = 1196), Bengali (N = 1537), Indian (N = 483), Mexican (N = 231), Egyptian (N = 565), Philippine (N = 2062), and Pakistani (N = 506) students (N = 6642) from 12th April to 1st June 2021. The respondents were asked several questions regarding their religiosity which was measured by The Duke University Religion Index (DUREL), the emotional distress was measured by the Depression, Anxiety, and Stress Scale-21 (DASS-21).
Results
Egypt with Islam as the dominant religion showed the greatest temple attendance (organizational religious activity: M=5.27±1.36) and spirituality (intrinsic religiosity: M=5.27±1.36), p<0.0001. On another hand, Egyptian students had the lowest emotional distress measured in all categories DASS-21 (depression: M=4.87±10.17, anxiety: M=4.78±10.13, stress: M=20.76±11.46). Two countries with the dominant Christian religion achieved the highest score for private religious activities (non-organizational religious activity; Mexico: M=3.94±0.94, Poland: M=3.63±1.20; p<0.0001) and experienced a moderate level of depressive symptoms, anxiety, and stress. Students from Mexico presented the lowest attendance to church (M=2.46±1,39) and spirituality (M=6.68± 3.41) and had the second highest level of depressive symptoms (M=19.13±13.03) and stress (M=20.27±1.98). Philippines students had the highest DASS-21 score (depression: M=22.77±12.58, anxiety: M=16.07±10.77, stress: M=4.87±10.08) and their level of religiosity reached average values in the whole group. The performed regression analysis confirmed the importance of the 3 dimensions (organizational religious activity, non-organizational religious activity, intrinsic religiosity) of religiosity for the well-being of students, except for the relationship between anxiety and private religious activities. The result was as presented for depression: R2=0.0398, F(3.664)=91.764, p<0.0001, SE of E: 12.88; anxiety: R2=0.0124, F(3.664)=27.683, p<0.0001, SE of E: 10,62; stress: R2= 0.0350, F(3.664)=80.363, p<0.0001, SE of E: 12.30.
Conclusions
The higher commitment to organizational religious activity, non-organizational religious activity, and intrinsic religiositywas correlated with the lower level of depressive symptoms, stress, and anxiety among students during the COVID-19 pandemic, but taking into account factors related to religiosity explains the level of emotional well-being to a small extent.
To evaluate quetiapine XR as adjunct to ongoing antidepressant therapy in patients with MDD showing inadequate response to antidepressant treatment.
Methods:
Data were analysed from two 6-week, multicentre, double-blind, randomised, placebo-controlled studies (D1448C00006; D1448C00007), prospectively designed to be pooled. Outpatients received adjunctive quetiapine XR 150mg/day (n=309), 300mg/day (n=307), placebo (n=303). Primary endpoint: change at Week 6 in MADRS total score. Other assessments included: MADRS individual item scores, HAM-A total scores, MADRS response and remission; AE reporting.
Results:
Quetiapine XR 150mg/day and 300mg/day (p< 0.001) reduced MADRS total scores versus placebo at Week 6 (-14.5, -14.8, -12.0) and Week 1 (-7.8, -7.3, -5.1). Subgroup analyses showed the therapeutic effect of quetiapine XR was neither limited to nor driven by factors such as gender or antidepressant class (SSRI/SNRI). Quetiapine XR demonstrated consistent improvements in individual MADRS items: 150mg/day and 300mg/day significantly improved 4/10 and 7/10 items at Week 6 versus placebo. At Week 6, MADRS response (≥50% decrease in total score) was 53.7% (p=0.063), 58.3% (p< 0.01) versus 46.2%; MADRS remission (total score ≤8) was 35.6% (p< 0.01), 36.5% (p< 0.001) versus 24.1% for quetiapine XR 150mg/day and 300mg/day and placebo, respectively. Quetiapine XR 150mg/day and 300mg/day improved HAM-A total scores versus placebo at Week 1 (-4.8 [p< 0.001], -4.2 [p< 0.01], -3.0) and Week 6 (-8.9 [p< 0.01], -9.1 [p< 0.001], -7.3). AEs (≥10%) were dry mouth, somnolence, sedation, dizziness, fatigue, constipation and headache with quetiapine XR.
Conclusion:
In patients with MDD and an inadequate response to antidepressant therapy adjunctive quetiapine XR is effective and generally well tolerated.
L’objectif principal de l’étude était de décrire la prise en charge des patients souffrant de crises non épileptiques psychogènes (CNEP). Nous avons utilisé le questionnaire élaboré par la International League against Epilepsy (ILAE) afin de pouvoir comparer nos résultats à ceux d’études menées à l’étranger. Ce questionnaire a été adressé au personnel travaillant dans les unités d’épileptologie de centres hospitaliers français du 2 juin 2015 au 8 juillet 2015 par e-mail par l’intermédiaire du logiciel Survey Monkey. Les résultats ont été collectés via ce même logiciel. L’annonce diagnostique est réalisée dans 94,4 % des cas au cours d’un entretien avec le patient et dans 79 % des cas avec la famille. Lors de l’entretien d’information sur les CNEP, 61,9 % des praticiens disent que ces manifestations peuvent être le signe de traumatismes refoulés ou actuels ; ils sont 14,4 % à dire que les CNEP ne s’expliquent pas d’un point de vue médical. Après l’annonce du diagnostic, 60,2 % des praticiens proposent au moins un RDV à leur patient. Environ 11,9 % des répondeurs n’assurent plus le suivi des patients. Concernant les options thérapeutiques, si 3,4 % des répondeurs considèrent qu’aucun traitement n’est efficace, ils sont 97,8 % à reconnaître l’efficience de la psychothérapie individuelle. Environ 33,9 % recommandent la prescription d’antidépresseurs, 28 % considèrent l’hypnose comme un traitement efficace des CNEP. Il existe une grande disparité dans les prises en charge des patients souffrant de CNEP et encore des difficultés d’accès et de relais vers des structures de soins psychiatriques pourtant indispensables. Cette enquête illustre la nécessité d’une coordination entre neurologues et psychiatres dans la prise en charge de ces patients.
The amino acid arginine is a well-known growth hormone (GH) stimulator and GH is an important modulator of linear growth. The aim of the present study was to investigate the effect of dietary arginine on growth velocity in children between 7 and 13 years of age. Data from the Copenhagen School Child Intervention Study during 2001–2 (baseline), and at 3-year and 7-year follow-up, were used. Arginine intake was estimated via a 7 d precoded food diary at baseline and 3-year follow-up. Data were analysed in a multilevel structure in which children were embedded within schools. Random intercept and slopes were defined to estimate the association between arginine intake and growth velocity, including the following covariates: sex; age; baseline height; energy intake; puberty stage at 7-year follow-up and intervention/control group. The association between arginine intake and growth velocity was significant for the third and fourth quintile of arginine intake (2·5–2·8 and 2·8–3·2 g/d, respectively) compared with the first quintile ( < 2·2 g/d) (P for trend = 0·04). Protein intake (excluding arginine) was significantly associated with growth velocity; however, the association was weaker than the association between arginine intake and growth velocity (P for trend = 0·14). The results of the present study suggest a dose-dependent physiological role of habitual protein intake, and specifically arginine intake, on linear growth in normally growing children. However, since the study was designed in healthy children, we cannot firmly conclude whether arginine supplementation represents a relevant clinical strategy. Further research is needed to investigate whether dietary arginine may represent a nutritional strategy potentially advantageous for the prevention and treatment of short stature.
In this study, the advantage of the use of chloroquine (CQ) containing liposomes (lipCQ) over free CQ in the chemotherapy of murine malaria (Plasmodium berghei) was demonstrated. The maximum permissible dose per intraperitoneal injection was 0·8 and 10 mg for CQ and lipCQ, respectively. An increase in therapeutic and prophylactic efficacy of lipCQ in comparison with free CQ at a 0·8 mg CQ dose level was found. It was possible to obtain 100% efficacy (injection at day 5 after infection; parasitaemia 4–8%) with one single intraperitoneal injection of 6 mg lipCQ. Moreover, the ability to increase the doses of CQ per injection after liposome encapsulation allowed successful treatment of infections with CQ-resistant Plasmodium berghei which could not be cured by a 7-day course with the maximum tolerable dose of free CQ of 0·8 mg/mouse/day.
The development of cerebral lesions in Plasmodium berghei-infected mice was dependent on the Strain of mice and the size of the infectious inoculum. In particular, C57B1/6J mice develop cerebral lesions when infected with low numbers of parasitized erythrocytes. By increasing the number of parasites in the infectious inoculum, the percentage of animals that develop cerebral malaria is decreased. Varying degrees of protection against the development of cerebral malaria can be obtained by several methods of immunization. (1) Injection of mice with large numbers of disrupted parasitized erythrocytes 1 or 2 weeks before the challenge infection (protection up to 70%). (2) A 2-day immunizing infection given 9 or 14 days before the challenge infection (protection up to 85%). (3) Injection of mice with plasmodial exoantigen preparations 1 week before the challenge infection (variable protection-rate, up to 100%). In all mice protected against cerebral malaria, parasitaemia is not affected by the immunizing treatment, indicating that protective mechanisms against cerebral malaria and parasitaemia are independent.
The effect of tumour necrosis factor-α on malaria-infected mice was studied. C57B1/6J mice infected with Plasmodium berghei K173 exhibited an increased sensitivity to exogenous TNF. Injection of 15 μg TNF was lethal to some of the animals when given 5–7 days after infection, while when given later on in the infection (i.e. days 8–10) amounts as low as 2·5 μg TNF appeared to be lethal in all mice. The pathology in infected mice treated with TNF resembled that found in the brains of infected mice dying with cerebral malaria. Infected mice treated with TNF, however, also developed severe pathological changes in other organs. On the contrary, treatment with sublethal amounts of TNF (1·0 μg or less) given on days 8 and 9 after infection, protected mice against the development of cerebral malaria. In addition, infected mice exhibited an enhanced sensitivity for treatment with lipopolysaccharide (LPS). Sublethal amounts of LPS, however, did not prevent mortality as in TNF-treated mice (LPS-treated mice died at about the same time as infected mice that developed cerebral malaria), but no cerebral haemorrhages were found in the majority of LPS treated, infected animals. Treatment with dexamethasone during infection protected mice against the development of cerebral malaria, but did not suppress their increased sensitivity to exogenous TNF. Treatment of mice with liposome-encapsulated dichloromethylene diphosphonate (lip-Cl2MDP), used to eliminate macrophages (an important source of TNF), prevented the development of cerebral malaria, but only when given before day 5 of infection. Mice protected by treatment with lip Cl2MDP, however, remained sensitive for LPS on the eighth day of infection.
A considerable proportion of mice lose acquired immunity to Plasmodium berghei during the first pregnancy. Immune parous mice, however, have a better immune status than virgin mice, the risk of loss of immunity during a subsequent pregnancy is greatly reduced, the capacity to clear parasites is enhanced, and the maintenance of immunity is less dependent on certain splenic functions. The establishment of improved immunity is dependent on the presence of proliferating parasites during the second half of pregnancy when immunosup pression results in recrudescence. Immune reactivity is also improved after a (chemothera peutically controlled) recrudescent infection provoked by immunosuppressive treatment of immune mice with corticoids or anti-T cell serum. This mimics the situation encountered during pregnancy. Hence, improved immunity after pregnancy is a consequence of a reconfrontation of a suppressed and/or convalescent immune system with proliferating parasites.
Hand harvesting is a major constraint to lentil production in North Africa and West Asia. This study, in north Syria, compared hand harvesting, cutting by mower (double-knife) and cutting with angled blades on two lentil cultivars differing in standing ability and using two sowing methods (broadcast and drilled) both with and without the use of a heavy bar for field levelling in the 1984/85 season. Seven treatments were selected for testing in five locations in the 1985/86 season; and in the 1986/87 and 1987/88 seasons, agronomic comparisons of mowing v. hand harvesting were conducted on five farmers' fields.
Both machine methods of harvesting resulted in significant harvest losses compared with hand harvesting. The angled blades performed well on a ridged broadcast crop, but tended to mix soil with the harvested crop. The loss of straw associated with harvesting by mower was reduced by levelling the seedbed after sowing. The superiority in seed yield of cultivar 78S26002 over the local cultivar increased from 9% when hand harvested to 39% with mowing because of its lower likelihood of lodging. In the 1986/87 and 1987/88 seasons, the seed yield from a hand harvest was 1650 kg/ha compared with 1508 kg/ha following harvest by mower, representing a loss of 8·6% from mechanization. The corresponding straw loss was 16·6% of the mean from a hand harvest of 2140 kg straw/ha. However, the harvest losses from mechanical harvesting by mower were compensated for by the reduced labour costs compared with hand harvesting.
Partial obstruction of endotracheal tubes due to accumulation of secretions and mucus plugs can increase the tube resistance and subsequently impose increased resistive load on the patient. This study was performed to determine the changes in the resistance of endotracheal tubes of sizes 7.5, 8.0 and 8.5 mm with different degrees and locations of endotracheal tube narrowing.
Methods
Reductions of 10%, 25%, 50% and 75% in the endotracheal tube’s cross-sectional areas were created at different sites along the axes of the tube connected to an artificial lung. While ventilating with a constant inspiratory flow, a 1 s end-inspiratory occlusion manoeuvre was applied and the resulting plateau pressure was determined. The resistance was calculated as (peak airway pressure – plateau pressure)/peak inspiratory flow.
Results
Significant increases in the endotracheal tube’s resistances were observed as the tube’s cross-sectional area reduction was increased from 25% to 50% and from 50% to 75% for the 7.5 mm endotracheal tube, from 25% to 50% for the 8.0 mm endotracheal tube, and from 50% to 75% for the 8.5 mm endotracheal tube. Changes of the endotracheal tube resistances were not affected by the site of cross-sectional area reductions along the axes of the tubes.
Conclusions
For endotracheal tubes of sizes 7.5, 8.0 and 8.5 mm, significant changes in the tubes resistances are observed when the partial obstructions of the tubes exceed certain critical values. The location of the partial obstruction did not affect the changes in the endotracheal tube resistances.
This study investigated the significance of serum complement on transmission-reducing activity (TRA) of field sera from 24 infected Plasmodium falciparum gametocyte carriers (from Cameroon) against cultured NF54 P. falciparum. Laboratory-reared Anopheles stephensi were given infectious blood meals prepared either with sera from naïve Dutch donor (AB type) or pair-matched field serum samples, both with and without active complement. TRA of serum factors and host complement on mosquito infection rate and oocyst intensity were divided into the various components involved in the early stages of sporogony. The majority (>80%) of sera tested showed positive antibody titres to Pfs230, the relevant complement-dependent target of transmission-reducing mechanisms. Regardless of the presence of active complement, bloodmeals with field sera exhibited significantly lower infection rates and oocyst intensity than the control group. Serological reactivity in Capture-ELISA against Pfs230 was significantly correlated with the reduction of parasite infectivity. Contrary to our expectation, the presence of active complement in the mosquito bloodmeal did not increase parasite losses and therefore the magnitude of transmission reduction by individual immune sera. Our findings on P. falciparum are consistent with previous studies on animal hosts of Plasmodium, indicating that early P. falciparum sporogonic stages may be insensitive to the antibody-dependent pathways of complement in human serum.
Molecular methods for the detection and typing of hepatitis A virus (HAV) strains in sewage were applied to determine its distribution in Cairo and Barcelona. The study revealed the occurrence of different patterns of hepatitis A endemicity in each city. The circulating strains characterized, whether in Cairo or Barcelona, were genotype IB. The effects of a child vaccination programme and the increase in the immigrant population on the overall hepatitis A occurrence in Barcelona were evaluated. While vaccination contributed to a significant decrease in the number of clinical cases, the huge recent immigration flow has probably been responsible for the re-emergence of the disease in the last year of study, in the form of small outbreaks among the non-vaccinated population.
Background: The presence of (distant) metastases affects the therapy (operation) and prognosis of patients with non–small-cell lung cancer (NSCLC). Fifty percent of the operations are futile due to the presence of a locally advanced tumor or distant metastases. Therefore, more accurate preoperative staging is required with respect to the outcomes (reduction of futile operations) and costs. This study examines current staging procedures and assesses possible situations for incorporating positron emission tomography (PET).
Methods: A retrospective analysis was performed to assess actual clinical practice in the staging procedure of 337 patients with NSCLC in two Dutch hospitals. Consequently, by combining these data of actual clinical practice with a literature review, a model was developed to determine the influence of PET on the staging outcomes and the costs. In this model the accuracy and costs of PET can be varied as well as the extent of substitution of conventional diagnostic tests by PET.
Results: Practice variation was found between the two hospitals with regard to the setting in which the diagnostic staging took place (hospitalization, outpatient setting) and the extent of the use of mediastinoscopy. This was reflected in the costs and in the number of (futile) operations.
Conclusion: Hospitalization is the major cost driver in these patients. From a cost viewpoint, the evaluation of PET in a strategy after diagnostic imaging but prior to invasive staging seems most optimal.