We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The different types of iron oxide phases associated with the surfaces of two suites of kaolins from Georgia, U.S.A., and from the Southwest Peninsula of England, have been identified using electron spin resonance (ESR) spectroscopy combined with magnetic-filtration, thermal, and chemical treatments. It has been shown that the English kaolins are coated with a lepidocrocitelike phase, which is readily removed by de Endredy's method of deferrification, while the Georgia kaolins are coated with a hematite- or goethitelike phase, which is not removed by this treatment. Throughout the course of this study, the effects of the various physical and chemical treatments on the brightness values of the kaolins were examined.
Previous studies by Electron Spin Resonance (ESR) have established the substitution of Fe3+ and Mg2+ in the kaolinite structure. It is shown that Fe2+ can substitute in kaolinite and stabilize defects which are detectable by ESR in a manner identical to Mg2+. The development of methods of preparing a synthetic kaolinite doped with Fe2+ is described in detail. It is shown that the main ESR signals, which occur at g = 2.0 in natural kaolinites and which previously have been interpreted in terms of iron and magnesium, can be attributed to iron alone.
Throughout the COVID-19 pandemic, many areas in the United States experienced healthcare personnel (HCP) shortages tied to a variety of factors. Infection prevention programs, in particular, faced increasing workload demands with little opportunity to delegate tasks to others without specific infectious diseases or infection control expertise. Shortages of clinicians providing inpatient care to critically ill patients during the early phase of the pandemic were multifactorial, largely attributed to increasing demands on hospitals to provide care to patients hospitalized with COVID-19 and furloughs.1 HCP shortages and challenges during later surges, including the Omicron variant-associated surges, were largely attributed to HCP infections and associated work restrictions during isolation periods and the need to care for family members, particularly children, with COVID-19. Additionally, the detrimental physical and mental health impact of COVID-19 on HCP has led to attrition, which further exacerbates shortages.2 Demands increased in post-acute and long-term care (PALTC) settings, which already faced critical staffing challenges difficulty with recruitment, and high rates of turnover. Although individual healthcare organizations and state and federal governments have taken actions to mitigate recurring shortages, additional work and innovation are needed to develop longer-term solutions to improve healthcare workforce resiliency. The critical role of those with specialized training in infection prevention, including healthcare epidemiologists, was well-demonstrated in pandemic preparedness and response. The COVID-19 pandemic underscored the need to support growth in these fields.3 This commentary outlines the need to develop the US healthcare workforce in preparation for future pandemics.
Throughout history, pandemics and their aftereffects have spurred society to make substantial improvements in healthcare. After the Black Death in 14th century Europe, changes were made to elevate standards of care and nutrition that resulted in improved life expectancy.1 The 1918 influenza pandemic spurred a movement that emphasized public health surveillance and detection of future outbreaks and eventually led to the creation of the World Health Organization Global Influenza Surveillance Network.2 In the present, the COVID-19 pandemic exposed many of the pre-existing problems within the US healthcare system, which included (1) a lack of capacity to manage a large influx of contagious patients while simultaneously maintaining routine and emergency care to non-COVID patients; (2) a “just in time” supply network that led to shortages and competition among hospitals, nursing homes, and other care sites for essential supplies; and (3) longstanding inequities in the distribution of healthcare and the healthcare workforce. The decades-long shift from domestic manufacturing to a reliance on global supply chains has compounded ongoing gaps in preparedness for supplies such as personal protective equipment and ventilators. Inequities in racial and socioeconomic outcomes highlighted during the pandemic have accelerated the call to focus on diversity, equity, and inclusion (DEI) within our communities. The pandemic accelerated cooperation between government entities and the healthcare system, resulting in swift implementation of mitigation measures, new therapies and vaccinations at unprecedented speeds, despite our fragmented healthcare delivery system and political divisions. Still, widespread misinformation or disinformation and political divisions contributed to eroded trust in the public health system and prevented an even uptake of mitigation measures, vaccines and therapeutics, impeding our ability to contain the spread of the virus in this country.3 Ultimately, the lessons of COVID-19 illustrate the need to better prepare for the next pandemic. Rising microbial resistance, emerging and re-emerging pathogens, increased globalization, an aging population, and climate change are all factors that increase the likelihood of another pandemic.4
The Society for Healthcare Epidemiology in America (SHEA) strongly supports modernization of data collection processes and the creation of publicly available data repositories that include a wide variety of data elements and mechanisms for securely storing both cleaned and uncleaned data sets that can be curated as clinical and research needs arise. These elements can be used for clinical research and quality monitoring and to evaluate the impacts of different policies on different outcomes. Achieving these goals will require dedicated, sustained and long-term funding to support data science teams and the creation of central data repositories that include data sets that can be “linked” via a variety of different mechanisms and also data sets that include institutional and state and local policies and procedures. A team-based approach to data science is strongly encouraged and supported to achieve the goal of a sustainable, adaptable national shared data resource.
This article is a clinical guide which discusses the “state-of-the-art” usage of the classic monoamine oxidase inhibitor (MAOI) antidepressants (phenelzine, tranylcypromine, and isocarboxazid) in modern psychiatric practice. The guide is for all clinicians, including those who may not be experienced MAOI prescribers. It discusses indications, drug-drug interactions, side-effect management, and the safety of various augmentation strategies. There is a clear and broad consensus (more than 70 international expert endorsers), based on 6 decades of experience, for the recommendations herein exposited. They are based on empirical evidence and expert opinion—this guide is presented as a new specialist-consensus standard. The guide provides practical clinical advice, and is the basis for the rational use of these drugs, particularly because it improves and updates knowledge, and corrects the various misconceptions that have hitherto been prominent in the literature, partly due to insufficient knowledge of pharmacology. The guide suggests that MAOIs should always be considered in cases of treatment-resistant depression (including those melancholic in nature), and prior to electroconvulsive therapy—while taking into account of patient preference. In selected cases, they may be considered earlier in the treatment algorithm than has previously been customary, and should not be regarded as drugs of last resort; they may prove decisively effective when many other treatments have failed. The guide clarifies key points on the concomitant use of incorrectly proscribed drugs such as methylphenidate and some tricyclic antidepressants. It also illustrates the straightforward “bridging” methods that may be used to transition simply and safely from other antidepressants to MAOIs.
Response to lithium in patients with bipolar disorder is associated with clinical and transdiagnostic genetic factors. The predictive combination of these variables might help clinicians better predict which patients will respond to lithium treatment.
Aims
To use a combination of transdiagnostic genetic and clinical factors to predict lithium response in patients with bipolar disorder.
Method
This study utilised genetic and clinical data (n = 1034) collected as part of the International Consortium on Lithium Genetics (ConLi+Gen) project. Polygenic risk scores (PRS) were computed for schizophrenia and major depressive disorder, and then combined with clinical variables using a cross-validated machine-learning regression approach. Unimodal, multimodal and genetically stratified models were trained and validated using ridge, elastic net and random forest regression on 692 patients with bipolar disorder from ten study sites using leave-site-out cross-validation. All models were then tested on an independent test set of 342 patients. The best performing models were then tested in a classification framework.
Results
The best performing linear model explained 5.1% (P = 0.0001) of variance in lithium response and was composed of clinical variables, PRS variables and interaction terms between them. The best performing non-linear model used only clinical variables and explained 8.1% (P = 0.0001) of variance in lithium response. A priori genomic stratification improved non-linear model performance to 13.7% (P = 0.0001) and improved the binary classification of lithium response. This model stratified patients based on their meta-polygenic loadings for major depressive disorder and schizophrenia and was then trained using clinical data.
Conclusions
Using PRS to first stratify patients genetically and then train machine-learning models with clinical predictors led to large improvements in lithium response prediction. When used with other PRS and biological markers in the future this approach may help inform which patients are most likely to respond to lithium treatment.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Chinese men who have sex with men (MSM) are at high risk for depression, anxiety and suicide. The estimated prevalence of these problems is essential to guide public health policy, but published results vary. This meta-analysis aimed to estimate the prevalence of depressive symptoms, anxiety symptoms and suicide among Chinese MSM.
Methods
Systematic searches of EMBASE, MEDLINE, PsycINFO, PubMed, CNKI and Wanfang databases with languages restricted to Chinese and English for studies published before 10 September 2019 on the prevalence of depressive symptoms, anxiety symptoms, suicidal ideation, suicide plans and suicide attempts among Chinese MSM. Studies that were published in the peer-reviewed journals and used validated instruments to assess depression and anxiety were included. The characteristics of studies and the prevalence of depression and anxiety symptoms, suicidal ideation, suicide plans and suicide attempts were independently extracted by authors. Random-effects modelling was used to estimate the pooled rates. Subgroup analysis and univariate meta-regression were conducted to explore potential sources of heterogeneity. This study followed the PRISMA and MOOSE.
Results
Sixty-seven studies were included. Fifty-two studies reported the prevalence of depressive symptoms, with a combined sample of 37 376 people, of whom 12 887 [43.2%; 95% confidence interval (CI), 38.9–47.5] reported depressive symptoms. Twenty-seven studies reported the prevalence of anxiety symptoms, with a combined sample of 10 531 people, of whom 3187 (32.2%; 95% CI, 28.3–36.6) reported anxiety symptoms. Twenty-three studies reported the prevalence of suicidal ideation, with a combined sample of 15 034 people, of whom 3416 (21.2%; 95% CI, 18.3–24.5) had suicidal ideation. Nine studies reported the prevalence of suicide plans, with a combined sample of 5271 people, of whom 401 (6.2%; 95% CI, 3.9–8.6) had suicide plans. Finally, 19 studies reported the prevalence of suicide attempts, with a combined sample of 27 936 people, of whom 1829 (7.3%; 95% CI, 5.6–9.0) had attempted suicide.
Conclusions
The mental health of Chinese MSM is poor compared with the general population. Efforts are warranted to develop interventions to prevent and alleviate mental health problems among this vulnerable population.
Introduction: Cases of anaphylaxis in children are often not appropriately managed by caregivers. We aimed to develop and to test the effectiveness of an education tool to help pediatric patients and their families better understand anaphylaxis and its management and to improve current knowledge and treatment guidelines adherence. Methods: The GEAR (Guidelines and Educational programs based on an Anaphylaxis Registry) is an initiative that recruits children with food-induced anaphylaxis who have visited the ED at the Montreal Children's Hospital and at The Children's Clinic located in Montreal, Quebec. The patients and parents, together, were asked to complete six questions related to the triggers, recognition and management of anaphylaxis at the time of presentation to the allergy clinic. Participants were automatically shown a 5-minute animated video addressing the main knowledge gaps related to the causes and management of anaphylaxis. At the end of the video, participants were redirected to same 6 questions to respond again. To test long-term knowledge retention, the questionnaire will be presented again in one year's time. A paired t-test was used to compare the difference between the baseline score and the follow-up score based on percentage of correct answers of the questionnaire. Results: From June to November 2019, 95 pediatric patients with diagnosed food-induced anaphylaxis were recruited. The median patient age was 4.5 years (Interquartile Range (IQR): 1.6–7.4) and half were male (51.6%). The mean questionnaire baseline score was 0.77 (77.0%, standard deviation (sd): 0.16) and the mean questionnaire follow-up score was 0.83 (83.0%, sd: 0.17). There was a significant difference between the follow-up score and baseline score (difference: 0.06, 95% CI: 0.04, 0.09). There were no associations of baseline questionnaire scores and change in scores with age and sex. Conclusion: Our video teaching method was successful in educating patients and their families to better understand anaphylaxis. The next step is to acquire long-term follow up scored to determine retention of knowledge.
Long-term forest dynamics plots in the tropics tend to be situated on stable terrain. This study investigated forest dynamics on the north coast of New Guinea where active subduction zones are uplifting lowland basins and exposing relatively young sediments to rapid weathering. We examined forest dynamics in relation to disturbance history, topography and soil nutrients based on partial re-census of the 50-ha Wanang Forest Dynamics Plot in Papua New Guinea. The plot is relatively high in cations and phosphorus but low in nitrogen. Soil nutrients and topography accounted for 29% of variation in species composition but only 4% of variation in basal area. There were few areas of high biomass and most of the forest was comprised of small-diameter stems. Approximately 18% of the forest was less than 30 y old and the annual tree mortality rate of nearly 4% was higher than in other tropical forests in South-East Asia and the neotropics. These results support the reputation of New Guinea's forests as highly dynamic, with frequent natural disturbance. Empirical documentation of this hypothesis expands our understanding of tropical forest dynamics and suggests that geomorphology might be incorporated in models of global carbon storage especially in regions of unstable terrain.
We present the glacier-wide summer surface mass balances determined by a detailed hydrological balance (sSMBhydro) and the quantification of the uncertainties of the calculations on the Argentière and Mer de Glace-Leschaux drainage basins, located in the upper Arve watershed (French Alps), over the period 1996–2004. The spatial distribution of precipitation within the study area was adjusted using in situ winter mass-balance measurements. The sSMBhydro performance was assessed via a comparison with the summer surface mass balances based on in situ glaciological observations (sSMBglacio). Our results show that the sSMBhydro has an uncertainty of ± 0.67 m w.e. a−1 at Argentière and ± 0.66 m w.e. a−1 at Mer de Glace-Leschaux. Estimates of the Argentière sSMBhydro values are in good agreement with the sSMBglacio values. These time series show almost the same interannual variability. From the marked difference between the sSMBhydro and sSMBglacio values for the Mer de Glace-Leschaux glacier, we suspect a significant role of groundwater fluxes in the hydrological balance. This study underlines the importance of taking into account the groundwater transfers to represent and predict the hydro-glaciological behaviour of a catchment.
Predicting recurrent Clostridium difficile infection (rCDI) remains difficult. METHODS. We employed a retrospective cohort design. Granular electronic medical record (EMR) data had been collected from patients hospitalized at 21 Kaiser Permanente Northern California hospitals. The derivation dataset (2007–2013) included data from 9,386 patients who experienced incident CDI (iCDI) and 1,311 who experienced their first CDI recurrences (rCDI). The validation dataset (2014) included data from 1,865 patients who experienced incident CDI and 144 who experienced rCDI. Using multiple techniques, including machine learning, we evaluated more than 150 potential predictors. Our final analyses evaluated 3 models with varying degrees of complexity and 1 previously published model.
RESULTS
Despite having a large multicenter cohort and access to granular EMR data (eg, vital signs, and laboratory test results), none of the models discriminated well (c statistics, 0.591–0.605), had good calibration, or had good explanatory power.
CONCLUSIONS
Our ability to predict rCDI remains limited. Given currently available EMR technology, improvements in prediction will require incorporating new variables because currently available data elements lack adequate explanatory power.
Introduction: Children with moderate cellulitis are often treated with IV antibiotics in the hospital setting, as per recommendations. Previously in our hospital, a protocol using daily IV ceftriaxone with follow-up at the day treatment center (DTC) was used to avoid admission. In 2013, a new protocol was implanted and suggested the use of high dose (HD) oral cephalexin with follow-up at the DTC for those patients. The aim of this study was to evaluate the safety and efficacy of the HD cephalexin protocol to treat moderate cellulitis in children as outpatient. Methods: A retrospective chart review was conducted. Children were included if they presented to the ED between January 2014 and 2016 and were diagnosed with a moderate cellulitis sufficiently severe to request a follow up at DTC and who were treated according to the standard of care with the HD oral cephalexin (100 mg/kg/day) protocol. Descriptive statistics for clinical characteristics of patients upon presentation, as well as for treatment characteristics in the ED and DTC were analyzed. Treatment failure was defined as: need for admission at the time of DTC evaluation, change for IV treatment in DTC or return visit to the ED. Outcomes were compared to historic controls treated with IV ceftriaxone at the DTC, where admission was avoided in 80% of cases. Results: During the study period, 682 children with cellulitis were diagnosed in our ED. Of these, 117 patients were treated using the oral HD cephalexin outpatient protocol. Success rate was 89.5% (102/114); 3 patients had an alternative diagnosis at DTC. Treatment failure was reported in 12 cases; 10 patients (8.8%) required admission, one (0.9%) received IV antibiotics at DTC, and one (0.9%) had a return visit to the ED without admission or change to the treatment. This compares favorably with the previous study using IV ceftriaxone (success rate of 80%). No severe deep infections were reported or missed; 4 patients required drainage. The mean number of visits per patient required at the DTC was 1.6. Conclusion: Treatment of moderate cellulitis requiring a follow-up in a DTC, using an oral outpatient protocol with HD cephalexin is a secure and effective option. By reducing hospitalization rate and avoiding the need for painful IV insertion, HD cephalexin is a favourable option in the management of moderate cellulitis for pediatric patients, when no criteria of toxicity are present.
This review summarizes the results from the INRA (Institut National de la Recherche Agronomique) divergent selection experiment on residual feed intake (RFI) in growing Large White pigs during nine generations of selection. It discusses the remaining challenges and perspectives for the improvement of feed efficiency in growing pigs. The impacts on growing pigs raised under standard conditions and in alternative situations such as heat stress, inflammatory challenges or lactation have been studied. After nine generations of selection, the divergent selection for RFI led to highly significant (P<0.001) line differences for RFI (−165 g/day in the low RFI (LRFI) line compared with high RFI line) and daily feed intake (−270 g/day). Low responses were observed on growth rate (−12.8 g/day, P<0.05) and body composition (+0.9 mm backfat thickness, P=0.57; −2.64% lean meat content, P<0.001) with a marked response on feed conversion ratio (−0.32 kg feed/kg gain, P<0.001). Reduced ultimate pH and increased lightness of the meat (P<0.001) were observed in LRFI pigs with minor impact on the sensory quality of the meat. These changes in meat quality were associated with changes of the muscular energy metabolism. Reduced maintenance energy requirements (−10% after five generations of selection) and activity (−21% of time standing after six generations of selection) of LRFI pigs greatly contributed to the gain in energy efficiency. However, the impact of selection for RFI on the protein metabolism of the pig remains unclear. Digestibility of energy and nutrients was not affected by selection, neither for pigs fed conventional diets nor for pigs fed high-fibre diets. A significant improvement of digestive efficiency could likely be achieved by selecting pigs on fibre diets. No convincing genetic or blood biomarker has been identified for explaining the differences in RFI, suggesting that pigs have various ways to achieve an efficient use of feed. No deleterious impact of the selection on the sow reproduction performance was observed. The resource allocation theory states that low RFI may reduce the ability to cope with stressors, via the reduction of a buffer compartment dedicated to responses to stress. None of the experiments focussed on the response of pigs to stress or challenges could confirm this theory. Understanding the relationships between RFI and responses to stress and energy demanding processes, as such immunity and lactation, remains a major challenge for a better understanding of the underlying biological mechanisms of the trait and to reconcile the experimental results with the resource allocation theory.
A recent outbreak of Q fever was linked to an intensive goat and sheep dairy farm in Victoria, Australia, 2012-2014. Seventeen employees and one family member were confirmed with Q fever over a 28-month period, including two culture-positive cases. The outbreak investigation and management involved a One Health approach with representation from human, animal, environmental and public health. Seroprevalence in non-pregnant milking goats was 15% [95% confidence interval (CI) 7–27]; active infection was confirmed by positive quantitative PCR on several animal specimens. Genotyping of Coxiella burnetii DNA obtained from goat and human specimens was identical by two typing methods. A number of farming practices probably contributed to the outbreak, with similar precipitating factors to the Netherlands outbreak, 2007-2012. Compared to workers in a high-efficiency particulate arrestance (HEPA) filtered factory, administrative staff in an unfiltered adjoining office and those regularly handling goats and kids had 5·49 (95% CI 1·29–23·4) and 5·65 (95% CI 1·09–29·3) times the risk of infection, respectively; suggesting factory workers were protected from windborne spread of organisms. Reduction in the incidence of human cases was achieved through an intensive human vaccination programme plus environmental and biosecurity interventions. Subsequent non-occupational acquisition of Q fever in the spouse of an employee, indicates that infection remains endemic in the goat herd, and remains a challenge to manage without source control.
Fourteen Friedreich patients (F group) who had undergone a first electronystagmogram (E.N.G.) reported in 1978, had the same test 12 to 24 months after the first one. In the second study, there are more patients with bilateral hypoactive caloric nystagmus failure of fixation suppression, ocular dysmetria, irregular pendulum tracking and ocular flutter. These signs are probably most representative of the progression of the disease. Nineteen unaffected relatives of these patients (H group) also had an electronystagmogram but no special “familial” electronystagmographic pattern could be identified. Irregular ocular poursuit, nearly invariable in the F group but absent in the H group, was one of the most important differences between patients and their relatives.