We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The heterogeneity of chronic post-COVID neuropsychiatric symptoms (PCNPS), especially after infection by the Omicron strain, has not been adequately explored.
Aims
To explore the clustering pattern of chronic PCNPS in a cohort of patients having their first COVID infection during the ‘Omicron wave’ and discover phenotypes of patients based on their symptoms’ patterns using a pre-registered protocol.
Method
We assessed 1205 eligible subjects in Hong Kong using app-based questionnaires and cognitive tasks.
Results
Partial network analysis of chronic PCNPS in this cohort produced two major symptom clusters (cognitive complaint–fatigue and anxiety–depression) and a minor headache–dizziness cluster, like our pre-Omicron cohort. Participants with high numbers of symptoms could be further grouped into two distinct phenotypes: a cognitive complaint–fatigue predominant phenotype and another with symptoms across multiple clusters. Multiple logistic regression showed that both phenotypes were predicted by the level of pre-infection deprivation (adjusted P-values of 0.025 and 0.0054, respectively). The severity of acute COVID (adjusted P = 0.023) and the number of pre-existing medical conditions predicted only the cognitive complaint–fatigue predominant phenotype (adjusted P = 0.003), and past suicidal ideas predicted only the symptoms across multiple clusters phenotype (adjusted P < 0.001). Pre-infection vaccination status did not predict either phenotype.
Conclusions
Our findings suggest that we should pursue a phenotype-driven approach with holistic biopsychosocial perspectives in disentangling the heterogeneity under the umbrella of chronic PCNPS. Management of patients complaining of chronic PCNPS should be stratified according to their phenotypes. Clinicians should recognise that depression and anxiety cannot explain all chronic post-COVID cognitive symptoms.
Background: Our prior six-year review (n=2165) revealed 24% of patients undergoing posterior decompression surgeries (laminectomy or discectomy) sought emergency department (ED) care within three months post-surgery. We established an integrated Spine Assessment Clinic (SAC) to enhance patient outcomes and minimize unnecessary ED visits through pre-operative education, targeted QI interventions, and early post-operative follow-up. Methods: We reviewed 13 months of posterior decompression data (n=205) following SAC implementation. These patients received individualized, comprehensive pre-operative education and follow-up phone calls within 7 days post-surgery. ED visits within 90 days post-surgery were tracked using provincial databases and compared to our pre-SAC implementation data. Results: Out of 205 patients, 24 (11.6%) accounted for 34 ED visits within 90 days post-op, showing a significant reduction in ED visits from 24% to 11.6%, and decreased overall ED utilization from 42.1% to 16.6% (when accounting for multiple visits by the same patient). Early interventions including wound monitoring, outpatient bloodwork, and prescription adjustments for pain management, helped mitigate ED visits. Patient satisfaction surveys (n=62) indicated 92% were “highly satisfied” and 100% would recommend the SAC. Conclusions: The SAC reduced ED visits after posterior decompression surgery by over 50%, with pre-operative education, focused QI initiatives, and its individualized, proactive approach.
Background: Adverse effects and risks associated with glucocorticoid (GC) treatment are frequently encountered in immune-mediated neuromuscular disorders. However, significant variability exists in the management of these complications. Our aim was to establish international consensus guidance on the management of GC-related complications in neuromuscular disorders. Methods: An international task force of 15 experts was assembled to develop clinical recommendations for managing GC-related complications in neuromuscular patients. The RAND/UCLA Appropriateness Method (RAM) was employed to formulate consensus guidance statements. Initial statements were drafted following a comprehensive literature review and were refined based on anonymous expert feedback, with up to three rounds of email voting to achieve consensus. Results: Consensus was reached on statements addressing general patient care, monitoring during GC therapy, osteoporosis prevention, vaccinations, infection screening, and prophylaxis for Pneumocystis jiroveci pneumonia. A multidisciplinary approach to managing GC-related complications was highlighted as a key recommendation. Conclusions: This represents the first consensus guidance in the neurological literature on GC complications, and offer clinicians structured guidance on mitigating and managing common adverse effects associated with both short- and long-term GC use. They also provide a foundation for future debate, quality improvement, research work in this area.
Widespread disasters can obstruct all external supports and isolate hospitals. This report aimed to extract key preparedness measures from 1 such hospital in Australia, which was flood-affected and cut off from surrounding supports.
Methods
Nine interviews with key personnel behind a flood-affected hospital’s evacuation and field hospital setup were conducted, and a narrative analysis of interview transcripts, meeting notes, and published accounts of hospital evacuation was conducted to highlight important preparedness measures for other hospitals.
Results
Findings indicate hospitals should compile a comprehensive list of resources needed to set up a field hospital. The analysis highlighted the importance of effective patient communication and in-transit tracking for safe evacuation, and revealed that staff can be better prepared if trained to expect disruptions and initiate pre-evacuation discharges.
Conclusions
Increase in climate change-driven extreme weather events requires a proportional increase in hospitals’ abilities to respond and adapt. This report points to key measures that can prepare hospitals to move their patients to improvised makeshift field facilities, if no external support is available.
Preliminary evidence suggests that a ketogenic diet may be effective for bipolar disorder.
Aims
To assess the impact of a ketogenic diet in bipolar disorder on clinical, metabolic and magnetic resonance spectroscopy outcomes.
Method
Euthymic individuals with bipolar disorder (N = 27) were recruited to a 6- to 8-week single-arm open pilot study of a modified ketogenic diet. Clinical, metabolic and MRS measures were assessed before and after the intervention.
Results
Of 27 recruited participants, 26 began and 20 completed the ketogenic diet. For participants completing the intervention, mean body weight fell by 4.2 kg (P < 0.001), mean body mass index fell by 1.5 kg/m2 (P < 0.001) and mean systolic blood pressure fell by 7.4 mmHg (P < 0.041). The euthymic participants had average baseline and follow-up assessments consistent with them being in the euthymic range with no statistically significant changes in Affective Lability Scale-18, Beck Depression Inventory and Young Mania Rating Scale. In participants providing reliable daily ecological momentary assessment data (n = 14), there was a positive correlation between daily ketone levels and self-rated mood (r = 0.21, P < 0.001) and energy (r = 0.19 P < 0.001), and an inverse correlation between ketone levels and both impulsivity (r = −0.30, P < 0.001) and anxiety (r = −0.19, P < 0.001). From the MRS measurements, brain glutamate plus glutamine concentration decreased by 11.6% in the anterior cingulate cortex (P = 0.025) and fell by 13.6% in the posterior cingulate cortex (P = <0.001).
Conclusions
These findings suggest that a ketogenic diet may be clinically useful in bipolar disorder, for both mental health and metabolic outcomes. Replication and randomised controlled trials are now warranted.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Two algorithms based on a latent class model are presented for discovering hierarchical relations that exist among a set of K dichotomous items. The two algorithms, stepwise forward selection and backward elimination, incorporate statistical criteria for selecting (or deleting) 0-1 response pattern vectors to form the subset of the total possible 2k vectors that uniquely describe the hierarchy. The performances of the algorithms are compared, using computer-constructed data, with those of three competing deterministic approaches based on ordering theory and the calculation of Phi/Phi-max coefficients. The discovery algorithms are also demonstrated on real data sets investigated in the literature.
Goodman contributed to the theory of scaling by including a category of intrinsically unscalable respondents in addition to the usual scale-type respondents. However, his formulation permits only error-free responses by respondents from the scale types. This paper presents new scaling models which have the properties that: (1) respondents in the scale types are subject to response errors; (2) a test of significance can be constructed to assist in deciding on the necessity for including an intrinsically unscalable class in the model; and (3) when an intrinsically unscalable class is not needed to explain the data, the model reduces to a probabilistic, rather than to a deterministic, form. Three data sets are analyzed with the new models and are used to illustrate stages of hypothesis testing.
A probabilistic model for the validation of behavioral hierarchies is presented. Estimation is by means of iterative convergence to maximum likelihood estimates, and two approaches to assessing the fit of the model to sample data are discussed. The relation of this general probabilistic model to other more restricted models which have been presented previously is explored and three cases of the general model are applied to exemplary data.
This paper provides a description of a new adaptive testing algorithm based on a latent class modeling framework. The algorithm incorporates a four-stage iterative procedure that conditionally minimizes expected loss in classification of respondents across different content domains. The classification decisions relate to the membership of a person in a category of a latent variable for each of the separate domains considered. The algorithm appears to be particularly effective when latent class membership is related across the various domains of interest, since classification decisions on domains assessed early in the process are used to revise the probabilities for latent class membership on domains for which classification decisions have not yet been made. To assess the effectiveness of the proposed algorithm, a simulation based upon real data was conducted. For this example, the algorithm proved to be relatively efficient, requiring only 40% of the number of items needed under a nonadaptive approach. In addition, the algorithm provided classification decisions which, in 96% of the cases, were consistent with decisions based upon all available items when the maximum acceptable classification error rate was set at 5%.
Palmer amaranth with resistance to dicamba, glufosinate, and protoporphyrinogen oxidase inhibitors has been documented in several southern states. With extensive use of these and other herbicides in South Carolina, a survey was initiated in fall 2020 and repeated in fall 2021 and 2022 to determine the relative response of Palmer amaranth accessions to selected preemergence and postemergence herbicides. A greenhouse screening experiment was conducted in which accessions were treated with three preemergence (atrazine, S-metolachlor, and isoxaflutole) and six postemergence (glyphosate, thifensulfuron-methyl, fomesafen, glufosinate, dicamba, and 2,4-D) herbicides at the 1× and 2× use rates. Herbicides were applied shortly after planting (preemergence) or at the 2- to 4-leaf growth stage (postemergence). Percent survival was evaluated 5 to 14 d after application depending on herbicide activity. Sensitivity to atrazine preemergence was lower for 49 and 33 accessions out of 115 to atrazine applied preemergence at the 1× and 2× rate, respectively. Most of the accessions (90%) were controlled by isoxaflutole applied preemergence at the 1× rate. Response to S-metolachlor applied preemergence indicated that 34% of the Palmer amaranth accessions survived the 1× rate (>60% survival). Eleven accessions exhibited reduced sensitivity to fomesafen applied postemergence; however, these percentages were not different from the 0% survivor group. Glyphosate applied postemergence at the 1× rate did not control most accessions (79%). Palmer amaranth response to thifensulfuron-methyl applied postemergence varied across the accessions, with only 36% and 28% controlled at the 1× rate and 2× rate, respectively. All accessions were controlled by 2,4-D, dicamba, or glufosinate when they were applied postemergence. Palmer amaranth accessions from this survey exhibited reduced susceptibility to several herbicides commonly used in agronomic crops in South Carolina. Therefore, growers should use multiple management tactics to minimize the evolution of herbicide resistance in Palmer amaranth in South Carolina.
Food choice is complex and is heavily influenced by the environment one lives in(1). Pacific Island food environments, including those in Tonga, have changed considerably in recent years, making healthier food choice more challenging(2). A widespread nutrition transition across the region has contributed to an increase in the availability of, and accessibility to, highly-processed foods, and high rates of diet-related non-communicable diseases(3). While system change is needed to support the availability, accessibility and affordability of healthy foods, nutrition education plays an important role in supporting individuals, communities, and populations to navigate their rapidly changing food environments, and to encourage healthy food choice and behaviour change. Approaches to nutrition education in the Pacific Islands region vary and do not always consider the socio-cultural aspects of the food environment, especially when focusing on fruit and vegetable consumption. This work was driven by an intent to develop contextually appropriate nutrition education plans using a structured process, Design Online(4). However, to develop a nutrition education plan a critical analysis of the current motivators and facilitators for the behaviour are required. When reviewing the scientific literature there is limited information on determinants of food choice within Tonga, and more broadly within the Pacific Islands context. Therefore, the aim of this cohort study was to qualitatively explore and document the motivating and facilitating determinants of fruit and vegetable consumption in Tonga. Data collection took place during August and September 2023 on the main island of Tongatapu. Semi-structured interviews (n = 5 men, 3 women) and a focus group (n = 4 women) based on the most appropriate method of engaging with participants, were conducted in Tongan. Guiding questions were derived from Design Online and proposed within the following categories: motivating determinants and facilitating determinants. Interview responses were qualitatively analysed using an inductive content analysis model. Key categories for motivating determinants included health and nutrition knowledge, normal consumption patterns, availability and access, production, financial considerations and preferences, perceptions and practices. Key categories for facilitating determinants included education, community engagement, environmental factors, food preference, finance, and accessibility. While this work has explored motivating and facilitating determinants for fruit and vegetable consumption in Tonga with a small sample, it makes an important contribution to the limited literature. The findings of this study can be used to underpin activities, such as the design of nutrition education plans. The findings also provide a foundation for further exploration of determinants of food choice. This study was undertaken on the main island of Tongatapu, but it is of interest to explore determinants with communities who live in the outer islands, and at different time points during the year to reflect seasonality.
Cancer health research relies on large-scale cohorts to derive generalizable results for different populations. While traditional epidemiological cohorts often use costly random sampling or self-motivated, preselected groups, a shift toward health system-based cohorts has emerged. However, such cohorts depend on participants remaining within a single system. Recent consumer engagement models using smartphone-based communication, driving projects, and social media have begun to upend these paradigms.
Methods:
We initiated the Healthy Oregon Project (HOP) to support basic and clinical cancer research. HOP study employs a novel, cost-effective remote recruitment approach to effectively establish a large-scale cohort for population-based studies. The recruitment leverages the unique email account, the HOP website, and social media platforms to direct smartphone users to the study app, which facilitates saliva sample collection and survey administration. Monthly newsletters further facilitate engagement and outreach to broader communities.
Results:
By the end of 2022, the HOP has enrolled approximately 35,000 participants aged 18–100 years (median = 44.2 years), comprising more than 1% of the Oregon adult population. Among those who have app access, ∼87% provided consent to genetic screening. The HOP monthly email newsletters have an average open rate of 38%. Efforts continue to be made to improve survey response rates.
Conclusion:
This study underscores the efficacy of remote recruitment approaches in establishing large-scale cohorts for population-based cancer studies. The implementation of the study facilitates the collection of extensive survey and biological data into a repository that can be broadly shared and supports collaborative clinical and translational research.
Schizophrenia (SCZ) is a neuropsychiatric disorder with strong genetic heritability and predicted genetic heterogeneity, but limited knowledge regarding the underlying genetic risk variants. Classification into phenotype-driven subgroups or endophenotypes is expected to facilitate genetic analysis. Here, we report a teen boy with chronic psychosis and cerebellar hypoplasia (CBLH) and analyze data on 16 reported individuals with SCZ or chronic psychosis not otherwise specified associated with cerebellar hypoplasia to look for shared features.
Participants and Methods:
We evaluated an 18-year-old boy with neurodevelopmental deficits from early childhood and onset of hallucinations and other features of SCZ at 10 years who had mild vermis-predominant CBLH on brain imaging. This prompted us to review prior reports of chronic psychosis or SCZ with cerebellar malformations using paired search terms including (1) cerebellar hypoplasia, Dandy-Walker malformation, Dandy-Walker variant, or mega-cisterna magna with (2) psychosis or SCZ. We found reports of 16 affected individuals from 13 reports. We reviewed clinical features focusing on demographic information, prenatal-perinatal history and neuropsychiatric and neurodevelopmental phenotypes, and independently reviewed brain imaging features.
Results:
All 17 individuals had classic psychiatric features of SCZ or chronic psychosis as well as shared neurodevelopmental features not previously highlighted including a downward shift in IQ of about 20 points, memory impairment, speech-language deficits, attention deficits and sleep disturbances. The brain imaging findings among these individuals consistently showed posterior vermis predominant CBLH with variable cerebellar hemisphere hypoplasia and enlarged posterior fossa (a.k.a. mega-cisterna magna). None had features of classic DWM.
Conclusions:
In 17 individuals with chronic psychosis or SCZ and cerebellar malformation, we found a high frequency of neurodevelopmental disorders, a consistent brain malformation consisting of posterior vermis-predominant (and usually symmetric) CBLH, and no evidence of prenatal risk factors. The consistent phenotype and lack of prenatal risk factors for CBLH leads us to hypothesize that psychosis or schizophrenia associated with vermis predominant CBLH comprises a homogeneous subgroup of individuals with chronic psychosis/schizophrenia that is likely to have an underlying genetic basis. No comprehensive targeted gene panel for CBLH has yet been defined, leading us to recommend trio-based exome sequencing for individuals who present with this combination of features.
Load balancing of constrained healthcare resources has become a critical aspect of assuring access to care during periods of pandemic related surge. These impacts include patient surges, staffing shortages, and limited access to specialty resources. This research focuses on the creation and work of a novel statewide coordination center, the Washington Medical Coordination Center (WMCC), whose primary goal is the load balancing of patients across the healthcare continuum of Washington State.
Methods:
This article discusses the origins, development, and operations of the WMCC including key partners, cooperative agreements, and structure necessary to create a patient load balancing system on a statewide level.
Results:
As of April 21, 2022, the WMCC received 3821 requests from Washington State hospitals. Nearly 90% were received during the pandemic surge. Nearly 75% originated from rural hospitals that are most often limited in their ability to transfer patients when referral centers are also overwhelmed.
Conclusions:
The WMCC served as an effective tool to carry out patient load balancing activities during the COVID-19 pandemic surge in Washington State. It (the WMCC) has been shown to be an equity enhancing, cost effective means of managing healthcare surge events across a broad geographic region.
Recent evidence from case reports suggests that a ketogenic diet may be effective for bipolar disorder. However, no clinical trials have been conducted to date.
Aims
To assess the recruitment and feasibility of a ketogenic diet intervention in bipolar disorder.
Method
Euthymic individuals with bipolar disorder were recruited to a 6–8 week trial of a modified ketogenic diet, and a range of clinical, economic and functional outcome measures were assessed. Study registration number: ISRCTN61613198.
Results
Of 27 recruited participants, 26 commenced and 20 completed the modified ketogenic diet for 6–8 weeks. The outcomes data-set was 95% complete for daily ketone measures, 95% complete for daily glucose measures and 95% complete for daily ecological momentary assessment of symptoms during the intervention period. Mean daily blood ketone readings were 1.3 mmol/L (s.d. = 0.77, median = 1.1) during the intervention period, and 91% of all readings indicated ketosis, suggesting a high degree of adherence to the diet. Over 91% of daily blood glucose readings were within normal range, with 9% indicating mild hypoglycaemia. Eleven minor adverse events were recorded, including fatigue, constipation, drowsiness and hunger. One serious adverse event was reported (euglycemic ketoacidosis in a participant taking SGLT2-inhibitor medication).
Conclusions
The recruitment and retention of euthymic individuals with bipolar disorder to a 6–8 week ketogenic diet intervention was feasible, with high completion rates for outcome measures. The majority of participants reached and maintained ketosis, and adverse events were generally mild and modifiable. A future randomised controlled trial is now warranted.
Cryobanking is a major component of today’s assisted reproductive technologies (ART). As Reproductive Biologists and Cryogenic Specialists, we are not only burdened with the accurate labelling, witnessing and use of cryopreserved specimens (the subject of other chapters in this text), we must ensure their safe and secure long-term storage. Based on a heightened awareness of actual and experimental tank failures, we will outline and discuss the critical components of effective quality management for cryostorage.
The evolutionary branch from early primates to human beings dates back about 400 000 years (personal communication, Dr Lee Silver, Princeton University), and in that time, man has endured an “ice age.” In more recent history, man’s interest in cellular responses to freezing temperatures has been primarily concerned with his defense against it. The first cells discovered for the microscopic assessment of changes were sperm by van Leeuwenhoek in 1677, and in 1827, Karl Ernst von Baer later discovered the egg [1]. However, it was a very different and difficult concept for the public to grasp that human life can start with two microscopic gametes and that these cells can be frozen and survive thawing [2].
Intravenous (IV) anesthetics were first discovered for their clinical utility in 1656 by Sir Christopher Wren, an architect, physicist, and astronomer at the University of Oxford while using a goosequill to inject opium into a dog to produce sleep [1]. In 1909, Ludwig Burkhardt became the first surgeon to deliberately use IV ether in a 5% solution to sedate patients for head and neck surgery, finding that a higher concentration caused thrombophlebitis and hemolysis, whereas a lower concentration proved too weak a sedative. The first barbiturate hexobarbital was used in 1932, soon being used for over 10 million cases by 1944. In 1989, the first propofol lipid emulsion formulation was launched in the United States, marking the beginning of the modern age of IV sedation pharmacology [2].