We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Developing integrated mental health services focused on the needs of children and young people is a key policy goal in England. The THRIVE Framework and its implementation programme, i-THRIVE, are widely used in England. This study examines experiences of staff using i-THRIVE, estimates its effectiveness, and assesses how local system working relationships influence programme success.
Methods
This evaluation uses a quasi-experimental design (10 implementation and 10 comparison sites.) Measurements included staff surveys and assessment of ‘THRIVE-like’ features of each site. Additional site-level characteristics were collected from health system reports. The effect of i-THRIVE was evaluated using a four-group propensity-score-weighted difference-in-differences model; the moderating effect of system working relationships was evaluated with a difference-in-difference-in-differences model.
Results
Implementation site staff were more likely to report using THRIVE and more knowledgeable of THRIVE principles than comparison site staff. The mean improvement of fidelity scores among i-THRIVE sites was 16.7, and 8.8 among comparison sites; the weighted model did not find a statistically significant difference. However, results show that strong working relationships in the local system significantly enhance the effectiveness of i-THRIVE. Sites with highly effective working relationships showed a notable improvement in ‘THRIVE-like’ features, with an average increase of 16.41 points (95% confidence interval: 1.69–31.13, P-value: 0.031) over comparison sites. Sites with ineffective working relationships did not benefit from i-THRIVE (−2.76, 95% confidence interval: − 18.25–12.73, P-value: 0.708).
Conclusions
The findings underscore the importance of working relationship effectiveness in the successful adoption and implementation of multi-agency health policies like i-THRIVE.
Multiple pregnancy affects 0.9-3.1% of births worldwide. Prevalence rates vary significantly due to differences in dizygotic twinning rates and use of assisted reproduction. Both maternal and fetal/neonatal complications are more common in multiple compared to singleton pregnancies, and there are specific problems for the fetuses related to monochorionicity. Multiple pregnancies require specialised and individualised care. Complicated multiple pregnancies should be managed in a tertiary care centre where there is additional expertise, such as the laser ablation needed to treat monochorionic monozygotic pregnancies with conjoined circulations. Cornerstones of management in pregnancy are the need for accurate fetal measurement to optimise dating of gestational age, and documentation of chorionicity. High-level ultrasound expertise is needed. The mothers need frequent assessment to detect hypertension and anemia, and early identification and management of preterm labour.
This Element aims to deepen our understanding of how the fields of multilingualism, second language acquisition and minority language revitalisation have largely overlooked the question of queer sexual identities among speakers of the languages under study. Based on case studies of four languages experiencing differing degrees of minoritisation – Irish, Breton, Catalan and Welsh – it investigates how queer people navigate belonging within the binary of speakers/non-speakers of minoritised languages while also maintaining their queer identities. Furthermore, it analyses how minoritised languages are dealing linguistically with the growing need for 'gender-fair' or 'gender-neutral' language. The marginalisation of queer subjects in these strands of linguistics can be traced to the historical dominance of the Fishmanian model of 'Reversing Language Shift' (RLS), which assumed the importance of the deeply heteronormative model of 'intergenerational transmission' of language as fundamental to language revitalisation contexts.
A set of 68 simple sequence repeat (SSR) markers were selected from existing databases (including Medicago, soybean, cowpea and peanut) for the purpose of exploiting the transferability of SSRs across species and/or genera within the legume family. Primers were tested for cross-species and cross-genus fragment amplification with an array of 24 different legume accessions. Nearly one-third (30.78%) of the SSR primers screened generated reproducible and cross-genus amplicons. One hundred and seventeen cross-species polymorphic amplicons were identified and could be used as DNA markers. These polymorphic markers are now being used for characterization and evaluation of our collected and donated legume germ- plasm. The transferability of SSRs, mis-/multiple-primings, homologous/heterologous amplifications, single/multiple-amplicons and application of these amplicons as DNA markers are discussed. The transfer of SSR markers across species or across genera can be a very efficient approach for DNA marker development, especially for minor crops.
Passive oxygenation with non-rebreather face mask (NRFM) has been used during cardiac arrest as an alternative to positive pressure ventilation (PPV) with bag-valve-mask (BVM) to minimize chest compression disruptions. A dual-channel pharyngeal oxygen delivery device (PODD) was created to open obstructed upper airways and provide oxygen at the glottic opening. It was hypothesized for this study that the PODD can deliver oxygen as efficiently as BVM or NRFM and oropharyngeal airway (OPA) in a cardiopulmonary resuscitation (CPR) manikin model.
Methods:
Oxygen concentration was measured in test lungs within a resuscitation manikin. These lungs were modified to mimic physiologic volumes, expansion, collapse, and recoil. Automated compressions were administered. Five trials were performed for each of five arms: (1) CPR with 30:2 compression-to-ventilation ratio using BVM with 15 liters per minute (LPM) oxygen; continuous compressions with passive oxygenation using (2) NRFM and OPA with 15 LPM oxygen, (3) PODD with 10 LPM oxygen, (4) PODD with 15 LPM oxygen; and (5) control arm with compressions only.
Results:
Mean peak oxygen concentrations were: (1) 30:2 CPR with BVM 49.3% (SD = 2.6%); (2) NRFM 47.7% (SD = 0.2%); (3) PODD with 10 LPM oxygen 52.3% (SD = 0.4%); (4) PODD with 15 LPM oxygen 62.7% (SD = 0.3%); and (5) control 21% (SD = 0%). Oxygen concentrations rose rapidly and remained steady with passive oxygenation, unlike 30:2 CPR with BVM, which rose after each ventilation and decreased until the next ventilation cycle (sawtooth pattern, mean concentration 40% [SD = 3%]).
Conclusions:
Continuous compressions and passive oxygenation with the PODD resulted in higher lung oxygen concentrations than NRFM and BVM while minimizing CPR interruptions in a manikin model.
Much has been made of the decline in population mental health over COVID but most studies show this just exacerabted a loing term trend This has predominnatly been attributed to changes in adolescent mental health over the past decade but there ahs been little evalaution of whether this post Millenium cohort was the first to demonstrate such a decline
Objectives
This study investigates to what extent mental differs in people born in different decades – i.e., possible birth cohort differences in the mental health of the popualtion over the past two decades To remove the linear dependency and identify any differences in trends between cohorts, we model mental health for each cohort as a nonlinear smooth function of age in an age-cohort model.
Methods
This analysis draws on 20 annual waves of the Household Income and Labour Dynamic in Australia (HILDA) survey.,is a nationally representative household panel that commenced in 2001 with 13,969 participants. The birth cohort of each person was defined by the decade of birth year(1940s, 1950s, etc). Mental ill health was assessed with the MHI5 from the SF36, in each wave and K10 from alternate waves. We estimate and compare penalized smooth trends in mental health for each cohort using restricted maximum likelihood (REML) using generalized additive mixed modelling (GAMM). Cohort effects are captured by directly estimating the differences between the smooth age trends of adjacent cohorts.
Results
Later cohorts were more likely to have poorer mental health, higher distress, more likely to be single and unemployed, and less likely to be chronically ill or disabled. Mental health was worse for younger age-groups in each survey year, and this discrepancy is much greater in more recent surveys - consistent with a birth cohort effect. Millennials (those born in the early 1990s) had a lower score at the same age as earlier generations, and the later cohorts do not show the age-related improvement seen in other earlier cohorts as they aged. At age 30 the average MHI-5 score of those born in the 1990s was 67, compared to 72.5 and 74 for people born in the 1980s and 1970s.
Conclusions
The deterioration in mental health over time which has been reported in large cross-sectional surveys, likely reflects cohort-specific effects related to the experiences of young people born in the Millennial generation and, to a lesser extent, those from the immediately prior cohort born in the1980s. We need to understand whether later cohorts are less resilient to similar risk factors experienced by earlier cohorts or whether they experience more and/or a greater severity of risks for mental ill-health. Such evidence is critical if the deteriorating pattern of mental health is to be arrested.
Irritable bowel syndrome (IBS) is a chronic and relapsing gastrointestinal condition which negatively impacts quality of life(1). Dietary triggers are common and dietary management is central to the IBS treatment pathway with dietitians being the main education providers for patients(2). The aim of this study was to explore the perceptions of dietitians towards current practices in IBS services in clinical settings across the UK.
Qualitative semi-structured interviews were undertaken to explore current practices, barriers, and facilitators to dietetic practice and expected treatment outcomes. Eligible participants were dietitians specialising in IBS and working in the National Health System (NHS) in the UK. Interviews were held virtually. Audio was recorded and transcribed following intelligent transcription. Data were analysed using template analysis (3).
Thirteen dietitians (n=12 female) specialising in gastroenterology consented to participate in the study. Dietitians were working in various NHS Trusts across the country (Southeast England n=3; Southwest England n=3; Northwest England n=2; Northeast England n=1; West Midlands n=1; Southwest Wales n=1 and Southcentral Scotland n=2). Ten out of 13 dietitians had more than five years of experience in IBS management. Three main themes emerged: 1) Dietetic services as part of IBS referral pathways; 2) Practices in relation to dietetic services and 3) Implications of services on patients’ expectations and feelings. Each main theme had subthemes to facilitate the description and interpretation of data. The increasing number of IBS referrals to dietitians and the need for accurate and timely IBS diagnosis and specialist IBS dietitians was reported, alongside the use of digital innovation to facilitate practice and access to dietetic care. The use of Internet as a source of (mis)information by patients and the limited time available for educating patients were identified as potential barriers to dietetic practice. Dietitians follow a patient-centred approach to dietary counselling and recognise the negative implications of perceived IBS-related stigma by patients on their feelings and treatment expectations.
The study identified areas and practices which can facilitate access to dietetic services and patient- centred care in IBS management as outlined in guidelines (4).
OBJECTIVES/GOALS: Super refractory status epilepticus (SRSE) is associated with high mortality, often due to withdrawal of life sustaining therapy (WLST) based on perceived poor neurological prognosis. Factors influencing decision making are underreported and poorly understood. We surveyed clinicians who treat SRSE to identify factors that influence WLST. METHODS/STUDY POPULATION: Health care providers (HCP), including physicians, pharmacists, and advanced practice providers, who treat SRSE answered a 51-question survey on respondent demographics, institutional characteristics and SRSE management that was distributed though professional societies. Respondents described approaches to prognostication and rated the importance of clinical factors in the management of two hypothetical clinical cases followed by their prediction of recovery potential for the same two cases. Neurointensivists and other HCP responses were compared using descriptive statistics to differentiate group characteristics; a p-value <0.05 was considered significant. Logistical regression models were employed to identify associations between clinician specific factors and prognostication. RESULTS/ANTICIPATED RESULTS: One-hundred and sixty-four respondents were included in the analysis. Compared to other HCPs (neurologists, epileptologists, neurosurgeons, other intensivists; n=122, 74%), neurointensivists (n=42, 26%) [Odds ratio (OR) 0.3, 95% confidence interval (CI) 0.14-0.68), p=.004)] were less likely to use prognostic severity scores and were less likely to prognosticate likelihood of good functional recovery (OR: 0.28 (95% CI: 0.13-0.62), p=.002) compared to non-neurointensivist HCPs, controlling for potential confounders including professional degree, years of experience, country of practice, and annual volume of SRSE cases. There was, however, significant overlap in factors deemed necessary for determining futility in care escalation. DISCUSSION/SIGNIFICANCE: Neurointensivists value similar clinical factors to other HCPs when evaluating medical futility in SRSE but are less likely to predict definitive outcomes. Pending final survey results, future studies aimed at understanding why neurointensivists may be less likely to decisively prognosticate (i.e. avoiding nihilism) in SRSE may be warranted.
Submicron-sized (~3–60 nm) powders of Al-substituted magnetite were synthesized in the laboratory by precipitation methods by mixing appropriate molar volumes FeCl2, FeCl3 and AlCl3 solutions and precipitating with 20% NH4OH. Precipitates were dialyzed for 48 hr to remove excess salts and then freeze-dried. The nominal Al mole fractions [Als = Al/(Al + Fe)] in the initial precipitate ranged from 0.001 to 0.42. Portions of the resulting powders were heated sequentially in air at 400° and 500°C. Powders were examined using X-ray diffraction (XRD), transmission electron microscopy (TEM), and visible and near-IR reflectance spectroscopy. Solubilities were determined in ammonium oxalate (pH = 3) and dithionite-citrate-bicarbonate (DCB) solutions. As determined by XRD, the mineralogy of precipitated powder samples was predominantly magnetite. Powders having Als > 0.20 contained minor goethite and a poorly crystalline iron oxide phase (ferrihydrite?), and powders having Als > 0.25 also contained gibbsite. The color of the magnetites was black throughout the range of Al-substitution. Powders heated to 400°C were reddish brown; Munsell colors ranged from 5R 2/2 to 10R 3/4 for Als from 0.1–41.5%, respectively. By XRD, these powders were maghemite, but hematite was also detected by Mössbauer spectroscopy. XRD and Mössbauer data indicate powders heated to 500°C are hematite; their Munsell colors are not noticeably different from the corresponding 400°C samples. Mean crystallite dimensions (MCDs) of the magnetite powders increase with the Al mole fraction from ~10 nm for Als = 0.001% to a maximum value of 35 nm for Als = 0.15 and decrease slightly with further increasing Al substitution. Heating magnetite powders to 400°C did not change the MCDs significantly. Heating to 500°C resulted in hematites having MCDs larger than those for corresponding precursor magnetites for Als < 0.10. The opposite is true for hematites derived from magnetites having Als > 0.10. For hematite powders with Als > 0.05, MCD decreased with increasing Al-substitution. Solubilities of powders in oxalate solutions were independent of Al content and decreased in the order unheated samples (mostly magnetite) >400°C-heated samples (maghemite + hematite) >500°C-heated samples (hematite). All powders dissolved completely in DCB. The low crystallinity of the magnetite powders and the presence of ferrous iron are responsible for their relatively high solubility in oxalate solutions.
In normative aging, there is a decline in associative memory that appears to relate to self-reported everyday use of general memory strategies (Guerrero et al., 2021). Self-reported general strategy use is also strongly associated with self-reported memory abilities (Frankenmolen et al., 2017), which, in turn, are weakly associated with objective memory performance (Crumley et al., 2014). Associative memory abilities and strategy use appear to differ by gender, with women outperforming men and using more memory strategies (Hertzog et al., 2019). In this study, we examine how actual performance and self-reported use of specific strategies on an associative memory task relate to each other and to general, everyday strategy use, and whether these differ by gender.
Participants and Methods:
An international sample of older adults (N = 566, 53% female, aged 60-80) were administered a demographic questionnaire and online tasks, including 1. the Multifactorial Memory Questionnaire (MMQ) which measures self-reported memory ability, satisfaction, and everyday strategy use (Troyer & Rich, 2018); and 2. the Face-Name Task which measures associative memory (Troyer et al., 2011). Participants were also asked about specific strategies that were used to complete the Face-Name Task.
Results:
On the Face-Name Task, participants who reported using more strategies performed better (F(3, 562) = 6.51, p < 0.001, n2 = 0.03), with those who reported using three or four strategies performing best (p < .05). There was a significant difference in performance based on the type of strategy used (p(2, 563) = 11.36, p < 0.001, n2 = 0.04), with individuals who relied on a “past experiences/knowledge” strategy performing best (p < .01). Women (M = 0.79, SD = 0.19) outperformed men (M = 0.71, SD = 0.20), f(545) = -4.64, p < 0.001, d = -0.39. No gender differences were found in the number (X2(3, N = 564) = 2.06, p = 0.561) or type (x2(2, N = 564) = 5.49, p = 0.064) of strategies used on the Face-Name Task. Only participants who reported using no strategies on the Face-Name Task had lower scores on the MMQ everyday strategy use subscale (p < .05). A multiple-regression model was used to investigate the relative contributions of the number of strategies used on the Face-Name Task, MMQ everyday strategy subscale score, gender, age, education, and psychological distress to Face-Name Task performance. The only significant predictors in the model were gender (B = 0.08, t(555) = 4.55, p < 0.001) and use of two or more strategies (B = 0.07, f(555) = 2.82, p = 0.005).
Conclusions:
Reports of greater self-initiated strategy use, and use of a semantic strategy in particular, related to better performance on an associative memory test in older adults. Self-initiated, task-specific strategy use also related to everyday strategy use. The findings extend past work on gender differences to show that women outperform men on an associative memory task but that this is unlikely to be due to self-reported differences in strategy use. The results suggest that self-reported strategy use predicts actual associative memory performance and should be considered in clinical practice.
Debate is ongoing on the efficacy of cognitive behavior therapy (CBT) for myalgic encephalomyelitis or chronic fatigue syndrome (ME/CFS). With an individual patient data (IPD) meta-analysis we investigated whether the effect of CBT varied by patient characteristics. These included post-exertional malaise (PEM), a central feature of ME/CFS according to many. We searched for randomized controlled trials similar with respect to comparison condition, outcomes and treatment-protocol. Moderation on fatigue severity (Checklist Individual Strength, subscale fatigue severity), functional impairment (Sickness Impact Profile-8) and physical functioning (Short Form-36, subscale physical functioning) was investigated using linear mixed model analyses and interaction tests. PROSPERO (CRD42022358245). Data from eight trials (n = 1298 patients) were pooled. CBT showed beneficial effects on fatigue severity (β = −11.46, 95% CI −15.13 to −7.79); p < 0.001, functional impairment (β = −448.40, 95% CI −625.58 to −271.23); p < 0.001; and physical functioning (β = 9.64, 95% CI 3.30 to 15.98); p < 0.001. The effect of CBT on fatigue severity varied by age (pinteraction = 0.003), functional impairment (pinteraction = 0.045) and physical activity pattern (pinteraction = 0.027). Patients who were younger, reported less functional impairments and had a fluctuating activity pattern benefitted more. The effect on physical functioning varied by self-efficacy (pinteraction = 0.025), with patients with higher self-efficacy benefitting most. No other moderators were found. It can be concluded from this study that CBT for ME/CFS can lead to significant reductions of fatigue, functional impairment, and physical limitations. There is no indication patients meeting different case definitions or reporting additional symptoms benefit less from CBT. Our findings do not support recent guidelines in which evidence from studies not mandating PEM was downgraded.
Adverse childhood experiences (ACEs) may be a risk factor for later-life cognitive disorders such as dementia; however, few studies have investigated underlying mechanisms, such as cardiovascular health and depressive symptoms, in a health disparities framework.
Method:
418 community-dwelling adults (50% nonHispanic Black, 50% nonHispanic White) aged 55+ from the Michigan Cognitive Aging Project retrospectively reported on nine ACEs. Baseline global cognition was a z-score composite of five factor scores from a comprehensive neuropsychological battery. Depressive symptoms were assessed using the Center for Epidemiologic Studies Depression Scale. Cardiovascular health was operationalized through systolic blood pressure. A mediation model controlling for sociodemographics, childhood health, and childhood socioeconomic status estimated indirect effects of ACEs on global cognition via depressive symptoms and blood pressure. Racial differences were probed via t-tests and stratified models.
Results:
A negative indirect effect of ACEs on cognition was observed through depressive symptoms [β = −.040, 95% CI (−.067, −.017)], but not blood pressure, for the whole sample. Black participants reported more ACEs (Cohen’s d = .21), reported more depressive symptoms (Cohen’s d = .35), higher blood pressure (Cohen’s d = .41), and lower cognitive scores (Cohen’s d = 1.35) compared to White participants. In stratified models, there was a negative indirect effect through depressive symptoms for Black participants [β = −.074, 95% CI (−.128, −.029)] but not for White participants.
Conclusions:
These results highlight the need to consider racially patterned contextual factors across the life course. Such factors could exacerbate the negative impact of ACEs and related mental health consequences and contribute to racial disparities in cognitive aging.
Febrile neutropenia (FN) is a medical emergency with significant morbidity and mortality for oncology patients, requiring comprehensive workup and timely antibiotic administration. We evaluated concordance with locally developed FN guidelines and outcomes of cancer patients admitted to general internal medicine at an academic teaching hospital.
Methods:
We conducted a retrospective observational cohort study of patients admitted between July 1, 2016, and June 30, 2017, for FN. Patients were classified as having low-risk or high-risk FN according to their malignancy and chemotherapy. Primary outcome was the proportion of patients receiving guideline-concordant antibiotics within 48 hours of admission to general internal medicine. Secondary outcomes were the proportion of patients in whom empirical antibiotics were active against pathogens isolated, rate of antibiotic-associated adverse events, and in-hospital mortality. We used logistic regression to model relationship between FN risk and guideline-concordant antibiotics.
Results:
Among 100 patients included, 34 (34%) were low-risk FN and 66 (66%) were high-risk. Proportion of guideline-concordant empirical antibiotics was significantly lower among low-risk FN patients than high-risk patients: 12 (35%) of 34 versus 47 (71%) of 66 (P = .001). Empirical antibiotics were active against 17 (94%) of 18 isolated pathogens. The mortality rate was 3%, and 16% of patients experienced antibiotic-associated adverse events. Hematological malignancy and infectious diseases–trained physician involvement were associated with guideline-concordant prescribing, with adjusted odds ratios of 3.76 (95% CI, 1.46–9.70; P = .006) and 3.71 (95% CI, 1.49–9.23; P = .005), respectively.
Conclusions:
Guideline concordance was low compared to published reports. Factors influencing appropriate antimicrobial prescribing in patients with FN warrant further exploration.
Avian endoparasites play important roles in conservation, biodiversity and host evolution. Currently, little is known about the epidemiology of intestinal helminths and protozoans infecting wild birds of Britain and Ireland. This study aimed to determine the rates of parasite prevalence, abundance and infection intensity in wild passerines. Fecal samples (n = 755) from 18 bird families were collected from 13 sites across England, Wales and Ireland from March 2020 to June 2021. A conventional sodium nitrate flotation method allowed morphological identification and abundance estimation of eggs/oocysts. Associations with host family and age were examined alongside spatiotemporal and ecological factors using Bayesian phylogenetically controlled models. Parasites were detected in 20.0% of samples, with corvids and finches having the highest prevalences and intensities, respectively. Syngamus (33%) and Isospora (32%) were the most prevalent genera observed. Parasite prevalence and abundance differed amongst avian families and seasons, while infection intensity varied between families and regions. Prevalence was affected by diet diversity, while abundance differed by host age and habitat diversity. Infection intensity was higher in birds using a wider range of habitats, and doubled in areas with feeders present. The elucidation of these patterns will increase the understanding of parasite fauna in British and Irish birds.
Early surgical intervention in infants with complex CHD results in significant disruptions to their respiratory, gastrointestinal, and nervous systems, which are all instrumental to the development of safe and efficient oral feeding skills. Standardised assessments or treatment protocols are not currently available for this unique population, requiring the clinician to rely on knowledge based on neonatal literature. Clinicians need to be skilled at evaluating and analysing these systems to develop an appropriate treatment plan to improve oral feeding skill and safety, while considering post-operative recovery in the infant with complex CHD. Supporting the family to re-establish their parental role during the hospitalisation and upon discharge is critical to reducing parental stress and oral feeding success.
The invasive vine black swallowwort [Vincetoxicum nigrum (L.) Moench = Cynanchum louiseae Kartesz & Gandhi, Apocynaceae] is difficult to control, and herbicide studies are lacking. This long-lived perennial species is primarily found in high-light environments in natural areas and perennial cropping systems in northeastern North America. We conducted a 3-yr herbicide efficacy study, with or without mowing, in an old-field site infested with V. nigrum in Dutchess County, NY, USA. Experimental plots were either herbicide treated in early July or mowed in early July and subsequently herbicide treated in late August for 2 yr with the potassium salt of glyphosate (2.02 kg ae ha−1), the isopropylamine salt of glyphosate (1.35 kg ae ha−1), or the butoxyethyl ester of triclopyr (1.79 kg ae ha−1). Both glyphosate formulations were effective in reducing V. nigrum aboveground biomass, although they were somewhat less effective in reducing cover or stem densities of V. nigrum plants >10-cm tall after 2 yr compared with untreated plots. Mowing did not always enhance the efficacy of foliar glyphosate applications. Triclopyr, with or without mowing, was generally not effective against V. nigrum in our study. The only significant effect of triclopyr was to increase the cover of grasses in the plots. While annual applications of glyphosate can be useful for management of V. nigrum infestations, higher rates and more frequent applications of triclopyr need to be investigated to determine its usefulness for V. nigrum control.
Data from UK confidential enquiries suggest a declining rate of twin stillbirth in monochorionic (MC) and dichorionic (DC) twin pregnancies with improved outcomes possibly reflecting the establishment of national guidelines for the management of multiple pregnancies. Despite this, twin pregnancies are at greater risk of all pregnancy complications, miscarriage and stillbirth than singleton pregnancies. Monochorionic twins, comprising approximately 20% of twin pregnancies, are at particular risk of fetal loss due to the unique pathological complications of a shared placenta: Twin to Twin Transfusion Syndrome (TTTS), early-onset severe selective growth restriction (sGR) and twin anaemia polycythaemia sequence (TAPS). Furthermore, following single intrauterine fetal demise (sIUFD) surviving monochorionic co-twins are exposed to an increased risk of intrauterine death, neonatal death and neurological disability. This chapter examines single and double fetal loss in DC and MC twin pregnancies, outlining the key facts, and covering the difficult issues and management challenges posed by twin demise.
There is emerging evidence of heterogeneity within treatment-resistance schizophrenia (TRS), with some people not responding to antipsychotic treatment from illness onset and a smaller group becoming treatment-resistant after an initial response period. It has been suggested that these groups have different aetiologies. Few studies have investigated socio-demographic and clinical differences between early and late onset of TRS.
Objectives
This study aims to investigate socio-demographic and clinical correlates of late-onset of TRS.
Methods
Using data from the electronic health records of the South London and Maudsley, we identified a cohort of people with TRS. Regression analyses were conducted to identify correlates of the length of treatment to TRS. Analysed predictors include gender, age, ethnicity, positive symptoms severity, problems with activities of daily living, psychiatric comorbidities, involuntary hospitalisation and treatment with long-acting injectable antipsychotics.
Results
We observed a continuum of the length of treatment until TRS presentation. Having severe hallucinations and delusions at treatment start was associated shorter duration of treatment until the presentation of TRS.
Conclusions
Our findings do not support a clear cut categorisation between early and late TRS, based on length of treatment until treatment resistance onset. More severe positive symptoms predict earlier onset of treatment resistance.
Disclosure
DFdF, GKS, EF and IR have received research funding from Janssen and H. Lundbeck A/S. RDH and HS have received research funding from Roche, Pfizer, Janssen and Lundbeck. SES is employed on a grant held by Cardiff University from Takeda Pharmaceutical Comp
OBJECTIVES/GOALS: Whole-genome viral sequencing is vital to inform public health and study evolution. Arboviruses evolve in vectors, reservoir hosts, and humans, and require surveillance at all points. We developed a new rigorous method of sequencing that captures whole viral genomes in field-collected and clinical samples. METHODS/STUDY POPULATION: ClickSeq is a novel method of Next Generation Sequencing (NGS) library synthesis using azido-nucleotides to terminate reverse transcription. The cDNA generated can be ligated to sequencing and indexing primers at room temperature using copper (Cu I) and vitamin C. With this approach, we designed primers located ~250 bp apart along the genomes of the arboviruses Chikungunya 37797, Zika Dakar, Yellow Fever Asibi, Dengue serotype 2, West Nile 385-99, and St. Louis Encephalitis Virus (SLEV) clade II. We tested this method with varying viral titers: lab-infected mosquito pools, field-collected mosquito pools from a Texas West Nile and SLEV outbreak, and patient isolates from a Pakistani CHIKV outbreak. The cDNA was sequenced in the UTMB NGS Core and aligned using bowtie. RESULTS/ANTICIPATED RESULTS: The use of a single protocol to capture whole viral genomes including UTRs for multiple viruses from different sample collection styles is ideal for arboviruses. Primers for multiple viruses were pooled and used to sequence mosquito pools. The Tiled ClickSeq method captured whole viral genomes without the need for host depletion. UTRs were captured even when the viral strain used for primer design differed from the resulting strain. Discreet variants were captured in both the hypervariable nsP3 region and the UTR in the patient isolates from the CHIKV outbreak compared to the 2017 outbreak. Texas WNV and SLEV outbreaks are now defined from the 2020 outbreak and can be further tracked to update public health measures and understand viral evolution. DISCUSSION/SIGNIFICANCE: UTRs impact both human and mosquito fitness, leading to further outbreaks. Tiled ClickSeq aims to capture whole viral genomes with a method and cost that can be implemented by public health researchers to understand disease evolution as it happens to update both public health and basic virology to the effects of evolution on arboviruses.