We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Progressive ventricular remodelling in children with repaired tetralogy of Fallot may or may not result in the need for pulmonary valve replacement. We aimed to model and compare the rates of right and left ventricular adaptation over time, as assessed by cardiac MRI after surgical repair of tetralogy of Fallot, in children who did or did not require pulmonary valve replacement later in adolescence.
Methods:
Single-centre, retrospective cohort study from 2000 to 2020 including patients with tetralogy of Fallot who had complete surgical repair before 24 months.
Results:
From 214 patients included in this analysis, 142 (66.3%) had pulmonary valve replacement at a median age of 12 years (interquartile range 9–15.5) during follow-up. Assessing 323 cardiac MRI studies from 201 patients commencing from a median age of 9.4 years (interquartile range 5.9–12.3), the group that required pulmonary valve replacement later during the follow-up had a steeper time-related right ventricular dilation trajectory than non-pulmonary valve replacement patients: the increment in right ventricular end-diastolic volume index was 19.4 versus 2.8 ml/m2/log2year, P < 0.001; also, right ventricular end-systolic volume index incremented more quickly, at 11.9 versus 0.8 ml/m2/log2year, P < 0.001. Left ventricular end-diastolic volume index increased more quickly in patients who eventually had pulmonary valve replacement, at 7.2 versus 1.5 ml/m2/log2year, P = 0.005; the same occurred for indexed left ventricular end-systolic volume at 3.2 versus –0.4 ml/m2/log2year, P = 0.001.
Conclusion:
Early right and left ventricular dilation over time are identifiable by cardiac MRI in patients destined to require pulmonary valve replacement following tetralogy of Fallot repair.
This collection gathers thirteen contributions by a number of historians, friends, colleagues and/or students of Jinty’s, who were asked to pick their favourite article by her and say a few words about it for an event held in her memory on 15 January 2025 at King’s College London. We offer this collection in print now for a wider audience not so much because it has any claim to be exhaustive or authoritative, but because taken all together these pieces seemed to add up to a useful retrospective on Jinty’s work, its wider context, and its impact on the field over the decades. We hope that, for those who know her work well already, this may be an opportunity to remember some of her classic (and a few less classic) articles, while at the same time serving as an accessible introduction to her research for anyone who knew her without necessarily knowing about her field, as well as for a new and younger generation of readers.
Despite advances in the development of systemic anticoagulants, there remain few agents approved for utilisation in patients with mechanical heart valves. Currently, recommendations for periprocedural and long-term anticoagulation in mechanical heart valves include unfractionated heparin or vitamin K antagonists, and there are some reports for off-label use of low-molecular-weight heparins. Emerging data on parenteral direct thrombin inhibitors, such as bivalirudin, have led to increased utilisation in both extracorporeal membrane oxygenation and ventricular assist devices. We present the case of a paediatric patient with rheumatic heart disease who had significant bleeding on unfractionated heparin who successfully received prolonged bivalirudin therapy in the setting of mechanical aortic and mitral heart valves.
Task-sharing holds promise for bridging gaps in access to mental healthcare; yet there remain significant challenges to scaling up task-sharing models. This formative study aimed to develop a digital platform for training non-specialist providers without prior experience in mental healthcare to deliver a brief psychosocial intervention for depression in community settings in Texas. A 5-step development approach was employed, consisting of: blueprinting, scripting, video production and digital content creation, uploading digital content to a Learning Management System and user testing. This resulted in the development of two courses, one called Foundational Skills covering the skills to become an effective counselor, and the second called Behavioral Activation covering the skills for addressing adult depression. Twenty-one participants with a range of health-related backgrounds, including 11 with prior training in mental healthcare, completed the training and joined focus group discussions offering qualitative feedback and recommendations for improving the program’s usability. Participant feedback centered around the need to make the content more interactive, to include additional engaging features, and to improve the layout and usability of the platform. The next steps will involve evaluating the training program on developing the skills of non-specialist providers and supporting its uptake and implementation.
Patients with hematological malignancies are at high risk of infections due to both the disease and the associated treatments. The use of immunoglobulin (Ig) to prevent infections is increasing in this population, but its cost effectiveness is unknown. This trial-based economic evaluation aimed to compare the cost effectiveness of prophylactic Ig with prophylactic antibiotics in patients with hematological malignancies.
Methods
The economic evaluation used individual patient data from the RATIONAL feasibility trial, which randomly assigned 63 adults with chronic lymphocytic leukemia, multiple myeloma, or lymphoma to prophylactic Ig or prophylactic antibiotics. The following two analyses were conducted to estimate the cost effectiveness of the two treatments over the 12-month trial period from the perspective of the Australian health system:
(i) a cost-utility analysis (CUA) to assess the incremental cost per quality-adjusted life-year (QALY) gained using data collected with the EuroQol 5D-5L questionnaire; and
(ii) a cost-effectiveness analysis (CEA) to assess the incremental cost per serious infection prevented (grade ≥3) and per infection prevented (any grade).
Results
The total cost per patient was significantly higher in the Ig arm than in the antibiotic arm (difference AUD29,140 [USD19,000]). There were non-significant differences in health outcomes between the treatment arms: patients treated with Ig had fewer QALYs (difference −0.072) and serious infections (difference −0.26) than those given antibiotics, but more overall infections (difference 0.76). The incremental cost-effectiveness from the CUA indicated that Ig was more costly than antibiotics and associated with fewer QALYs. In the CEA, Ig costed an additional AUD111,262 (USD73,000) per serious infection prevented, but it was more costly than antibiotics and associated with more infections when all infections were included.
Conclusions
These results indicate that, on average, Ig prophylactic treatment may not be cost effective compared with prophylactic antibiotics for the group of patients with hematological malignancies recruited to the RATIONAL feasibility trial. Further research is needed to confirm these findings in a larger population and over the longer term.
When a company becomes insolvent, particularly if it is a large company, this will often mean that there will be a large-scale redundancy process. The requirements of the process can be technical, but there is a list of obligations that must be adhered and these are set out within the Trade Union and Labour Relations (Consolidation) Act 1992 (TULRCA 1992).
Edited by
Richard Williams, University of South Wales,Verity Kemp, Independent Health Emergency Planning Consultant,Keith Porter, University of Birmingham,Tim Healing, Worshipful Society of Apothecaries of London,John Drury, University of Sussex
It is usual for humans to experience distress in the aftermath of emergencies, incidents, disasters, and disease outbreaks (EIDD). The manifestation, severity, and duration of the experiences that constitute distress depend on many intrinsic and extrinsic factors. Recent research has demonstrated that distress may be more ubiquitous than was previously thought, and that some interventions, even if well meaning, may not be helpful. Amelioration for most people comes with timely, proportionate, and targeted support and the passage of time. Validation of people’s experiences and minimising the medicalisation of distress are important in helping people to return to ordinary social functioning. This chapter looks at distress related to major events, including the scientific principles, impacts, and implications for intervention. The case study draws on the experience of three members of a pre-hospital team and how a challenging case affected them all.
Edited by
Richard Williams, University of South Wales,Verity Kemp, Independent Health Emergency Planning Consultant,Keith Porter, University of Birmingham,Tim Healing, Worshipful Society of Apothecaries of London,John Drury, University of Sussex
There is increasing awareness that working within the field of pre-hospital care can have psychosocial effects on clinicians. This chapter describes a systematic review of current knowledge of the psychosocial consequences of working in pre-hospital care. A considerable amount of research has been conducted, examining in particular whether practitioners develop burnout and psychiatric disorders, especially symptoms of post-traumatic stress and post-traumatic stress disorder (PTSD), as a result of their work. However, most studies did not fully assess whether practitioners developed clinically significant symptoms.. Instead, cross-sectional surveys and self-report questionnaires were used, which considerably overestimate the incidence of these problems. Perhaps the high scores on these questionnaires indicate that practitioners who work in pre-hospital care often suffer considerable stress and distress that can be the result of daily organisational and operational hassles, a high volume of work, lack of resources, and, less than has often been thought, attending unusual and high-profile incidents.
Obesity is associated with adverse effects on brain health, including increased risk for neurodegenerative diseases. Changes in cerebral metabolism may underlie or precede structural and functional brain changes. While bariatric surgery is known to be effective in inducing weight loss and improving obesity-related medical comorbidities, few studies have examined whether it may be able to improve brain metabolism. In the present study, we examined change in cerebral metabolite concentrations in participants with obesity who underwent bariatric surgery.
Participants and Methods:
35 patients with obesity (BMI > 35 kg/m2) were recruited from a bariatric surgery candidate nutrition class. They completed single voxel 1H-proton magnetic resonance spectroscopy at baseline (pre-surgery) and within one year post-surgery. Spectra were obtained from a large medial frontal brain region. Tissue-corrected absolute concentrations for metabolites including choline-containing compounds (Cho), myo-inositol (mI), N-acetylaspartate (NAA), creatine (Cr), and glutamate and glutamine (Glx) were determined using Osprey. Paired t-tests were used to examine within-subject change in metabolite concentrations, and correlations were used to relate these changes to other health-related outcomes, including weight loss and glycemic control.
Results:
Bariatric surgery was associated with a reduction in cerebral Cho (f[34j = -3.79, p < 0.001, d = -0.64) and mI (f[34] = -2.81, p < 0.01, d = -0.47) concentrations. There were no significant changes in NAA, Glx, or Cr concentrations. Reductions in Cho were associated with greater weight loss (r = 0.40, p < 0.05), and reductions in mI were associated with greater reductions in HbA1c (r = 0.44, p < 0.05).
Conclusions:
Participants who underwent bariatric surgery exhibited reductions in cerebral Cho and mI concentrations, which were associated with improvements in weight loss and glycemic control. Given that elevated levels of Cho and mI have been implicated in neuroinflammation, reduction in these metabolites after bariatric surgery may reflect amelioration of obesity-related neuroinflammatory processes. As such, our results provide evidence that bariatric surgery may improve brain health and metabolism in individuals with obesity.
Children with neurodevelopmental disorders (NDDs) commonly experience attentional and executive function (EF) difficulties that are negatively associated with academic success, psychosocial functioning, and quality of life. Access to early and consistent interventions is a critical protective factor and there are recommendations to deliver cognitive interventions in schools; however, current cognitive interventions are expensive and/or inaccessible, particularly for those with limited resources and/or in remote communities. The current study evaluated the school-based implementation of two game-based interventions in children with NDDs: 1) a novel neurocognitive attention/EF intervention (Dino Island; DI), and 2) a commercial educational intervention (Adventure Academy; AA). DI is a game-based attention/EF intervention specifically developed for children for delivery in community-based settings.
Participants and Methods:
Thirty five children with NDDs (ages 5-13 years) and 17 EAs participated. EAs completed on-line training to deliver the interventions to assigned students at their respective schools (3x/week, 40-60 minutes/session, 8 weeks, 14 hours in total). We gathered baseline child and EA demographic data, completed pre-intervention EA interviews, and conducted regular fidelity checks throughout the interventions. Implementation data included paper-pencil tracking forms, computerized game analytic data, and online communications.
Results:
Using a mixed methods approach we evaluated the following implementation outcomes: fidelity, feasibility, acceptability, and barriers. Overall, no meaningful between-group differences were found in EA or child demographics, except for total number of years worked as an EA (M = 17.18 for AA and 9.15 for DI; t (22) = - 4.34, p < .01) and EA gender (χ2 (1) = 6.11, p < .05). For both groups, EA age was significantly associated with the number of sessions played [DI (r = .847, p < .01), AA (r = .986, p < .05)]. EAs who knew their student better completed longer sessions [DI (r = .646), AA (r = .973)], all ps < .05]. The number of years worked as an EA was negatively associated with the total intervention hours for both groups. Qualitative interview data indicated that most EAs found DI valuable and feasible to deliver in their classrooms, whereas more implementation challenges were identified with AA. Barriers common to both groups included technical difficulties (e.g., game access, internet firewalls), environmental barriers (e.g., distractions in surroundings, time of the year), child factors (e.g., lack of motivation, attentional difficulties, frustration), and game-specific factors (e.g., difficulty level progression). Barriers specific to DI included greater challenges in motivating children as a function of difficulty level progression. Furthermore, given the comprehensive nature of training required for delivery, EAs needed a longer time to complete the training for DI. Nevertheless, many EAs in the DI group found the training helpful, with a potential to generalize to other children in the classroom.
Conclusions:
The availability of affordable, accessible, and effective cognitive intervention is important for children with NDDs. We found that delivery of a novel cognitive intervention by EAs was feasible and acceptable, with similarities and differences in implementation facilitators/barriers between the cognitive and commercialized academic intervention. Recommendations regarding strategies for successful school-based implementation of neurocognitive intervention will be elaborated on in the poster.
Individuals living with HIV may experience cognitive difficulties or marked declines known as HIV-Associated Neurocognitive Disorder (HAND). Cognitive difficulties have been associated with worse outcomes for people living with HIV, therefore, accurate cognitive screening and identification is critical. One potentially sensitive marker of cognitive impairment which has been underutilized, is intra-individual variability (IIV). Cognitive IIV is the dispersion of scores across tasks in neuropsychological assessment. In individuals living with HIV, greater cognitive IIV has been associated with cortical atrophy, poorer cognitive functioning, with more rapid declines, and greater difficulties in daily functioning. Studies examining the use of IIV in clinical neuropsychological testing are limited, and few have examined IIV in the context of a single neuropsychological battery designed for culturally diverse or at-risk populations. To address these gaps, this study aimed to examine IIV profiles of individuals living with HIV and who inject drugs, utilizing the Neuropsi, a standardized neuropsychological instrument for Spanish speaking populations.
Participants and Methods:
Spanish speaking adults residing in Puerto Rico (n=90) who are HIV positive and who inject drugs (HIV+I), HIV negative and who inject drugs (HIV-I), HIV positive who do not inject drugs (HIV+), or healthy controls (HC) completed the Neuropsi battery as part of a larger research protocol. The Neuropsi produces 3 index scores representing cognitive domains of memory, attention/memory, and attention/executive functioning. Total battery and within index IIV were calculated by dividing the standard deviation of T-scores by mean performance, resulting in a coefficient of variance (CoV). Group differences on overall test battery mean CoV (OTBMCoV) were investigated. To examine unique profiles of index specific IIV, a cluster analysis was performed for each group.
Results:
Results of a one-way ANOVA indicated significant between group differences on OTBMCoV (F[3,86]=6.54, p<.001). Post-hoc analyses revealed that HIV+I (M=.55, SE=.07, p=.003), HIV-I (M=.50, SE=.03, p=.001), and HIV+ (M=.48, SE=.02, p=.002) had greater OTBMCoV than the HC group (M=.30, SE=.02). To better understand sources of IIV within each group, cluster analysis of index specific IIV was conducted. For the HIV+ group, 3 distinct clusters were extracted: 1. High IIV in attention/memory and attention/executive functioning (n=3, 8%); 2. Elevated memory IIV (n=21, 52%); 3. Low IIV across all indices (n=16, 40%). For the HIV-I group, 2 distinct clusters were extracted: 1. High IIV across all 3 indices (n=7, 24%) and 2. Low IIV across all 3 indices (n=22, 76%). For the HC group, 3 distinct clusters were extracted: 1. Very low IIV across all 3 indices (n=5, 36%); 2. Elevated memory IIV (n=6, 43%); 3. Elevated attention/executive functioning IIV with very low attention/memory and memory IIV (n=3, 21%). Sample size of the HIV+I group was insufficient to extract clusters.
Conclusions:
Current findings support IIV in the Neuropsi test battery as clinically sensitive marker for cognitive impairment in Spanish speaking individuals living with HIV or who inject drugs. Furthermore, the distinct IIV cluster types identified between groups can help to better understand specific sources of variability. Implications for clinical assessment in prognosis and etiological considerations are discussed.
Injection drug use is a significant public health crisis with adverse health outcomes, including increased risk of human immunodeficiency virus (HIV) infection. Comorbidity of HIV and injection drug use is highly prevalent in the United States and disproportionately elevated in surrounding territories such as Puerto Rico. While both HIV status and injection drug use are independently known to be associated with cognitive deficits, the interaction of these effects remains largely unknown. The aim of this study was to determine how HIV status and injection drug use are related to cognitive functioning in a group of Puerto Rican participants. Additionally, we investigated the degree to which type and frequency of substance use predict cognitive abilities.
Participants and Methods:
96 Puerto Rican adults completed the Neuropsi Attention and Memory-3rd Edition battery for Spanish-speaking participants. Injection substance use over the previous 12 months was also obtained via clinical interview. Participants were categorized into four groups based on HIV status and injection substance use in the last 30 days (HIV+/injector, HIV+/non-injector, HIV/injector, HIV-/non-injector). One-way analysis of variance (ANOVA) was conducted to determine differences between groups on each index of the Neuropsi battery (Attention and Executive Function; Memory; Attention and Memory). Multiple linear regression was used to determine whether type and frequency of substance use predicted performance on these indices while considering HIV status.
Results:
The one-way ANOVAs revealed significant differences (p’s < 0.01) between the healthy control group and all other groups across all indices. No significant differences were observed between the other groups. Injection drug use, regardless of the substance, was associated with lower combined attention and memory performance compared to those who inject less than monthly (Monthly: p = 0.04; 2-3x daily: p < 0.01; 4-7x daily: p = 0.02; 8+ times daily: p < 0.01). Both minimal and heavy daily use predicted poorer memory performance (p = 0.02 and p = 0.01, respectively). Heavy heroin use predicted poorer attention and executive functioning (p = 0.04). Heroin use also predicted lower performance on tests of memory when used monthly (p = 0.049), and daily or almost daily (2-6x weekly: p = 0.04; 4-7x daily: p = 0.04). Finally, moderate injection of heroin predicted lower scores on attention and memory (Weekly: p = 0.04; 2-6x weekly: p = 0.048). Heavy combined heroin and cocaine use predicted worse memory performance (p = 0.03) and combined attention and memory (p = 0.046). HIV status was not a moderating factor in any circumstance.
Conclusions:
As predicted, residents of Puerto Rico who do not inject substances and are HIVnegative performed better in domains of memory, attention, and executive function than those living with HIV and/or inject substances. There was no significant difference among the affected groups in cognitive ability. As expected, daily injection of substances predicted worse performance on tasks of memory. Heavy heroin use predicted worse performance on executive function and memory tasks, while heroin-only and combined heroin and cocaine use predicted worse memory performance. Overall, the type and frequency of substance is more predictive of cognitive functioning than HIV status.
Aristotle is history’s first great logician and Chrysippus is the second. We know more of Aristotle’s work than Chrysippus’ (whose works have been almost entirely lost), but we have enough at hand to identify the principal achievements of each. Aristotle’s logical particles of the syllogistic were ‘all’, ‘no’, ‘some’, and ‘non-’. Chrysippus’ were ‘if-then’, ‘it is not the case’, and ‘or’. This inclines the modern reader to see in Aristotle’s term-logic a precursor of predicate logic, and in Chrysippus’ logic the precursor of propositional logic. Because space is limited, I shall take the ancient logic of this chapter to be Aristotelian and Chrysippean logics.
Maternal protein restriction is often associated with structural and functional sequelae in offspring, particularly affecting growth and renal-cardiovascular function. However, there is little understanding as to whether hypertension and kidney disease occur because of a primary nephron deficit or whether controlling postnatal growth can result in normal renal-cardiovascular phenotypes. To investigate this, female Sprague-Dawley rats were fed either a low-protein (LP, 8.4% protein) or normal-protein (NP, 19.4% protein) diet prior to mating and until offspring were weaned at postnatal day (PN) 21. Offspring were then fed a non ‘growth’ (4.6% fat) which ensured that catch-up growth did not occur. Offspring growth was determined by weight and dual energy X-ray absorptiometry. Nephron number was determined at PN21 using the disector-fractionator method. Kidney function was measured at PN180 and PN360 using clearance methods. Blood pressure was measured at PN360 using radio-telemetry. Body weight was similar at PN1, but by PN21 LP offspring were 39% smaller than controls (Pdiet < 0.001). This difference was due to proportional changes in lean muscle, fat, and bone content. LP offspring remained smaller than NP offspring until PN360. In LP offspring, nephron number was 26% less in males and 17% less in females, than NP controls (Pdiet < 0.0004). Kidney function was similar across dietary groups and sexes at PN180 and PN360. Blood pressure was similar in LP and NP offspring at PN360. These findings suggest that remaining on a slow growth trajectory after exposure to a suboptimal intrauterine environment does not lead to the development of kidney dysfunction and hypertension.
Three important properties associated with a classification of any group of organisms are diagnosability, monophyly and resolution. In this chapter we explore the interrelationships between these three properties in the context of cryptic taxa, here defined as a clade with no obvious diagnostic morphological support. We present the view that the number of nodes on a phylogenetic tree of all flowering plants that have morphological diagnostic support is less than five percent; as such, cryptic nodes are much more common than non-cryptic nodes. Because of this, we suggest that the phrase ‘cryptic nodes’ is a preferable description as opposed to cryptic taxa because taxa in the sense of traditional classifications are generally diagnostic. By reference to a global taxonomic study of the genus Ipomoea, we discuss the role of diagnosability at various scales including major infrageneric clade, genus and species. We demonstrate that the level of diagnosability for Ipomoea is relatively low, therefore making cryptic nodes the rule and not the exception. We provide several examples of such cryptic nodes, detail how we discovered them and place them in a wider conceptual framework of diagnosability in angiosperms.
Behaviors typical of body-focused repetitive behavior disorders such as trichotillomania (TTM) and skin-picking disorder (SPD) are often associated with pleasure or relief, and with little or no physical pain, suggesting aberrant pain perception. Conclusive evidence about pain perception and correlates in these conditions is, however, lacking.
Methods
A multisite international study examined pain perception and its physiological correlates in adults with TTM (n = 31), SPD (n = 24), and healthy controls (HCs; n = 26). The cold pressor test was administered, and measurements of pain perception and cardiovascular parameters were taken every 15 seconds. Pain perception, latency to pain tolerance, cardiovascular parameters and associations with illness severity, and comorbid depression, as well as interaction effects (group × time interval), were investigated across groups.
Results
There were no group differences in pain ratings over time (P = .8) or latency to pain tolerance (P = .8). Illness severity was not associated with pain ratings (all P > .05). In terms of diastolic blood pressure (DBP), the main effect of group was statistically significant (P = .01), with post hoc analyses indicating higher mean DBP in TTM (95% confidence intervals [CI], 84.0-93.5) compared to SPD (95% CI, 73.5-84.2; P = .01), and HCs (95% CI, 75.6-86.0; P = .03). Pain perception did not differ between those with and those without depression (TTM: P = .2, SPD: P = .4).
Conclusion
The study findings were mostly negative suggesting that general pain perception aberration is not involved in TTM and SPD. Other underlying drivers of hair-pulling and skin-picking behavior (eg, abnormal reward processing) should be investigated.
Earth is rapidly losing free-living species. Is the same true for parasitic species? To reveal temporal trends in biodiversity, historical data are needed, but often such data do not exist for parasites. Here, parasite communities of the past were reconstructed by identifying parasites in fluid-preserved specimens held in natural history collections. Approximately 2500 macroparasites were counted from 109 English Sole (Parophrys vetulus) collected between 1930 and 2019 in the Salish Sea, Washington, USA. Alpha and beta diversity were measured to determine if and how diversity changed over time. Species richness of parasite infracommunities and community dispersion did not vary over time, but community composition of decadal component communities varied significantly over the study period. Community dissimilarity also varied: prior to the mid-20th century, parasites shifted in abundance in a seemingly stochastic manner and, after this time period, a canalization of community change was observed, where species' abundances began to shift in consistent directions. Further work is needed to elucidate potential drivers of these changes and to determine if these patterns are present in the parasite communities of other fishes of the Salish Sea.