We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Racial disparities in healthcare have been well documented in the United States. We hypothesise that there will be a racial variance in different clinical variables in single-ventricle patients through stages of palliation.
Materials and Methods:
Retrospective single-centre study stratified all single-ventricle patients who reached stage 2 palliation by race: Black and White. Other races were excluded. Demographic and clinical characteristics were compared, alongside follow-up survival data. Primary outcomes were progression to Fontan and overall survival.
Results:
Among 526 patients, 325 (61.8%) were White, and 201 (38.2%) were Black. Median age at stage 2 palliation was 150 days for White and 165 for Black patients (p = 0.005), with similar weights. Black patients exhibited higher median cardiopulmonary bypass times (87 vs. 74 minutes, p = 0.001) and a greater frequency of genetic syndromes (30.1% vs. 22.1%, p = 0.044). No significant differences were observed in outcomes between groups from stage 2 to stage 3, pre-stage 3 cardiac catheterisation variables, or perioperative outcomes. Multivariable regression analysis identified hypoplastic pulmonary arteries as the risk factor for mortality after stage 2. Survival analysis showed no difference in survival by race; however, occurrence of combined cardiovascular event was significantly higher in Black race.
Conclusions:
Significant racial disparities exist among single-ventricle patients regarding the timing of stage 2 palliation, cardiopulmonary bypass duration, and frequency of genetic syndromes. Black race was a risk factor for sub-optimal long-term outcome, although perioperative mortality was similar. These race-related factors warrant further studies to improve our understanding of the impact of race on outcomes.
Hemodynamic collapse in multi-trauma patients with severe traumatic brain injury (TBI) poses both a diagnostic and therapeutic challenge for prehospital clinicians. Brain injury associated shock (BIAS), likely resulting from catecholamine storm, can cause both ventricular dysfunction and vasoplegia but may present clinically in a manner similar to hemorrhagic shock. Despite different treatment strategies, few studies exist describing this phenomenon in the early post-injury phase. This retrospective observational study aimed to describe the frequency of shock in isolated TBI in prehospital trauma patients and to compare their clinical characteristics to those patients with hemorrhagic shock and TBI without shock.
Methods:
All prehospital trauma patients intubated by prehospital medical teams from New South Wales Ambulance Aeromedical Operations (NSWA-AO) with an initial Glasgow Coma Scale (GCS) of 12 or less were investigated. Shock was defined as a pre-intubation systolic blood pressure under 90mmHg and the administration of blood products or vasopressors. Injuries were classified from in-hospital computed tomography (CT) reports. From this, three study groups were derived: BIAS, hemorrhagic shock, and isolated TBI without shock. Descriptive statistics were then produced for clinical and treatment variables.
Results:
Of 1,292 intubated patients, 423 had an initial GCS of 12 or less, 24 patients (5.7% of the original cohort) had shock with an isolated TBI, and 39 patients had hemorrhagic shock. The hemodynamic parameters were similar amongst these groups, including values of tachycardia, hypotension, and elevated shock index. Prehospital clinical interventions including blood transfusion and total fluids administered were also similar, suggesting they were indistinguishable to prehospital clinicians.
Conclusions:
Hemodynamic compromise in the setting of isolated severe TBI is a rare clinical entity. Current prehospital physiological data available to clinicians do not allow for easy delineation between these patients from those with hemorrhagic shock.
Approximately 6.5 million Americans ages 65 and older have Alzheimer’s disease and related dementias, a prevalence projected to triple by 2060. While subtle impairment in cognition and instrumental activities of daily living (IADLs) arises in the mild cognitive impairment (MCI) phase, early detection of these insidious changes is difficult to capture given limitations. Traditional IADL assessments administered infrequently are less sensitive to early MCI and not conducive to tracking subtle changes that precede significant declines. Continuous passive monitoring of IADLs using sensors and software in home environments is a promising alternative. The purpose of this study was to determine which remotely monitored IADLs best distinguish between MCI and normal cognition.
Participants and Methods:
Participants were 65 years or older, independently community-dwelling, and had at least one daily medication and home internet access. Clinical assessments were performed at baseline. Electronic pillboxes (MedTracker) and computer software (Worktime) measured daily medication and computer habits using the Oregon Center for Aging and Technology (ORCATECH) platform. The Survey for Memory, Attention, and Reaction Time (SMART; Trail A, Trail B, and Stroop Tests) is a self-administered digital cognitive assessment that was deployed monthly. IADL data was aggregated for each participant at baseline (first 90 days) in each domain and various features developed for each. The receiver operating characteristic area under the curve (ROC-AUC) was calculated for each feature.
Results:
Traditional IADL Questionnaires.
At baseline, 103 participants (normal n = 59, Mage = 73.6±5.5; MCI n = 44, Mage = 76.0±6.1) completed three functional questionnaires (Functional Activities Questionnaire; Measurement of Everyday Cognition (ECog), both self-report and informant). The Informant ECog demonstrated the highest AUC (72% AUC, p = <.001).
Remotely monitored in-home IADLs and self-administered brief online cognitive test performance.
Eighty-four had medication data (normal n = 48, Mage = 73.2±5.4; MCI n = 36, Mage = 75.6±6.9). Four features related to pillbox-use frequency (73% AUC) and four features related to pillbox-use time (62% AUC) were developed. The discrepancy between self-reported frequency of use versus actual use was the most discriminating (67% AUC, p = .03).
Sixty-six had computer data (normal n = 38, Mage = 73.6±6.1; MCI n = 28, Mage = 76.6±6.8). Average usage time showed 64% AUC (p = .048) and usage variability showed 60% AUC (p = .18).
One hundred and two completed the SMART (normal n = 59, Mage = 73.6±5.5; MCI n = 43, Mage = 75.9±6.2). Eleven features related to survey completion time demonstrated 80% AUC in discriminating cognition. Eleven features related to the number of clicks during the survey demonstrated 70% AUC. Lastly, seven mouse movement features demonstrated 71% AUC.
Conclusions:
Pillbox use frequency combined features and self-administered brief online cognitive test combined features (e.g., completion times, mouse cursor movements) have acceptable to excellent ability to discriminate between normal cognition and MCI and are relatively comparable to informant rated IADL questionnaires. General computer usage habits demonstrated lower discriminatory ability. Our approach has applied implications for detecting and tracking older adults’ declining cognition and function in real world contexts.
Subjective cognitive decline (SCD, i.e., perceived cognitive decline without neuropsychological deficits) is associated with Alzheimer’s disease pathology and increased risk for cognitive impairment but is heterogenous in etiology and has been linked to other factors including personality and depression. Mental wellbeing (i.e., the perception and functioning of social, emotional, and health-related aspects of one’s life) has been associated with subjective memory complaints, but its relationship with other subjective cognitive domains is poorly understood. Further characterizing the relationship between mental wellbeing and SCD could refine understanding of SCD and inform development of interventions that prevent progression to objective cognitive decline. This study aimed to describe the relationship between mental wellbeing and subjective decline in multiple cognitive domains and examine whether this relationship differs between older adults with normal cognition and those with mild cognitive impairment (MCI).
Participants and Methods:
Community-dwelling older adults (normal: n = 58, Mage = 73.7±5.6; MCI: n = 43, Mage = 75.9±6.1) completed the Everyday Cognition scale, a validated self-report measure of SCD, and the RAND-36 Health Survey, a validated self-report measure of health-related quality of life which includes a mental wellbeing subscale. Spearman’s rank correlations were conducted between self-reported mental wellbeing and each self-reported cognitive domain (i.e., memory, language, visuospatial, and executive function) for the Normal Cognition and MCI groups.
Results:
Worse mental wellbeing was associated with worse subjective language and executive function in the normal group, rs(56) = -.42, p =.001; rs(56) = -.37, p =.005, but not for the MCI group, rs(41) = -.23, p =.15; rs(41) = -.12, p =.46. Worse mental wellbeing was associated with worse subjective visuospatial function in the MCI group, rs(41) = -.39, p =.009, but not in the normal group, rs(56) = -.11, p =.39. For both groups, worse mental wellbeing was associated with worse subjective memory, rs(56) = -.45, p < .001; rs(41) = -.37, p =.02. While this correlation was greater in the normal group, the difference was not significant (z = 0.38, p =.71).
Conclusions:
These results suggest that perceptions of mental wellbeing are related to perceptions of cognitive decline in multiple domains, and that the specific domains involved differ between normal and MCI groups. The differential associations may mean perception of specific cognitive domains more strongly affect mental wellbeing, or mental wellbeing more acutely influences perception of those domains. The overall observed relationship between SCD and mental wellbeing may have several explanations: the impact of broader health perceptions may extend to cognitive perception, behavioral changes associated with poor wellbeing may reduce subjective cognitive function, or worse subjective cognitive function may lead to negative experiences of wellbeing. Future longitudinal investigation could inform causal inferences. The more limited associations between mental wellbeing and SCD among MCI individuals may point to the role of decreased self-awareness (due to cognitive impairment) precluding detection of subtle changes in cognition or wellbeing. This study highlights the importance of better understanding mental wellbeing in experiences of SCD in both normal and MCI older adults to improve cognitive and mental health outcomes.
Examining the role of arts and culture in regional Australia often focuses on economic aspects within the creative industries. However, this perspective tends to disregard the value of unconventional practices and fails to recognise the influence of regional ecological settings and the well-being advantages experienced by amateur and hobbyist musicians who explore ubiquitous methods of music creation. This article presents the results of a survey conducted among practitioners in regional Australia, exploring their utilisation of creative technology ecosystems. This project marks the first independent, evidence-based study of experimental electronic music practices in regional Australia and how local and digital resource ecosystems support those activities. Spanning the years 2021 and 2022, the study involved interviewing 11 participants from many Australian states. In this article, we share the study’s findings, outlining the diverse range of experimental electronic music practices taking place across regional Australia and how practitioners navigate the opportunities and challenges presented by their local context.
Background: SARS-CoV-2 viral load decreases over time after illness onset. However, immunocompromised patients may take longer for viral load decrease or have a more erratic viral-load trajectory. We used strand-specific assay data from admitted patients to evaluate viral-load trajectories after illness onset. Methods: We reviewed records of hospitalized patients with a positive SARS-CoV-2 PCR and tested using the strand-specific SARS-CoV-2 PCR during July 2020–April 2022. At Stanford Healthcare, we use a 2-step reverse real-time polymerase chain reaction (rRT-PCR) assay specific to the minus strand of the SARS-CoV-2 envelope gene to assess infectivity. Restricting our analysis to each patient’s first strand-specific assay, we used logistic regression models to compare patients with single versus multiple assays. Among patients with multiple tests, we compared those who had an upward trajectory in cycle threshold (Ct) values (a surrogate of decreasing viral load) versus those who did not. We analyzed presence of symptoms, immunocompromised state, immunosuppression reason, and severe COVID-19 leading to ICU care in univariate and multivariate models that further adjust for additional covariates. Significant differences were assessed using logistic regression odds ratios and an α level of 0.05. Results: In total, 848 inpatients were included. Among them, 703 were tested only once and 145 were tested 2–6 times. The longest duration of minus-strand detection was 163 days. In univariate analyses, patients with a single minus-strand assay had lower odds of being symptomatic (OR, 0.55), of being immunocompromised (OR, 0.58), and of being admitted to the ICU with severe COVID-19 (OR, 0.49). In the multivariate analysis, being admitted to the ICU with severe COVID-19 was the only significant variable associated with having >1 test (OR, 2.44). Among patients who had multiple strand-specific SARS-CoV-2 assays, 119 had upward minus-strand trends of Ct values (as expected) and 26 did not. Being immunocompromised was associated with nonrising minus-strand CT values (OR, 33.3) when holding all other covariates in the model constant. Conclusions: Immunocompromised patients with COVID-19 tend to actively replicate for longer and have unexpected viral trajectories compared to immunocompetent patients. Among immunocompromised patients, suspension of transmission-based precautions may require a case-by-case evaluation.
To evaluate the potential superiority of donanemab vs. aducanumab on the percentage of participants with amyloid plaque clearance (≤24.1 Centiloids [CL]) at 6 months in patients with early symptomatic Alzheimer's disease (AD) in phase 3 TRAILBLAZER-ALZ-4 study. The amyloid cascade in AD involves the production and deposition of amyloid beta (Aβ) as an early and necessary event in the pathogenesis of AD.
Methods
Participants (n = 148) were randomized 1:1 to receive donanemab (700 mg IV Q4W [first 3 doses], then 1400 mg IV Q4W [subsequent doses]) or aducanumab (per USPI: 1 mg/kg IV Q4W [first 2 doses], 3 mg/kg IV Q4W [next 2 doses], 6 mg/kg IV Q4W [next 2 doses] and 10 mg/kg IV Q4W [subsequent doses]).
Results
Baseline demographics and characteristics were well-balanced across treatment arms (donanemab [N = 71], aducanumab [N = 69]). Twenty-seven donanemab-treated and 28 aducanumab-treated participants defined as having intermediate tau.
Upon assessment of florbetapir F18 PET scans (6 months), 37.9% donanemab-treated vs. 1.6% aducanumab-treated participants achieved amyloid clearance (p < 0.001). In the intermediate tau subpopulation, 38.5% donanemab-treated vs. 3.8% aducanumab-treated participants achieved amyloid clearance (p = 0.008).
Percent change in brain amyloid levels were −65.2%±3.9% (baseline: 98.29 ± 27.83 CL) and −17.0%±4.0% (baseline: 102.40 ± 35.49 CL) in donanemab and aducanumab arms, respectively (p < 0.001). In the intermediate tau subpopulation, percent change in brain amyloid levels were −63.9%±7.4% (baseline: 104.97 ± 25.68 CL) and −25.4%±7.8% (baseline: 102.23 ± 28.13 CL) in donanemab and aducanumab arms, respectively (p ≤ 0.001).
62.0% of donanemab-treated and 66.7% of aducanumab-treated participants reported an adverse event (AE), there were no serious AEs due to ARIA in donanemab arm and 1.4% serious AEs (one event) due to ARIA were reported in aducanumab arm.
Conclusion
This study provides the first active comparator data on amyloid plaque clearance in patients with early symptomatic AD. Significantly higher number of participants reached amyloid clearance and amyloid plaque reductions with donanemab vs. aducanumab at 6 months.
Previously presented at the Clinical Trials on Alzheimer's Disease - 15th Conference, 2022.
Severe acute respiratory coronavirus virus 2 (SARS-CoV-2) real-time reverse-transcription polymerase chain reaction (rRT-PCR) strand-specific assay can be used to identify active SARS-CoV-2 viral replication. We describe the characteristics of 337 hospitalized patients with at least 1 minus-strand SARS-CoV-2 assay performed >20 days after illness onset. This test is a novel tool to identify high-risk hospitalized patients with prolonged SARS-CoV-2 replication.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
Innovative methods to collect dietary data at multiple times across the year are needed to better understand seasonal or temporal changes in household diets and measure the impact of nutrition-sensitive agricultural programmes in low-income countries. The present study aims to validate a picture-based research tool for participants to self-record their household’s dietary diversity each month in villages of Manyoni District, Tanzania. Pictorial record charts were developed to reflect local food resources. In 113 randomly selected households, the person responsible for food preparation was trained to mark all items consumed by any household member within the home, or prepared for consumption outside the home, for a single recording day. The next day, an interview-based household 24-h food recall (H24HR) was collected for the same period. Separate analyses tested agreement (a) between picture charts and H24HR and (b) between H24HR following chart completion and on an alternative day. Concordance between methods differed between food groups and items but was high to very high for all cereals, vegetables, pulses, legumes and nuts and almost all fruits. Recording of ten items (including non-cultivated fruits and ingredients of mixed dishes) differed significantly between H24HR assessments, all of which were reported by more households in interviews following chart completion. Results suggest potential for visual prompts and the contemporaneous nature of data collection to improve the accuracy of interview-based recall. With adequate investment in developing and implementing context-adapted tools, pictorial charts may also offer an effective standalone method for use at multiple time-points in agricultural programmes.
Although life-story work is an established form of support for people with dementia and their carers, culturally Deaf people who are sign language users have been excluded from this practice. There is no evidence base for the cultural coherence of this approach with Deaf people who sign, nor any prior investigation of the linguistic and cultural adaptation that might be required for life-story work to be effective for sign language users with dementia. Given the lack of empirical work, this conceptual thematic literature review approaches the topic by first investigating the significance of storytelling practices amongst Deaf communities across the lifespan before using the findings to draw out key implications for the development of life-story work with culturally Deaf people who experience dementia and their formal and informal carers (whether Deaf or hearing). The reviewed work is presented in three themes: (a) the cultural positioning of self and others, (b) learning to be Deaf and (c) resistance narratives and narratives of resistance. The article concludes that life-story work has the potential to build on lifelong storying practices by Deaf people, the functions of which have included the (re)forming of cultural identity, the combating of ontological insecurity, knowledge transmission, the resistance of false identity attribution, and the celebration of language and culture.
Fifteen widely separated occurrences of kimberlite and kimberlitic rocks are now known in south-eastern Australia. Those that have been satisfactorily dated isotopically give ages ranging from Permian to Late Jurassic. One occurrence exhibits an intimate spatial association with carbonatite. The classification of these rocks as ‘kimberlitic’ is partly based on their mode of emplacement, and particularly on the presence of crust/mantle inclusions. Compared with African kimberlitic magmas, the southeastern Australian examples have lower incompatible-element contents. These differences are interpreted as representing slightly greater degrees of partial melting of a four-phase Iherzolite assemblage at shallower depths (∼ 65 km) than typical African kimberlite magma.
Animal-source foods (ASF) have the potential to enhance the nutritional adequacy of cereal-based diets in low- and middle-income countries, through the provision of high-quality protein and bioavailable micronutrients. The development of guidelines for including ASF in local diets requires an understanding of the nutrient content of available resources. This article reviews food composition tables (FCT) used in sub-Saharan Africa, examining the spectrum of ASF reported and exploring data sources for each reference. Compositional data are shown to be derived from a small number of existing data sets from analyses conducted largely in high-income nations, often many decades previously. There are limitations in using such values, which represent the products of intensively raised animals of commercial breeds, as a reference in resource-poor settings where indigenous breed livestock are commonly reared in low-input production systems, on mineral-deficient soils and not receiving nutritionally balanced feed. The FCT examined also revealed a lack of data on the full spectrum of ASF, including offal and wild foods, which correspond to local food preferences and represent valuable dietary resources in food-deficient settings. Using poultry products as an example, comparisons are made between compositional data from three high-income nations, and potential implications of differences in the published values for micronutrients of public health significance, including Fe, folate and vitamin A, are discussed. It is important that those working on nutritional interventions and on developing dietary recommendations for resource-poor settings understand the limitations of current food composition data and that opportunities to improve existing resources are more actively explored and supported.
In this article we discuss how contemporary computational and electronic music-making practices might be characterised as a post-digital avant-garde. We also discuss how practitioners within the higher education sector can play a role in leading the development of these practices through their research and teaching. A brief overview of twentieth-century avant-garde practices is provided to set the scene before a case for defining a post-digital avant-garde is made. By way of illustration, the authors describe their own post-digital creative practices and then discuss how these integrate into their academic duties. We reflect on themes that run through avant-garde practices and continue into the post-digital. Finally, we describe how these themes inform an undergraduate music technology programme such that it might be shaped to reflect these developments and prepare students for a post-digital future.