We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Metabolic dietary patterns, including the Empirical Dietary Index for Hyperinsulinaemia (EDIH) and Empirical Dietary Inflammatory Pattern (EDIP), are known to impact multiple chronic diseases, but the role of the colonic microbiome in mediating such relationships is poorly understood. Among 1,610 adults with faecal 16S rRNA data in the TwinsUK cohort, we identified the microbiome profiles for EDIH and EDIP (from food frequency questionnaires) cross-sectionally using elastic net regression. We assessed the association of the dietary pattern-related microbiome profile scores with circulating biomarkers in multivariable-adjusted linear regression. In addition, we used PICRUSt2 to predict biological pathways associated with the enriched microbiome profiles, and further screened pathways for associations with the dietary scores in linear regression analyses. Microbiome profile scores developed with 32 (EDIH) and 15 (EDIP) genera were associated with higher insulin and homeostatic model assessment of insulin resistance. Six genera were associated with both dietary scores: Ruminococcaceae_UCG-008, Lachnospiraceae_UCG-008, Defluviitaleaceae_UCG-011 Anaeroplasma, inversely and Negativibacillus, Streptococcus, positively. Further, pathways in fatty acid biosynthesis, sugar acid degradation, and mevalonate metabolism were associated with insulinaemic and inflammatory diets. Dietary patterns that exert metabolic effects on insulin and inflammation may influence chronic disease risk by modulating gut microbial composition and function.
The expensive-tissue hypothesis (ETH) posited a brain–gut trade-off to explain how humans evolved large, costly brains. Versions of the ETH interrogating gut or other body tissues have been tested in non-human animals, but not humans. We collected brain and body composition data in 70 South Asian women and used structural equation modelling with instrumental variables, an approach that handles threats to causal inference including measurement error, unmeasured confounding and reverse causality. We tested a negative, causal effect of the latent construct ‘nutritional investment in brain tissues’ (MRI-derived brain volumes) on the construct ‘nutritional investment in lean body tissues’ (organ volume and skeletal muscle). We also predicted a negative causal effect of the brain latent on fat mass. We found negative causal estimates for both brain and lean tissue (−0.41, 95% CI, −1.13, 0.23) and brain and fat (−0.56, 95% CI, −2.46, 2.28). These results, although inconclusive, are consistent with theory and prior evidence of the brain trading off with lean and fat tissues, and they are an important step in assessing empirical evidence for the ETH in humans. Analyses using larger datasets, genetic data and causal modelling are required to build on these findings and expand the evidence base.
Background: Currently there are no disease modifying treatment for Synucleinopathies including Parkinson’s disease Dementia (PDD). Carrying a mutation in the GBA gene (beta-glucocerebrosidase/ GCAse) is a leading risk factor for synucleinopathies. Raising activity GCAse lowers α-synuclein levels in cells and animal models. Ambroxol is a pharmacological chaperone for GCAse and can raise GCAse levels. Our goal is to test Ambroxol as a disease-modifying treatment in PDD. Methods: We randomized fifty-five individuals with PDD to Ambroxol 1050mg/day, 525mg/day, or placebo for 52 weeks. Primary outcome measures included safety, Alzheimer’s disease Assessment Scale-cognitive (ADAS-Cog) subscale and the Clinician’s Global Impression of Change (CGIC). Secondary outcomes included pharmacokinetics, cognitive and motor outcomes and and plasma and CSF biomarkers. Results: Ambroxol was well tolerated. There were 7 serious adverse events (SAEs) none deemed related to Ambroxol. GCase activity was increased in white blood cells by ~1.5 fold. There were no differences between groups on primary outcome measures. Patients receiving high dose Ambroxol appeared better on the Neuropsychiatric Inventory. GBA carriers appeared to improve on some cognitive tests. pTau 181 was reduced in CSF. Conclusions: Ambroxol was safe and well-tolerated in PDD. Ambroxol may improve biomarkers and cognitive outcomes in GBA1 mutation carrie.rs Ambroxol improved some biomarkerss. ClinicalTrials.gov NCT02914366
HIV and severe wasting are associated with post-discharge mortality and hospital readmission among children with complicated severe acute malnutrition (SAM); however, the reasons remain unclear. We assessed body composition at hospital discharge, stratified by HIV and oedema status, in a cohort of children with complicated SAM in three hospitals in Zambia and Zimbabwe. We measured skinfold thicknesses and bioelectrical impedance analysis (BIA) to investigate whether fat and lean mass were independent predictors of time to death or readmission. Cox proportional hazards models were used to estimate the association between death/readmission and discharge body composition. Mixed effects models were fitted to compare longitudinal changes in body composition over 1 year. At discharge, 284 and 546 children had complete BIA and skinfold measurements, respectively. Low discharge lean and peripheral fat mass were independently associated with death/hospital readmission. Each unit Z-score increase in impedance index and triceps skinfolds was associated with 48 % (adjusted hazard ratio 0·52, 95 % CI (0·30, 0·90)) and 17 % (adjusted hazard ratio 0·83, 95 % CI (0·71, 0·96)) lower hazard of death/readmission, respectively. HIV-positive v. HIV-negative children had lower gains in sum of skinfolds (mean difference −1·49, 95 % CI (−2·01, −0·97)) and impedance index Z-scores (–0·13, 95 % CI (−0·24, −0·01)) over 52 weeks. Children with non-oedematous v. oedematous SAM had lower mean changes in the sum of skinfolds (–1·47, 95 % CI (−1·97, −0·97)) and impedance index Z-scores (–0·23, 95 % CI (−0·36, −0·09)). Risk stratification to identify children at risk for mortality or readmission, and interventions to increase lean and peripheral fat mass, should be considered in the post-discharge care of these children.
Mental health and psychosocial support (MHPSS) staff in humanitarian settings have limited access to clinical supervision and are at high risk of experiencing burnout. We previously piloted an online, peer-supervision program for MHPSS professionals working with displaced Rohingya (Bangladesh) and Syrian (Turkey and Northwest Syria) communities. Pilot evaluations demonstrated that online, peer-supervision is feasible, low-cost, and acceptable to MHPSS practitioners in humanitarian settings.
Objectives
This project will determine the impact of online supervision on i) the wellbeing and burnout levels of local MHPSS practitioners, and ii) practitioner technical skills to improve beneficiary perceived service satisfaction, acceptability, and appropriateness.
Methods
MHPSS practitioners in two contexts (Bangladesh and Turkey/Northwest Syria) will participate in 90-minute group-based online supervision, fortnightly for six months. Sessions will be run on zoom and will be co-facilitated by MHPSS practitioners and in-country research assistants. A quasi-experimental multiple-baseline design will enable a quantitative comparison of practitioner and beneficiary outcomes between control periods (12-months) and the intervention. Outcomes to be assessed include the Kessler-6, Harvard Trauma Questionnaire and Copenhagen Burnout Inventory and Client Satisfaction Questionnaire-8.
Results
A total of 80 MHPSS practitioners will complete 24 monthly online assessments from May 2022. Concurrently, 1920 people receiving MHPSS services will be randomly selected for post-session interviews (24 per practitioner).
Conclusions
This study will determine the impact of an online, peer-supervision program for MHPSS practitioners in humanitarian settings. Results from the baseline assessments, pilot evaluation, and theory of change model will be presented.
Over the last 25 years, radiowave detection of neutrino-generated signals, using cold polar ice as the neutrino target, has emerged as perhaps the most promising technique for detection of extragalactic ultra-high energy neutrinos (corresponding to neutrino energies in excess of 0.01 Joules, or 1017 electron volts). During the summer of 2021 and in tandem with the initial deployment of the Radio Neutrino Observatory in Greenland (RNO-G), we conducted radioglaciological measurements at Summit Station, Greenland to refine our understanding of the ice target. We report the result of one such measurement, the radio-frequency electric field attenuation length $L_\alpha$. We find an approximately linear dependence of $L_\alpha$ on frequency with the best fit of the average field attenuation for the upper 1500 m of ice: $\langle L_\alpha \rangle = ( ( 1154 \pm 121) - ( 0.81 \pm 0.14) \, ( \nu /{\rm MHz}) ) \,{\rm m}$ for frequencies ν ∈ [145 − 350] MHz.
Poor transition planning contributes to discontinuity of care at the child–adult mental health service boundary (SB), adversely affecting mental health outcomes in young people (YP). The aim of the study was to determine whether managed transition (MT) improves mental health outcomes of YP reaching the child/adolescent mental health service (CAMHS) boundary compared with usual care (UC).
Methods
A two-arm cluster-randomised trial (ISRCTN83240263 and NCT03013595) with clusters allocated 1:2 between MT and UC. Recruitment took place in 40 CAMHS (eight European countries) between October 2015 and December 2016. Eligible participants were CAMHS service users who were receiving treatment or had a diagnosed mental disorder, had an IQ ⩾ 70 and were within 1 year of reaching the SB. MT was a multi-component intervention that included CAMHS training, systematic identification of YP approaching SB, a structured assessment (Transition Readiness and Appropriateness Measure) and sharing of information between CAMHS and adult mental health services. The primary outcome was HoNOSCA (Health of the Nation Outcome Scale for Children and Adolescents) score 15-months post-entry to the trial.
Results
The mean difference in HoNOSCA scores between the MT and UC arms at 15 months was −1.11 points (95% confidence interval −2.07 to −0.14, p = 0.03). The cost of delivering the intervention was relatively modest (€17–€65 per service user).
Conclusions
MT led to improved mental health of YP after the SB but the magnitude of the effect was small. The intervention can be implemented at low cost and form part of planned and purposeful transitional care.
Among 1,770 healthcare workers serving in high-risk care areas for coronavirus disease 2019 (COVID-19), 39 (2.2%) were seropositive. Exposure to severe acute respiratory coronavirus virus 2 (SARS-CoV-2) in the community was associated with being seropositive. Job or unit type and percentage of time working with COVID-19 patients were not associated with positive antibody tests.
Introduction: For rhythm control of acute atrial flutter (AAFL) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAFL, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an intention-to-treat basis. Statistical significance was assessed using chi-squared tests and multivariable logistic regression. Results: We randomized 76 patients, and none was lost to follow-up. The Drug-Shock (N = 33) and Shock Only (N = 43) groups were similar for all characteristics including mean age (66.3 vs 63.4 yrs), duration of AAFL (30.1 vs 24.5 hrs), previous AAFL (72.7% vs 69.8%), median CHADS2 score (1 vs 1), and mean initial heart rate (128.9 vs 126.0 bpm). The Drug-Shock and Shock only groups were similar for the primary outcome of conversion (100% vs 93%; absolute difference 7.0%, 95% CI -0.6;14.6; P = 0.25). The multivariable analyses confirmed the similarity of the two strategies (P = 0.19). In the Drug-Shock group 21.2% of patients converted with the infusion. There were no statistically significant differences for time to conversion (84.2 vs 97.6 minutes), total ED length of stay (9.4 vs 7.5 hours), disposition home (100% vs 95.3%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion (usually for transient hypotension) was more common in the Drug-Shock group (9.1% vs 0.0%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAFL patients to go home in sinus rhythm. IV procainamide alone was effective in only one fifth of patients, much less than for acute AF.
Introduction: Spatial ability has been defined as a skill in representing, transforming, generating and recalling symbolic, non-linguistic information. Two distinct human spatial abilities have been identified: visualization and orientation. A sex difference in spatial abilities favouring male has been documented. A pattern of negative effects with increasing age on spatial abilities has also been demonstrated. Spatial abilities have been correlated to anatomy knowledge assessment using practical examination, three-dimensional synthesis from two-dimensional views, drawing of views, and cross-sections in a systematic review. Spatial abilities have also been correlated to technical skills performance in beginners and intermediate learners in a systematic review. The objective of this study was to conduct a systematic review of the interrelationship between spatial abilities, anatomy knowledge and technical skills. Methods: Search criteria included ‘spatial abilities’, ‘anatomy knowledge’ and ‘technical skills’. Keywords related to these criteria were identified. A literature search was done up to November 9, 2018 in Scopus and in several medical and educational databases on Ovid and EBSCOhost platforms. A bank of citations was obtained and was reviewed independently by two investigators. Citations related to abstracts, literature reviews, theses and books were excluded. Articles related to retained citations were obtained and a final list of articles was established. Methods relating spatial abilities testing, anatomy knowledge assessment and technical skills performance were identified. Results: A series of 385 titles and abstracts was obtained. After duplicates were removed and selection criteria applied, 11 articles were retained, fully reviewed, and subsequently excluded with reasons. Conclusion: No eligible articles were found in a systematic review of the interrelationship between spatial abilities, anatomy knowledge and technical skills. The outcome of future studies could help to further understand the cognitive process involved in learning a technical skill in Emergency Medicine.
Introduction: An important challenge physicians face when treating acute heart failure (AHF) patients in the emergency department (ED) is deciding whether to admit or discharge, with or without early follow-up. The overall goal of our project was to improve care for AHF patients seen in the ED while avoiding unnecessary hospital admissions. The specific goal was to introduce hospital rapid referral clinics to ensure AHF patients were seen within 7 days of ED discharge. Methods: This prospective before-after study was conducted at two campuses of a large tertiary care hospital, including the EDs and specialty outpatient clinics. We enrolled AHF patients ≥50 years who presented to the ED with shortness of breath (<7 days). The 12-month before (control) period was separated from the 12-month after (intervention) period by a 3-month implementation period. Implementation included creation of rapid access AHF clinics staffed by cardiology and internal medicine, and development of referral procedures. There was extensive in-servicing of all ED staff. The primary outcome measure was hospital admission at the index visit or within 30 days. Secondary outcomes included mortality and actual access to rapid follow-up. We used segmented autoregression analysis of the monthly proportions to determine whether there was a change in admissions coinciding with the introduction of the intervention and estimated a sample size of 700 patients. Results: The patients in the before period (N = 355) and the after period (N = 374) were similar for age (77.8 vs. 78.1 years), arrival by ambulance (48.7% vs 51.1%), comorbidities, current medications, and need for non-invasive ventilation (10.4% vs. 6.7%). Comparing the before to the after periods, we observed a decrease in hospital admissions on index visit (from 57.7% to 42.0%; P <0.01), as well as all admissions within 30 days (from 65.1% to 53.5% (P < 0.01). The autoregression analysis, however, demonstrated a pre-existing trend to fewer admissions and could not attribute this to the intervention (P = 0.91). Attendance at a specialty clinic, amongst those discharged increased from 17.8% to 42.1% (P < 0.01) and the median days to clinic decreased from 13 to 6 days (P < 0.01). 30-day mortality did not change (4.5% vs. 4.0%; P = 0.76). Conclusion: Implementation of rapid-access dedicated AHF clinics led to considerably increased access to specialist care, much reduced follow-up times, and possible reduction in hospital admissions. Widespread use of this approach can improve AHF care in Canada.
Introduction: In-hospital cardiac arrest (IHCA) most commonly occurs in non-monitored areas, where we observed a 10min delay before defibrillation (Phase I). Nurses (RNs) and respiratory therapists (RTs) cannot legally use Automated External Defibrillators (AEDs) during IHCA without a medical directive. We sought to evaluate IHCA outcomes following usual implementation (Phase II) vs. a Theory-Based educational program (Phase III) allowing RNs and RTs to use AEDs during IHCA. Methods: We completed a pragmatic before-after study of consecutive IHCA. We used ICD-10 codes to identify potentially eligible cases and included IHCA cases for which resuscitation was attempted. We obtained consensus on all data definitions before initiation of standardized-piloted data extraction by trained investigators. Phase I (Jan.2012-Aug.2013) consisted of baseline data. We implemented the AED medical directive in Phase II (Sept.2013-Aug.2016) using usual implementation strategies. In Phase III (Sept.2016-Dec.2017) we added an educational video informed by key constructs from a Theory of Planned Behavior survey. We report univariate comparisons of Utstein IHCA outcomes using 95% confidence intervals (CI). Results: There were 753 IHCA for which resuscitation was attempted with the following similar characteristics (Phase I n = 195; II n = 372; III n = 186): median age 68, 60.0% male, 79.3% witnessed, 29.7% non-monitored medical ward, 23.9% cardiac cause, 47.9% initial rhythm of pulseless electrical activity and 27.2% ventricular fibrillation/tachycardia (VF/VT). Comparing Phases I, II and III: an AED was used 0 times (0.0%), 21 times (5.6%), 15 times (8.1%); time to 1st rhythm analysis was 6min, 3min, 1min; and time to 1st shock was 10min, 10min and 7min. Comparing Phases I and III: time to 1st shock decreased by 3min (95%CI -7; 1), sustained ROSC increased from 29.7% to 33.3% (AD3.6%; 95%CI -10.8; 17.8), and survival to discharge increased from 24.6% to 25.8% (AD1.2%; 95%CI -7.5; 9.9). In the VF/VT subgroup, time to first shock decreased from 9 to 3 min (AD-6min; 95%CI -12; 0) and survival increased from 23.1% to 38.7% (AD15.6%; 95%CI -4.3; 35.4). Conclusion: The implementation of a medical directive allowing for AED use by RNs and RRTs successfully improved key outcomes for IHCA victims, particularly following the Theory-Based education video. The expansion of this project to other hospitals and health care professionals could significantly impact survival for VF/VT patients.
Introduction: For rhythm control of acute atrial fibrillation (AAF) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAF, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an apriori-specified modified intention-to-treat (MITT) basis excluding patients who never received the study infusion (e.g. spontaneous conversion). Data were analyzed using chi-squared tests and logistic regression. Our target sample size was 374 evaluable patients. Results: Of 395 randomized patients, 18 were excluded from the MITT analysis; none were lost to follow-up. The Drug-Shock (N = 198) and Shock Only (N = 180) groups (total = 378) were similar for all characteristics including mean age (60.0 vs 59.5 yrs), duration of AAF (10.1 vs 10.8 hrs), previous AF (67.2% vs 68.3%), median CHADS2 score (0 vs 0), and mean initial heart rate (119.9 vs 118.0 bpm). More patients converted to normal sinus rhythm in the Drug-Shock group (97.0% vs 92.2%; absolute difference 4.8%, 95% CI 0.2-9.9; P = 0.04). The multivariable analyses confirmed the Drug-Shock strategy superiority (P = 0.04). There were no statistically significant differences for time to conversion (91.4 vs 85.4 minutes), total ED length of stay (7.1 vs 7.7 hours), disposition home (97.0% vs 96.1%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion was more common in the Drug-Shock group (8.1% vs 0.6%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAF patients to go home in sinus rhythm. A strategy of initial cardioversion with procainamide was superior to a strategy of immediate ECV.
Salmonella spp. continue to be a leading cause of foodborne morbidity worldwide. To assess the risk of foodborne disease, current national regulatory schemes focus on prevalence estimates of Salmonella and other pathogens. The role of pathogen quantification as a risk management measure and its impact on public health is not well understood. To address this information gap, a quantitative risk assessment model was developed to evaluate the impact of pathogen enumeration strategies on public health after consumption of contaminated ground turkey in the USA. Public health impact was evaluated by using several dose–response models for high- and low-virulent strains to account for potential under- or overestimation of human health impacts. The model predicted 2705–21 099 illnesses that would result in 93–727 reported cases of salmonellosis. Sensitivity analysis predicted cooking an unthawed product at home as the riskiest consumption scenario and microbial concentration the most influential input on the incidence of human illnesses. Model results indicated that removing ground turkey lots exceeding contamination levels of 1 MPN/g and 1 MPN in 25 g would decrease the median number of illnesses by 86–94% and 99%, respectively. For a single production lot, contamination levels higher than 1 MPN/g would be needed to result in a reported case to public health officials. At contamination levels of 10 MPN/g, there would be a 13% chance of detecting an outbreak, and at 100 MPN/g, the likelihood of detecting an outbreak increases to 41%. Based on these model prediction results, risk management strategies should incorporate pathogen enumeration. This would have a direct impact on illness incidence linking public health outcomes with measurable food safety objectives.
To assess differences in cognition functions and gross brain structure in children seven years after an episode of severe acute malnutrition (SAM), compared with other Malawian children.
Design
Prospective longitudinal cohort assessing school grade achieved and results of five computer-based (CANTAB) tests, covering three cognitive domains. A subset underwent brain MRI scans which were reviewed using a standardized checklist of gross abnormalities and compared with a reference population of Malawian children.
Setting
Blantyre, Malawi.
Participants
Children discharged from SAM treatment in 2006 and 2007 (n 320; median age 9·3 years) were compared with controls: siblings closest in age to the SAM survivors and age/sex-matched community children.
Results
SAM survivors were significantly more likely to be in a lower grade at school than controls (adjusted OR = 0·4; 95 % CI 0·3, 0·6; P < 0·0001) and had consistently poorer scores in all CANTAB cognitive tests. Adjusting for HIV and socio-economic status diminished statistically significant differences. There were no significant differences in odds of brain abnormalities and sinusitis between SAM survivors (n 49) and reference children (OR = 1·11; 95 % CI 0·61, 2·03; P = 0·73).
Conclusions
Despite apparent preservation in gross brain structure, persistent impaired school achievement is likely to be detrimental to individual attainment and economic well-being. Understanding the multifactorial causes of lower school achievement is therefore needed to design interventions for SAM survivors to thrive in adulthood. The cognitive and potential economic implications of SAM need further emphasis to better advocate for SAM prevention and early treatment.
A gradational increase in concentration of CO2 towards the margins of a very fine-grained basalt dyke has led to the development of a pale marginal facies, enriched in carbonates, particularly siderite. Increases in CO2 from about 3 % in the interior of the dyke to 8 % at the margins are accompanied by decreases in SiO2 and Fe2O3, increases in Al2O3, and less significant changes in the other major components. Cu, Co, and Zn change only slightly, and Cr, Ni, and Li remain constant. Petrographic variation is considerable, even in the superficially homogeneous interior of the dyke, in which it ranges from a type containing a titanaugite (analysed) to one devoid of pyroxene, but containing conspicuous opaque minerals. Microprobe analysis for Fe and Ti shows that these comprise: cotahedral titaniferous magnetite; rodlets, less than 1 µm thick, of rutile partly altered to, or overgrown by, ilmenite; and sub-opaque patches with a very low Ti/Fe ratio. Plagioclase, An55, is the most abundant and constant crystalline phase in the interior of the dyke, but changes to An20 in the marginal facies. Mineral-content of the latter, deduced from optical, chemical, and X-ray data, also includes siderite, serpentine and clay minerals, leucoxene, and apatite. There is no evidence of quartz, sericite, or calcite.
Petrographic evidence shows that variations in concentrations of CO2 and H2O affected phase equilibria from the start of magmatic crystallization. Data on the fO2 required for TiO2 and Fe-oxide phases to co-exist at magmatic temperatures indicate that, initially, the concentrations of CO2 and H2O in the interior of the dyke were higher than the values recorded in analyses of the rocks. From this evidence and the field relationships, it is concluded that the intruding magma was rich in volatiles, which diffused towards the dyke margins and, in part, became trapped as the magma congealed, producing a changed marginal assemblage of minerals. The dyke provides a unique glimpse of influences on a basic magma exerted by what S. J. Shand has aptly termed the ‘fugitive constituents’, the transient effects of which are rarely preserved in the rocks.