To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Embedding climate resilient development principles in planning, urban design, and architecture means ensuring that transformation of the built environment helps achieve carbon neutrality, effective adaptation, and well-being for people and nature. Planners, urban designers, and architects are called to bridge the domains of research and practice and evolve their agency and capacity, developing methods and tools consistent across spatial scales to ensure the convergence of outcomes towards targets. Shaping change necessitates an innovative action-driven framework with multi-scale analysis of urban climate factors and co-mapping, co-design, and co-evaluation with city stakeholders and communities. This Element provides analysis on how urban climate factors, system efficiency, form and layout, building envelope and surface materials, and green/blue infrastructure affect key metrics and indicators related to complementary aspects like greenhouse gas emissions, impacts of extreme weather events, spatial and environmental justice, and human comfort. This title is also available as open access on Cambridge Core.
Reading is a complex cognitive process requiring the integration of orthographic, phonological, and semantic information. The visual word form area, located in the ventral occipitotemporal cortex, is critically involved in orthographic decoding, and damage to this region is known to cause alexia. In contrast, the contributions of white matter pathways supporting reading are less well understood.
Method:
We present a unique neurosurgical case undergoing awake brain surgery for resection of a metastasis in the left occipitotemporal cortex. A tubular retractor was used to access the lesion and during the insertion of the retractor the patient underwent careful, continuous neuropsychological testing, including evaluation of reading. fMRI language mapping and diffusion MRI were performed preoperatively. Postoperative neuropsychological testing was completed two weeks after surgery to assess cognitive outcome.
Results:
The patient developed an alexia with letter-by-letter reading in real time during insertion of the tubular retractor. Stealth imaging enabled localization of the tubular retractor at the exact onset of the alexia and, by correlating this with tractography, showed that the tubular retractor was in the vertical occipital fasciculus (VOF).
Conclusions:
We present the first detailed case report linking the VOF to the acute onset of alexia observed intraoperatively during awake brain surgery. We discuss the connectomics of reading and possible contributions of the VOF in reading.
Patients with CHD are at risk for developing necrotising enterocolitis. Currently, no standardised approaches for identification, diagnosis, and treatment of necrotising enterocolitis exists, and there are varying rates and management strategies of necrotising enterocolitis across centres. We used the Paediatric Cardiac Critical Care Consortium to identify high- and low-performing centres based on necrotising enterocolitis rates and convened a necrotising enterocolitis working group. The aims of the group were to understand why variability exists, identify risk factors, and create a foundation for a prospective improvement project.
Methods:
Nine centres participated, and collaborative learning sessions were held with multidisciplinary input. REDCap surveys were disseminated to centres to create consensus among site practices and recommendations.
Results:
The following topics were discussed: diagnosis, risk factors, and management. Diagnosis consensus suggests (1) Diagnosis would benefit from a comprehensive scoring tool, and (2) ultrasound may serve as a highly sensitive diagnostic tool for those at high risk with the absence of other radiologic findings of necrotising enterocolitis. Risk factor consensus suggests (1) those with ductal-dependent systemic blood flow are the highest risk, and (2) vasopressors with splanchnic constriction should be used with caution. Management consensus suggests (1) breastmilk be used first-line for feeding, 2) resume feeds 24–48 hours after a necrotising enterocolitis rule-out, and 3) surgical deference to physical examination and laboratory evaluation above radiographic findings.
Conclusion:
Variability exists in diagnosing necrotising enterocolitis and feeding approaches for at-risk patients. Opportunities exist for collaboration to standardise definitions, compare outcomes, identify risk factors, and create consensus on the management of necrotising enterocolitis.
Loneliness is associated with several physical and mental health problems, yet its costs to the healthcare system remain unclear.
Aims
The current study aimed to review literature on the health and social care impacts of loneliness, and review economic evaluations of loneliness interventions.
Method
We conducted a systematic review of studies published from 2008 to April 2025 by searching five bibliographic databases, grey literature and reference lists of systematic reviews. Studies estimating health and social care cost/expenditure, and on health resource utilisation, were included to assess the impact of loneliness on the health system. Return on investment, social return on investment and cost-effectiveness evaluations were included to assess the economic impact of loneliness interventions. We conducted quality appraisal and narrative synthesis of results.
Results
We included 53 studies. Eight estimated the healthcare cost/expenditure of loneliness, 33 reported healthcare resource use and 19 were economic evaluations of interventions. Findings relating to the cost/expenditure of loneliness and service use were inconsistent: some studies reported excess costs/expenditure and service use, whereas others found lower costs/expenditure and service use. Economic evaluation studies indicated that loneliness interventions can be cost-effective, but were not consistently cost-saving or effective in reducing loneliness.
Conclusions
Findings on the impact of loneliness on the healthcare system and economic evaluations of loneliness interventions were varied. Therefore, we cannot derive confident conclusions from this review. To address evidence gaps, future research relating to social care, younger populations, direct healthcare costs of loneliness and randomised controlled trials with long-term follow-ups should be prioritised.
As temperatures globally continue to rise, sporting events such as marathons will take place on warmer days, increasing the risk of exertional heat stroke (EHS).
Methods
The medical librarian developed and executed comprehensive searches in Ovid MEDLINE, Ovid Embase, CINAHL, SPORTDiscus, Scopus, and Web of Science Core Collection. Relevant keywords were selected. The results underwent title, abstract, and full text screening in a web-based tool called Covidence, and were analyzed for pertinent data.
Results
A total of 3918 results were retrieved. After duplicate removal and title, abstract, and full text screening, 38 articles remained for inclusion. There were 22 case reports, 12 retrospective reviews, and 4 prospective observational studies. The races included half marathons, marathons, and other long distances. In the case reports and retrospective reviews, the mean environmental temperatures were 21.3°C and 19.8°C, respectively. Discussions emphasized that increasing environmental temperatures result in higher incidences of EHS.
Conclusion
With rising global temperatures from climate change, athletes are at higher risk of EHS. Early ice water immersion is the best treatment for EHS. Earlier start times and cooling stations for races may mitigate incidences of EHS. Future work needs to concentrate on the establishment of EHS prevention and mitigation protocols.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
CBRN incidents require specialized hazmat decontamination protocols to prevent secondary contamination and systemic toxicity. While wet decontamination is standard, it can present challenges in cold weather or when resources are limited. Dry decontamination offers an alternative and supportive approach, though its effectiveness across different contaminants remains unclear. This scoping review evaluates the effectiveness, advantages, and limitations of dry decontamination in hazmat incidents.
Methods
A scoping review was conducted using MEDLINE, CINAHL, and other databases. Following the PRISMA-ScR approach, 9 studies were selected from 234 identified articles. The review assessed decontamination techniques, materials, and effectiveness across different contaminants.
Results
Dry decontamination is rapid, resource-efficient, and suitable for immediate use in pre-hospital and hospital settings, especially during mass casualty incidents (MCIs). Dry decontamination is highly effective for liquid contaminants, with blue roll and sterile trauma dressings removing over 80% of contaminants within minutes. However, dry decontamination is less effective for hair and particulate contaminants. Blotting and rubbing techniques significantly enhance decontamination efficiency.
Conclusions
Dry decontamination can be an effective alternative for wet decontamination, particularly for liquid contaminants, as a first-line approach for scenarios where wet decontamination is not a practical solution for logistical and environmental reasons. However, dry decontamination is less effective than wet decontamination for hair and particulate contaminants. Combining dry and wet decontamination is shown to be more effective. Identifying the need for including dry decontamination as an integral part of the CBRN response plan improves the efficacy of decontamination.
The stars of the Milky Way carry the chemical history of our Galaxy in their atmospheres as they journey through its vast expanse. Like barcodes, we can extract the chemical fingerprints of stars from high-resolution spectroscopy. The fourth data release (DR4) of the Galactic Archaeology with HERMES (GALAH) Survey, based on a decade of observations, provides the chemical abundances of up to 32 elements for 917 588 stars that also have exquisite astrometric data from the Gaia satellite. For the first time, these elements include life-essential nitrogen to complement carbon, and oxygen as well as more measurements of rare-earth elements critical to modern-life electronics, offering unparalleled insights into the chemical composition of the Milky Way. For this release, we use neural networks to simultaneously fit stellar parameters and abundances across the whole wavelength range, leveraging synthetic grids computed with Spectroscopy Made Easy. These grids account for atomic line formation in non-local thermodynamic equilibrium for 14 elements. In a two-iteration process, we first fit stellar labels to all 1 085 520 spectra, then co-add repeated observations and refine these labels using astrometric data from Gaia and 2MASS photometry, improving the accuracy and precision of stellar parameters and abundances. Our validation thoroughly assesses the reliability of spectroscopic measurements and highlights key caveats. GALAH DR4 represents yet another milestone in Galactic archaeology, combining detailed chemical compositions from multiple nucleosynthetic channels with kinematic information and age estimates. The resulting dataset, covering nearly a million stars, opens new avenues for understanding not only the chemical and dynamical history of the Milky Way but also the broader questions of the origin of elements and the evolution of planets, stars, and galaxies.
Land degradation is reducing biodiversity and crop yields, and exacerbating the impacts of climate change, throughout the world. Monitoring land degradation is required to determine the effectiveness of land management and restoration practices, and to track progress toward reaching land degradation neutrality (LDN). It is also needed to target investments where they are most needed, and will have the greatest impact. The most useful indicators of land degradation vary among soils and climates. The United Nations Convention to Combat Desertification (UNCCD) selected three widely accepted land degradation indicators for LDN: land cover, net primary production (NPP) and soil carbon stocks. In addition to non-universal relevance, the use of these indicators has been limited by data availability, especially for carbon. This article presents an alternative monitoring framework based on the definition and ranking of states in a degradation hierarchy. Unique classifications can be defined for different regions and even different landscapes allowing, for example, perennial cropland to be ranked above a highly degraded grassland. The article concludes with an invitation to discuss the potential value of this approach and how it could be practically implemented at landscape to global scales. The ultimate objective is to support decision-making information at the local levels at which land degradation is addressed through improved management and restoration while providing the information necessary for reporting on progress toward meeting goals.
Major depressive disorder (MDD) is a serious and often chronic illness that requires early and urgent treatment. Failing to provide effective treatment of MDD can worsen the illness trajectory, negatively impact physical health, and even alter brain structure. Early optimized treatment (EOT) of MDD, with a measurement-based approach to diagnosis, rapid treatment initiation with medication dosage optimization, frequent monitoring, and prompt adjustments in treatment planning when indicated, should proceed with a sense of urgency. In this article, we describe common barriers to providing an EOT approach to treating MDD at each phase of care, along with strategies for navigating these obstacles. Approaching the treatment of MDD with a greater sense of urgency increases the likelihood of symptom reduction in MDD, facilitating full functional recovery and a return to life engagement.
To better understand clinicians’ rationale for ordering testing for C. difficile infection (CDI) for patients receiving laxatives and the impact of the implementation of a clinical decision support (CDS) intervention.
Design:
A mixed-methods, case series was performed from March 2, 2017 to December 31, 2018.
Setting:
Yale New Haven Hospital, a 1,541 bed tertiary academic medical center.
Participants:
Hospitalized patients ≥ 18 years old, and clinicians who were alerted by the CDS.
Intervention:
CDS was triggered in real-time when a clinician sought to order testing for CDI for a patient who received one or more doses of laxatives within the preceding 24 hours.
Results:
A total of 3,376 CDS alerts were triggered during the 21-month study period from 2,567 unique clinician interactions. Clinicians bypassed the CDS alert 74.5% of the time, more frequent among residents (48.3% bypass vs. 39.9% accept) and advanced practice providers (APPs) (34.9% bypass vs. 30.6% accept) than attendings (11.3% bypass vs. 22.5% accept). Ordering clinicians noted increased stool frequency/output (48%), current antibiotic exposure (34%), and instructions by an attending physician to test (28%) were among the most common reasons for overriding the alert and proceeding with testing for CDI.
Conclusions:
Testing for CDI despite patient laxative use was associated with an increased clinician concern for CDI, patient risk for CDI, and attending physician instruction for testing. Attendings frequently accepted CDS guidance while residents and APPs often reinstated CDI test orders, suggesting a need for greater empowerment and discretion when ordering tests.
The International Society for Twin Studies (ISTS) held its 19th scientific congress in Assisi, Italy, from September 26 to 28, 2024. This prestigious event, which was the seventh joint meeting with the World Congress on Twin Pregnancy, brought together researchers from various fields, including psychology, biology and medicine. Representatives from ICOMBO (the International Consortium of Multiple Birth Organisations), which supports multiple-birth families worldwide, were also in attendance. Many twin researchers consider this event to be the highlight of their professional year, as it brings together experts and parents alike to discuss the latest advancements in twin studies.
There is a large literature evaluating the dual process model of cognition, including the biases and heuristics it implies. However, our understanding of what causes effortful thinking remains incomplete. To advance this literature, we focus on what triggers decision-makers to switch from the intuitive process (System 1) to the more deliberative process (System 2). We examine how the framing of incentives (gains versus losses) influences decision processing. To evaluate this, we design experiments based on a task developed to distinguish between intuitive and deliberative thinking. Replicating previous research, we find that losses elicit more cognitive effort. Most importantly, we also find that losses differentially reduce the incidence of intuitive answers, consistent with triggering a shift between these modes of cognition. We find substantial heterogeneity in these effects, with young men being much more responsive to the loss framing. To complement these findings, we provide robustness tests of our results using aggregated data, the imposition of a constraint to hinder the activation of System 2, and an analysis of incorrect, but unintuitive, answers to inform hybrid models of choice.
In June of 2024, Becton Dickinson experienced a blood culture bottle shortage for their BACTEC system, forcing health systems to reduce usage or risk exhausting their supply. Virginia Commonwealth University Health System (VCUHS) in Richmond, VA decided that it was necessary to implement austerity measures to preserve the blood culture bottle supply.
Setting:
VCUHS includes a main campus in Richmond, VA as well as two affiliate hospitals in South Hill, VA (Community Memorial Hospital (CMH)) and Tappahannock Hospital in Tappahannock, VA. It also includes a free-standing Emergency Department in New Kent, VA.
Patients:
Blood cultures from both pediatric and adult patients were included in this study.
Interventions:
VCUHS intervened to decrease blood culture utilization across the entire health system. Interventions included communication of blood culture guidance as well as an electronic health record order designed to guide providers and discourage wasteful ordering.
Results:
Post-implementation analyses showed that interventions reduced overall usage by 35.6% (P < .0001) and by greater than 40% in the Emergency Departments. The impact of these changes in utilization on positivity were analyzed, and it was found that the overall positivity rate increased post-intervention from 8.8% to 12.1% (P = .0115) and in the ED specifically from 10.2% to 19.5% (P < .0001).
Conclusions:
These findings strongly suggest that some basic stewardship interventions can significantly change blood culture practice in a manner that minimizes the impact on patient care.
The SDMPH 10-year anniversary conference created an opportunity for a researcher to present at a professional association conference to advance their research by seeking consensus of statements using Delphi methodology.
Methods
Conference attendees and SDMPH members who did not attend the conference were identified as Delphi experts. Experts rated their agreement of each statement on a 7- point linear numeric scale. Consensus amongst experts was defined as a standard deviation < = 1. Presenters submitted statements relevant to advancing their research to the authors to edit to fit Delphi statement formatting.
Statements attaining consensus were included in the final report after the first round. Those not attaining consensus moved to the second round in which experts were shown the mean response of the expert panel and their own response for opportunity to reconsider their rating for that round. If reconsideration attained consensus, these statements were included in the final report. This process repeated in a third and final round.
Results
37 Experts agreed to participate in the first round; 35 completed the second round, and 34 completed the third round; 35 statements attained consensus; 3 statements did not attain consensus.
Conclusions
A Delphi technique was used to establish expert consensus of statements submitted by the SDMPH conference presenters to guide their future education, research, and training.
Mandatory folic acid fortification of enriched grains has reduced neural tube defect prevalence in several countries. We examined salt as an additional vehicle for folic acid fortification. The primary objective was to examine the change in serum folate concentration after 1 month of consumption of fortified iodised salt with folic acid (FISFA) among women of reproductive age. The secondary objectives were to examine (1) the feasibility of implementing FISFA intervention and (2) the acceptability of FISFA.
Design:
We conducted a pre–post intervention study (January–April 2023). Participants received a FISFA saltshaker with the study salt (1 g of sodium chloride salt fortified with 100 mcg of folic acid) to use instead of regular table salt for 1 month. Serum folate was measured using the Elecsys Folate-III immunoassay method at baseline and 1-month endpoint. Change in serum folate was assessed using a two-tailed Wilcoxon signed rank test for paired samples.
Setting:
Metropolitan city, Southern USA.
Participants:
Non-pregnant, 18–40-year-old women who lived alone/with a partner.
Results:
Thirty-two eligible women consented to participate, including eleven non-Hispanic-White, eleven non-Hispanic-Black and ten Hispanic. Post-intervention, there was a significant increase in median serum folate concentration of 1·40 nmol/l (IQR 0·74–2·05; P < 0·001) from 24·08 nmol/l to 25·96 nmol/l in an analytical sample of n 29. An increase was seen in 28/29 (93 %) participants. Feasibility: 100 % study consent and compliance. FISFA acceptability: 25 d average use; 1·28 g average daily intake; 96·7 % and 90 % reported taste and colour of FISFA as highly acceptable, respectively.
Conclusions:
FISFA is an effective approach to increasing serum folate concentrations among women of reproductive age. Findings should be replicated in a larger study.
This paper provides the methodology used to simulate and control an icosahedral tensegrity structure augmented with movable masses attached to each bar to provide a means of locomotion. The center of mass of the system can be changed by moving the masses along the length of each of the bars that compose the structure. Moving the masses changes the moments created by gravitational force, allowing for the structure to roll. With this methodology in mind, a controller was created to move the masses to the desired locations to cause such a roll. As shown later in this paper, such a methodology, assuming the movable masses have the required mass, allows for full control of the system using a quasi-static controller created specifically for this system. This system has advantages over traditional tensegrity controllers because it retains its shape and is designed for high-shock scenarios.
Complications following the Fontan procedure include prolonged pleural drainage and readmission for effusions. To address these complications, a post-Fontan management pathway was implemented with primary goals of reducing chest tube duration/reinsertion rates and decreasing hospital length of stay and readmissions.
Methods:
Fontan patients were identified by retrospective chart review (2017–2019) to obtain baseline data for chest tube duration/reinsertion rates, hospital length of stay, and readmission rates for effusion. A post-Fontan management pathway was implemented (2020–2021) utilising post-operative vasopressin, nasal cannula oxygen until chest tube removal, and discharge regimen of three times daily diuretics, sildenafil, and afterload reducing medications. Patients were followed to evaluate primary outcomes.
Results:
The pre- and post-pathway groups were similar in single ventricle morphology, demographics, and pre-operative haemodynamics. Forty-three and 36 patients were included in the pre- and post-pathway cohorts, respectively. There were statistically significant reductions in chest tube duration (8 vs. 5 days, p ≤ 0.001), chest tube output on post-operative day 4 (20.4 vs. 9.9 mL/kg/day, p = 0.003), and hospital readmission rates for effusion (13[30%] vs. 3[8%], p = 0.02) compared to baseline. There was an absolute reduction in hospital length of stay (11 vs. 9.5 days, p = 0.052). When combining average cost savings for the Fontan hospitalisations, readmissions for effusion, and cardiac catheterisations within 6 months of Fontan completion, there was a $325,144 total cost savings for 36 patients following pathway implementation.
Conclusion:
Implementation of a post-Fontan management pathway resulted in significant reductions in chest tube duration and output, and readmission rates for effusion in the perioperative period.
We describe a retrospective assessment of practitioner and patient recruitment strategies, patient retention strategies, and rates for five clinical studies conducted in the National Dental Practice-Based Research Network between 2012 and 2019, and practitioner and patient characteristics associated with retention.
Methods:
Similar recruitment strategies were adopted in the studies. The characteristics of the practitioners and patients are described. The proportion of patients who either attended a follow-up (FU) assessment or completed an online assessment was calculated. For studies with multiple FU visits or questionnaire assessments, rates for completing each FU were calculated, as were the rates for completing any and for completing all FU assessments. The associations of practitioner and patient characteristics with all clinic FU visits, and with the completion of all assessments for a study were ascertained.
Results:
Overall, 591 practitioners and 12,159 patients were included. FU rates by patients for any assessment varied from 91% to 96.5%, and rates for participating in all assessments ranged from 68% to 87%. The mean total number of patients each practitioner recruited was 21 (sd = 15); the mean number per study was 13 (sd = 7). For practitioners, practice type and patient enrollment were associated with greater clinic retention, while only race was associated with their patients completing post-visit online assessments. For patients, age was associated with clinic retention, while female gender, age, race, and education were all associated with greater completion of post-visit online assessments.
Conclusion:
The Network efficiently recruited practitioners and patients and achieved high patient retention rates for the five studies.
To characterize the evolution of dioctahedral interstratified clay minerals in the Golden Cross epithermal deposit, New Zealand, hydrothermally altered volcanic rocks containing the sequence smectite through illite-smectite (I-S) to muscovite were examined by optical microscopy, X-ray diffraction (XRD), scanning electron microscopy (SEM), and transmission and analytical electron microscopies (TEM/AEM).
XRD analyses of 30 oriented clay samples show a broad deposit-wide trend of increasing illite content in I-S with increasing depth and proximity to the central vein system. Six representative samples were selected for SEM/TEM study on the basis of petrographic observations and XRD estimates of I-S interstratification. Ca and Na are the dominant interlayer cations in smectite, but as the proportion of illite layers in I-S increases, so does the K content and (IVAl + VIAl)/Si ratio. Layers and packets tend to flatten and form larger arrays, reducing the amount of pore space. Smectite coexists with (R = 1) I-S, rather than being (R = 0) I-S where R is the Reichweite parameter. The highest alteration rank samples contain discrete packets of mica to ∼300 Å thick, but a limited chemical and structural gap exists between illite, which is intermediate in composition between common illite and muscovite, and illite-rich I-S. Selected-area electron diffraction (SAED) patterns of mica show that the 1M polytype dominates, rather than the common 2M1 polytype.
Petrographic, SEM, and TEM data imply that all phyllosilicates formed via neoformation directly from fluids. Relatively mature I-S and micas form simultaneously, without progressing through the series of transformations that are commonly assumed to characterize diagenetic sequences during burial metamorphism in mud-dominated basins. Although the overall distribution of clay minerals is consistent with temperature as a controlling variable, local heterogeneities in the distribution of clay minerals were controlled by water/rock ratio, which varied widely owing to different rock types and fracture control.