We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The transition from breastmilk to solid foods (weaning) is a critical stage in infant development and plays a decisive role in the maturation of the complex microbial community inhabiting the human colon. Diet is a major factor shaping the colonic microbiota, which ferments nutrients reaching the colon unabsorbed by the host to produce a variety of microbial metabolites influencing host physiology(1). Therefore, making adequate dietary choices during weaning can positively modulate the colonic microbiota, ultimately contributing to health in infancy and later life(2). However, our understanding of how complementary foods impact the colonic microbiota of weaning infants is limited. To address this knowledge gap, we employed a metagenome-scale modelling approach to simulate the impact of complementary foods, either combined with breastmilk or with breastmilk and other foods, on the production of organic acids by colonic microbes of weaning infants(3). Complementary foods and combinations of foods with the greatest impact on the in silico microbial production of organic acids were identified. These foods and food combinations were further tested in vitro, individually or in combination with infant formula. Fifty-three food samples were digested using a protocol adapted from INFOGEST to mimic infant digestion and then fermented with faecal inoculum from 6 New Zealand infants (5-11 months old). After 24h of fermentation, the production of organic acids was measured by gas chromatography. Differences in organic acid production between samples were determined using the Tukey Honestly Significant Difference test to account for multiple comparisons. The microbial composition was characterised by amplicon sequencing of the V3-V4 regions of the 16S bacterial gene. Taxonomy was assigned using the DADA2 pipeline and the SILVA database (version 138.1). Bioinformatic and statistical analyses were conducted using the R packages phyloseq and ANCOM-BC2, with the Holm-Bonferroni adjustment to account for false discovery rates in differential abundance testing. Blackcurrant and raspberries increased the production of acetate and propionate (Tukey’s test, p<0.05) and the relative abundance of the genus Parabacteroides (Dunnett’s test, adjusted p<0.05) compared to other foods. Raspberries also increased the abundance of the genus Eubacterium (Dunnett’s test, adjusted p<0.05). When combined with infant formula, black beans stood out for increasing the production of butyrate (Tukey’s test, p<0.05) and the relative abundance of the genus Clostridium (Dunnett’s test, adjusted p<0.05). In conclusion, this study provides new evidence on how complementary foods, both individually or in combination with other dietary compounds, influence the colonic microbiota of weaning infants in vitro. Insights generated by this research can help design future clinical trials, ultimately enhancing our understanding of the relationship between human nutrition and colonic microbiota composition and function in post-weaning life.
This review highlights the importance of dietary fibres (DF) intake and its interconnection with the gut microbiome and psychological well-being, while also exploring the effects of existing DF interventions on these aspects in adults. The gut microbiota is a complex and diverse ecosystem in which microbial species interact, influencing the human host. DF are heterogeneous, requiring different microbial species to degrade the complex DF structures. Emerging evidence suggests that microbial fermentation of DF produces short-chain fatty acids (SCFA), which may play a role in regulating psychological well-being by affecting neurotransmitter levels, including serotonin. The effectiveness of DF interventions depends on factors such as baseline gut microbiota composition, the dosage and the source of DF consumed. Although the gut microbiota of adults is relatively stable, studies have shown that the abundance of the species in the gut microbiota can change within 24 h of an intervention and may return to baseline following the termination of DF intervention. This review underscores the need for larger and well-powered dietary clinical trials incorporating longitudinal biological sample collections, advanced sequencing and omic techniques (including novel dietary biomarkers and microbial metabolites), validated subjective questionnaires and dietary records. Furthermore, mechanistic studies driven by clinical observations are crucial to understanding gut microbiota function and its underlying biological pathways, informing targeted dietary interventions.
Blast injuries can occur by a multitude of mechanisms, including improvised explosive devices (IEDs), military munitions, and accidental detonation of chemical or petroleum stores. These injuries disproportionately affect people in low- and middle-income countries (LMICs), where there are often fewer resources to manage complex injuries and mass-casualty events.
Study Objective:
The aim of this systematic review is to describe the literature on the acute facility-based management of blast injuries in LMICs to aid hospitals and organizations preparing to respond to conflict- and non-conflict-related blast events.
Methods:
A search of Ovid MEDLINE, Scopus, Global Index Medicus, Web of Science, CINAHL, and Cochrane databases was used to identify relevant citations from January 1998 through July 2024. This systematic review was conducted in adherence with PRISMA guidelines. Data were extracted and analyzed descriptively. A meta-analysis calculated the pooled proportions of mortality, hospital admission, intensive care unit (ICU) admission, intubation and mechanical ventilation, and emergency surgery.
Results:
Reviewers screened 3,731 titles and abstracts and 173 full texts. Seventy-five articles from 22 countries were included for analysis. Only 14.7% of included articles came from low-income countries (LICs). Sixty percent of studies were conducted in tertiary care hospitals. The mean proportion of patients who were admitted was 52.1% (95% CI, 0.376 to 0.664). Among all in-patients, 20.0% (95% CI, 0.124 to 0.288) were admitted to an ICU. Overall, 38.0% (95% CI, 0.256 to 0.513) of in-patients underwent emergency surgery and 13.8% (95% CI, 0.023 to 0.315) were intubated. Pooled in-patient mortality was 9.5% (95% CI, 0.046 to 0.156) and total hospital mortality (including emergency department [ED] mortality) was 7.4% (95% CI, 0.034 to 0.124). There were no significant differences in mortality when stratified by country income level or hospital setting.
Conclusion:
Findings from this systematic review can be used to guide preparedness and resource allocation for acute care facilities. Pooled proportions for mortality and other outcomes described in the meta-analysis offer a metric by which future researchers can assess the impact of blast events. Under-representation of LICs and non-tertiary care medical facilities and significant heterogeneity in data reporting among published studies limited the analysis.
Whole genome sequencing (WGS) can help identify transmission of pathogens causing healthcare-associated infections (HAIs). However, the current gold standard of short-read, Illumina-based WGS is labor and time intensive. Given recent improvements in long-read Oxford Nanopore Technologies (ONT) sequencing, we sought to establish a low resource approach providing accurate WGS-pathogen comparison within a time frame allowing for infection prevention and control (IPC) interventions.
Methods:
WGS was prospectively performed on pathogens at increased risk of potential healthcare transmission using the ONT MinION sequencer with R10.4.1 flow cells and Dorado basecaller. Potential transmission was assessed via Ridom SeqSphere+ for core genome multilocus sequence typing and MINTyper for reference-based core genome single nucleotide polymorphisms using previously published cutoff values. The accuracy of our ONT pipeline was determined relative to Illumina.
Results:
Over a six-month period, 242 bacterial isolates from 216 patients were sequenced by a single operator. Compared to the Illumina gold standard, our ONT pipeline achieved a mean identity score of Q60 for assembled genomes, even with a coverage rate as low as 40×. The mean time from initiating DNA extraction to complete analysis was 2 days (IQR 2–3.25 days). We identified five potential transmission clusters comprising 21 isolates (8.7% of sequenced strains). Integrating ONT with epidemiological data, >70% (15/21) of putative transmission cluster isolates originated from patients with potential healthcare transmission links.
Conclusions:
Via a stand-alone ONT pipeline, we detected potentially transmitted HAI pathogens rapidly and accurately, aligning closely with epidemiological data. Our low-resource method has the potential to assist in IPC efforts.
This chapter discusses the Ontogeny Phylogeny Model (OPM), which focuses on the formation and development of second language phonological systems. It proposes an interrelationship between L2 native-like productions, L1 transfer, and universal factors. The model argues that chronologically, and as style becomes increasingly formal, L2 native-like processes increase, L1 transfer processes decrease, and universal processes increase and then decrease. It further claims that the roles of universals and L1 transfer are mediated by markedness and similarity, both of which slow L2 acquisition. Specifically, in similar phenomena L1 transfer processes persist, while in marked phenomena universal processes persist. The OPM also argues that these same principles obtain for learners acquiring more than one L2, monolingual and bilingual acquisition, and L1 attrition. In addition to the chronological stages and variation of the individual learner, the model claims that these relationships hold true for language variation and change, including pidgins and creoles.
Background: Patients with an acute ischemic stroke (AIS) are selected to receive reperfusion therapy using either computed tomography (CT-CTA) or magnetic brain imaging (MRI). The aim of this study was to compare CT and MRI as the primary imaging modality for AIS patients undergoing EVT. Methods: Data for AIS patients between January 2018 and January 2021 were extracted from two prospective multicenter EVT cohorts: the ETIS registry in France (MRI) and the OPTIMISE registry in Canada (CT). Demographics, procedural data and outcomes were collected. We assessed the association of qualifying imaging (CT vs. MRI) with time metrics and functional outcome. Results: From January 2018 to January 2021, 4059 patients selected by MRI and 1324 patients selected by CT were included in the study. Demographics were similar between the two groups. The median imaging-to-arterial puncture time was 37 minutes longer in the MRI group. Patients selected by CT had more favorable 90-day functional outcomes (mRS 0-2) as compared to patients selected by MRI (48.5% vs 44.4%; adjusted OR (aOR), 1.54, 95%CI 1.31 to 1.80, p<0.001). Conclusions: Patients with AIS undergoing EVT who were selected with MRI as opposed to CT had longer imaging-to-arterial-puncture delays and worse functional outcomes at 90 days.
Adequate dietary fibre (DF) intake is recommended to relieve constipation and improve gut health(1). It is often assumed that individuals with constipation have relatively low DF intake and do not meet the recommended adequate intake of 25 g and 30 g for females and males, respectively. The 2008/09 New Zealand Adult Nutrition Survey confirmed that the mean DF was 17.9 grams (g) per day for females and 22.8 g per day for males, which was well below the recommended adequate intake(2). With the continuous shift of dietary patterns over time, we sought to compare the current usual DF intake of two cohorts of New Zealand adults: those who have constipation with those without constipation but with relatively low DF intake. We report baseline dietary data from two randomised controlled dietary studies (Kiwifruit Ingestion to Normalise Gut Symptoms (KINGS) (ACTRN12621000621819) and Bread Related Effects on microbiAl Distribution (BREAD) (ACTRN12622000884707)) conducted in Christchurch, New Zealand in 2021 and 2022, respectively. The KINGS study included adults with either functional constipation or constipation-predominant irritable bowel syndrome to consume either two green kiwifruit or maltodextrin for four weeks. The BREAD study is a crossover study and included healthy adults without constipation but with relatively low DF intake (<18 g for females, <22 g for males) to consume two types of bread with different DF content, each bread for four weeks separated by a two-week washout period. All participants completed a non-consecutive three-day food diary at baseline. Dietary data were entered into FoodWorks Online Professional (Xyris Software Australia, 2021) to assess mean daily DF intake. Fifty-six adults from the KINGS study (n = 48 females, n= 8 males; mean age ± standard deviation: 42.8 ± 12.6 years) and BREAD study (n = 33 females, n= 23 males; mean age: 40.4 ± 13.4 years) completed a baseline food diary. In the KINGS study, females with constipation had a daily mean DF intake of 25.0 ± 9.4 g whilst male participants consumed 26.9 ± 5.0 g per day. In the BREAD study, females without constipation had a mean daily DF intake of 19.4 ± 5.8 g, whereas males had 22.6 ± 8.5 g per day. There was a statistically significant difference in the mean daily DF intake between females with constipation and those without constipation (p < 0.001) but not between males (p = 0.19). These two studies found that DF intakes among females with constipation were not as relatively low as previously assumed, as they met their adequate intake of 25 g. Further data analysis from the KINGS and BREAD studies will reveal the effects of using diet to manage constipation and promote better gut health in these two cohorts of New Zealand adults.
Rehydration is shown to be straightforward for the reconstruction of polyoxometallate-pillared layered double hydroxides. Zn-Al hydrotalcite-like minerals were prepared with Zn/Al ratios of 1 to 5 by coprecipitation at pH 7. Good crystallinity was obtained for samples with Zn/Al ratios above 2. Thermal decomposition was achieved by calcining the samples at 300 to 900 °C. The calcined samples were exposed to decarbonated water, with or without hydrothermal treatment to evaluate reconstruction of the hydrotalcite-like minerals by rehydration. Restoration of the hydrotalcite-like structure was found to be independent of the Zn/Al ratios for samples calcined between 300 and 400 °C; however, asecond phase, aluminum hydroxide or zinc oxide, was generally detected. A spinel phase, formed during the calcination of samples at temperatures above 600 °C, inhibited reconstruction of the hydrotalcite-like phase. The rehydrated hydrotalcite-like minerals had Zn/Al ratios close to 2, irrespective of the chemistry of the starting material.
We disagree with Almaatouq et al. that no realistic alternative exists to the “one-at-a-time” paradigm. Seventy years ago, Egon Brunswik introduced representative design, which offers a clear path to commensurability and generality. Almaatouq et al.'s integrative design cannot guarantee the external validity and generalizability of results which is sorely needed, while representative design tackles the problem head on.
To investigate the symptoms of SARS-CoV-2 infection, their dynamics and their discriminatory power for the disease using longitudinally, prospectively collected information reported at the time of their occurrence. We have analysed data from a large phase 3 clinical UK COVID-19 vaccine trial. The alpha variant was the predominant strain. Participants were assessed for SARS-CoV-2 infection via nasal/throat PCR at recruitment, vaccination appointments, and when symptomatic. Statistical techniques were implemented to infer estimates representative of the UK population, accounting for multiple symptomatic episodes associated with one individual. An optimal diagnostic model for SARS-CoV-2 infection was derived. The 4-month prevalence of SARS-CoV-2 was 2.1%; increasing to 19.4% (16.0%–22.7%) in participants reporting loss of appetite and 31.9% (27.1%–36.8%) in those with anosmia/ageusia. The model identified anosmia and/or ageusia, fever, congestion, and cough to be significantly associated with SARS-CoV-2 infection. Symptoms’ dynamics were vastly different in the two groups; after a slow start peaking later and lasting longer in PCR+ participants, whilst exhibiting a consistent decline in PCR- participants, with, on average, fewer than 3 days of symptoms reported. Anosmia/ageusia peaked late in confirmed SARS-CoV-2 infection (day 12), indicating a low discrimination power for early disease diagnosis.
To examine patterns of cognitive function among a clinical sample of patients seeking treatment for Post-Acute Sequelae of COVID-19 (PASC).
Participants and Methods:
One hundred nineteen patients each completed a baseline neuropsychological evaluation, including clinical diagnostic interview, cognitive assessments, and a comprehensive battery of self-report questionnaires. Patients had a mean age of 50 years (range:18 to 74, SD=10.1) and a mean of 15.5 years (SD=2.54) of formal education. Patients were primarily female (74%) and of White/Caucasian race (75%). Hierarchical agglomerative clustering was used to partition the data into groups based on cognitive performance. Euclidean distance was used as the similarity measure for the continuous variables and within-cluster variance was minimized using Ward’s method. The optimal number of clusters was determined empirically by fitting models with 1 to 15 clusters, with the best number of clusters selected using the silhouette index. All analyses were conducted using the NbClust package, an R package for determining the relevant number of clusters in a data set.
Results:
Clustering yielded two distinct clusters of cognitive performance. Group 1 (n=57) performed worse than Group 2 (n=62) on most cognitive variables (including a brief cognitive screener and tests of attention/working memory, executive function, processing speed, learning and delayed recall). Of note, there were no significant differences between groups on an infection severity scale, hospitalizations/ICU admissions, initial or current COVID-19 symptoms, or prior comorbidities. Groups did not differ in age or gender, but Group 1 had a lower education level than Group 2 (M=14.7, SD=2.45 vs. M=16.2, SD=2.42; p=.001). Group 1 also had significantly more minorities than Group 2 (40% vs. 8%; p<.001). No other demographic differences (income, living arrangement, or marital status) were observed. In comparison to Group 2 patients, Group 1 patients self-reported significantly higher levels of anxiety and depression and functional impairment (Functional Activities Questionnaire: M=11.3, SD=8.33 vs. M=7.65, SD=7.97), perceived stress (Perceived Stress Scale: M=24.7, SD=7.90 vs. M=20.3, SD=7.89), insomnia (Insomnia Severity Index: M=16.0, SD=6.50 vs. M=13.1, SD=6.76), and subjective cognitive functioning (Cognitive Failures Questionnaire: M=58.8, SD=16.9 vs. M=50.3, SD=18.6; p’s<.05).
Conclusions:
Findings indicate two predominant subtypes of patients seeking treatment for PASC, with one group presenting as more cognitively impaired and reporting greater levels of anxiety, depression, insomnia, perceived stress, functional limitations, and subjective cognitive impairment. Future directions include follow-up assessments with these patients to determine cognitive trajectories over time and tailoring treatment adjuncts to address mood symptoms, insomnia, functional ability, and lifestyle variables. Understanding mechanisms of differences in cognitive and affective symptoms is needed in future work. Limitations to the study were that patients were referred for evaluation based on the complaint of “brain fog” and the sample was a homogenous, highly educated, younger group of individuals who experienced generally mild COVID-19 course.
Non-motor symptoms, such as mild cognitive impairment and dementia, are an overwhelming cause of disability in Parkinson’s disease (PD). While subthalamic nucleus deep brain stimulation (STN DBS) is safe and effective for motor symptoms, declines in verbal fluency after bilateral DBS surgery have been widely replicated. However, little is known about cognitive outcomes following unilateral surgeries.
Participants and Methods:
We enrolled 31 PD patients who underwent unilateral STN-DBS in a randomized, cross-over, double-blind study (SUNDIAL Trial). Targets were chosen based on treatment of the most symptomatic side (n = 17 left hemisphere and 14 right hemisphere). All participants completed a neuropsychological battery (FAS/CFL, AVLT, DKEFS Color-Word Test) at baseline, then 2, 4, and 6 months post-surgery. Outcomes include raw scores for verbal fluency, immediate and delayed recall, and DKEFS Color-Word Inhibition trial (Trial 3) completion time. At 2, 4, and 6 months, the neurostimulation type (directional versus ring mode) was randomized for each participant. We compared baseline scores for all cognitive outcome measures using Welch’s two-sample t-tests and used linear mixed effects models to examine longitudinal effects of hemisphere and stimulation on cognition. This test battery was converted to a teleneuropsychology administration because of COVID-19 mid-study, and this was included as a covariate in all statistical models, along with years of education, baseline cognitive scores, and levodopa equivalent medication dose at each time point.
Results:
At baseline, patients who underwent left hemisphere implants scored lower on verbal fluency than right implants (t(20.66) = -2.49, p = 0.02). There were not significant differences between hemispheres in immediate recall (p = 0.57), delayed recall (p = 0.22), or response inhibition (p = 0.51). Post-operatively, left STN DBS patients experienced significant declines in verbal fluency over the study period (p = 0.02), while patients with right-sided stimulation demonstrated improvements (p < .001). There was no main effect of stimulation parameters (directional versus ring) on verbal fluency, memory, or inhibition, but there was a three-way interaction between time, stimulation parameters, and hemisphere on inhibition, such that left STN DBS patients receiving ring stimulation completed the inhibition trial faster (p = 0.035). After surgery, right STN DBS patients displayed faster inhibition times than patients with left implants (p = 0.015).
Conclusions:
Declines in verbal fluency after bilateral stimulation are the most commonly reported cognitive sequalae of DBS for movement disorders. Here we found group level declines in verbal fluency after unilateral left STN implants, but not right STN DBS up to 6 months after surgery. Patients with right hemisphere implants displayed improvements in verbal fluency. Compared to bilateral DBS, unilateral DBS surgery, particularly in the right hemisphere, is likely a modifiable risk factor for verbal fluency declines in patients with Parkinson’s disease.
This study investigated the effects of Lacticaseibacillus rhamnosus HN001 supplementation on the architecture and gene expression in small intestinal tissues of piglets used as an animal model for infant humans. Twenty-four 10-d-old entire male piglets (4·3 (sd 0·59) kg body weight) were fed an infant formula (IF) (control) or IF supplemented with 1·3 × 105 (low dose) or 7·9 × 106 (high dose) colony-forming units HN001 per ml of reconstituted formula (n 8 piglets/treatment). After 24 d, piglets were euthanised. Samples were collected to analyse the histology and gene expression (RNAseq and qPCR) in the jejunal and ileal tissues, blood cytokine concentrations, and blood and faecal calprotectin concentrations. HN001 consumption altered (false discovery rate < 0·05) gene expression (RNAseq) in jejunal tissues but not in ileal tissues. The number of ileal goblet cells and crypt surface area increased quadratically (P < 0·05) as dietary HN001 levels increased, but no increase was observed in the jejunal tissues. Similarly, blood plasma concentrations of IL-10 and calprotectin increased linearly (P < 0·05) as dietary HN001 levels increased. In conclusion, supplementation of IF with HN001 affected the architecture and gene expression of small intestine tissue, blood cytokine concentration and frequencies, and blood calprotectin concentrations, indicating that HN001 modulated small intestinal tissue maturation and immunity in the piglet model.
The University of Kansas Cancer Center (KU Cancer Center) initiated an engagement program to leverage the lived experience of individuals and families with cancer. KU Cancer Center faculty, staff, and patient partners built an infrastructure to achieve a patient-designed, patient-led, and research-informed engagement program called Patient and Investigator Voices Organizing Together (PIVOT). This special communication offers an engagement roadmap that can be replicated, scaled, and adopted at other cancer centers and academic health systems. PIVOT demonstrates that collaboration among academic leaders, investigators, and people with a lived experience yields a patient-centered, vibrant environment that enriches the research enterprise.
In individuals with first episode psychosis (FEP) and cannabis use disorder (CUD), reducing cannabis use is associated with improved clinical outcomes. Access to evidence-based psychological interventions to decrease cannabis use in FEP clinics is highly variable; E-mental health interventions may help to address this gap. Development of E-interventions for CUD in individuals with FEP is in its incipient phases.
Objectives
To assess preferences for online psychological interventions aiming at decreasing or stopping cannabis use in young adults with psychosis and CUD.
Methods
Individuals aged 18 to 35 years old with psychosis and CUD were recruited from seven FEP intervention programs in Canada and responded to an electronic survey between January 2020-July 2022. We used the Case 2 Best Worst Scaling methodology that is grounded in the trade-off utility concept to collect and analyse data. Participants selected the best or worst option for each of the nine questions corresponding to three distinct domains. For each domain we used conditional logistic regression and marginal models (i.e., three models in total) to estimate preferences for attributes (e.g., duration, frequency of online intervention sessions) and attribute levels (e.g., 15 minutes, every day).
Results
Participants (N=104) showed higher preferences for the following attributes: duration of online sessions; mode of receiving the intervention; method of feedback delivery and the frequency of feedback from clinicians (Table 1). Attribute-level analyses showed higher preferences for participating once a week in short (15 minutes) online interventions (Figure 1). Participants valued the autonomy offered by online interventions which aligns with their preference for completing the intervention outside the clinic and only require assistance once a week (Figure 2). Participants’ preferences were higher for receiving feedback related to cannabis consumption both from the application and clinicians at a frequency of once a week from clinicians (Figure 3).Table 1.
Preferences for Attributes. Results of conditional logistic regression
Attributes
Domains
OR
95% CI for OR
Duration session
A
1.62
1.45; 1.82
Frequency sessions
0.98
0.87; 1.09
Duration intervention
ref
Preferred mode of receiving the intervention
B
1.63
1.46; 1.83
Preferred location for participating
1.07
0.96; 1.20
Frequency of assistance from the clinician
ref
Preference for the feedback delivery method
C
1.21
1.08; 1.36
Frequency of feedback from the treating clinician
1.14
1.02; 1.28
Frequency of feedback from the application
ref
Note: In boldface significant odds ratios (OR) and confidence intervals (CI)
Image:
Image 2:
Image 3:
Conclusions
Using advanced methodologies to assess preferences, our results can inform the development of highly acceptable E-Mental health interventions for decreasing cannabis use in individuals with CUD and FEP.
Clinical trials are constantly evolving in the context of increasingly complex research questions and potentially limited resources. In this review article, we discuss the emergence of “adaptive” clinical trials that allow for the preplanned modification of an ongoing clinical trial based on the accumulating evidence with application across translational research. These modifications may include terminating a trial before completion due to futility or efficacy, re-estimating the needed sample size to ensure adequate power, enriching the target population enrolled in the study, selecting across multiple treatment arms, revising allocation ratios used for randomization, or selecting the most appropriate endpoint. Emerging topics related to borrowing information from historic or supplemental data sources, sequential multiple assignment randomized trials (SMART), master protocol and seamless designs, and phase I dose-finding studies are also presented. Each design element includes a brief overview with an accompanying case study to illustrate the design method in practice. We close with brief discussions relating to the statistical considerations for these contemporary designs.
Premixed turbulent flames, encountered in power generation and propulsion engines, are an archetype of a randomly advected, self-propagating surface. While such a flame is known to exhibit large-scale intermittent flapping, the possible intermittency of its small-scale fluctuations has been largely disregarded. Here, we experimentally reveal the inner intermittency of a premixed turbulent V-flame, while clearly distinguishing this small-scale feature from large-scale outer intermittency. From temporal measurements of the fluctuations of the flame, we find a frequency spectrum that has a power-law subrange with an exponent close to $-2$, which is shown to follow from Kolmogorov phenomenology. Crucially, however, the moments of the temporal increment of the flame position are found to scale anomalously, with exponents that saturate at higher orders. This signature of small-scale inner intermittency is shown to originate from high-curvature, cusp-like structures on the flame surface, which have significance for modelling the heat release rate and other key properties of premixed turbulent flames.
A new hibbertopterid eurypterid, Cyrtoctenus bambachi n. sp., is described from the Early Mississippian (Tournaisian) Price Formation of western Virginia. The same unit yields an unidentifiable stylonurine eurypterid. These are the first eurypterids documented from the Mississippian of North America, and only the fourth locality of this age anywhere in the world to yield eurypterids.