We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Major depressive disorder (MDD) and psychostimulant use disorder (PUD) are common, disabling psychopathologies that pose a major public health burden. They share a common behavioral phenotype: deficits in inhibitory control (IC). However, whether this is underpinned by shared neurobiology remains unclear. In this meta-analytic study, we aimed to define and compare brain functional alterations during IC tasks in MDD and PUD.
Methods
We conducted a systematic literature search on IC task-based functional magnetic resonance imaging studies in MDD and PUD (cocaine or methamphetamine use disorder) in PubMed, Web of Science, and Scopus. We performed a quantitative meta-analysis using seed-based d mapping to define common and distinct neurofunctional abnormalities.
Results
We identified 14 studies comparing IC-related brain activation in a total of 340 MDD patients with 303 healthy controls (HCs), and 11 studies comparing 258 PUD patients with 273 HCs. MDD showed disorder-differentiating hypoactivation during IC tasks in the median cingulate/paracingulate gyri relative to PUD and HC, whereas PUD showed disorder-differentiating hypoactivation relative to MDD and HC in the bilateral inferior parietal lobule. In conjunction analysis, hypoactivation in the right inferior/middle frontal gyrus was common to both MDD and PUD.
Conclusions
The transdiagnostic neurofunctional alterations in prefrontal cognitive control regions may underlie IC deficits shared by MDD and PUD, whereas disorder-differentiating activation abnormalities in midcingulate and parietal regions may account for their distinct features associated with disturbed goal-directed behavior.
The interaction of helminth infections with type 2 diabetes (T2D) has been a major area of research in the past few years. This paper, therefore, focuses on the systematic review of the effects of helminthic infections on metabolism and immune regulation related to T2D, with mechanisms through which both direct and indirect effects are mediated. Specifically, the possible therapeutic role of helminths in T2D management, probably mediated through the modulation of host metabolic pathways and immune responses, is of special interest. This paper discusses the current possibilities for translating helminth therapy from basic laboratory research to clinical application, as well as existing and future challenges. Although preliminary studies suggest the potential for helminth therapy for T2D patients, their safety and efficacy still need to be confirmed by larger-scale clinical studies.
Background: Aneurysmal subarachnoid hemorrhage (aSAH) is a devastating disease process that represents a significant health shock for thousands of patients each year. Return to work outcomes and associated factors require evaluation to counsel patients and identify domains on which to focus clinical efforts. Methods: A systematic review of the literature following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses 2020 guidelines was performed using MEDLINE, EMBASE and Cochrane databases from inception to February 2024. Proportion of patients returning to work was collected from included studies. Odds ratios were pooled from studies evaluating the association between pre-rupture demographic variables, post-rupture clinical variables and return to work following aSAH. Results: Literature search yielded 3861 studies, of which 40 studies were included in the final analysis for a total of 6888 patients. On average, 55% (SD 17%) of all patients returned to work after an aSAH. Female sex (male sex OR 1.75), high grade aSAH on presentation (OR 0.30), and need for permanent CSF diversion (OR 0.50) are significantly associated with unemployment after aSAH. Conclusions: Female sex, high grade presentation, and permanent CSF diversion are associated with unemployment after aSAH. About half of all patients that experience aSAH return to work.
Background: Self-injurious behaviours (SIB) are repetitive, non-accidental movements that result in physical damage inflicted upon oneself, without suicidal intent. SIB are prevalent among children with autism spectrum disorder and can lead to permanent disability or death. Neuromodulation at a locus of neural circuitry implicated in SIB, the nucleus accumbens (NAc), may directly influence these behaviours. Methods: We completed a phase I, open-label clinical trial of deep brain stimulation (DBS) of the NAc in children with severe, treatment-refractory SIB (ClinicalTrials.gov NCT03982888). Participants were monitored for 12 months following NAc-DBS to assess the primary outcomes of safety and feasibility. Secondary outcomes included serial assessments of SIB, ambulatory actigraphy, and changes in brain glucose metabolism induced by DBS. Results: Six children underwent NAc-DBS without any serious adverse events. NAc-DBS resulted in significant reductions in SIB and SIB-associated behaviours across multiple standardized scales, concurrent with clinically meaningful improvements in quality-of-life. Ambulatory actigraphy showed reductions in high-amplitude limb movements and positron emission tomography revealed treatment-induced reductions in metabolic activity within the thalamus, striatum, and temporoinsular cortex. Conclusions: This first-in-children phase 1 clinical trial demonstrates the safety and feasibility of NAc-DBS in children with severe, refractory SIB at high risk of physical injury and death and supports further investigations.
Background: Late-onset Pompe disease (LOPD) is caused by a deficiency of acid α-glucosidase (GAA), leading to progressive muscle and respiratory decline. Cipaglucosidase alfa (cipa), a recombinant human GAA naturally enriched with bis-mannose-6-phosphate, exhibits improved muscle uptake but is limited by inactivation at near-neutral blood pH. Miglustat (mig), an enzyme stabiliser, binds competitively and reversibly to cipa, enhancing its stability and activity. Methods: In dose-finding studies, Gaa-/- mice were treated with cipa (20 mg/kg) +/- mig (10 mg/kg; equivalent human dose ~260 mg). Clinical study methodologies have been published (Schoser et al. Lancet Neurol 2021:20;1027–37; Schoser et al. J Neurol 2024:271;2810–23). Results: In Gaa-/- mice, cipa+mig improved muscle glycogen reduction more than cipa alone and grip strength to levels approaching wild-type mice. LOPD patients (n=11) treated with cipa alone showed dose-dependent decreases in hexose tetrasaccharide (Hex4) levels by ~15% from baseline, decreasing another ~10% with added mig (260 mg). In a head-to-head study, cipa+mig had a similar safety profile to alglucosidase alfa. Among 151 patients (three trials), mig-related adverse events occurred in 21 (13.9%), none serious. Conclusions: Mig stabilised cipa in circulation, improving cipa exposure, further reducing Hex4 levels and was well tolerated in clinical studies in patients with LOPD. Sponsored by Amicus Therapeutics, Inc.
Children born very preterm (VPT; ≤32 weeks’ gestation) are at higher risk of developing behavioural problems, encompassing socio-emotional processing and attention, compared to term-born children. This study aimed to examine multi-dimensional predictors of late childhood behavioural and psychiatric outcomes in very preterm children, using longitudinal clinical, environmental, and cognitive measures.
Methods
Participants were 153 VPT children previously enrolled in the Evaluation of Preterm Imaging study who underwent neuropsychological assessments at 18–24 months, 4–7 years and 8–11 years as part of the Brain Immunity and Psychopathology following very Preterm birth (BIPP) study. Predictors of late childhood behavioural and psychiatric outcomes were investigated, including clinical, environmental, cognitive, and behavioural measures in toddlerhood and early childhood. Parallel analysis and exploratory factor analysis were conducted to define outcome variables. A prediction model using elastic-net regularisation and repeated nested cross-validation was applied to evaluate the predictive strength of these variables.
Results
Factor analysis revealed two key outcome factors in late childhood: externalising and internalising-socio-emotional problems. The strongest predictors of externalising problems were response inhibition, effortful control and internalising symptoms in early childhood (cross-validated R2=.256). The strongest predictors of internalising problems were autism traits and poor cognitive flexibility in early childhood (cross-validated R2=.123). Cross-validation demonstrated robust prediction models, with higher accuracy for externalising symptoms.
Conclusions
Early childhood cognitive and behavioural outcomes predicted late childhood behavioural and psychiatric outcomes in very preterm children. These findings underscore the importance of early interventions targeting cognitive development and behavioural regulation to mitigate long-term psychiatric risks in very preterm children.
This study explored mental workload recognition methods for carrier-based aircraft pilots utilising multiple sensor physiological signal fusion and portable devices. A simulation carrier-based aircraft flight experiment was designed, and subjective mental workload scores and electroencephalogram (EEG) and photoplethysmogram (PPG) signals from six pilot cadets were collected using NASA Task Load Index (NASA-TLX) and portable devices. The subjective scores of the pilots in three flight phases were used to label the data into three mental workload levels. Features from the physiological signals were extracted, and the interrelations between mental workload and physiological indicators were evaluated. Machine learning and deep learning algorithms were used to classify the pilots’ mental workload. The performances of the single-modal method and multimodal fusion methods were investigated. The results showed that the multimodal fusion methods outperformed the single-modal methods, achieving higher accuracy, precision, recall and F1 score. Among all the classifiers, the random forest classifier with feature-level fusion obtained the best results, with an accuracy of 97.69%, precision of 98.08%, recall of 96.98% and F1 score of 97.44%. The findings of this study demonstrate the effectiveness and feasibility of the proposed method, offering insights into mental workload management and the enhancement of flight safety for carrier-based aircraft pilots.
This paper presents an improved signal-processing method based on the Hilbert-Huang transform (HHT), which is applied to the fault feature extraction of the aerospace generator rotating rectifier (AGRR). Initially, the excitation current of the alternating-current (AC) exciter is utilised as measurable information for data collection. Subsequently, the HHT is processed with variational mode decomposition (VMD), followed by the improvement of the variational Hilbert-Huang transform (VHHT) using particle swarm optimisation (PSO) to determine the modal decomposition number and the secondary penalty factor. Finally, the proposed PSO-VHHT method is compared with several other signal processing-based feature extraction methods through both simulated and practical experiment data, and an analysis of the diagnostic performance of these methods is also conducted.
Objectives/Goals: Cutaneous lupus erythematosus (CLE) is an inflammatory skin manifestation of lupus. CLE lesions are frequently colonized by Staphylococcus aureus, a microbe known to promote IFN production and inflammation. Here, we investigate whether type I IFN and inflammatory gene signatures in CLE lesions can be modulated with a topical antibiotic treatment. Methods/Study Population: SLE patients with active CLE lesions (n = 12) were recruited and randomized into a week of topical treatment with either 2% mupirocin or petroleum jelly vehicle. Paired samples were collected before and after 7 days of treatment to assess microbial lesional skin responses. Microbial samples from nares and lesional skin were used to determine baseline and posttreatment Staphylococcus abundance and microbial community profiles by 16S rRNA gene sequencing. Inflammatory responses were evaluated by bulk RNA sequencing of lesional skin biopsies. Immunophenotyping of CLE lesions was performed using CIBERSORTx to deconvolute the RNA-seq data into predicted cell populations impacted by treatment. Results/Anticipated Results: We identified 173 differentially expressed genes in CLE lesions after topical mupirocin treatment. Mupirocin treatment decreased the abundance of Staphylococcus associated with CLE lesions without altering the overall diversity of the skin microbiota relative to vehicle. Decreased lesional Staphylococcus burden correlated with decreased IFN pathway signaling and inflammatory gene expression and increased barrier dysfunction. Interestingly, mupirocin treatment lowered skin monocyte levels, and this mupirocin-associated depletion of monocytes correlated with decreased inflammatory gene expression. Discussion/Significance of Impact: Mupirocin treatment decreased lesional Staphylococcus burden and this correlated with decreased IFN signaling and inflammatory gene expression. This study suggests a topical antibiotic could be employed to decrease lupus skin inflammation and type I IFN responses by reducing Staphylococcus colonization.
Two potential obstacles stand between the observation of a statistical correlation and the design (and deployment) of an effective intervention, omitted variable bias and reverse causality. Whereas the former has received ample attention, comparably scant focus has been devoted to the latter in the methodological literature. Many existing methods for reverse causality testing commence by postulating a structural model that may suffer from widely recognized issues such as the difficulty of properly setting temporal lags, which are critical to model validity. In this article, we draw upon advances in machine learning, specifically the recently established link between causal direction and the effectiveness of semi-supervised learning algorithms, to develop a novel method for reverse causality testing that circumvents many of the assumptions required by traditional methods. Mathematical analysis and simulation studies were carried out to demonstrate the effectiveness of our method. We also performed tests over a real-world dataset to show how our method may be used to identify causal relationships in practice.
Rationality is a fundamental pillar of Economics. It is however unclear if this assumption holds when decisions are made under stress. To answer this question, we design two laboratory experiments where we exogenously induce physiological stress in participants and test the consistency of their choices with economic rationality. In both experiments we induce stress with the Cold Pressor test and measure economic rationality by the consistency of participants’ choices with the Generalized Axiom of Revealed Preference (GARP). In the first experiment, participants delay the decision-making task for 20 min until the cortisol level peaks. We find significant differences in cortisol levels between the stressed group and the placebo group which, however, do not affect the consistency of choices with GARP. In a second experiment, we study the immediate effect of the stressor on rationality. Overall, results from the second experiment confirm that rationality is not impaired by the stressor. If anything, we observe that compared to the placebo group, participants are more consistent with rationality immediately after the stressor. Our findings provide strong empirical support for the robustness of the economic rationality assumption under physiological stress.
The transport process of a relativistic electron beam (REB) in high-density and degenerate plasmas holds significant importance for fast ignition. In this study, we have formulated a comprehensive theoretical model to address this issue, incorporating quantum degeneracy, charged particle collisions and the effects of electromagnetic (EB) fields. We model the fuel as a uniform density region and particularly focus on the effect of quantum degeneracy during the transport of the REB, which leads to the rapid growth of a self-generated EB field and a subsequently significant self-organized pinching of the REB. Through our newly developed hybrid particle-in-cell simulations, we have observed a two-fold enhancement of the heating efficiency of the REB compared with previous intuitive expectation. This finding provides a promising theoretical framework for exploring the degeneracy effect and the enhanced self-generated EB field in the dense plasma for fast ignition, and is also linked to a wide array of ultra-intense laser-based applications.
This study introduces the prostate cancer linear energy transfer sensitivity index (PCLSI) as a novel method to predict relative biological effectiveness (RBE) in prostate cancer using linear energy transfer (LET) in proton therapy based on screening for DNA repair mutations.
Materials and Methods:
Five prostate cancer cell lines with DNA repair mutations known to cause sensitivity to LET and DNA repair inhibitors were examined using published data. Relative Du145 LET sensitivity data were leveraged to deduce the LET equivalent of olaparib doses. The PCLSI model was built using three of the prostate cancer cell lines (LNCaP, 22Rv1 and Du145) with DNA mutation frequency from patient cohorts. The PCLSI model was compared against two established RBE models, McNamara and McMahon, for LET-optimized prostate cancer treatment plans.
Results:
The PCLSI model relies on the presence of eight DNA repair mutations: AR, ATM, BRCA1, BRCA2, CDH1, ETV1, PTEN and TP53, which are most likely to predict increased LET sensitivity and RBE in proton therapy. In the LET-optimized plan, the PCLSI model indicates that prostate cancer cells with these DNA repair mutations are more sensitive to increased LET than the McNamara and McMahon RBE models, with expected RBE increases ranging from 11%–33% at 2keV/µm.
Conclusions:
The PCLSI model predicts increasing RBE as a function of LET in the presence of certain genetic mutations. The integration of LET-optimized proton therapy and genetic mutation profiling could be a significant step toward the use of individualized medicine to improve outcomes using RBE escalation without the potential toxicity of physical dose escalation.
Integrating the predictive processing framework into our understanding of motivation offers promising avenues for theoretical development, while shedding light on the computational processes underlying motivated behavior. Here we decompose expected free energy into intrinsic value (i.e., epistemic affordance) and extrinsic value (i.e., instrumental affordance) to provide insights into how individuals adapt to and interact with their environment.
In this article, we delve into the optimal scheduling challenge for many-to-many on-orbit services, taking into account variations in target accessibility. The scenario assumes that each servicing satellite is equipped with singular or multiple service capabilities, tasked with providing on-orbit services to multiple targets, each characterised by distinct service requirements. The mission’s primary objective is to determine the optimal service sequence, orbital transfer duration and on-orbit service time for each servicing satellite, with the ultimate goal of minimising the overall cost. We frame the optimal scheduling dilemma as a time-related colored travelling salesman problem (TRCTSP) and propose an enhanced firefly algorithm (EFA) to address it. Finally, experimental results across various scenarios validate the effectiveness and superiority of the proposed algorithm. The principal contribution of this work lies in the modeling and resolution of the many-to-many on-orbit service challenge, considering accessibility variations — a domain that has, until now, remained unexplored.
Urban air mobility (UAM) utilising novel transportation tools is gradually being recognised as a significant means to alleviate ground transportation pressures, vertiports which serve as pivotal nodes in UAM require efficient methods for assessing its operational capacity to develop an appropriate operational strategy and help to design vertiport ground infrastructure scientifically. This study proposes a multi-dimensional assessment method for the capacity of vertiports considering throughput and quality of service based on genetic algorithm (CEGA). The method comprehensively considers constraints such as unmanned aerial vehicle (UAV) safety separation, battery endurance, number of landing vertipads and UAV speed. The experimental results indicate that the vertiport with the scheduling algorithm proposed by this study has a larger capacity and experiences fewer delay than the vertiport with first-come-first-served (FCFS) algorithm when the vertiport has the same limited number of vertipads. Different proportions of UAVs significantly affect the quality of service and the degree of operation delays. The weights of vertiport throughput and customer satisfaction are the parameters that represent the importance of throughput and customer satisfaction in the objective function of the capacity assessment model. When the weights of throughput and customer satisfaction are set to 0.8 and 0.2 respectively, the performance of this optimisation model is optimal. This study provides a novel solution for capacity assessment and operation scheduling of vertiports, laying the foundation for improving the efficiency of UAM operations.
Centanafadine (CTN) is a potential first-in-class norepinephrine/dopamine/serotonin triple reuptake inhibitor (NDSRI) that has demonstrated efficacy, safety, and tolerability vs placebo (PBO) in adults with ADHD in 2 pivotal phase 3 trials (Adler LA, et al. J Clin Psychopharmacol. 2022;42:429-39).
Methods
Pooled data from 2 double-blind, multicenter, PBO-controlled trials enrolling adults (18–55 years) meeting DSM-5 ADHD criteria were analyzed. Patients were randomized 1:1:1 to CTN sustained release (SR) 200 mg or 400 mg total daily dose (TDD) or matching PBO if Adult ADHD Investigator Symptom Rating Scale (AISRS) score was ≥28 at screening (if not receiving pharmacologic ADHD treatment) or ≥22 at screening and ≥28 at baseline (if receiving treatment). Having had no prior benefit from ≥2 ADHD therapies of different classes, use of prohibited medications, and positive alcohol/drug screens were exclusionary. Studies had 4 periods: (1) screening and washout (≤28 days), (2) single-blind PBO run-in (1 week), (3) double-blind treatment (6 weeks), and (4) follow-up (10 days after last dose). Patients with ≥30% Adult ADHD Self-report Scale (ASRS) improvement from start to end of screening were screen failures; those with ≥30% ASRS improvement from start to end of PBO run-in were terminated early. A mixed model for repeated measures analysis evaluated CTN SR vs PBO based on ADHD treatment history; least squares mean (LSM) change from baseline (BL) in AISRS at day 42 was the outcome of interest.
Results
In total, 859 patients were analyzed (CTN SR 200 mg TDD, n=287; 400 mg TDD, n=287; PBO, n=285). LSM change from BL in AISRS score was significant at day 42 for each CTN SR TDD group (both, P<0.001) in the overall population vs PBO. Among patients with prior stimulant/nonstimulant treatment (n=542), LSM change from BL was significant at day 42 vs PBO in the CTN SR 200 mg (P=0.016) and 400 mg (P=0.008) TDD groups. Although cohort size was limited (n=47), LSM change from baseline with CTN SR 400 mg TDD was significant (P<0.05) from days 14 to 42 in those who took 2 prior stimulant/nonstimulant treatments, with P=0.030 at day 42. In those with no prior stimulant/nonstimulant treatment (n=317), LSM change from BL was significant at day 42 for the CTN SR 200 mg (P=0.007) and 400 mg (P=0.008) TDD groups vs PBO. When analyzed by history of any past stimulant use, LSM change from BL was significant at day 42 for CTN 200 mg (n=179; P=0.013) and 400 mg (n=166; P=0.006), with significance (P<0.05) noted at day 7 (200 mg TDD) and at day 21 (400 mg TDD), remaining significant to day 42.
Conclusions
This pooled analysis suggests that CTN SR treatment is efficacious in adults with ADHD, regardless of prior treatments, an encouraging finding given reported adult ADHD treatment patterns.
Centanafadine (CTN) is a potential first-in-class norepinephrine/dopamine/serotonin triple reuptake inhibitor (NDSRI). The efficacy, safety, and tolerability of CTN sustained release (SR) for adults with ADHD was demonstrated in 2 pivotal phase 3 trials (Adler LA, et al. J Clin Psychopharmacol. 2022;42:429-39).
Methods
Adults (18–55 years) meeting DSM-5 criteria for ADHD enrolled in these double-blind, multicenter, placebo-controlled trials and randomized to treatment if ADHD Investigator Symptom Rating Scale (AISRS) score was ≥28 at screening (if not receiving pharmacologic treatment for ADHD) or ≥22 at screening and ≥28 at baseline (BL) (if receiving treatment). Having had no prior benefit from ≥2 ADHD therapies of 2 different classes, taking prohibited medications, and positive alcohol/drug screen were exclusionary. Trials had 4 periods: (1) screening and washout (≤28 days), (2) single-blind placebo run-in (1 week), (3) double-blind treatment (6 weeks), and (4) follow-up (10 days after last dose). Patients with ≥30% improvement in the Adult ADHD Self-report Scale (ASRS) from start to end of screening were screen failures; those with ≥30% ASRS improvement from start to end of placebo run-in were terminated early. Patients were randomized 1:1:1 to twice-daily CTN SR (200 or 400 mg total daily dose [TDD]) or matching placebo. The 200 mg/d group received CTN SR 200 mg TDD from days 1–42; the 400 mg/d group received 200 mg TDD on days 1–7, and increased to 400 mg TDD on day 8. This analysis assessed CTN SR effects based on median BL AISRS severity score (<38 or ≥38) using a mixed model for repeated measures analysis. Least squares mean (LSM) differences (95% CI) from BL at day 42 were compared between individual CTN SR dose groups and placebo, tested at a 2-sided significance level of 0.05.
Results
In total, 859 patients were randomized (200 mg TDD, n=287; 400 mg TDD, n=287; placebo, n=285). Significant LSM differences on the AISRS were observed vs placebo in the overall population (200 mg TDD and 400 mg TDD, P<0.0001 for each), in the low BL severity (200 mg TDD [P=0.016]; 400 mg TDD [P=0.019]), and in the high BL severity (200 mg TDD [P=0.005]; 400 mg TDD [P=0.003]) populations at day 42. Significant LSM differences vs placebo (P<0.01) began at day 7 (200 mg) and day 14 (400 mg) overall, remaining significant to day 42. Significant LSM differences were observed vs placebo (P<0.05) from day 14 (400 mg TDD) and day 21 (200 mg) in the low severity populations, and from day 21 (400 mg TDD) and day 7 (200 mg TDD) in the high severity population, remaining significant (P<0.05) to day 42.
Conclusions
CTN SR, a potential first-in-class NDSRI, is efficacious for patients with adult ADHD of low or high BL symptom severity, with significant improvements observed vs placebo within the first 3 weeks.
The infinitesimal jackknife, a nonparametric method for estimating standard errors, has been used to obtain standard error estimates in covariance structure analysis. In this article, we adapt it for obtaining standard errors for rotated factor loadings and factor correlations in exploratory factor analysis with sample correlation matrices. Both maximum likelihood estimation and ordinary least squares estimation are considered.
Considering a dyad as a dynamic system whose current state depends on its past state has allowed researchers to investigate whether and how partners influence each other. Some researchers have also focused on how differences between dyads in their interaction patterns are related to other differences between them. A promising approach in this area is the model that was proposed by Gottman and Murray, which is based on nonlinear coupled difference equations. In this paper, it is shown that their model is a special case of the threshold autoregressive (TAR) model. As a consequence, we can make use of existing knowledge about TAR models with respect to parameter estimation, model alternatives and model selection. We propose a new estimation procedure and perform a simulation study to compare it to the estimation procedure developed by Gottman and Murray. In addition, we include an empirical example based on interaction data of three dyads.