We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The turbulent wake behind a flat-back Ahmed body is investigated using stacked stereoscopic particle image velocimetry. The wake is disturbed by a steady jet from the centre of the base and the effects are quantified for key blowing rates. The unactuated wake exhibits bistable dynamics in the horizontal plane that are completely subdued for the optimal blowing case, yielding a base drag reduction of 9 %. The three-dimensional mean wake is reconstructed and used to evaluate the wake mass fluxes whose equilibrium determines the recirculation length. The results for the unactuated wake show that up to 80 % of replenishment fluid flux entering the recirculation bubble from the free-stream flow is provided through the low-pressure side of the base, where the symmetry-breaking shear layer roll-up occurs near the base. For the optimal blowing configuration, where the wake becomes symmetric, the flux of wake replenishment is severely reduced. This flow configuration results in elongated shear layers on all sides, which terminate the bubble with a roll-up of reduced intensity at a further downstream location. The dominant cause of bubble growth and the accompanying drag reduction is attributed to the momentum of the base blowing, and the new regime is referred to as the ‘favourable momentum regime’. Similar trends are observed when the model is at $5^{\circ }$ yaw where a reduction of drag and yaw-induced asymmetry are obtained. Proper orthogonal decomposition of the wake reveals the coherent structures related to the bistable flow and the symmetric wake under optimal blowing coefficient.
Caribbean health research has overwhelmingly employed measures developed elsewhere and rarely includes evaluation of psychometric properties. Established measures are important for research and practice. Particularly, measures of stress and coping are needed. Stressors experienced by Caribbean people are multifactorial, as emerging climate threats interact with existing complex and vulnerable socioeconomic environments. In the early COVID-19 pandemic, our team developed an online survey to assess the well-being of health professions students across university campuses in four Caribbean countries. This survey included the Perceived Stress Scale, 10-item version (PSS-10) and the Brief Resilient Coping Scale (BRCS). The participants were 1,519 health professions students (1,144 females, 372 males). We evaluated the psychometric qualities of the measures, including internal consistency, concurrent validity by correlating both measures, and configural invariance using confirmatory factor analysis (CFA). Both scales had good internal consistency, with omega values of 0.91 for the PSS-10 and 0.81 for the BRCS. CFA suggested a two-factor structure of the PSS-10 and unidimensional structure of the BRCS. These findings support further use of these measures in Caribbean populations. However, the sampling strategy limits generalizability. Further research evaluating these and other measures in the Caribbean is desirable.
Background: Central-line–associated bloodstream infection (CLABSI) rates increased nationally during COVID-19, the drivers of which are still being characterized in the literature. CLABSI rates doubled during the SARS-CoV-2 omicron-variant surge at our rural academic medical center. We sought to identify potential drivers of CLABSIs by comparing period- and patient-specific characteristics of this COVID-19 surge to a historical control period. Methods: We defined the study period as the time of highest COVID-19 burden at our hospital (July 2021–June 2022) and the control period as the previous 2 years (July 2019–June 2021). We compared NHSN CLABSI standardized infection ratios (SIRs), central-line standardized utilization ratios (SURs), completion of practice evaluation tools (PETs) for monitoring of central-line bundle compliance, and proportions of traveling nurses. We performed chart reviews to determine patient-specific characteristics of NHSN CLABSIs during these periods, including demographics, comorbidities, central-line characteristics and care, and microbiology. Results: The CLABSI SIR was significantly higher during the study period than the control period (0.89 vs 0.52; P = .03); the SUR was significantly higher during the study period (1.08 vs 1.02; P < .01); the PET completion per 100 central-line days was significantly lower during the study period (23.0 vs 31.5; P < .01); and the proportion of traveling nurses was significantly higher during the study period (0.20 vs 0.08; P < .01) (Fig. 1). Patients with NHSN CLABSIs during the study period were more likely to have a history of COVID-19 (27% vs 3%; P = .01) and were more likely to receive a higher level of care (60% vs 27%; P = .02). During the study period, more patients had multilumen catheters (87% vs 61%; P = .04). The type of catheter, catheter care (ie, dressing changes and chlorhexidine bathing), catheter duration before CLABSI, and associated microbiology were similar between the study and control periods (Table 1). Conclusions: During the SARS-CoV-2 omicron-variant surge, the increase in CLABSIs at our hospital was significantly associated with increased central-line utilization, decreased PET completion, and increased proportion of traveling nurses. Critical illness and multilumen catheters were significant patient-specific factors that differed between CLABSIs from the study and control periods. We did not observe differences in catheter type, duration, or catheter care. Our study highlights key modifiable risk factors for CLABSI reduction. These findings may be surrogates for other difficult-to-measure challenges related to the culture of safety during a global pandemic, such as staff education related to infection prevention and daily review of central-line necessity.
Low and middle-income countries (LMICs) hold the majority of disease burden attributed to major depressive disorder (MDD). Despite this, there remains a substantial gap for access to evidence-based treatments for MDD in LMICs like Pakistan. Measurement-based care (MBC) incorporates systematic administration of validated outcome measures to guide treatment decision making and is considered a low-cost approach to optimise better clinical outcomes for individuals with MDD but there is a paucity of evidence on the efficacy of MBC in LMICs.
Objectives
This protocol highlights a randomized trial which will include Pakistani outpatients with moderate to severe major depression.
Methods
Participants will be randomised to either MBC (guided by schedule), or standard treatment (guided by clinicians’ judgement), and will be prescribed with paroxetine (10–60mg/day) or mirtazapine (7.5–45mg/day) for 24 weeks. Outcomes will be evaluated by raters blind to study protocol and treatment.
Results
National Bioethics Committee (NBC) of Pakistan has given full ethics approval. The trial is being conducted and reported as per recommendation of the CONSORT statement for RCTs.
Conclusions
With increasing evidence from high-income settings supporting the effectiveness of MBC for MDD, it is now necessary to explore its feasibility, utility. and efficacy in low-resource settings. The results of the proposed trial could inform the development of a low-cost and scalable approach to efficiently optimise outcomes for individuals with MDD in Pakistan.
Bipolar disorder (BD) is a source of marked disability, morbidity, and premature death. There is a paucity of research on personalized psychosocial interventions for BD, especially in lowresource settings. A previously published pilot randomized controlled trial (RCT) of a Culturally adapted PsychoEducation (CaPE) intervention for BD in Pakistan reported higher patient satisfaction, enhanced medication adherence, knowledge and attitudes towards BD, and improvement in mood symptom scores and health-related quality of life measures compared to treatment-as-usual (TAU).
Objectives
This protocol describes a larger multicentre RCT to confirm the clinical and cost-effectiveness of CaPE in Pakistan.
Methods
A multicentre individual, parallel arm, RCT of CaPE in 300Pakistani adults with BD. Participants over the age of 18, with adiagnosis of bipolar I and II and who are currently euthymic, will berecruited from seven sites including Karachi, Lahore, Multan, Rawalpindi,Peshawar, Hyderabad and Quetta. Time to recurrence will be the primaryoutcome assessed using Longitudinal Interval Follow-up Evaluation(LIFE). Secondary measures will include mood symptomatology, qualityof life and functioning, adherence to psychotropic medications, andknowledge and attitudes towards BD.
Results
Full ethics approval has been received from National Bioethics Committee (NBC) of Pakistan and Centre for Addiction and Mental Health (CAMH), Toronto, Canada. The study has completed sixty-five screening across the seven centres, of which forty-eight participants have been randomised.
Conclusions
A successful trial will lead to rapid implementation of CaPE in clinical practice, not only in Pakistan, but also in other low-resource settings including those in high-income countries, to improve clinical outcomes, social and occupational functioning, and quality of life in South Asian and other minority patients with BD.
Up to 90% of adults with untreated atrial septal defect will be symptomatic by 4th decade, and 30-49% will develop heart failure. 8–10% of these patients have pulmonary arterial hypertension with a female predominance regardless of age. We aimed to demonstrate that fenestrated closure can be safely performed in patients with decompensated heart failure and atrial septal defect-associated pulmonary arterial hypertension with improved outcome.
Methods:
Transcatheter fenestrated atrial septal defect closures (Occlutech GmbH, Jena, Germany) were performed on a compassionate-use basis in 5 consecutive adult patients with atrial septal defect-associated pulmonary arterial hypertension and severe heart failure with prohibitive surgical mortality risks. Change in systemic oxygen saturation, 6-minute walk test, NYHA class, echocardiographic and haemodynamic parameters were used as parameters of outcome.
Results:
All patients were female, mean age 48.8 ± 13.5 years, followed up for a median of 29 months (max 64 months). Significant improvements observed in the 6-minute walk test, and oxygen saturation comparing day 0 time point to all other follow-up time points data (B = 1.32, SE = 0.28, t (22.7) = -4.77, p = 0.0001); and in the haemodynamic data (including pulmonary vascular resistance and pulmonary pressure) (B = –0.60, SE = 0.22, t (40.2) = 2.74, p = .009). All patients showed improved right ventricular size and function along with NYHA class. There were no procedure-related complications.
Conclusion:
Fenestrated atrial septal defect closure is feasible in adults with decompensated heart failure and atrial septal defect-associated pulmonary arterial hypertension. It results in sustained haemodynamic and functional improvement
For 147 hospital-onset bloodstream infections, we assessed the sensitivity, specificity, positive predictive value, and negative predictive value of the National Healthcare Safety Network surveillance definitions of central-line–associated bloodstream infections against the gold standard of physician review, examining the drivers of discrepancies and related implications for reporting and infection prevention.
The magnitude and azimuth of horizontal ice flow at Camp Century, Greenland have been measured several times since 1963. Here, we provide a further two independent measurements over the 2017–21 period. Our consensus estimate of horizontal ice flow from four independent satellite-positioning solutions is 3.65 ± 0.13 m a−1 at an azimuth of 236 ± 2°. A portion of the small, but significant, differences in ice velocity and azimuth reported between studies likely results from spatial gradients in ice flow. This highlights the importance of restricting inter-study comparisons of ice flow estimates to measurements surveyed within a horizontal distance of one ice thickness from each other. We suggest that ice flow at Camp Century is stable on seasonal to multi-decadal timescales. The airborne and satellite laser altimetry record indicates an ice thickening trend of 1.1 ± 0.3 cm a−1 since 1994. This thickening trend is qualitatively consistent with previously inferred ongoing millennial-scale ice thickening at Camp Century. The ice flow divide immediately north of Camp Century may now be migrating southward, although the reasons for this divide migration are poorly understood. The Camp Century flowlines presently terminate in the vicinity of Innaqqissorsuup Oqquani Sermeq (Gade Gletsjer) on the Melville Bay coast.
Background: Chordomas are rare malignant skull-base/spine cancers with devastating neurological morbidities and mortality. Unfortunately, no reliable prognostic factors exist to guide treatment decisions. This work identifies DNA methylation-based prognostic chordoma subtypes that are detectable non-invasively in plasma. Methods: Sixty-eight tissue samples underwent DNA methylation profiling and plasma methylomes were obtained for available paired samples. Immunohistochemical staining and publicly available methylation and gene expression data were utilized for validation. Results: Unsupervised clustering identified two prognostic tissue clusters (log-rank p=0.0062) predicting disease-specific survival independent of clinical factors (Multivariable Cox: HR=16.5, 95%CI: 2.8-96, p=0.0018). The poorer-performing cluster showed immune-related pathway promoter hypermethylation and higher immune cell abundance within tumours, which was validated with external RNA-seq data and immunohistochemical staining. The better-performing cluster showed higher tumour cellularity. Similar clusters were seen in external DNA methylation data. Plasma methylome-based models distinguished chordomas from differential diagnoses in independent testing sets (AUROC=0.84, 95%CI: 0.52-1.00). Plasma methylomes were highly correlated with tissue-based signals for both clusters (r=0.69 & 0.67) and leave-one-out models identified the correct cluster in all plasma cases. Conclusions: Prognostic molecular chordoma subgroups are for the first time identified, characterized, and validated. Plasma methylomes can detect and subtype chordomas which may transform chordoma treatment with personalized approaches tailored to prognosis.
Background: Focal spasticity affects up to 1 in 3 residents in long-term care (LTC), with potentially disabling consequences. Data are limited on access to care for patients requiring botulinum toxin (BoNT) treatment in LTC. Methods: This retrospective, observational, real-world study was conducted using the Ontario Drug Benefit claims database. Patients with ≥1 medical claim for BoNT for focal spasticity treatment were selected, and those residing in LTC were further identified. Data were analyzed for the utilization (2000–2019), treatment rate, and time-to-treatment with BoNT in LTC residents (2015–2019). Results: Over a 10-year period, the number of patients receiving BoNT for spasticity increased 7-fold and the proportion of patients residing in LTC versus community increased from 43% (2010) to 52% (2019). Of the LTC residents eligible for BoNT treatment, 33% received BoNT in 2015 compared with 63% in 2019. Injections/patient/year increased from 1.9 (2010) to 3.1 (2017). Following LTC admission, median time to first injection was 2.9 years. Conclusions: In this study, approximately 40% of eligible LTC residents in Ontario were not receiving BoNT treatment, and of those who were, median time to first injection was 2.9 years. Future policy considerations should prioritize uniform access to spasticity standards of care for LTC residents.
Production of cereal crops in sub-Saharan Africa is threatened by parasitic striga weeds and attack by stemborers and the invasive fall armyworm (FAW), compounded by increasing hot and dry conditions. A climate-smart push-pull technology (PPT) significantly reduces effects of these biotic challenges. To improve further resilience of the system to climate change, more adapted and suitable companion plants were identified and integrated in a new version of PPT, termed ‘third generation PPT’. Our study evaluates field performance and farmer opinions of this new version in comparison with the earlier version, climate-smart PPT, and farmers’ own practices of growing maize in controlling stemborers, FAW, and striga weeds. Trials were conducted across five locations in western Kenya for two cropping seasons in the year 2019 following a one-farm one-replicate completely randomized design. We assessed infestation on striga, stemborers, and FAW, and yield performance of the three cropping systems. We also sought the opinions of the hosting farmers through semi-structured questionnaires that were administered through individual interviews. Both PPT plots recorded significantly (P < 0.05) lower striga count, FAW, and stemborer damage, and higher grain yield than in plots that followed farmers’ own practices. There was no statistically significant difference between the two PPT plots except for stemborer damage for which the third generation PPT recorded higher damage than the climate-smart PPT. However, farmers preferred the third generation PPT for important traits possessed by its companion plants which their counterparts in climate-smart PPT are deficient. The cultivar Xaraes was rated as ‘very good’ for resistance to spider mites, biomass yield, and drought tolerance while Desmodium incanum was rated ‘very good’ for seed production and drought tolerance. The third generation PPT is based on companion crops that are more resilient to hot and dry conditions which are increasing rapidly in prevalence with climate change. This version therefore presents a better option to upscale the technology and meet different needs of farmers especially in arid and semi-arid conditions.
We report results and modelling of an experiment performed at the Target Area West Vulcan laser facility, aimed at investigating laser–plasma interaction in conditions that are of interest for the shock ignition scheme in inertial confinement fusion (ICF), that is, laser intensity higher than ${10}^{16}$$\mathrm{W}/{\mathrm{cm}}^2$ impinging on a hot ($T>1$ keV), inhomogeneous and long scalelength pre-formed plasma. Measurements show a significant stimulated Raman scattering (SRS) backscattering ($\sim 4\%{-}20\%$ of laser energy) driven at low plasma densities and no signatures of two-plasmon decay (TPD)/SRS driven at the quarter critical density region. Results are satisfactorily reproduced by an analytical model accounting for the convective SRS growth in independent laser speckles, in conditions where the reflectivity is dominated by the contribution from the most intense speckles, where SRS becomes saturated. Analytical and kinetic simulations well reproduce the onset of SRS at low plasma densities in a regime strongly affected by non-linear Landau damping and by filamentation of the most intense laser speckles. The absence of TPD/SRS at higher densities is explained by pump depletion and plasma smoothing driven by filamentation. The prevalence of laser coupling in the low-density profile justifies the low temperature measured for hot electrons ($7\!{-}\!12$ keV), which is well reproduced by numerical simulations.
Obese subjects have shown a preference for dietary lipids. A recent collection of evidence has proposed that a variant in the CD36 gene plays a significant role in this pathway. We assessed the association between the orosensory detection of a long-chain fatty acid, i.e. oleic acid (OA), and genetic polymorphism of the lipid taste sensor CD36 in obese and normal-weight subjects. Adult participants were recruited in the fasting condition. They were invited to fat taste perception sessions, using emulsions containing OA and according to the three-alternative forced-choice (3-AFC) method. Genomic DNA was used to determine the polymorphism (SNP rs 1761667) of the CD36 gene. Obese (n 50; BMI 34⋅97 (sd 4⋅02) kg/m2) exhibited a significantly higher oral detection threshold for OA (3⋅056 (sd 3⋅53) mmol/l) than did the normal-weight (n 50; BMI 22⋅16 (sd 1⋅81) kg/m2) participants (1⋅20 (sd 3⋅23) mmol/l; P = 0⋅007). There was a positive correlation between OA detection thresholds and BMI in all subjects; evenly with body fat percentage (BF%). AA genotype was more frequent in the obese group than normal-weight group. OA detection thresholds were much higher for AA and AG genotypes in obese subjects compared with normal-weight participants. Higher oral detection thresholds for fatty acid taste are related to BMI, BF% and not always to CD36 genotype.
To examine the factors that are associated with changes in depression in people with type 2 diabetes living in 12 different countries.
Methods
People with type 2 diabetes treated in out-patient settings aged 18–65 years underwent a psychiatric assessment to diagnose major depressive disorder (MDD) at baseline and follow-up. At both time points, participants completed the Patient Health Questionnaire (PHQ-9), the WHO five-item Well-being scale (WHO-5) and the Problem Areas in Diabetes (PAID) scale which measures diabetes-related distress. A composite stress score (CSS) (the occurrence of stressful life events and their reported degree of ‘upset’) between baseline and follow-up was calculated. Demographic data and medical record information were collected. Separate regression analyses were conducted with MDD and PHQ-9 scores as the dependent variables.
Results
In total, there were 7.4% (120) incident cases of MDD with 81.5% (1317) continuing to remain free of a diagnosis of MDD. Univariate analyses demonstrated that those with MDD were more likely to be female, less likely to be physically active, more likely to have diabetes complications at baseline and have higher CSS. Mean scores for the WHO-5, PAID and PHQ-9 were poorer in those with incident MDD compared with those who had never had a diagnosis of MDD. Regression analyses demonstrated that higher PHQ-9, lower WHO-5 scores and greater CSS were significant predictors of incident MDD. Significant predictors of PHQ-9 were baseline PHQ-9 score, WHO-5, PAID and CSS.
Conclusion
This study demonstrates the importance of psychosocial factors in addition to physiological variables in the development of depressive symptoms and incident MDD in people with type 2 diabetes. Stressful life events, depressive symptoms and diabetes-related distress all play a significant role which has implications for practice. A more holistic approach to care, which recognises the interplay of these psychosocial factors, may help to mitigate their impact on diabetes self-management as well as MDD, thus early screening and treatment for symptoms is recommended.
Children and adolescents make up one third of the world's population. Neuropsychiatric disorders are the most prevalent cause of health burden in this age group and are estimated to effect 10 to 20 per cent of children worldwide. Little is known about the prevalence of child psychiatric morbidities and associated risk factors in countries like Pakistan.
Method:
This is a prospective cross sectional study of 300 participants aged 6 to 18 years. Participants were recruited from a general psychiatric outpatient department over 12 weeks in Lahore, Pakistan. Information was collected on presenting complaints, possible risk factors and mental health disorders using the strengths and difficulties (SDQ) questionnaire.
Results:
Our preliminary results show that frequently reported presenting complaint were fits and alterations in consciousness (54%), disturbed behaviour (14%) and depressive symptoms (8.3%). The most frequent mental health disorder on SDQ was hyperactivity, followed by conduct. Emotional problems were least commonly reported. Male participants scored higher for conduct disorder. A history of epilepsy was reported by 35.3% of participants. Low socio economic status, low educational achievement and nuclear family setup were associated with higher rates of illness.
Conclusion:
Our results show that fits and alterations in consciousness was the most frequently reported presenting complaint. Hyperactivity was most prevalent followed by conduct. The majority of child and adolescent patients attend general practitioners or general psychiatry out-patient departments. They are managed by practitioners who lack adequate training. Training these clinicians would help utilisation of limited resources.
Multimorbidity may impose an overwhelming burden on patients with psychosis and is affected by gender and age. Our aim is to study the independent role of familial liability to psychosis as a risk factor for multimorbidity.
Methods:
We performed the study within the framework of the Genetic Risk and Outcome of Psychosis (GROUP) project. Overall, we compared 1024 psychotic patients, 994 unaffected siblings and 566 controls on the prevalence of 125 lifetime diseases, and 19 self-reported somatic complaints. Multimorbidity was defined as the presence of two or more complaints/diseases in the same individual. Generalized linear mixed model (GLMM) were used to investigate the effects of gender, age (adolescent, young, older) and familial liability (patients, siblings, controls) and their interactions on multimorbidity.
Results:
Familial liability had a significant effect on multimorbidity of either complaints or diseases. Patients had a higher prevalence of multimorbidity of complaints compared to siblings (OR 2.20, 95% CI 1.79–2.69, P < 0.001) and to controls (3.05, 2.35–3.96, P < 0.001). In physical health multimorbidity, patients (OR 1.36, 95% CI 1.05–1.75, P = 0.018), but not siblings, had significantly higher prevalence than controls. Similar finding were observed for multimorbidity of lifetime diseases, including psychiatric diseases. Significant results were observed for complaints and disease multimorbidity across gender and age groups.
Conclusion:
Multimorbidity is a common burden, significantly more prevalent in patients and their unaffected siblings. Familial liability to psychosis showed an independent effect on multimorbidity; gender and age are also important factors determining multimorbidity.
Background: SMA1, a rapidly progressing disease, results in muscle weakness, respiratory failure, hospitalization, and early death. This study highlights the value of onasemnogene abeparvovec (AVXS-101) gene-replacement therapy for SMA1. Methods: Twelve SMA1 patients received a one-time intravenous proposed therapeutic dose of AVXS-101 (CL-101; NCT02122952). Event-free survival (no death/permanent ventilation), pulmonary/nutritional interventions, swallow function, hospitalization rates, CHOP-INTEND, motor milestones, and safety were assessed (2-year follow-up). Results: By study end, all 12 patients survived event-free; 7 did not require non-invasive ventilation; 11 had stable/improved swallowing function (6 exclusively fed by mouth); 11 spoke. On average, patients experienced 1.4 (SD=0.41, range=0–4.8) respiratory hospitalizations/year. The mean proportion of time hospitalized was 4.4% (range=0–18.3%); mean unadjusted rate of hospitalization/year was 2.1 (range=0–7.6), with a mean hospital stay of 6.7 (range=3–12.1) days. CHOP-INTEND increased by 9.8 (SD=3.9) and 15.4 (SD=6.4) points at 1- and 3-months post-treatment. At long-term follow-up, 11 patients sat unassisted, 4 stood with assistance, and 2 walked. Adverse events included elevated serum aminotransferase levels, which were attenuated by prednisolone. Conclusions: AVXS-101 in CL-101 resulted in dramatic survival and motor function improvements. The reduced healthcare utilization in treated infants could decrease cost and alleviate patient, caregiver, and societal burden.
Background: Sotos syndrome is a genetic condition caused by NSD1 alterations, characterized by overgrowth, macrocephaly, dysmorphic features, and learning disability. Approximately half of children with Sotos syndrome develop seizures. We investigated the spectrum of seizure phenotypes in these patients. Methods: Patients were recruited from clinics and referral from support groups. Those withclinical or genetic diagnosis of Sotos syndrome and seizures were included. Phenotyping data was collected via structured clinical interview and medical chart review. Results: 25 patients with typical Sotos syndrome features were included. Of 14 tested patients, 64% (n=9) had NSD1 alterations. Most had developmental impairment (80%, n=20) and neuropsychiatric comorbidities (68%, n=17). Seizure onset was variable (2 months to 12 years). Febrile and absence seizures were the most frequent types (64%, n=16). Afebrile generalized tonicclonic (40%, n=10) and atonic (24%, n=6) seizures followed. Most patients (60%, n=15) had multiple seizure types. The majority (72%, n=18) was controlled on a single antiepileptic, or none; 4% (n=1) remained refractory to antiepileptics. Conclusions: The seizure phenotype in Sotos syndrome most commonly involves febrile convulsions or absence seizures. Afebrile tonic-clonic or atonic seizures may also occur. Seizures are typically well-controlled with antiepileptics. The rate of developmental impairment and neuropsychiatric comorbidities is high.
Production of maize in western Kenya is severely constrained by the parasitic weed striga. Although productivity of maize can be improved through adoption of improved varieties, adoption of such varieties remains low in the region, as the majority of smallholder farmers still grow unimproved open-pollinated varieties (landraces). The performance of two improved hybrid varieties was evaluated against six landraces in striga-infested soils in western Kenya. The varieties were planted in plots under natural striga infestation and were supplemented with pot experiments under artificial infestation. Striga emergence was lower in landraces than in the hybrid varieties in both field and pot experiments. Similarly, height of maize plants at harvest and grain yields were higher in the landraces than in the hybrids. After three continuous cropping seasons, in all treatments, striga seedbank density increased two to seven times. Seedbank increase was higher with hybrids and two of the landraces, ‘Rachar' and ‘Endere'. These results provide an insight into the potential role landraces could play in efforts toward an integrated management approach for striga in smallholder cropping systems. They also highlight the need to develop hybrid maize lines with local adaptation to biotic constraints, specifically striga.