We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We present the serendipitous radio-continuum discovery of a likely Galactic supernova remnant (SNR) G305.4–2.2. This object displays a remarkable circular symmetry in shape, making it one of the most circular Galactic SNRs known. Nicknamed Teleios due to its symmetry, it was detected in the new Australian Square Kilometre Array Pathfinder (ASKAP) Evolutionary Map of the Universe (EMU) radio–continuum images with an angular size of 1 320$^{\prime\prime}$$\times$1 260$^{\prime\prime}$ and PA = 0$^\circ$. While there is a hint of possible H$\alpha$ and gamma-ray emission, Teleios is exclusively seen at radio–continuum frequencies. Interestingly, Teleios is not only almost perfectly symmetric, but it also has one of the lowest surface brightnesses discovered among Galactic SNRs and a steep spectral index of $\alpha$=–0.6$\pm$0.3. Our best estimates from Hi studies and the $\Sigma$–D relation place Teleios as a type Ia SNR at a distance of either $\sim$2.2 kpc (near-side) or $\sim$7.7 kpc (far-side). This indicates two possible scenarios, either a young (under 1 000 yr) or a somewhat older SNR (over 10 000 yr). With a corresponding diameter of 14/48 pc, our evolutionary studies place Teleios at the either early or late Sedov phase, depending on the distance/diameter estimate. However, our modelling also predicts X-ray emission, which we do not see in the present generation of eROSITA images. We also explored a type Iax explosion scenario that would point to a much closer distance of $\lt$1 kpc and Teleios size of only $\sim$3.3 pc, which would be similar to the only known type Iax remnant SN1181. Unfortunately, all examined scenarios have their challenges, and no definitive Supernova (SN) origin type can be established at this stage. Remarkably, Teleios has retained its symmetrical shape as it aged even to such a diameter, suggesting expansion into a rarefied and isotropic ambient medium. The low radio surface brightness and the lack of pronounced polarisation can be explained by a high level of ambient rotation measure (RM), with the largest RM being observed at Teleios’s centre.
The Scientific Advisory Committee on Nutrition (SACN) provides independent advice on nutrition and related health matters to UK government organisations. In keeping with its commitment to openness and transparency, SACN follows a set ‘Framework’ to ensure a prescribed and consistent approach is taken in all its evidence evaluations. Following an update of the SACN Framework in 2020, which addressed some straightforward issues, the SACN Framework subgroup was established in 2021 to consider more complex matters that were not addressed in the 2020 update. The SACN Framework subgroup considered 4 main topics for update: 1) the different types of evidence evaluations produced by SACN, 2) interpretation of statistical data, 3) tools for assessment of study quality, 4) tools to assess the certainty of a body of evidence for exposure-outcome relationships. The Framework subgroup agreed clear definitions and processes for the different types of evidence evaluations produced by SACN and agreed that interpretation of p values should be informed by consideration of study size, power and methodological quality. The subgroup recommended use of the AMSTAR 2 tool for quality assessment of evidence from systematic reviews and use of the GRADE approach to assess the certainty of evidence. The updated Framework was published in January 2023. This was followed by publication of a further update in October 2024. As a ‘living’ document, the Framework will be subject to regular review by the Framework subgroup and continue to evolve in line with best practice.
Accurate redshift measurements are essential for studying the evolution of quasi-stellar objects (QSOs) and their role in cosmic structure formation. While spectroscopic redshifts provide high precision, they are impractical for the vast number of sources detected in large-scale surveys. Photometric redshifts, derived from broadband fluxes, offer an efficient alternative, particularly when combined with machine learning techniques. In this work, we develop and evaluate a neural network model for predicting the redshifts of QSOs in the Dark Energy Spectroscopic Instrument (DESI) Early Data Release spectroscopic catalogue, using photometry from DESI, the Widefield Infrared Survey Explorer (WISE) and the Galactic Evolution Explorer (GALEX).We compare the performance of the neural network model against a k-Nearest Neighbours approach, these being the most accurate and least resource-intensive of the methods trialled herein, optimising model parameters and assessing accuracy with standard statistical metrics. Our results show that incorporating ultraviolet photometry from GALEX improves photometric redshift estimates, reducing scatter and catastrophic outliers compared to models trained only on near infrared and optical bands. The neural network achieves a correlation coefficient with spectroscopic redshift of 0.9187 with normalised median absolute deviation of 0.197, representing a significant improvement over other methods. Our work combines DESI, WISE and GALEX measurements, providing robust predictions which address the difficulties in predicting photometric redshift of QSOs over a large redshift range.
A wrist-hand exoskeleton designed to assist individuals with wrist and hand limitations is presented in this paper. The novel design is developed based on specific selection criteria, addressing all the Degrees of Freedom (DOF). In the conceptual design phase, design concepts are created and assessed before being screened and scored to determine which concept is the most promising. Performance and possible restrictions are assessed using kinematic and dynamic analysis. Using polylactic acid material, the exoskeleton is prototyped to ensure structural integrity and fit. Manual control, master-slave control, and electroencephalography (EEG) dataset-based control are among the control strategies that have been investigated. Direct manipulation is possible with manual control, nevertheless, master-slave control uses sensors to map user motions. Brain signals for hand opening and closing are interpreted by EEG dataset-based control, which manages the hand open-close of the exoskeleton. This study introduces a novel wrist-hand exoskeleton that improves usefulness, modularity, and mobility. While the numerous control techniques give versatility based on user requirements, the 3D printing process assures personalization and flexibility in design.
A novel entomopathogenic nematode (EPN) species, Steinernema tarimense n. sp., was isolated from soil samples collected in a Populus euphratica forest located in Yuli County within the Tarim Basin of Xinjiang, China. Integrated morphological and molecular analyses consistently place S. tarimense n. sp. within the ‘kushidai-clade’. The infective juvenile (IJ) of new species is characterized by a body length of 674–1010 μm, excretory pore located 53–80 μm from anterior end, nerve ring positioned 85–131 μm from anterior end, pharynx base situated 111–162 μm from anterior end, a tail length of 41–56 μm, and the ratios D% = 42.0–66.6, E% = 116.2–184.4, and H% = 25.5–45.1. The first-generation male of the new species is characterized by a curved spicule length of 61–89 μm, gubernaculum length of 41–58 μm, and ratios D% = 36.8–66.2, SW% = 117.0–206.1, and GS% = 54.8–82.0. Additionally, the tail of first-generation female is conoid with a minute mucron. Phylogenetic analyses of ITS, 28S, and mt12S sequences demonstrated that the three isolates of S. tarimense n. sp. are conspecific and form a sister clade to members of the ‘kushidai-clade’ including S. akhursti, S. anantnagense, S. kushidai, and S. populi. Notably, the IJs of the new species exhibited faster development at 25°C compared to other Steinernema species. This represents the first described of an indigenous EPN species from Xinjiang, suggesting its potential as a novel biocontrol agent against local pests.
In childhood, diets high in sodium and low in potassium contribute to raised blood pressure and cardiovascular disease later in life(1). For New Zealand (NZ) children, bread is a major source of dietary sodium, and fruit, vegetables, and milk are major dietary sources of potassium(2,3). However, it is mandatory to use iodised salt in NZ bread meaning reducing the salt and thus sodium content could put children at risk of iodine deficiency(4). Our objective was to measure the sodium, potassium, and iodine intake, and blood pressure of NZ school children 8-13 years old. A cross-sectional survey was conducted in five primary schools in Auckland and Dunedin. Primary schools were recruited between July 2022 and February 2023 using purposive sampling. Seventy-five children (n= 37 boys, 29 girls, and nine children who did not state their gender) took part. The most common ethnicity was NZ European and Other (n=54 or 72%) followed by Māori (indigenous inhabitants; n=9 or 12%) and Pasifika (n=5 or 7%). The main outcomes were 24-hour sodium and potassium intake, sodium to potassium molar ratio, 24-hour iodine intake, and BP. Sodium, potassium, and iodine intake were assessed using 24-hour urine samples and BP was assessed using standard methods. Differences by gender were tested using two-sample t-tests and nonparametric Wilcoxon two-sample tests. The mean (SD) 24-hour sodium excretion, potassium excretion, and sodium to potassium molar ratio for children with complete samples (n=59) were 2,420 (1,025) mg, 1,567 (733) mg, and 3.0 (1.6), respectively. The median (25th, 75th percentile) urinary iodine excretion was 88 (61, 122) µg per 24 hours and the mean (SD) systolic and diastolic blood pressure (n=74) were 105 (10) mmHg and 67 (9) mmHg, respectively. There was a significant difference between boys and girls for iodine (77 (43, 96) vs. 98 (72, 127) µg per 24 hours; p=0.02) but no other outcomes. In conclusion, children consumed more sodium and less potassium and iodine than World Health Organization recommendations(5). However, future research should confirm these findings in a nationally representative sample. Evidence-based, equitable interventions and policies with adequate monitoring should be considered to reduce potentially suboptimal sodium, potassium, and iodine intakes in New Zealand.
Food security constitutes a worldwide concern closely correlated with population growth. By 2050, the global population is expected to reach 9.3 billion(1). The rising population, along with increasing life expectancy and shifts toward Western dietary patterns, is expected to drive higher food demand and contribute to a rise in metabolic conditions(2). In this context, looking for alternative and sustainable food and protein sources is imperative. Pasture legumes including lucerne (Medicago sativa) and red clover (Trifolium pratense) are becoming popular as they can be used as an alternative protein and functional food source. Both crops play an important role in New Zealand’s agriculture. Their seeds can be used in human nutrition as alternative food and protein options; however, the presence of anti-nutritional factors (ANF) and their distinct taste make them less favourable for human consumption. Fermentation can be used as a possible strategy to mitigate these limitations. Lactobacillus fermentation was conducted using Lactocillus plantarum, Lactobacillus. acidophilus and Lactobacillus. casei. Proximate composition and mineral content were determined following Association of Official Analytical Chemists (AOAC) methods. Total phenol content (TPC), total flavonoid content (TFC) and antioxidant activity (2,2-Diphenyl-1-picrylhydrazyl and 2,2′-azino-bis-(3-ethylbenzothiazoline-6-sulfonic) acid) and ANF including phytic acid, trypsin, and chymotrypsin inhibition were assessed using colourimetric techniques. For the enzyme inhibition assays, enzyme-substrate reactions were performed with sample extracts before measurement. All the experiments were replicated three times, and the results were expressed as mean ± SD. A factorial analysis of variance (ANOVA) was conducted (4 legume seed samples × 3 LAB cultures) with a Tukey’s post-hoc test for mean comparison at P < 0.05 using IBM SPSS Statistics 29.0. All the legume seeds demonstrated high nutritional content, with crude protein and fibre levels around 40 and 16% respectively. The seeds were also rich in minerals, particularly magnesium, phosphorus, iron and zinc. In addition, fermentation led to an increase (P < 0.05) in TPC, TFC and antioxidant activity, while significantly reducing ANF. For instance, fermentation led to an increase in TPC (18.8 to 47.1% increase), TFC (9.6 to 34.5% increase) and AOA via DPPH and ABTS. Lactobacillus fermentation has proven to be an effective processing technique to enhance the nutritional value of lucerne and red clover seeds. These findings support the potential of using fermentation to develop novel and sustainable protein sources, contributing to improved dietary quality and nutrition. Moreover, further work to study the effect of fermentation on the nutrient digestibility of lucerne and red clover seeds is warranted.
New Zealand and Australian governments rely heavily on voluntary industry initiatives to improve population nutrition, such as voluntary front-of-pack nutrition labelling (Health Star Rating [HSR]), industry-led food advertising standards, and optional food reformulation programmes. Research in both countries has shown that food companies vary considerably in their policies and practices on nutrition(1). We aimed to determine if a tailored nutrition support programme for food companies improved their nutrition policies and practices compared with control companies who were not offered the programme. REFORM was a 24-month, two-country, cluster-randomised controlled trial. 132 major packaged food/drink manufacturers (n=96) and fast-food companies (n=36) were randomly assigned (2:1 ratio) to receive a 12-month tailored support programme or to the control group (no intervention). The intervention group was offered a programme designed and delivered by public health academics comprising regular meetings, tailored company reports, and recommendations and resources to improve product composition (e.g., reducing nutrients of concern through reformulation), nutrition labelling (e.g., adoption of HSR labels), marketing to children (reducing the exposure of children to unhealthy products and brands) and improved nutrition policy and corporate sustainability reporting. The primary outcome was the nutrient profile (measured using HSR) of company food and drink products at 24 months. Secondary outcomes were the nutrient content (energy, sodium, total sugar, and saturated fat) of company products, display of HSR labels on packaged products, company nutrition-related policies and commitments, and engagement with the intervention. Eighty-eight eligible intervention companies (9,235 products at baseline) were invited to participate, of whom 21 accepted and were enrolled in the REFORM programme (delivered between September 2021 and December 2022). Forty-four companies (3,551 products at baseline) were randomised to the control arm. At 24 months, the model-adjusted mean HSR of intervention company products was 2.58 compared to 2.68 for control companies, with no significant difference between groups (mean difference -0.10, 95% CI -0.40 to 0.21, p-value 0.53). A per protocol analysis of intervention companies who enrolled in the programme compared to control companies with no major protocol violation also found no significant difference (2.93 vs 2.64, mean difference 0.29, 95% CI -0.13 to 0.72, p-value 0.18). We found no significant differences between the intervention and control groups in any secondary outcome, except in total sugar (g/100g) where the sugar content of intervention company products was higher than that of control companies (12.32 vs 6.98, mean difference 5.34, 95% CI 1.73 to 8.96, p-value 0.004). The per-protocol analysis for sugar did not show a significant difference (10.47 vs 7.44, mean difference 3.03, 95% CI -0.48 to 6.53, p-value 0.09).In conclusion, a 12-month tailored nutrition support for food companies did not improve the nutrient profile of company products.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
Outbreaks of Rachiplusia nu have occurred on soybean in Brazil as the first species resistant to the Bt soybean expressing only Cry1Ac protein, triggering a significant increase in insecticide use on the crop. This threatens one of the most important benefits of adopting Bt soybean cultivars – the reduction of chemicals. Therefore, this research studied the biology and parasitism capacity of Trichogramma pretiosum at 20, 25, and 30 ± 2 °C on R. nu eggs in order to evaluate the potential of releasing this egg parasitoid in soybean to manage R. nu. Parasitoid exhibited high biological performance on the R. nu eggs as observed in the lifetime parasitism of 24.9, 46.4, and 34.4 R. nu eggs at 20, 25, and 30 °C, respectively, and 100% emergence in both biology and parasitism capacity experiments. The sex ratio was statistically lower at 20 °C (0.4947), but at all studied temperatures, the production of female descendants was equal (sex ratio of 0.4947 at 20 °C) or higher (sex ratio of 0.6666 at 25 °C and 0.6524 at 30 °C). All other evaluated parameters were similar to previously positive recorded observations for T. pretiosum on other soybean pests, such as Anticarsia gemmatalis and Chrysodeixis includens, against which the parasitoid has already been commercially released in the fields as a biocontrol option. Therefore, T. pretiosum might also be released in soybean as an egg parasitoid against R. nu, what needs to be confirmed in future field trials.
The impact of chronic pain and opioid use on cognitive decline and mild cognitive impairment (MCI) is unclear. We investigated these associations in early older adulthood, considering different definitions of chronic pain.
Methods:
Men in the Vietnam Era Twin Study of Aging (VETSA; n = 1,042) underwent cognitive testing and medical history interviews at average ages 56, 62, and 68. Chronic pain was defined using pain intensity and interference ratings from the SF-36 over 2 or 3 waves (categorized as mild versus moderate-to-severe). Opioid use was determined by self-reported medication use. Amnestic and non-amnestic MCI were assessed using the Jak-Bondi approach. Mixed models and Cox proportional hazards models were used to assess associations of pain and opioid use with cognitive decline and risk for MCI.
Results:
Moderate-to-severe, but not mild, chronic pain intensity (β = −.10) and interference (β = −.23) were associated with greater declines in executive function. Moderate-to-severe chronic pain intensity (HR = 1.75) and interference (HR = 3.31) were associated with a higher risk of non-amnestic MCI. Opioid use was associated with a faster decline in verbal fluency (β = −.18) and a higher risk of amnestic MCI (HR = 1.99). There were no significant interactions between chronic pain and opioid use on cognitive decline or MCI risk (all p-values > .05).
Discussion:
Moderate-to-severe chronic pain intensity and interference related to executive function decline and greater risk of non-amnestic MCI; while opioid use related to verbal fluency decline and greater risk of amnestic MCI. Lowering chronic pain severity while reducing opioid exposure may help clinicians mitigate later cognitive decline and dementia risk.
During the COVID-19 pandemic, the United States Centers for Disease Control and Prevention provided strategies, such as extended use and reuse, to preserve N95 filtering facepiece respirators (FFR). We aimed to assess the prevalence of N95 FFR contamination with SARS-CoV-2 among healthcare personnel (HCP) in the Emergency Department (ED).
Design:
Real-world, prospective, multicenter cohort study. N95 FFR contamination (primary outcome) was measured by real-time quantitative polymerase chain reaction. Multiple logistic regression was used to assess factors associated with contamination.
Setting:
Six academic medical centers.
Participants:
ED HCP who practiced N95 FFR reuse and extended use during the COVID-19 pandemic between April 2021 and July 2022.
Primary exposure:
Total number of COVID-19-positive patients treated.
Results:
Two-hundred forty-five N95 FFRs were tested. Forty-four N95 FFRs (18.0%, 95% CI 13.4, 23.3) were contaminated with SARS-CoV-2 RNA. The number of patients seen with COVID-19 was associated with N95 FFR contamination (adjusted odds ratio, 2.3 [95% CI 1.5, 3.6]). Wearing either surgical masks or face shields over FFRs was not associated with FFR contamination, and FFR contamination prevalence was high when using these adjuncts [face shields: 25% (16/64), surgical masks: 22% (23/107)].
Conclusions:
Exposure to patients with known COVID-19 was independently associated with N95 FFR contamination. Face shields and overlying surgical masks were not associated with N95 FFR contamination. N95 FFR reuse and extended use should be avoided due to the increased risk of contact exposure from contaminated FFRs.
Background: Predicting neurological recovery in patients with severe brain injury remains challenging. Continuous EEG monitoring can detect malignant patterns but is resource-intensive, and its role in long-term functional outcome prediction is unclear. This study evaluates the utility of parameterized short-segment EEG, acquired via EEG cap, in predicting neurological recovery. Methods: We analyzed short-segment high-density EEGs from 42 patients in the NET-ICU cohort with acute neurological injury. EEGs were pre-processed into standard clinical formats and parameterized using five visual EEG features associated with outcome prediction. Random Forest Classifier (RFC) models were trained and cross-validated to predict recovery of responsiveness (following 1-2 step commands during or after ICU admission) using: EEG features alone; clinician prediction combined with EEG features. Results: EEG-based prediction outperformed clinician bedside assessment (AUC ROC: 0.80 vs. 0.67) under the RFC model. Combining clinician Glasgow Outcome Scale–Extended (GOSE) scores with EEG features improved overall predictive performance (AUC ROC: 0.91). Conclusions: Standardized EEG features obtained using EEG caps can improve the accuracy of neurological recovery predictions in patients with acute severe brain injury. This suggests that automated extraction of background brain signals has the potential to provide clinically meaningful prognostic information in critical care settings, enhancing accessibility and resource efficiency.
Background: While efgartigimod usage is expected to reduce immunoglobulin (IG) utilization, evidence in clinical practice is limited. Methods: In this retrospective cohort study, patients with gMG treated with efgartigimod for ≥1-year were identified from US medical/pharmacy claims data (April 2016-January 2024) and data from the My VYVGART Path patient support program (PSP). The number of IG courses during 1-year before and after efgartigimod initiation (index date) were evaluated. Patients with ≥6 annual IG courses were considered chronic IG users. Myasthenia Gravis Activities of Daily Living (MG-ADL) scores before and after index were obtained from the PSP where available. Descriptive statistics were used without adjustment for covariates. Results: 167 patients with ≥1 IG claim before index were included. Prior to efgartigimod initiation, the majority of patients (62%) received IG chronically. During the 1-year after index, the number of IG courses fell by 95% (pre: 1531, post: 75). 89% (n=149/167) of patients fully discontinued IG usage. Mean (SD) best-follow up MG-ADL scores were significantly reduced after index (8.0 [4.1] to 2.8 [2.1], P<0.05, n=73/167, 44%). Conclusions: Based on US claims, IG utilization was substantially reduced among patients who continued efgartigimod for ≥1-year, with patients demonstrating a favorable MG-ADL response.
Background: TERT promoter mutation (TPM) is an established biomarker in meningiomas associated with aberrant TERT expression and reduced progression-free survival (PFS). TERT expression, however, has also been observed even in tumours with wildtype TERT promoters (TP-WT). This study aimed to examine TERT expression and clinical outcomes in meningiomas. Methods: TERT expression, TPM status, and TERT promoter methylation of a multi-institutional cohort of meningiomas (n=1241) was assessed through nulk RNA sequencing (n=604), Sanger sequencing of the promoter (n=1095), and methylation profiling (n=1218). 380 Toronto meningiomas were used for discovery, and 861 external institution samples were compiled as a validation cohort. Results: Both TPMs and TERTpromoter methylation were associated with increased TERT expression and may represent independent mechanisms of TERT reactivation. TERT expression was detected in 30.4% of meningiomas that lacked TPMs, was associated with higher WHO grades, and corresponded to shorter PFS, independent of grade and even among TP-WT tumours. TERT expression was associated with a shorter PFS equivalent to those of TERT-negative meningiomas of one higher grade. Conclusions: Our findings highlight the prognostic significance of TERT expression in meningiomas, even in the absence of TPMs. Its presence may identify patients who may progress earlier and should be considered in risk stratification models.
Background: RAISE-XT (NCT04225871; Phase 3 study) showed clinically meaningful and sustained improvements in myasthenia gravis (MG)-specific outcomes with zilucoplan, a macrocyclic peptide complement component 5 inhibitor, in patients with acetylcholine receptor autoantibody-positive generalised MG. Methods: Adults self-administered once-daily subcutaneous zilucoplan 0.3mg/kg. This post hoc analysis assessed durability of response to Week 120 in MG-Activities of Daily Living (MG-ADL) and Quantitative MG (QMG) responders at Week 1 of two double-blind studies (NCT03315130, NCT04115293). Responder definitions: improvements of ≥3-points (MG-ADL) or ≥5-points (QMG) (interim data cut: 11 November 2023). Results: 93 patients were randomised to zilucoplan 0.3mg/kg in the double-blind studies; 43.0% (n=40/93) and 33.3% (n=31/93) were MG-ADL and QMG responders, respectively, at Week 1. Week 1 responders spent a median #of 98.9% (5.8–99.2) and 99.0% (2.5–99.2) time in response up to Week 120 for MG-ADL and QMG. Week 1 non-responders spent# a median #of 84.6% (0.0–98.3) and 66.7% (0.0–98.9) time in response up to Week 120 for MG-ADL and QMG, with most responding later in the study. Conclusions: Among early (Week 1) zilucoplan responders, time in response remained high (99%) up to Week 120. These data demonstrate rapid and sustained efficacy with long-term zilucoplan treatment.#
Background: Our prior six-year review (n=2165) revealed 24% of patients undergoing posterior decompression surgeries (laminectomy or discectomy) sought emergency department (ED) care within three months post-surgery. We established an integrated Spine Assessment Clinic (SAC) to enhance patient outcomes and minimize unnecessary ED visits through pre-operative education, targeted QI interventions, and early post-operative follow-up. Methods: We reviewed 13 months of posterior decompression data (n=205) following SAC implementation. These patients received individualized, comprehensive pre-operative education and follow-up phone calls within 7 days post-surgery. ED visits within 90 days post-surgery were tracked using provincial databases and compared to our pre-SAC implementation data. Results: Out of 205 patients, 24 (11.6%) accounted for 34 ED visits within 90 days post-op, showing a significant reduction in ED visits from 24% to 11.6%, and decreased overall ED utilization from 42.1% to 16.6% (when accounting for multiple visits by the same patient). Early interventions including wound monitoring, outpatient bloodwork, and prescription adjustments for pain management, helped mitigate ED visits. Patient satisfaction surveys (n=62) indicated 92% were “highly satisfied” and 100% would recommend the SAC. Conclusions: The SAC reduced ED visits after posterior decompression surgery by over 50%, with pre-operative education, focused QI initiatives, and its individualized, proactive approach.
Background: Dural tears (DT) are relatively common spine surgery complications, increasing risks of cerebrospinal fluid leaks, adverse events, and prolonged hospitalization. This study sought to identify DT predictors and compare postoperative outcomes including adverse events, revision, emergency room (ER) care, and length of stay between DT and non-DT cohorts. Methods: Retrospective analysis of elective spine surgery patients at a single tertiary centre. Variables included demographics, DT repair techniques, risk factors, post-operative adverse events, ER care within 30 days post-op, and revision. Binary logistic regression was used to analyze risk factors while hierarchical logistic and linear regressions analyzed postoperative events. Results: 6.6% of patients experienced DTs, with patches used in 40% of repairs. Age was a risk factor for DT (EXP(B)=1.039, CI [1.016, 1.063]), while minimally invasive surgery (MIS) (EXP(B)=0.521, CI [.297, .912]) reduced risk. DTs were associated with increased rates of cardiac arrest (EXP(B) = 3.966, CI [1.046, 15.033]), urinary retention (EXP(B)=2.408, CI [1.218, 4.759]), revision (EXP(B)=4.574, CI [1.941, 10.779]), ER visits (EXP(B)=1.975, CI [1.020, 3.826]), and length of stay (B=3.42, p<0.001). Conclusions: MIS seems to be associated with decreased DT risk. DTs are also associated with post-operative cardiac arrest, urinary retention, required revision surgery, and visits to the ER within 30 days post-op.