We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: The WHO grade of meningioma was updated in 2021 to include homozygous deletions of CDKN2A/B and TERT promotor mutations. Previous work including the recent cIMPACT-NOW statement have discussed the potential value of including chromosomal copy number alterations to help refine the current grading system. Methods: Chromosomal copy number profiles were inferred from from 1964 meningiomas using DNA methylation. Regularized Cox regresssion was used to identify CNAs independenly associated with post-surgical and post-RT PFS. Outcomes were stratified by WHO grade and novel CNAs to assess their potential value in WHO critiera. Results: Patients with WHO grade 1 tumours and chromosome 1p loss had similar outcomes to those with WHO grade 2 tumours (median PFS 5.83 [95% CI 4.36-Inf] vs 4.48 [4.09-5.18] years). Those with chromosome 1p loss and 1q gain had similar outcomes to those with WHO grade 3 cases regardless of initial grade (median PFS 2.23 [1.28-Inf] years WHO grade 1, 1.90 [1.23-2.25] years WHO grade 2, compared to 2.27 [1.68-3.05] years in WHO grade 3 cases overall). Conclusions: We advocate for chromosome 1p loss being added as a criterion for a CNS WHO grade of 2 meningioma and addition of 1q gain as a criterion for a CNS WHO grade of 3.
Background: Meningiomas exhibit considerable heterogeneity. We previously identified four distinct molecular groups (immunogenic, NF2-wildtype, hypermetabolic, proliferative) which address much of this heterogeneity. Despite their utility, the stochasticity of clustering methods and the requirement of multi-omics data limits the potential for classifying cases in the clinical setting. Methods: Using an international cohort of 1698 meningiomas, we constructed and validated a machine learning-based molecular classifier using DNA methylation alone. Original and newly-predicted molecular groups were compared using DNA methylation, RNA sequencing, whole exome sequencing, and clinical outcomes. Results: Group-specific outcomes in the validation cohort were nearly identical to those originally described, with median PFS of 7.4 (4.9-Inf) years in hypermetabolic tumors and 2.5 (2.3-5.3) years in proliferative tumors (not reached in the other groups). Predicted NF2-wildtype cases had no NF2 mutations, and 51.4% had others mutations previously described in this group. RNA pathway analysis revealed upregulation of immune-related pathways in the immunogenic group, metabolic pathways in the hypermetabolic group and cell-cycle programs in the proliferative group. Bulk deconvolution similarly revealed enrichment of macrophages in immunogenic tumours and neoplastic cells in hypermetabolic/proliferative tumours. Conclusions: Our DNA methylation-based classifier faithfully recapitulates the biology and outcomes of the original molecular groups allowing for their widespread clinical implementation.
Background: Nipocalimab is a human IgG1 monoclonal antibody targeting FcRn that selectively reduces IgG levels without impacting antigen presentation, T- and B-cell functions. This study describes the effect of nipocalimab on vaccine response. Methods: Open-label, parallel, interventional study randomized participants 1:1 to receive intravenous 30mg/kg nipocalimab at Week0 and 15mg/kg at Week2 and Week4 (active) or no drug (control). On Day 3, participants received Tdap and PPSV®23 vaccinations and were followed through Wk16. Results: Twenty-nine participants completed the study and are included (active, n=15; control, n=14). Participants with a positive anti-tetanus IgG response was comparable between groups at Wk2 and Wk16, but lower at Wk4 (nipocalimab 3/15 [20%] vs control 7/14 [50%]; P=0.089). All maintained anti-tetanus IgG above the protective threshold (0.16IU/mL) through Wk16. While anti-pneumococcal-capsular-polysaccharide (PCP) IgG levels were lower during nipocalimab treatment, the percent increase from baseline at Wk2 and Wk16 was comparable between groups. Post-vaccination, anti-PCP IgG remained above 50mg/L and showed a 2-fold increase from baseline throughout the study in both groups. Nipocalimab co-administration with vaccines was safe and well-tolerated. Conclusions: These findings suggest that nipocalimab does not impact the development of an adequate IgG response to T-cell–dependent/independent vaccines and that nipocalimab-treated patients can follow recommended vaccination schedules.
We conducted an analysis of a nationwide survey of US physician offices between 2016 and 2019 and calculated annualized prevalence rates of urinary tract infections (UTIs). During the 3-year study period, UTI was the most common infection in US physician offices, accounting for approximately 10 million annualized encounters.
Political scientists regularly rely on a selection-on-observables assumption to identify causal effects of interest. Once a causal effect has been identified in this way, a wide variety of estimators can, in principle, be used to consistently estimate the effect of interest. While these estimators are all justified by appeals to the same causal identification assumptions, they often differ greatly in how they make use of the data at hand. For instance, methods based on regression rely on an explicit model of the outcome variable but do not explicitly model the treatment assignment process, whereas methods based on propensity scores explicitly model the treatment assignment process but do not explicitly model the outcome variable. Understanding the tradeoffs between estimation methods is complicated by these seemingly fundamental differences. In this paper we seek to rectify this problem. We do so by clarifying how most estimators of causal effects that are justified by an appeal to a selection-on-observables assumption are all special cases of a general weighting estimator. We then explain how this commonality provides for diagnostics that allow for meaningful comparisons across estimation methods—even when the methods are seemingly very different. We illustrate these ideas with two applied examples.
Psychiatric symptoms are typically highly inter-correlated at the group level. Collectively, these correlations define the architecture of psychopathology – informing taxonomic and mechanistic models in psychiatry. However, to date, it remains unclear if this architecture differs between etiologically distinct subgroups, despite the core relevance of this understanding for personalized medicine. Here, we introduce a new analytic pipeline to probe group differences in the psychopathology architecture – demonstrated through the comparison of two distinct neurogenetic disorders.
Methods
We use a large questionnaire battery in 300 individuals aged 5–25 years (n = 102 XXY/KS, n = 64 XYY, n = 134 age-matched XY) to characterize the structure of correlations among 53 diverse measures of psychopathology in XXY/KS and XYY syndrome – enabling us to compare the effects of X- versus Y-chromosome dosage on the architecture of psychopathology at multiple, distinctly informative levels.
Results
Behavior correlation matrices describe the architecture of psychopathology in each syndrome. A comparison of matrix rows reveals that social problems and externalizing symptoms are most differentially coupled to other aspects of psychopathology in XXY/KS versus XYY. Clustering the difference between matrices captures coordinated group differences in pairwise coupling between measures of psychopathology: XXY/KS shows greater coherence among externalizing, internalizing, and autism-related features, while XYY syndrome shows greater coherence in dissociality and early neurodevelopmental impairment.
Conclusions
These methods offer new insights into X- and Y-chromosome dosage effects on behavior, and our shared code can now be applied to other clinical groups of interest – helping to hone mechanistic models and inform the tailoring of care.
Improving neonatal piglet survival is a key driver for improving pig production and enhancing animal welfare. Gestational diabetes is a risk factor for neonatal morbidities in humans, such as hypoglycaemia and respiratory distress(1). There is limited knowledge on the association of gestational diabetes with neonatal survival in commercial pigs. An early study suggested that the diabetic condition of late-gestating sows was positively correlated with the first-week newborn piglet mortality(2). Genetic selection in recent decades for heavier birth weight may have increased the prevalence or severity of gestational diabetes in pigs, considering the positive correlation between gestational diabetes and birth weight. We hypothesised that the diabetic condition of late gestating sows positively correlates with the neonatal piglet mortality rate in sows with modern genetics. Mixed-parity sows (1.5 ± 1.6 parity for mean ± standard deviation (SD); Large White × Landrace) from a commercial piggery in Australia were randomly selected and participated in an oral glucose tolerance test (OGTT) during two seasons (118 sows in winter and 118 sows in summer). On the d109 day of gestation, sows were fed 3.0 g dextrose per kg of metabolic body weight after fasting overnight. Tail blood glucose concentrations were measured using a glucometer (Accu-Chek ®, Roche Diabetes Care Australia Pty) at −10, 0, 10, 20, 30, 40, 50, 60, 70, 80, 90, 105, 120 minutes relative to dextrose feeding. The glucose increment (2.5 ± 1.29 mM for mean ± SD) during OGTT was calculated using the maximum concentration substrating the fasting concentration of blood glucose. The 24-hour piglet mortality rate (5% ± 8.8% for mean ± SD) was calculated as the ratio between piglets that died during the first 24 hours and the total number of born alive on a litter basis. The effect of sow glucose increment, season (winter vs summer), glucose increment × season, number of piglets born alive, and sows parity on the 24-h piglet mortality rate as analysed using a Generalised Linear Model (SPSS 27th Version, IBM SPSS Statistics, Armonk). Results showed that the 24-hour piglet mortality rate was numerically higher in winter than in summer although insignificant (5.7% vs 4.2%, p = 0.41). The glucose increment of gestating sows was positively correlated with the 24-hour piglet mortality rate during winter but not summer, as evidenced by an interaction trend between glucose increment and season (p = 0.059). The regression coefficient suggested that every extra unit (mM) of glucose increment during OGTT corresponded to a 1.4% increase in the 24-hour piglet mortality rate in winter. In conclusion, the diabetic condition of late-gestating sows is a risk factor for neonatal piglet mortality in winter. Developing nutritional strategies to mitigate the diabetic condition of late-gestating sows may benefit neonatal piglet survival.
Despite a global decline in suicide rates, the USA has witnessed a concerning rise in suicide mortality over the past two decades.
Aims
This study aims to elucidate the changing patterns of suicide mortality in the USA from 1999 to 2019, with a particular focus on gender and racial differences.
Method
We utilised national mortality data for causes of suicide (X60–X84, Y87.0) from the Centers for Disease Control and Prevention for 1999–2019. The age–period–cohort analysis was conducted to explore the effects of age, period and birth cohort effects on suicide mortality by gender and race.
Results
Between 1999 and 2019, the suicide rate and the number of suicides in the USA increased 33% and 62%, respectively. We discerned an emerging peak of suicide among young adult populations even as increases affected nearly all groups. Females have shown increasing period risk, which has exceeded that of males since 2011. Their cohort risk, which slowly increased and exceeded males in post-1959 cohorts, exhibited a steep J-shaped pattern, especially among those born after 1977. Although Americans of all races have experienced increased period risk since 2011, it was highest among American Indians and Alaska Natives by the end of the 20-year span. With the mortality risk increasing rapidly in all post-1959 cohorts, the risk showed an obvious cliff-shaped pattern among the Asian/Pacific Islander population born after 1989.
Conclusions
The shifting burden of suicide mortality towards younger populations, transcending gender and racial boundaries, underscores the need for the implementation of tailored public health strategies.
Phylogenetic analysis demonstrates that Kuamaia lata, a helmetiid euarthropod from the lower Cambrian (Series 2, Stage 3) Chengjiang Konservat-Lagerstätte, nests robustly within Artiopoda, the euarthropod clade including trilobitomorphs. Microtomography of new specimens of K. lata reveals details of morphology, notably a six-segmented head and raptorial frontal appendages, the latter contrasting with filiform antennae considered to be a diagnostic character of Artiopoda. Phylogenetic analyses demonstrate that a raptorial frontal appendage is a symplesiomorphy for upper stem-group euarthropods, retained across a swathe of tree space, but evolved secondarily in K. lata from an antenna within Artiopoda. The phylogenetic position of K. lata adds support to a six-segmented head being an ancestral state for upper stem- and crown-group euarthropods.
Monitored anesthesia care (MAC) has been increasingly utilized in anesthesia services for diagnostic or therapeutic procedures for various non-surgical and surgical procedures in the last several decades [1]. It is also steadily increasing in demand by many different medical specialties: cardiology for cardioversion, defibrillation, transesophageal echocardiography, pacemaker/defibrillator implantation or removal, cardiac catheterization, and other cardiac monitoring devices; gastroenterology for endoscopic examinations, potential biopsies, and other therapeutic interventions; urology for cystoscopy, etc. [1, 2]. MAC has also been gradually applied for more complex procedures in patients receiving endovascular aortic stent placements, transcatheter aortic valve replacements, and even sophisticated procedures like Mitroclip. The aims of MAC for procedures are to enhance patient comfort and cooperation, maintain airway patency and hemodynamic stability, thus facilitating efficient and safe completion of the scheduled procedures.
Intravenous pharmacologic sedation is often chosen for surgical and nonsurgical procedures and is administered by an anesthesiologist, nurse anesthetist, or other trained professional. Sedation is described as a continuum, encompassing minimal, moderate, and deep sedation that can be categorized according to the patient’s level of consciousness (Figure 12.1). This categorization is subjective and the different levels of sedation can be achieved through changes in medication choice and dosage. There exist overlapping zones between levels of sedation. In clinical practice, deep sedation and general anesthesia share many of the same features in terms of patient awareness, lack of responsiveness, and risk of airway compromise.
Certain patient populations requiring sedation for procedures present the clinician with challenging decisions regarding their care and management. Some underlying medical disease states, airway abnormalities, or extremes of age require cautious pre-procedural assessment and planning when sedation is required to minimize the incidence of morbidity or mortality. It should be noted that some of these higher-risk patients should only be sedated by trained anesthesia providers. The following commonly encountered conditions are considered high risk and are associated with a higher rate of complications: old age, obesity, chronic obstructive pulmonary disease, coronary artery disease, and chronic renal failure. This chapter discusses important features of these higher-risk patients and practice management when sedation is required. In all cases, appropriate monitoring, prudent selection and dosing of sedative agents, and careful assessment are important to ensure the best outcome for these higher-risk patients.
Perioperative anesthesia care for the patients undergoing ophthalmologic procedures is unique and sometimes challenging. Many of the ophthalmologic procedures can often be done with sedation/monitored anesthesia care (MAC) [1]. Intravenous sedatives combined with topical/local/regional anesthesia during eye surgery can alleviate patients’ pain, fear, anxiety, thus improving outcomes [2]. In this chapter we review the current practices and trends in anesthesia service with respect to MAC for ophthalmologic procedures with topical/local/regional anesthesia [1, 2, 3]. The nerve blocks performed for eye surgery determine, to some extent, the techniques and requirement of the sedation level by the anesthesia service. And the traditions of surgery teams and hospitals also affect the choice of sedation technique. The evolvement of surgical techniques seems to facilitate the trend that sedation is more and more used in the eye surgical procedures. Anesthesia care options are also based on surgeons’ skill and anesthesia providers’ comfort level, and the patients’ expectations and demands. Regardless, patients’ safety and perioperative care quality are the key determinants [1, 3, 4].
A distributed cooperative guidance law without numerical singularities is proposed for the simultaneous attack a stationary target by multiple vehicles with field-of-view constraints. Firstly, the vehicle engagement motion model is transformed into a multi-agent model. Then, based on the state-constrained consensus protocol, a coordination control law with field-of-view (FOV) constraints is proposed. Finally, the cooperative guidance law has been improved to make it more suitable for practical application. Numerical simulations verified the effectiveness and robustness of the proposed guidance law in the presence of acceleration saturation, communication delays and measurement noise.
Objectives: Activities that require active thinking, like occupations, may influence cognitive function and its change over time. Associations between retirement and dementia risk have been reported, however the role of retirement age in these associations is unclear. We assessed associations of occupation and retirement age with cognitive decline in the US community-based Atherosclerosis Risk in Communities (ARIC)cohort.
Methods: We included 14,090 ARIC participants, followed for changes in cognition during up to 21 years. Information on current or most recent occupation was collected at ARIC baseline (1987–1989; participants aged 45–64 years) and categorized according to the 1980 US Census protocols and the Nam-Powers-Boyd occupational status score. Follow-up data on retirement was collected during 1999–2007 and classified as retired versus not retired at age 70. Trajectories of global cognitive factor scores from ARIC visit 2 (1990–1992) to visit 5 (2011–2013) were presented, and associations with occupation and age at retirement were studied using generalized estimating equation models, stratified by race and sex, and adjusted for demographics andcomorbidities.
Results: Mean age (SD) at first cognitive assessment was 57.0 (5.72) years. Higher occupational status and white- collar occupations were significantly associated with higher cognitive function at baseline. Occupation was associated with cognitive decline over 21 years only in women, and the direction of the effect on cognitive function differed between black and white women: in white women, the decline in cognitive function was greater in homemakers and low status occupations, whereas in black women, less decline was found in homemakers and low (compared to high) occupational status. Interestingly, retirement on or before age 70 was associated with less 21-year cognitive decline in all race-sex strata, except for blackwomen.
Conclusions: Associations between occupation, retirement age and cognitive function substantially differed by race and sex. Further research should explore reasons for the observed associations and race-sex differences.
In this paper, a brand-new adaptive fault-tolerant non-affine integrated guidance and control method based on reinforcement learning is proposed for a class of skid-to-turn (STT) missile. Firstly, considering the non-affine characteristics of the missile, a new non-affine integrated guidance and control (NAIGC) design model is constructed. For the NAIGC system, an adaptive expansion integral system is introduced to address the issue of challenging control brought on by the non-affine form of the control signal. Subsequently, the hyperbolic tangent function and adaptive boundary estimation are utilised to lessen the jitter due to disturbances in the control system and the deviation caused by actuator failures while taking into account the uncertainty in the NAIGC system. Importantly, actor-critic is introduced into the control framework, where the actor network aims to deal with the multiple uncertainties of the subsystem and generate the control input based on the critic results. Eventually, not only is the stability of the NAIGC closed-loop system demonstrated using Lyapunov theory, but also the validity and superiority of the method are verified by numerical simulations.
Aircraft ground taxiing contributes significantly to carbon emissions and engine wear. The electric towing tractor (ETT) addresses these issues by towing the aircraft to the runway end, thereby minimising ground taxiing. As the complexity of ETT towing operations increases, both the towing distance and time increase significantly, and the original method for estimating the number of ETTs is no longer applicable. Due to the substantial acquisition cost of ETT and the need to reduce waste while ensuring operational efficiency, this paper introduces for the first time an ETT quantity estimation model that combines simulation and vehicle scheduling models. The simulation model simulates the impact of ETT on apron operations, taxiing on taxiways and takeoffs and landings on runways. Key timing points for ETT usage by each aircraft are identified through simulation, forming the basis for determining the minimum number of vehicles required for airport operations using a hard-time window vehicle scheduling model. To ensure the validity of the model, simulation model verification is conducted. Furthermore, the study explores the influence of vehicle speed and airport scale on the required number of ETTs. The results demonstrate the effective representation of real-airport operations by the simulation model. ETT speed, airport runway and taxiway configurations, takeoff and landing frequencies and imbalances during peak periods all impact the required quantity of ETTs. A comprehensive approach considering these factors is necessary to determine the optimal number of ETTs.
The Ediacaran Subcommission of the International Commission on Stratigraphy is diligently working toward the goal of subdividing the Ediacaran Period into precise and useful chronostratigraphic units. As emphasized by Xiao and colleagues in 2016, one of the most effective tools in this endeavor will be the use of index fossils. Our special issue serves as a presentation of ongoing research efforts aimed at advancing this task and contains explorations into taxonomy, taphonomy, and the diversity of life during the Ediacaran Period.