We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Antibiotics overuse leads to bacterial resistance. The biomarker procalcitonin rises with bacterial pneumonias and remains normal in viral respiratory tract infections. Its use can distinguish between these etiologies and thus guide antibiotics use. We aimed to quantify the effect of procalcitonin use on clinical decision-making.
Design:
A retrospective study, spanning a year at a tertiary care center, where 348 patients hospitalized with aspiration pneumonia and 824 with non-aspiration pneumonia were evaluated with regards to procalcitonin use, the length of stay (LOS) and antibiotics prescribing practices. Descriptive statistics and univariate analyses were applied to the ensemble data. Subsets of cases were manually reviewed and analyzed with descriptive statistics. P < 0.05 indicated statistical significance.
Results:
21% of both the aspiration and non-aspiration pneumonia cases had procalcitonin checked. In the ensemble analyses, a check of procalcitonin was more likely to happen in prolonged hospitalizations with aspiration pneumonia. The LOS was statistically the same regardless of procalcitonin results (elevated or normal) for both the aspiration and non-aspiration pneumonia cohorts. The overall use of antibiotics was not affected by the procalcitonin results. After excluding two extreme outliers, the per-person antibiotics cost was not affected by the procalcitonin results. Detailed chart reviews of 33 cases revealed that for the vast majority, the procalcitonin results were not used by clinicians to guide the duration of antibiotics use.
Conclusions:
Despite its promise as a biomarker for antibiotics stewardship, procalcitonin results appeared to not be utilized by clinicians as a decision-making tool in the management of pneumonia.
Before COVID-19, breast cancer patients in the UK typically received 15 radiotherapy (RT) fractions over three weeks. During the pandemic, adoption of a 5-fraction treatment prescription and more advanced treatment techniques like surface-guided RT, meant a change in the duration and number of hospital visits for patients accessing treatment. This work sought to understand how breast cancer patients’ time in the RT department has changed, between 2018 and 2023.
Methods:
Appointments for CT simulation, mould room, and RT, from January 2018 to December 2023, were extracted from the Mosaiq® Oncology Management System. Appointments lasting between 5 minutes and 5 hours were analysed. Total visit time was calculated from check-in to completion on the quality checklist.
Results:
In total, 29,523 attendances were analysed over 6 years. Average time spent in the department decreased during the pandemic but has since increased 12·4% above pre-COVID-19 levels. Early morning and late afternoon appointments resulted in the shortest visits, with early afternoon appointments leading to the longest visits. On average, patients spend the longest in the department on a Monday, and the least amount of time on a Friday. Friday was the least common day to start a 15-fraction treatment, whereas Tuesday and Friday were equally uncommon for the 5-fraction regime.
Conclusions:
During the COVID-19 pandemic, the number of visits a patient makes for breast cancer RT and related services dropped, and remained lower post-COVID-19, due to fewer treatment fractions being prescribed. Average time spent in the department initially decreased but has since increased beyond pre-COVID-19 levels.
Medicare claims are frequently used to study Clostridioides difficile infection (CDI) epidemiology. However, they lack specimen collection and diagnosis dates to assign location of onset. Algorithms to classify CDI onset location using claims data have been published, but the degree of misclassification is unknown.
Methods:
We linked patients with laboratory-confirmed CDI reported to four Emerging Infections Program (EIP) sites from 2016–2021 to Medicare beneficiaries with fee-for-service Part A/B coverage. We calculated sensitivity of ICD-10-CM codes in claims within ±28 days of EIP specimen collection. CDI was categorized as hospital, long-term care facility, or community-onset using three different Medicare claims-based algorithms based on claim type, ICD-10-CM code position, duration of hospitalization, and ICD-10-CM diagnosis code presence-on-admission indicators. We assessed concordance of EIP case classifications, based on chart review and specimen collection date, with claims case classifications using Cohen’s kappa statistic.
Results:
Of 12,671 CDI cases eligible for linkage, 9,032 (71%) were linked to a single, unique Medicare beneficiary. Compared to EIP, sensitivity of CDI ICD-10-CM codes was 81%; codes were more likely to be present for hospitalized patients (93.0%) than those who were not (56.2%). Concordance between EIP and Medicare claims algorithms ranged from 68% to 75%, depending on the algorithm used (κ = 0.56–0.66).
Conclusion:
ICD-10-CM codes in Medicare claims data had high sensitivity compared to laboratory-confirmed CDI reported to EIP. Claims-based epidemiologic classification algorithms had moderate concordance with EIP classification of onset location. Misclassification of CDI onset location using Medicare algorithms may bias findings of claims-based CDI studies.
There is growing interest in lifestyle interventions as stand-alone and add-on therapies in mental health care due to their potential benefits for both physical and mental health outcomes. We evaluated lifestyle interventions focusing on physical activity, diet, and sleep in adults with severe mental illness (SMI) and the evidence for their effectiveness. To this end, we conducted a meta-review and searched major electronic databases for articles published prior to 09/2022 and updated our search in 03/2024. We identified 89 relevant systematic reviews and assessed their quality using the SIGN checklist. Based on the findings of our meta-review and on clinical expertise of the authors, we formulated seven recommendations. In brief, evidence supports the application of lifestyle interventions that combine behavioural change techniques, dietary modification, and physical activity to reduce weight and improve cardiovascular health parameters in adults with SMI. Furthermore, physical activity should be used as an adjunct treatment to improve mental health in adults with SMI, including psychotic symptoms and cognition in adults with schizophrenia or depressive symptoms in adults with major depression. To ameliorate sleep quality, cognitive behavioural informed interventions can be considered. Additionally, we provide an overview of key gaps in the current literature. Future studies should integrate both mental and physical health outcomes to reflect the multi-faceted benefits of lifestyle interventions. Moreover, our meta-review highlighted a relative dearth of evidence relating to interventions in adults with bipolar disorder and to nutritional and sleep interventions. Future research could help establish lifestyle interventions as a core component of mental health care.
Herbicides have been placed in global Herbicide Resistance Action Committee (HRAC) herbicide groups based on their sites of action (e.g., acetolactate synthase–inhibiting herbicides are grouped in HRAC Group 2). A major driving force for this classification system is that growers have been encouraged to rotate or mix herbicides from different HRAC groups to delay the evolution of herbicide-resistant weeds, because in theory, all active ingredients within a herbicide group physiologically affect weeds similarly. Although herbicide resistance in weeds has been studied for decades, recent research on the biochemical and molecular basis for resistance has demonstrated that patterns of cross-resistance are usually quite complicated and much more complex than merely stating, for example, a certain weed population is Group 2-resistant. The objective of this review article is to highlight and describe the intricacies associated with the magnitude of herbicide resistance and cross-resistance patterns that have resulted from myriad target-site and non–target site resistance mechanisms in weeds, as well as environmental and application timing influences. Our hope is this review will provide opportunities for students, growers, agronomists, ag retailers, regulatory personnel, and research scientists to better understand and realize that herbicide resistance in weeds is far more complicated than previously considered when based solely on HRAC groups. Furthermore, a comprehensive understanding of cross-resistance patterns among weed species and populations may assist in managing herbicide-resistant biotypes in the short term by providing growers with previously unconsidered effective control options. This knowledge may also inform agrochemical company efforts aimed at developing new resistance-breaking chemistries and herbicide mixtures. However, in the long term, nonchemical management strategies, including cultural, mechanical, and biological weed management tactics, must also be implemented to prevent or delay increasingly problematic issues with weed resistance to current and future herbicides.
The increased risk for psychopathology associated with interpersonal violence exposure (IPV, e.g., physical abuse, sexual assault) is partially mediated by neurobiological alterations in threat-related processes. Evidence supports parsing neural circuitry related to transient and sustained threat, as they appear to be separable processes with distinct neurobiological underpinnings. Although childhood is a sensitive period for neurodevelopment, most prior work has been conducted in adult samples. Further, it is unknown how IPV exposure may impact transient-sustained threat neural interactions. The current study tested the moderating role of IPV exposure on sustained vmPFC-transient amygdala co-activation during an fMRI task during which threat and neutral cues were predictably or unpredictably presented. Analyses were conducted in a sample of 212 community-recruited youth (M/SDage = 11.77/2.44 years old; 51.9% male; 56.1% White/Caucasian). IPV-exposed youth evidenced a positive sustained vmPFC-transient amygdala co-activation, while youth with no IPV exposure did not show this association. Consistent with theoretical models, effects were specific to unpredictable, negative trials and to exposure to IPV (i.e., unrelated to non-IPV traumatic experiences). Although preliminary, these findings provide novel insight into how childhood IPV exposure may alter neural circuity involved in specific facets of threat processing.
SCN2A encodes a voltage-gated sodium channel (designated NaV1.2) vital for generating neuronal action potentials. Pathogenic SCN2A variants are associated with a diverse array of neurodevelopmental disorders featuring neonatal or infantile onset epilepsy, developmental delay, autism, intellectual disability and movement disorders. SCN2A is a high confidence risk gene for autism spectrum disorder and a commonly discovered cause of neonatal onset epilepsy. This remarkable clinical heterogeneity is mirrored by extensive allelic heterogeneity and complex genotype-phenotype relationships partially explained by divergent functional consequences of pathogenic variants. Emerging therapeutic strategies targeted to specific patterns of NaV1.2 dysfunction offer hope to improving the lives of individuals affected by SCN2A-related disorders. This Element provides a review of the clinical features, genetic basis, pathophysiology, pharmacology and treatment of these genetic conditions authored by leading experts in the field and accompanied by perspectives shared by affected families. This title is also available as Open Access on Cambridge Core.
According to International Union for the Conservation of Nature (IUCN) guidelines, all species must be assessed against all criteria during the Red Listing process. For organismal groups that are diverse and understudied, assessors face considerable challenges in assembling evidence due to difficulty in applying definitions of key terms used in the guidelines. Challenges also arise because of uncertainty in population sizes (Criteria A, C, D) and distributions (Criteria A2/3/4c, B). Lichens, which are often small, difficult to identify, or overlooked during biodiversity inventories, are one such group for which specific difficulties arise in applying Red List criteria. Here, we offer approaches and examples that address challenges in completing Red List assessments for lichens in a rapidly changing arena of data availability and analysis strategies. While assessors still contend with far from perfect information about individual species, we propose practical solutions for completing robust assessments given the currently available knowledge of individual lichen life-histories.
Society of Thoracic Surgeons Congenital Heart Surgery Database is the largest congenital heart surgery database worldwide but does not provide information beyond primary episode of care. Linkage to hospital electronic health records would capture complications and comorbidities along with long-term outcomes for patients with CHD surgeries. The current study explores linkage success between Society of Thoracic Surgeons Congenital Heart Surgery Database and electronic health record data in North Carolina and Georgia.
Methods:
The Society of Thoracic Surgeons Congenital Heart Surgery Database was linked to hospital electronic health records from four North Carolina congenital heart surgery using indirect identifiers like date of birth, sex, admission, and discharge dates, from 2008 to 2013. Indirect linkage was performed at the admissions level and compared to two other linkages using a “direct identifier,” medical record number: (1) linkage between Society of Thoracic Surgeons Congenital Heart Surgery Database and electronic health records from a subset of patients from one North Carolina institution and (2) linkage between Society of Thoracic Surgeons data from two Georgia facilities and Georgia’s CHD repository, which also uses direct identifiers for linkage.
Results:
Indirect identifiers successfully linked 79% (3692/4685) of Society of Thoracic Surgeons Congenital Heart Surgery Database admissions across four North Carolina hospitals. Direct linkage techniques successfully matched Society of Thoracic Surgeons Congenital Heart Surgery Database to 90.2% of electronic health records from the North Carolina subsample. Linkage between Society of Thoracic Surgeons and Georgia’s CHD repository was 99.5% (7,544/7,585).
Conclusions:
Linkage methodology was successfully demonstrated between surgical data and hospital-based electronic health records in North Carolina and Georgia, uniting granular procedural details with clinical, developmental, and economic data. Indirect identifiers linked most patients, consistent with similar linkages in adult populations. Future directions include applying these linkage techniques with other data sources and exploring long-term outcomes in linked populations.
Background: Medicare claims are frequently used to study Clostridioides difficile infection (CDI) epidemiology. Categorizing CDI based on location of onset and potential exposure is critical in understanding transmission patterns and prevention strategies. While claims data are well-suited for identifying prior healthcare utilization exposures, they lack specimen collection and diagnosis dates to assign likely location of onset. Algorithms to classify CDI onset and healthcare association using claims data have been published, but the degree of misclassification is unknown. Methods: We linked patients with laboratory-confirmed CDI reported to four Emerging Infections Program (EIP) sites from 2016-2020 to Medicare beneficiaries using residence, birth date, sex, and hospitalization and/or healthcare exposure dates. Uniquely linked patients with fee-for-service Medicare A/B coverage and complete EIP case report forms were included. Patients with a claims CDI diagnosis code within ±28 days of a positive CDI test reported to EIP were categorized as hospital-onset (HO), long-term care facility onset (LTCFO), or community-onset (CO, either healthcare facility-associated [COHCFA] or community-associated [CA]) using a previously published algorithm based on claim type, ICD-10-CM code position, and duration of hospitalization (if applicable). EIP classifies CDI into these categories using positive specimen collection date and other information from chart review (e.g. admit/discharge dates). We assessed concordance of EIP and claims case classifications using Cohen’s kappa. Results: Of 10,002 eligible EIP-identified CDI cases, 7,064 were linked to a unique beneficiary; 3,451 met Medicare A/B fee-for-service coverage inclusion criteria. Of these, 650 (19%) did not have a claims diagnosis code ±28 days of the EIP specimen collection date (Table); 48% (313/650) of those without a claims diagnosis code were categorized by EIP as CA CDI. Among those with a CDI diagnosis code, concurrence of claims-based and EIP CDI classification was 68% (κ=0.56). Concurrence was highest for HO and lowest for COHCFA CDI. A substantial number of EIP-classified CO CDIs (30%, Figure) were misclassified as HO using the claims-based algorithm; half of these had a primary ICD-10 diagnosis code of sepsis (226/454; 50%). Conclusions: Evidence of CDI in claims data was found for 81% of EIP-reported CDI cases. Medicare classification algorithms concurred with the EIP classification in 68% of cases. Discordance was most common for community-onset CDI patients, many of whom were hospitalized with a primary diagnosis of sepsis. Misclassification of CO-CDI as HO may bias findings of claims-based CDI studies.
Background: Nursing home (NH) residents are at high risk of COVID-19 from exposure to infected staff and other residents. Understanding SARS-CoV-2 viral RNA kinetics in residents and staff can guide testing, isolation, and return to work recommendations. We sought to determine the duration of antigen test and polymerase chain reaction (PCR) positivity in a cohort of NH residents and staff. Methods: We prospectively collected data on SARS-CoV-2 viral kinetics from April 2023 through November 2023. Staff and residents could enroll prospectively or upon a positive test (identified through routine clinical testing, screening, or outbreak response testing). Participating facilities performed routine clinical testing; asymptomatic testing of contacts was performed within 48 hours if an outbreak or known exposure occurred and upon (re-) admission. Enrolled participants who tested positive for SARS-CoV-2 were re-tested daily for 14 days with both nasal antigen and nasal PCR tests. All PCR tests were run by a central lab with the same assay. We conducted a Kaplan-Meier survival analysis on time to first negative test restricted to participants who initially tested positive (day zero) and had at least one test ≥10 days after initially testing positive with the same test type; a participant could contribute to both antigen and PCR survival curves. We compared survival curves for staff and residents using the log-rank test. Results: Twenty-four nursing homes in eight states participated; 587 participants (275 residents, 312 staff) enrolled in the evaluation, participants were only tested through routine clinical or outbreak response testing. Seventy-two participants tested positive for antigen; of these, 63 tested PCR-positive. Residents were antigen- and PCR-positive longer than staff (Figure 1), but this finding is only statistically significant (p=0.006) for duration of PCR positivity. Five days after the first positive test, 56% of 50 residents and 59% of 22 staff remained antigen-positive; 91% of 44 residents and 79% of 19 staff were PCR-positive. Ten days after the first positive test, 22% of 50 residents and 5% of 22 staff remained antigen-positive; 61% of 44 residents and 21% of 19 staff remained PCR-positive. Conclusions: Most NH residents and staff with SARS-CoV-2 remained antigen- or PCR-positive 5 days after the initial positive test; however, differences between staff and resident test positivity were noted at 10 days. These data can inform recommendations for testing, duration of NH resident isolation, and return to work guidance for staff. Additional viral culture data may strengthen these conclusions.
Disclosure: Stefan Gravenstein: Received consulting and speaker fees from most vaccine manufacturers (Sanofi, Seqirus, Moderna, Merck, Janssen, Pfizer, Novavax, GSK, and have or expect to receive grant funding from several (Sanofi, Seqirus, Moderna, Pfizer, GSK). Lona Mody: NIH, VA, CDC, Kahn Foundation; Honoraria: UpToDate; Contracted Research: Nano-Vibronix
This paper presents a comprehensive technical overview of the Linac Coherent Light Source II (LCLS-II) photoinjector laser system, its first and foremost component. The LCLS-II photoinjector laser system serves as an upgrade to the original LCLS at SLAC National Accelerator Laboratory. This advanced laser system generates high-quality laser beams for the LCLS-II, contributing to the instrument’s unprecedented brightness, precision and flexibility. Our discussion extends to the various subsystems that comprise the photoinjector, including the photocathode laser, laser heater and beam transport systems. Lastly, we draw attention to the ongoing research and development infrastructure underway to enhance the functionality and efficiency of the LCLS-II, and similar X-ray free-electron laser facilities around the world, thereby contributing to the future of laser technology and its applications.
OBJECTIVES/GOALS: Prostate cancer treatment is associated with significant genitourinary side effects. There is a critical need for treatment with decreased morbidity. We report the development of a novel treatment paradigm combining irreversible electroporation and lower dose radiation to provide prostate cancer patients with a less morbid treatment. METHODS/STUDY POPULATION: Intermediate risk prostate cancer patients will undergo focal irreversible electroporation followed by low dose, whole gland radiation therapy. The primary endpoint is freedom from clinically significant cancer on biopsy at 12-month follow up. Secondary endpoints include safety profile, oncologic efficacy, effectiveness of RT and need for secondary treatment. This trial (NCT05345444) and currently actively recruiting patients after initial feasibility trial. Sample size is calculated to detect an increase in the proportion of patients who are cancer free at 1-year, from 0.80 to 0.95. An exact binomial test with a 10% one-sided significance level will have 94.3% power to detect the difference between the null and alternative hypothesis when the sample size is 42. RESULTS/ANTICIPATED RESULTS: This is a clinical trial in progress. DISCUSSION/SIGNIFICANCE: Combined irreversible electroporation (IRE) and a lower dose radiotherapy (RTIRE) may provide prostate cancer patients a treatment with minimal side effects.
Non-motor symptoms, such as mild cognitive impairment and dementia, are an overwhelming cause of disability in Parkinson’s disease (PD). While subthalamic nucleus deep brain stimulation (STN DBS) is safe and effective for motor symptoms, declines in verbal fluency after bilateral DBS surgery have been widely replicated. However, little is known about cognitive outcomes following unilateral surgeries.
Participants and Methods:
We enrolled 31 PD patients who underwent unilateral STN-DBS in a randomized, cross-over, double-blind study (SUNDIAL Trial). Targets were chosen based on treatment of the most symptomatic side (n = 17 left hemisphere and 14 right hemisphere). All participants completed a neuropsychological battery (FAS/CFL, AVLT, DKEFS Color-Word Test) at baseline, then 2, 4, and 6 months post-surgery. Outcomes include raw scores for verbal fluency, immediate and delayed recall, and DKEFS Color-Word Inhibition trial (Trial 3) completion time. At 2, 4, and 6 months, the neurostimulation type (directional versus ring mode) was randomized for each participant. We compared baseline scores for all cognitive outcome measures using Welch’s two-sample t-tests and used linear mixed effects models to examine longitudinal effects of hemisphere and stimulation on cognition. This test battery was converted to a teleneuropsychology administration because of COVID-19 mid-study, and this was included as a covariate in all statistical models, along with years of education, baseline cognitive scores, and levodopa equivalent medication dose at each time point.
Results:
At baseline, patients who underwent left hemisphere implants scored lower on verbal fluency than right implants (t(20.66) = -2.49, p = 0.02). There were not significant differences between hemispheres in immediate recall (p = 0.57), delayed recall (p = 0.22), or response inhibition (p = 0.51). Post-operatively, left STN DBS patients experienced significant declines in verbal fluency over the study period (p = 0.02), while patients with right-sided stimulation demonstrated improvements (p < .001). There was no main effect of stimulation parameters (directional versus ring) on verbal fluency, memory, or inhibition, but there was a three-way interaction between time, stimulation parameters, and hemisphere on inhibition, such that left STN DBS patients receiving ring stimulation completed the inhibition trial faster (p = 0.035). After surgery, right STN DBS patients displayed faster inhibition times than patients with left implants (p = 0.015).
Conclusions:
Declines in verbal fluency after bilateral stimulation are the most commonly reported cognitive sequalae of DBS for movement disorders. Here we found group level declines in verbal fluency after unilateral left STN implants, but not right STN DBS up to 6 months after surgery. Patients with right hemisphere implants displayed improvements in verbal fluency. Compared to bilateral DBS, unilateral DBS surgery, particularly in the right hemisphere, is likely a modifiable risk factor for verbal fluency declines in patients with Parkinson’s disease.
Stable water isotope records of six firn cores retrieved from two adjacent plateaus on the northern Antarctic Peninsula between 2014 and 2016 are presented and investigated for their connections with firn-core glacio-chemical data, meteorological records and modelling results. Average annual accumulation rates of 2500 kg m−2 a−1 largely reduce the modification of isotopic signals in the snowpack by post-depositional processes, allowing excellent signal preservation in space and time. Comparison of firn-core and ECHAM6-wiso modelled δ18O and d-excess records reveals a large agreement on annual and sub-annual scales, suggesting firn-core stable water isotopes to be representative of specific synoptic situations. The six firn cores exhibit highly similar isotopic patterns in the overlapping period (2013), which seem to be related to temporal changes in moisture sources rather than local near-surface air temperatures. Backward trajectories calculated with the HYSPLIT model suggest that prominent δ18O minima in 2013 associated with elevated sea salt concentrations are related to long-range moisture transport dominated by westerly winds during positive SAM phases. In contrast, a broad δ18O maximum in the same year accompanied by increased concentrations of black carbon and mineral dust corresponds to the advection of more locally derived moisture with northerly flow components (South America) when the SAM is negative.
Resuscitated cardiac arrest in a child triggers a comprehensive workup to identify an aetiology and direct management. The presence of a myocardial bridge does not automatically imply causation. Careful determination of the haemodynamic significance of the myocardial bridge is critical to avoid an unnecessary sternotomy and to provide appropriate treatment.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
Acute pyelonephritis (AP) epidemiology has been sparsely described. This study aimed to describe the evolution of AP patients hospitalised in France and identify the factors associated with urinary diversion and fatality, in a cross-sectional study over the 2014–2019 period. Adult patients hospitalised for AP were selected by algorithms of ICD-10 codes (PPV 90.1%) and urinary diversion procedure codes (PPV 100%). 527,671 AP patients were included (76.5% female: mean age 66.1, 48.0% Escherichia coli), with 5.9% of hospital deaths. In 2019, the AP incidence was 19.2/10,000, slightly increasing over the period (17.3/10,000 in 2014). 69,313 urinary diversions (13.1%) were performed (fatality rate 6.7%), mainly in males, increasing over the period (11.7% to 14.9%). Urolithiasis (OR [95% CI] =33.1 [32.3–34.0]), sepsis (1.73 [1.69–1.77]) and a Charlson index ≥3 (1.32 [1.29–1.35]) were significantly associated with urinary diversion, whereas E. coli (0.75 [0.74–0.77]) was less likely associated. The same factors were significantly associated with fatality, plus old age and cancer (2.38 [2.32–2.45]). This nationwide study showed an increase in urolithiasis and identified, for the first time, factors associated with urinary diversion in AP along with death risk factors, which may aid urologists in clinical decision-making.