We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Cox duration model serves as the basis for more complex duration models like competing risks, repeated events, and multistate models. These models make a number of assumptions, many of which can be assessed empirically, sometimes for substantive ends. We use Monte Carlo simulations to show the order in which practitioners assess these assumptions can impact the model’s final specification, and ultimately, can produce misleading inferences. We focus on three assumptions regarding model specification decisions: proportional hazards (PH), stratified baseline hazards, and stratum-specific covariate effects. Our results suggest checking the PH assumption before checking for stratum-specific covariate effects tends to produce the correct final specification most frequently. We reexamine a recent study of the timing of GATT/WTO applications to illustrate our points.
Improving the quality and conduct of multi-center clinical trials is essential to the generation of generalizable knowledge about the safety and efficacy of healthcare treatments. Despite significant effort and expense, many clinical trials are unsuccessful. The National Center for Advancing Translational Science launched the Trial Innovation Network to address critical roadblocks in multi-center trials by leveraging existing infrastructure and developing operational innovations. We provide an overview of the roadblocks that led to opportunities for operational innovation, our work to develop, define, and map innovations across the network, and how we implemented and disseminated mature innovations.
To determine the impact of an inpatient stewardship intervention targeting fluoroquinolone use on inpatient and postdischarge Clostridioides difficile infection (CDI).
Design:
We used an interrupted time series study design to evaluate the rate of hospital-onset CDI (HO-CDI), postdischarge CDI (PD-CDI) within 12 weeks, and inpatient fluoroquinolone use from 2 years prior to 1 year after a stewardship intervention.
Setting:
An academic healthcare system with 4 hospitals.
Patients:
All inpatients hospitalized between January 2017 and September 2020, excluding those discharged from locations caring for oncology, bone marrow transplant, or solid-organ transplant patients.
Intervention:
Introduction of electronic order sets designed to reduce inpatient fluoroquinolone prescribing.
Results:
Among 163,117 admissions, there were 683 cases of HO-CDI and 1,104 cases of PD-CDI. In the context of a 2% month-to-month decline starting in the preintervention period (P < .01), we observed a reduction in fluoroquinolone days of therapy per 1,000 patient days of 21% after the intervention (level change, P < .05). HO-CDI rates were stable throughout the study period. In contrast, we also detected a change in the trend of PD-CDI rates from a stable monthly rate in the preintervention period to a monthly decrease of 2.5% in the postintervention period (P < .01).
Conclusions:
Our systemwide intervention reduced inpatient fluoroquinolone use immediately, but not HO-CDI. However, a downward trend in PD-CDI occurred. Relying on outcome measures limited to the inpatient setting may not reflect the full impact of inpatient stewardship efforts.
Logit and probit (L/P) models are a mainstay of binary time-series cross-sectional (BTSCS) analyses. Researchers include cubic splines or time polynomials to acknowledge the temporal element inherent in these data. However, L/P models cannot easily accommodate three other aspects of the data’s temporality: whether covariate effects are conditional on time, whether the process of interest is causally complex, and whether our functional form assumption regarding time’s effect is correct. Failing to account for any of these issues amounts to misspecification bias, threatening our inferences’ validity. We argue scholars should consider using Cox duration models when analyzing BTSCS data, as they create fewer opportunities for such misspecification bias, while also having the ability to assess the same hypotheses as L/P. We use Monte Carlo simulations to bring new evidence to light showing Cox models perform just as well—and sometimes better—than logit models in a basic BTSCS setting, and perform considerably better in more complex BTSCS situations. In addition, we highlight a new interpretation technique for Cox models—transition probabilities—to make Cox model results more readily interpretable. We use an application from interstate conflict to demonstrate our points.
To determine the usefulness of adjusting antibiotic use (AU) by prevalence of bacterial isolates as an alternative method for risk adjustment beyond hospital characteristics.
AU in days of therapy per 1,000 patient days and microbiologic data from 2015 and 2016 were collected from 26 hospitals. The prevalences of Pseudomonas aeruginosa, extended-spectrum β-lactamase (ESBL)–producing bacteria, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant enterococci (VRE) were calculated and compared to the average prevalence of all hospitals in the network. This proportion was used to calculate the adjusted AU (a-AU) for various categories of antimicrobials. For example, a-AU of antipseudomonal β-lactams (APBL) was the AU of APBL divided by (prevalence of P. aeruginosa at that hospital divided by the average prevalence of P. aeruginosa). Hospitals were categorized by bed size and ranked by AU and a-AU, and the rankings were compared.
Results:
Most hospitals in 2015 and 2016, respectively, moved ≥2 positions in the ranking using a-AU of APBL (15 of 24, 63%; 22 of 26, 85%), carbapenems (14 of 23, 61%; 22 of 25; 88%), anti-MRSA agents (13 of 23, 57%; 18 of 26, 69%), and anti-VRE agents (18 of 24, 75%; 15 of 26, 58%). Use of a-AU resulted in a shift in quartile of hospital ranking for 50% of APBL agents, 57% of carbapenems, 35% of anti-MRSA agents, and 75% of anti-VRE agents in 2015 and 50% of APBL agents, 28% of carbapenems, 50% of anti-MRSA agents, and 58% of anti-VRE agents in 2016.
Conclusions:
The a-AU considerably changes how hospitals compare among each other within a network. Adjusting AU by microbiological burden allows for a more balanced comparison among hospitals with variable baseline rates of resistant bacteria.
To determine the effect of an electronic medical record (EMR) nudge at reducing total and inappropriate orders testing for hospital-onset Clostridioides difficile infection (HO-CDI).
Design:
An interrupted time series analysis of HO-CDI orders 2 years before and 2 years after the implementation of an EMR intervention designed to reduce inappropriate HO-CDI testing. Orders for C. difficile testing were considered inappropriate if the patient had received a laxative or stool softener in the previous 24 hours.
Setting:
Four hospitals in an academic healthcare network.
Patients:
All patients with a C. difficile order after hospital day 3.
Intervention:
Orders for C. difficile testing in patients administered a laxative or stool softener in <24 hours triggered an EMR alert defaulting to cancellation of the order (“nudge”).
Results:
Of the 17,694 HO-CDI orders, 7% were inappropriate (8% prentervention vs 6% postintervention; P < .001). Monthly HO-CDI orders decreased by 21% postintervention (level-change rate ratio [RR], 0.79; 95% confidence interval [CI], 0.73–0.86), and the rate continued to decrease (postintervention trend change RR, 0.99; 95% CI, 0.98–1.00). The intervention was not associated with a level change in inappropriate HO-CDI orders (RR, 0.80; 95% CI, 0.61–1.05), but the postintervention inappropriate order rate decreased over time (RR, 0.95; 95% CI, 0.93–0.97).
Conclusion:
An EMR nudge to minimize inappropriate ordering for C. difficile was effective at reducing HO-CDI orders, and likely contributed to decreasing the inappropriate HO-CDI order rate after the intervention.
Children with congenital heart disease are at high risk for malnutrition. Standardisation of feeding protocols has shown promise in decreasing some of this risk. With little standardisation between institutions’ feeding protocols and no understanding of protocol adherence, it is important to analyse the efficacy of individual aspects of the protocols.
Methods:
Adherence to and deviation from a feeding protocol in high-risk congenital heart disease patients between December 2015 and March 2017 were analysed. Associations between adherence to and deviation from the protocol and clinical outcomes were also assessed. The primary outcome was change in weight-for-age z score between time intervals.
Results:
Increased adherence to and decreased deviation from individual instructions of a feeding protocol improves patients change in weight-for-age z score between birth and hospital discharge (p = 0.031). Secondary outcomes such as markers of clinical severity and nutritional delivery were not statistically different between groups with high or low adherence or deviation rates.
Conclusions:
High-risk feeding protocol adherence and fewer deviations are associated with weight gain independent of their influence on nutritional delivery and caloric intake. Future studies assessing the efficacy of feeding protocols should include the measures of adherence and deviations that are not merely limited to caloric delivery and illness severity.
The use of duration models in political science continues to grow, more than a decade after Box-Steffensmeier and Jones (2004). However, several common misconceptions about the models still persist. To improve scholars’ use and interpretation of duration models, we point out that they are a type of regression model and therefore follow the same rules as other more commonly used regression models. In this article, we present four maxims as guidelines. We survey the various duration model interpretation strategies and group them into four categories, which is an important organizational exercise that does not appear elsewhere. We then discuss the strengths and weaknesses of these strategies, noting that all are correct from a technical perspective. However, some strategies make more sense than others for nontechnical reasons, which ultimately informs best practices.
Can rebel organizations in a civil conflict use social media to garner international support? This article argues that the use of social media is a unique form of public diplomacy through which rebels project a favorable image to gain that support. It analyzes the Libyan civil war, during which rebels invested considerable resources in diplomatic efforts to gain US support. The study entails collecting original data, and finds that rebel public diplomacy via Twitter increases co-operation with the rebels when their message (1) clarifies the type of regime they intend to create and (2) emphasizes the atrocities perpetrated by the government. Providing rebels with an important tool of image projection, social media can affect dynamics in an ever more connected international arena.
Many political processes consist of a series of theoretically meaningful transitions across discrete phases that occur through time. Yet political scientists are often theoretically interested in studying not just individual transitions between phases, but also the duration that subjects spend within phases, as well as the effect of covariates on subjects’ trajectories through the process's multiple phases. We introduce the multistate survival model to political scientists, which is capable of modeling precisely this type of situation. The model is appealing because of its ability to accommodate multiple forms of causal complexity that unfold over time. In particular, we highlight three attractive features of multistate models: transition-specific baseline hazards, transition-specific covariate effects, and the ability to estimate transition probabilities. We provide two applications to illustrate these features.
Survey-based contingent valuation (CV) techniques are commonly used to value the potential effects of a policy change when market-based valuation of those effects is not possible. The results of these analyses are often intended to inform policy decisions, which are made within the context of formal policymaking institutions. These institutions are typically designed to reduce the large number of potential options for addressing any given policy problem to a binary choice between the continuation of current policy and a single, specified alternative. In this research we develop an approach for conducting CV exercises in a manner consistent with the decision structure typically faced by policymakers. The data generated from this approach allow for an estimate of willingness to pay (WTP) for a defined policy alternative, relative to leaving policy unchanged, which we argue is of direct interest to policymakers. We illustrate our approach within the context of policy governing the storage of used nuclear fuel in the United States. We value the policy option of constructing an interim storage facility relative to continuation of current policy, wherein used nuclear fuel is stored on-site at or near commercial nuclear generating plants. We close the paper with a discussion of the implications for future research and the role of CV in the policymaking process.
In this paper we illustrate the use of Ultra Soft X-ray Absorption Spectroscopy (USXAS) for the characterization of polymeric materials by highlighting three novel applications of the technique. The surface sensitivity of electron yield (3 nm) and the bulk information available from fluorescence yield USXAS (200 nm) provide unique information on the chemistry of polymer surfaces and interfaces. USXAS is sensitive to both concentration and orientation of functional groups in polymers. The systems highlighted here include the characterization of flame treated model acrylic automotive coatings, ultra-low surface energy crosslinked fluorocarbon films, and spin cast polystyrene films. The chemical and surface sensitivity of the technique are emphasized by the ability of USXAS to detect an increase in the trigonally coordinated carbon at the surface after treatment with a reducing flame. The sensitivity to functional group orientation at the surface is demonstrated by the characterization of the crosslinked flurocarbon polymer films. The results show that the pendant fluoroalkyl moieties of these polymers are strongly oriented perpendicular to the film surface. Spin coated polystyrene films were characterized as a functionv of molecular weight, film thickness and casting solvent. The pendant phenyl groups were found to be preferentially oriented towards the normal to the surface plane, independent of casting solvent, molecular weight, and film thickness
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.