Clinical Epidemiology/Clinical Trial
3279 First in Man
- Laura Adang, Francesco Gavazzi, Valentina De Giorgis, Micaela De Simone, Elisa Fazzi, Jessica Galli, Jamie Koh, Julia Kramer-Golinkoff, Simona Orcesi, Kyle Peer, Nicole Ulrick
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 45
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: A mimic of congenital infections and a rare genetic cause of interferon overproduction, Aicardi Goutières Syndrome (AGS) results in significant neurologic disability. AGS is caused by pathogenic changes in the intracellular nucleic acid sensing machinery (TREX1, RNASEH2A, RNASEH2B, RNASEH2C, SAMHD1, ADAR1, and IFIH1). All affected individuals exhibit neurologic impairment: from mild spastic paraparesis to severe tetraparesis and global developmental delay. We hypothesize that genotype influences the heterogeneous developmental trajectory found in AGS. METHODS/STUDY POPULATION: To characterize this spectrum, age and symptoms at presentation and longitudinal developmental skill acquisition was collected from an international cohort of children (n=88) with genetically confirmed AGS. RESULTS/ANTICIPATED RESULTS: We found that individuals present at variable ages, with the largest range in SAMHD1, ADAR, and IFIH1. There are 3 clusters of symptoms at presentation: altered mental status (irritability or lethargy), systemic inflammatory symptoms, and acute neurologic symptoms, with variability across all genotypes. By creating Kaplan-Meier curves for developmental milestones, we were able to create genotype-based developmental trajectories for the children affected by the 5 most common genotypes: TREX1, IFIH1, SAMHD1, ADAR, and RNASEH2B. Individuals with AGS secondary to TREX1 were the most severely affected, significantly less likely to reach milestones compared to the other genotypes, including head control, sitting, and nonspecific mama/dada (p-value <0.005). Individuals affected by SAMHD1, IFIH1, and ADAR collectively attained the most advanced milestones, with 44% of the population achieving a minimum of a single word and 31% able to walk independently. Three retrospective scales were also applied: Gross Motor Function Classification System, Manual Ability Classification Scale, and Communication Function Classification System. Within each genotypic cohort, there was pronounced heterogeneity. DISCUSSION/SIGNIFICANCE OF IMPACT: Our results demonstrate the influence of genotype on early development, but also suggest the importance of other unidentified variables. These results underscore the need for deep phenotyping to better characterize subcohorts within the AGS population.
3526 Healthy eating, physical activity, sleep and cognitive function in elderly population: Data from National Health and Nutrition Examination Survey 2011-2014
- Magda Shaheen
-
- Published online by Cambridge University Press:
- 26 March 2019, pp. 45-46
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: To examine the relationship between healthy eating, physical activity (PA), sleep problem and hours of sleep and cognitive function among elderly population and the racial/ethnic differences in this relation. METHODS/STUDY POPULATION: We analyzed data from National Health and Nutrition Examination Survey 2014-2016 for 882 population 60 years and older. Cognitive status was measured by the Digit Symbol Substitution (DSS) exercise score and the Consortium to Establish a Registry for Alzheimer’s Disease (CERAD) total score. Healthy eating index (HEI), PA, and sleep problem and hours of sleep were assessed by questionnaire. The association between cognitive function and HEI, PA, sleep problem and hours of sleep were assessed by linear regression after adjusting for age, gender, race/ethnicity, poverty level, lipid profile, fasting glucose level, alcohol, body mass index, stroke and education. Data were analyzed using Stata 14 considering design and sample weight and p<0.05 is statistically significant. RESULTS/ANTICIPATED RESULTS: CERAD total score was associated with HEI (Adjusted B = 0.07, 95% Confidence Interval (CI) = 0.01-0.13, p = 0.02) and not associated with physical activity or sleep problem or hours of sleep (p > 0.05). Animal fluency score was associated only with HEI (Adjusted B = 0.05, 95% CI = 0.01-0.09, p = 0.02). DDS score was not associated with HEI, PA, or sleep problem (p > 0.05) but associated with hours of sleep (p = 0.03). Stratified analysis by race/ethnicity showed that CERAD total score was associated with HEI only in White (Adjusted B = 0.08, 95% CI = 0.01-0.15, p = 0.02). DISCUSSION/SIGNIFICANCE OF IMPACT: CERAD total score was associated with HEI and not associated with PA or sleep problem. Promoting healthy eating is important for improving cognition in elderly population. Culturally sensitive and linguistically appropriate programs that involve community and care providers are needed to promote healthy eating for elderly population.
3111 Heart Rate Variability as a Predictor of Post-Operative Cognitive Dysfunction in Older Adults
- Deborah Oyeyemi, Miles Berger, Kenneth C. Roberts, Charles M. Giattino, Marty G. Woldorff, Cathleen Colon-Emeric, Michael J. Devinney, Thomas Bunning, Junhong Zhou, Lewis A. Lipsitz, Heather E. Whitson
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 46
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: The objective of this project is to determine whether HRV, collected peri-operatively, is predictive of cognitive decline among older adults who undergo elective surgery/anesthesia. METHODS/STUDY POPULATION: This project is a part of the ongoing INTUIT/PRIME study, which is collecting pre- and post-operative cognitive testing, fMRI imaging, CSF samples, and EEG recordings from 200 older adults (age ≥ 60) undergoing elective non-cardiac/non-neurologic surgery scheduled to last > 2 hours at Duke University Medical Center and Duke Regional Hospital. This project utilizes data from the first 60 INTUIT participants who contributed continuous heart rate data before and during surgery. Participants undergo cognitive testing prior to surgery (baseline) and at 6 weeks after surgery. Our primary dependent variable is the change in the composite score from baseline to 6-weeks. Delirium is assessed in the hospital with the twice daily 3D-CAM tool, so we will report the proportion of individuals with 6-week cognitive decline who exhibited delirium in the days following surgery. Participants’ echocardiogram (ECG) recordings are extracted pre- and intraoperatively from B650/B850 patient monitors with VSCapture software. HRV is defined as the variability between successive R-spikes or inter-beat-intervals on ECG. RESULTS/ANTICIPATED RESULTS: We anticipate that lower intraoperative HRV is associated with worse cognitive decline at 6 weeks after surgery. As secondary objectives, we will determine whether pre-operative HRV or change in HRV (from pre-operative to intra-operative measures) are predictive of cognitive decline after surgery. We expect that in-hospital delirium will be detected in a higher proportion of those with 6-week cognitive decline, compared to those with stable or improved cognition at 6 weeks. DISCUSSION/SIGNIFICANCE OF IMPACT: HRV may address the present need for pre- and intra-operative cognitive risk stratification in the elderly. Physiological indices like HRV have the potential to dramatically change our understanding of CI in older adults undergoing surgery, as they offer an accessible, cost-effective, and non-invasive means whereby clinicians, particularly those unfamiliar with the nuances of geriatric and CI/dementia-related care, can monitor patients and refer those at high-risk of CI after surgery for early intervention.
3525 Improvement in Suicidal Ideation after Repeated Ketamine Infusions: Relationship to Reductions in Symptoms of Posttraumatic Stress Disorder, Depression, and Pain
- Cristina Sophia Albott, Kelvin O. Lim, Miriam K. Forbes, Paul Thuras, Joseph Wels, Susanna Tye, Paulo Shiroma
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 46
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: Given the heightened risk for suicide seen in individuals with PTSD+MDD, this report explored the effect of repeated ketamine infusions on SI in a cohort of veterans. METHODS/STUDY POPULATION: Veterans with PTSD+MDD (n = 15) received six intravenous infusions of 0.5 mg/kg ketamine on a Monday-Wednesday-Friday schedule over a 12-day period. All subjects endorsed SI at baseline. Outcome measures included the Montgomery-Asberg Depression Rating Scale (MADRS) total score, MADRS suicidal ideation item, and PTSD symptom Checklist for DSM-5 (PCL-5) subscales (intrusion, avoidance, negative alterations in cognition and mood, and marked alterations in arousal and reactivity), and visual analog scale of pain. Measures were collected immediately before and 24-hours after each infusion. RESULTS/ANTICIPATED RESULTS: Significant improvement in SI was observed 24-hours after the first infusion (Z = 3.21; p = .001) and remained significantly improved at all other post-infusion time points. Improvement in SI at the conclusion of the infusion series was significantly correlated with PTSD subscales of avoidance (r(12) = .610, p = .021), negative alterations in cognition and mood (r(12) = .786, p = .001), alterations in arousal and reactivity (r(12) = .729, p = .003), and pain (r(12) = .591, p = .013), even when controlling for improvement in symptoms of depression. DISCUSSION/SIGNIFICANCE OF IMPACT: The present analysis provides evidence of improvement in SI in a cohort of veterans with PTSD+MDD. Improvements in suicidality were correlated with PTSD symptom subscales and pain independent of improvement in depression. This report extends the interpersonal theory of suicide as it applies to posttraumatic pathology by demonstrating a significant association between improvements in all subclusters of PTSD, improvement in pain and improvement in suicidal ideation.
3193 Improving Individual Clinical Outcomes in a Sequential Multiple Assignment Randomized Trial (SMART)
- Hayley M Belli, Andrea B. Troxel
-
- Published online by Cambridge University Press:
- 26 March 2019, pp. 46-47
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: This work develops an algorithm that identifies patients in a Sequential Multiple Assignment Randomized Trial (SMART) who should switch treatments prior to the end of a stage because clinical effectiveness via their current intervention is unlikely. This algorithm uses as inputs patient baseline and interim measurements to assign a probability that a patient should switch or stay on their current intervention. First, the algorithm will be derived assuming both a linear and non-linear patient trajectory. Second, the performance of the algorithm will be assessed using trial data from the Establishing Moderators and Biosignatures of Antidepressant Response in Clinical Care† (EMBARC) study. The primary objective of the algorithm is to switch treatment in patients who will not reach clinical effectiveness by the end of the stage, and the secondary objective is to avoid accidentally switching treatment in patients who will reach clinical effectiveness by the end of the stage. †Trivedi et al. Journal of Psychiatric Research 78 (2016) 11-23 METHODS/STUDY POPULATION: First, the algorithm was derived assuming a linear or non-linear trajectory. Next, performance of the algorithm was assessed using data from the Establishing Moderators and Biosignatures of Antidepressant Response in Clinical Care† (EMBARC) study. This two-stage SMART design measured the effectiveness of sertraline in 242 patients with nonpsychotic Major Depressive Disorder (MDD). The algorithm was applied to baseline and interim measurements from the EMBARC study to predict end-stage Hamilton Depression (HAMD17) scores, the primary outcome of the study. True positive rate (TPR) and false positive rate (FPR) were used to measure respectively the primary study objective (switching treatment in patients who will not reach clinical effectiveness by the end of the stage), and the secondary study objective (avoiding accidentally switching treatment in patients who will reach clinical effectiveness by the end of the stage). TPR and FPR were calculated for the following prediction scenarios: (1) three separate two-point predictions: Baseline and Week 2, Baseline and Week 4, Baseline and Week 6, and (2) a single three-point prediction: Baseline and Weeks 2 and 6. †Trivedi et al. Journal of Psychiatric Research 78 (2016) 11-23 RESULTS/ANTICIPATED RESULTS: When using two-point prediction, we found TPR to increase and FPR to decrease as the interim measurements approached closer to the end of the stage. We also found TPR to increase when using a three-point prediction, but at the expense of FPR also increasing. Across these scenarios, TPR ranged between 70 and 90%, and FPR ranged between approximately 20 and 50%. DISCUSSION/SIGNIFICANCE OF IMPACT: Although SMART designs ultimately assign patients to more effective treatments, this process can take time and leave a patient (currently on an ineffective treatment) waiting until the end of a stage to try a potentially superior treatment. This disadvantage of the SMART design is currently addressed by this algorithm. By introducing a regression and likelihood approach to predict whether a patient should switch or stay on their current treatment, we move closer to the goal of designing rigorous, patient-centered studies. This work has the potential to improve individual clinical outcomes for patients enrolled in pragmatic clinical trials.
3426 Increased Monounsaturated Fat Consumption is Associated with Improved Body Composition in Subjects with Obesity and Heart Failure with Preserved Ejection Fraction
- Hayley Billingsley, Salvatore Carbone, Brando Rotelli, Dinesh Kadariya, Justin M. Canada, Roshanak Markley, Antonio Abbate
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 47
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: We hypothesized that increasing percent calories from MUFA (%MUFA) would be associated an increased FFM/FM index. METHODS/STUDY POPULATION: Nine consecutive HFpEF patients with obesity participated in a 12-week pilot feasibility trial of UFA supplementation (NCT03310099). Subjects were educated at baseline by a dietitian on UFA rich foods including high MUFA choices such as extra-virgin olive oil, canola oil and avocados. Participants were given a list of items, corresponding serving sizes and asked to eat at least one serving of these UFA rich foods per day for 12 weeks. Adherence was encouraged through weekly phone calls by the dietitian. Standardized 5-pass 24-hour dietary recall was performed by a dietitian at baseline and 12 weeks. The recalls were analyzed to establish intake of MUFA in percent calories (%kcals) with Nutrition Data Systems for Research software (NDSR). Body composition including FM%, fat free mass percent of body weight (FFM%) and ratio of FFM to FM (FFM/FM Index) was measured with bioelectrical impedance analysis (RJL systems) at baseline and 12 weeks. Statistical analysis was performed with SPSS (24.0). Spearman rank test was used for correlations. Values are expressed as numbers and percentages or as median and interquartile range (IQR). RESULTS/ANTICIPATED RESULTS: Subjects were mostly female (56%) with a median age 56 (IQR 50-59). Baseline median body mass index (kg/m2) was 36.7 (36.2-48.0), median FM% was 44.5 (IQR 32.5-53.4), median FFM% was 55.5 (IQR 46.7-67.5) and median FFM/FM Index was 1.25 (IQR 0.88-2.1).The only significant change was an increase in %MUFA from baseline 12.4% (IQR 6.9-14.3) to 12 weeks 21.8% (17.6-36.9) (p = 0.008). Increased %MUFA was highly associated with increased FFM% (r = 0.783, p = 0.013) (Figure 1A), decreased FM%(r = −0.783, p = 0.013)(Figure 1B) and increased FFM/FM index (r = 0.800, p = 0.010) (Figure 1C). All correlations remained statistically significant after adjustment for changes in energy intake. DISCUSSION/SIGNIFICANCE OF IMPACT: Increasing dietary %MUFA is protective against negative changes in body composition in patients with obesity and HFpEF, independent of changes in caloric intake. Future work should focus on whether the correlation found in this pilot study translate in improved body composition and finally, exercise tolerance and clinical outcomes.
3551 Intermittent Theta Burst Stimulation to Relieve Depression and Executive Function impairment in older adults
- Pilar Cristancho, Jacinda Berger, Lojine Kamel, Manuela Araque, Barch Deanna, Lenze Eric
-
- Published online by Cambridge University Press:
- 26 March 2019, pp. 47-48
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: The objective of the study is to examine the ability of iTBS to improve depression and executive impairment in depressed older adults. If effective, this treatment will have the potential to improve the quality of life in LLD. METHODS/STUDY POPULATION: From 12- 2016 to date older adults (60 – 85 y/o) in a major depressive episode, with evidence of executive dysfunction (on the NIH Tool Box battery) were enrolled. iTBS protocol: This brief paradigm (3 min 9 seconds duration) was administered on weekdays for four weeks (20 sessions total). Stimulation intensity was set up to 120% of the observed motor threshold. Depression primary outcome: Change in the Montgomery Asberg Depression Rating Scale (MADRS) from baseline to the end of iTBS course. Executive function primary outcome: Change in executive measures from the electronic NIH Tool Box cognitive domain battery8. Executive secondary outcome: Change in scores from baseline to the end of iTBS on the Frontal Systems Behavior Scale (FrSBe), this self reported instrument measures dys-executive behavior. Statistical Analysis: paired t-test examined changes in depression and executive variables from baseline to post iTBS. Pearson correlation examined the association between degree of mood improvement and degree of improvement in executive function. SPSS v 24 was used for all analyses. RESULTS/ANTICIPATED RESULTS: We examined 11 subjects. Primary outcomes: Patients showed a significant decrease in scores on the Montgomery Asberg Depression Rating Scale (MADRS) from baseline Mean (M) = 27.73, Standard Deviation (SD) 8.2 to the end of 4 weeks of iTBS: M = 15.91, SD = 10.05, t = 7.4, p < .001. The Flanker Inhibitory Control and attention test significantly improved from baseline M = 91.0, SD = 7 to the end of iTBS M = 98.7, SD = 12.8, t = −2.9, p = .014 (higher scores at week 4 denote improvement). The List Sorting Working Memory test and the Dimensional Change Card sort a measure of cognitive flexibility improved but did not reach statistical significance. The self reported executive measure improved from baseline M = 48.6, SD = 9.4 to post iTBS M = 39.4, SD = 8.5, t = 3.8, p = .003 (lower scores at week 4 denote improvement). We also examined whether the degree of improvement in depression related to the degree of improvement in executive function. We found positive correlations between change in mood scores with iTBS with change in executive scores with iTBS, with a strong relationship with working memory r = 0.34. Tolerability and Side effects: Common side effects were twitching in facial muscles during the stimulation (n =11), headaches (n =10) and pain or discomfort at the stimulation site and face (n =4). One participant withdrew due to intolerance to the stimulation. DISCUSSION/SIGNIFICANCE OF IMPACT: The iTBS paradigm was effective in improving mood and executive function in older adults. Both the psychometric measure and the self reported executive function measure (indicative of dysexecutive behavior) reflected improvements post iTBS. Improvement in executive function was correlated with depression improvement. We targeted the Dorso Lateral Prefrontal cortex, which exhibits decreased connectivity with the dorsal anterior cingulate in depressed elderly and is a key in orchestration of executive function. Our findings are consistent with the conceptualization of depression as a circuit level disorder affecting interconnected networks involving mood and cognition. Although we demonstrated potential therapeutic effects, the mechanism of action of iTBS remains unknown. We are presently conducting a randomized controlled trial to examine the effects of iTBS on brain connectivity using functional MRI. Results of this study underway will hopefully demonstrate engagement of the TMS target and contribute to a neurocircuitry based approach treatment of geriatric depression.
3411 Maximizing the Value of Your Trial Innovation Network Hub Liaison Team
- Charlie Gregor, Ann Melvin, Christopher Goss
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 48
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: The University of Washington (UW) CTSA Hub Liaison Team has directed and facilitated work required to bring TIN multisite trials to the CTSA hub and its affiliates by: (1) Connecting hub and affiliate investigators with the services offered by the Trial and Recruitment Innovation Centers, (2) identifying investigators at academic and non-academic institutions to act as co-investigators on multisite trials, (3) supporting the local and affiliate human research protection programs and investigators throughout the life-cycle of the study, (4) maximizing CTSA and local study team resources to develop and monitor study-specific volunteer recruitment and retention plans. The UW CTSA TIN Hub Liaison Team has worked to achieve these objectives via the following methods designed for generalization and dissemination. METHODS/STUDY POPULATION: (1) Providing consultations to investigators interested in the services offered by the Trial and Recruitment Innovation Centers. (2) Hub and affiliate investigators at academic and non-academic institutions are identified by a variety of approaches, including the engagement of existing CTSA hub regional collaboration networks, utilizing EHR data from CTSA developed phenotypes and targeted “Investigator Engagement Packets”. (3) Ensuring regulatory oversight and compliance is challenging in the new age of single IRB review. Establishing a flexible reliance office, engaging with the central TIN IRBs and providing guidance and resources to local study teams ensures investigator confidence in the integrity of the protocol approval and study activity processes. (4) The CTSA Hub Liaison team has developed a Recruitment and Retention Plan template and holds recruitment and retention planning meetings with the CTSA study teams engaging in TIN studies. RESULTS/ANTICIPATED RESULTS: It is anticipated that The Hub Liaison Team: (1) Will contribute to the TIN’s process improvement to bring regionally appropriate studies to the CTSA hub and affiliates. (2) Identify ideal investigators to engage both in proposal submission and co-investigating multisite trials. (3) Collect, compare and improve regulatory and contract approval cycle times. (4) Monitor and support screening, accrual and retention of study volunteers. DISCUSSION/SIGNIFICANCE OF IMPACT: Due to low prevalence of disease, challenges related to identifying and randomizing study volunteers and urgency to address clinical and public health issues, multisite study design is an essential option for NCATS. The Trial Innovation Network is an exciting approach to leverage local and national resources to provide infrastructure to improve multisite clinical and observational trial conduct. The University of Washington CTSA hub has developed and piloted methods to achieve the mission of the TIN, by recruiting investigators and realizing trial objectives, with the hope that these methods could be utilized by other CTSA TIN Hub Liaison Teams.
3092 Measuring Fluid Compartments Before and After Rapid Saline Infusion
- Kevin Lawrence Kelly, Alex R. Carlson, Bradley B. Cierzan, Jennifer Isautier, Wayne L. Miller, Bruce D. Johnson
-
- Published online by Cambridge University Press:
- 26 March 2019, pp. 48-49
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: To evaluate the ability of various techniques to track changes in body fluid volumes before and after a rapid infusion of saline. METHODS/STUDY POPULATION: Eight healthy participants (5M; 3F) completed baseline measurements of 1) total body water using ethanol dilution and bioelectrical impedance analysis (BIA) and 2) blood volume, plasma volume and red blood cell (RBC) volume using carbon monoxide rebreathe technique and I-131 albumin dilution. Subsequently, 30mL saline/kg body weight was administered intravenously over 20 minutes after which BIA and ethanol dilution were repeated. RESULTS/ANTICIPATED RESULTS: On average, 2.29±0.35 L saline was infused with an average increase in net fluid input-output (I/O) of 1.56±0.29 L. BIA underestimated measured I/O by −3.4±7.9%, while ethanol dilution did not demonstrate a measurable change in total body water. Carbon monoxide rebreathe differed from I-131 albumin dilution measurements of blood, plasma and RBC volumes by +0.6±2.8%, −5.4±3.6%, and +11.0±4.7%, respectively. DISCUSSION/SIGNIFICANCE OF IMPACT: BIA is capable of tracking modest changes in total body water. Carbon monoxide rebreathe appears to be a viable alternative for the I-131 albumin dilution technique to determine blood volume. Together, these two techniques may be useful in monitoring fluid status in patients with impaired fluid regulation.
3459 Modeling Emergency Department Length of Stay of Patients With Substance Use Disorder Using an Accelerated Failure Time Model
- Keshab Subedi, Zugui Zhang, Terry Horton, Claudine Jurkovitz
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 49
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: Emergency department (ED) length of stay (LOS) is one of the important indicators of quality and efficiency of ED service delivery and is reported to be both cause and result of ED crowding. Increased ED LOS is associated with ED crowding, increases service cost and sometimes poor patient outcome. Substance abuse is one of the major determinants of morbidity, mortality and healthcare needs. Substance abuse may confound the healthcare and service needs of patients in the ED irrespective of primary purpose of their ED visit and may lengthen the ED LOS. The aim of this study was to evaluate the effect of patients’ demographic and clinical characteristics and of different patient-related activities such as screening brief intervention and referral to treatment (SBIRT) on the ED LOS of patients discharged from the ED with a diagnosis of substance abuse. METHODS/STUDY POPULATION: We conducted a retrospective data analysis of electronic health records. The study population included 26971 patients who visited our hospital ED between 2013 and 2017, had a history of substance abuse and were discharged from the ED. An accelerated failure time (AFT) model was used to analyze the influence of covariates on patient ED LOS. The predictor factors in the model included age, gender, ED arrival shift and weekday, diagnosis history of mental health and drug use, acuity triage level from 1 to 5, with 1 being worse severity, and whether any lab tests were ordered, SBIRT intervention and whether patient was homeless. The AFT model is an alternative to the Cox Proportional Hazard Ratio model, which directly models the log of ED LOS as a function of a vector of covariates. The model defines the increase or decrease in LOS with the changes in the covariate levels as an acceleration factor or time ratio (TR). RESULTS/ANTICIPATED RESULTS: The overall median ED LOS was 4 hours with IQR of 4.2 hours. The average age of the study population was 39.3 years, 58.6% of the patients were male and 57% where White; 63.4% had a history of drug use; 43% had a history of mental health issue, and 0.4% were homeless. In the analysis using the AFT model, increased age (a year increase, TR =1.01, p =0.008), female sex (TR=1.044, P<0.001), SBIRT (TR=1.525, P <0.001), history of mental health issue (TR=1.117, P<0.00), evening arrival (evening vs night, TR=1.04, p=0.006), history of drug use (drug vs alcohol only, TR=1.04, p=0.001), higher acuity (triage level 1 vs 5, TR=2.795, p <0.001) and homelessness (TR=1.073, P = 0.021) lengthened the ED LOS. In contrast, weekend arrival (TR=0.956, p=0.004) and day shift arrival (day vs night, TR=0.958, p=0.004) shortened the ED LOS. DISCUSSION/SIGNIFICANCE OF IMPACT: We identified gender, age, SBIRT, arrival shift, weekend arrival, mental health status, substance abuse, acuity level and homelessness to be significant predictor of ED LOS. The fact that SBIRT increased the LOS should be balanced with the advantages of engaging patients into substance use disorder treatment. Understanding the determinants of ED LOS in this population may provide useful information for physicians or patients to better anticipate an individual’s LOS and to help administrators plan the ED staffing and other resources mobilization.
3316 Mycoplasma Induced Rash and Mucositis: How Affected Are the Eyes?
- Ramy Rashad, Swapna S. Shanbhag, James Chodosh, Hajirah N. Saeed
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 49
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: To demonstrate the prevalence of ocular complications in patients suffering with Mycoplasma Induced Rash and Mucositis (MIRM). METHODS/STUDY POPULATION: In this retrospective observational study, we identified all patients in our hospital database who were diagnosed with MIRM. Diagnosis was confirmed by clinical information and positive Mycoplasma pneumoniae serology. Only patients with available records with formal ophthalmology consults were included. Clinical and laboratory data were collected from our electronic medical record system to capture key components of their clinical course. RESULTS/ANTICIPATED RESULTS: A total of 12 patients satisfied all inclusion and exclusion criteria and were included in our study. The average age of our included patients was 21.2 ± 14.7 years, and the majority were male vs. female (66.7% vs. 33.3%). In all 24 eyes, the only acute ocular findings included conjunctival hyperemia (n=20, 83.3%), meibomitis (n=4, 16.7%), and conjunctival epithelial defects (n=1, 4.2%). None of the patients required or were recommended to receive an amniotic membrane transplantation in the acute phase. Only 2 patients were followed in the chronic phase, one of whom showed evidence of meibomitis in both eyes. Otherwise, no other chronic complications were seen in either patient with chronic follow-up. DISCUSSION/SIGNIFICANCE OF IMPACT: Ocular complications from MIRM may be much milder in comparison to ocular complications found in other bullous and inflammatory conditions such as Stevens-Johnson Syndrome or Toxic Epidermal Necrolysis. Understanding MIRM’s specific sequelae is important in understanding disease manifestation and prognosis in order to better inform acute and chronic management.
3252 Neuroclinical fingerprint of high-risk psychosis
- Keisha Novak, Roman Kotov, Dan Foti
-
- Published online by Cambridge University Press:
- 26 March 2019, pp. 49-50
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: The study aims to utilize event-related potentials (ERPs) coupled with observable reports of symptoms to comprehensively understand neurological and symptomatic profile of individuals at risk for developing psychosis. The study is a short-term longitudinal design which allows for examination of course as well as structure of illness. The primary outcome is to map known neuroclinical deficits among individuals with schizophrenia onto a high-risk, non-clinical sample. A secondary aim of the study is to demonstrate prediction of symptom severity over time measured by a combination of ERPs and clinical symptom scores. METHODS/STUDY POPULATION: Recruited participants are pre-screened for eligibility via telephone interview. This process includes administration of Community Assessment of Psychotic Experiences (CAPE), and the Mini International Neuropsychiatric Interview (MINI). During in-person lab assessment, participants provide written informed consent and complete a battery of ERP tasks, semi-structured clinical interviews, and self-report questionnaires that assess for presence and severity of sub-threshold psychotic-like experiences. Six months following the laboratory visit, participants will be provided a link to online questionnaires that were completed during laboratory visit in order to reassess presence and severity. RESULTS/ANTICIPATED RESULTS: The target number of participants included in this study is 60. We hope to recruit individuals who range in symptom severity as measured by CAPE. It is of interest to determine relationship among known deficits in individuals with schizophrenia and individuals exhibiting sub-clinical symptoms of psychosis. Additionally, we plan to examine ERPs and symptoms together as a “profile” of high risk psychosis, yielding more robust information about this population than any one ERP or symptom measure alone. The within subjects design of this study allows for examination of symptom progression and potential prediction of symptoms based on brain activity. Many studies examine only single ERP components thus limiting the ability to draw broader conclusions regarding general cognitive frameworks among populations. We use a combination of well-validated ERPs (i.e. P300, N400, ERN) with behavioral and symptom data in order to predict variation in symptoms over the course of 6 months. The project aims to take a novel approach at identifying high-risk profiles based on neurophysiological and behavioral data and using this as a basis for predicting symptom severity across time. DISCUSSION/SIGNIFICANCE OF IMPACT: Individuals endorsing psychotic-like experiences are at heightened risk for developing a psychotic disorder in the future, and have been linked with similar social, behavioral, and emotional risk factors similar to those of schizophrenia. Subjective data (e.g. self-report, interview) sheds light on important information regarding observable symptom manifestation; however, neural measures can detect relatively subtle deficits in information processing that precede and predict overt symptom onset, which necessitates other important methodological considerations. Specifically, extant literature has shown that quantifiable indices of cognitive deficits may represent a vulnerability to psychosis in high-risk populations, and can be measured using event-related potentials (ERPs). This study integrates a psychophysiological approach by mapping neural deficits from schizophrenia onto a high-risk sample. Many studies examine only single ERP components thus limiting the ability to draw broader conclusions regarding general cognitive frameworks among populations. We use a combination of well-validated ERPs (i.e. P300, N400, ERN) with behavioral and symptom data in order to predict variation in symptoms over the course of 6 months. The project aims to take a novel approach at identifying high-risk profiles based on neurophysiological and behavioral data and using this as a basis for predicting symptom severity across time. We will parse heterogeneity within a high-risk group in order to create innovative profiles and potentially predict variation in course of symptoms. In other words, a “fingerprint” physiologic aberration may be exhibited within high-risk individuals and can be used as biomarkers to identify those at risk even before onset of observable symptoms.
3134 Organophosphate pesticide exposure during pregnancy, gestational weight gain and long-term postpartum weight retention
- Linda G Kahn, Elise M Philips, Michiel A van den Dries, Romy Gaillard, Susana Santos, Kelly Ferguson, Vincent V W Jaddoe, Leonardo Trasande
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 50
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: Little is known about potentially obesogenic endocrine-disruptors’ effects on excessive gestational weight gain (GWG) and postpartum weight retention (PPWR), which increase risk of adverse pregnancy and postnatal outcomes. We explored associations between prenatal organophosphate (OP) pesticide exposure and increased weight both during and after pregnancy. METHODS/STUDY POPULATION: Three dimethyl (DM) and three diethyl (DE) OP metabolites were measured in spot urine samples collected at <18, 18-25, and >25 gestational weeks among 688 participants in the Generation R Study. Metabolite levels were expressed as molar concentration/gram creatinine and log10-transformed. GWG and PPWR were calculated as the difference between weight at each prenatal/postnatal visit or maximum gestational weight and pre-pregnancy weight. In covariate-adjusted regression models we assessed associations of metabolite concentrations at each prenatal visit and, where appropriate, averaged across pregnancy with early-to-mid pregnancy, mid-to-late pregnancy, late pregnancy-to-maximum, and total GWG; insufficient and excessive GWG according to Institute of Medicine guidelines; and long-term PPWR at 6 and 10 years postpartum. Based on OP pesticides’ lipophilicity and association with hypomethylation, we investigated interactions with pre-pregnancy body mass index, periconceptional folic acid supplementation, and breastfeeding duration. RESULTS/ANTICIPATED RESULTS: A 10-fold increase in late pregnancy DE metabolite concentration was associated with 1.34 kg [95% confidence interval: 0.55, 2.12] higher late pregnancy-to-maximum GWG. A 10-fold increase in mean DE metabolite concentration across pregnancy was associated with 2.41 kg [0.62, 4.20] lower PPWR at 6 years. Stratified analysis suggested that the prenatal finding was driven by women with pre-pregnancy BMI ≥25 kg/m2, while the postnatal finding was driven by women with pre-pregnancy BMI <25 kg/m2 and with inadequate folic acid supplementation. We found no associations between OP pesticide metabolites and insufficient or excessive weight gain and no interaction with breastfeeding. DISCUSSION/SIGNIFICANCE OF IMPACT: In this longitudinal analysis, we observed a positive association of OP pesticide metabolites with GWG in late pregnancy among overweight/obese women, potentially reflecting inhibition of OP pesticide detoxification by oxidative stress. Postnatally, under/normal weight women with higher OP pesticide metabolites had lower PPWR, possibly due to better metabolic function and a more healthful diet. These results suggest that there may be a critical period during the late phase of pregnancy when OP pesticide exposure may increase GWG, and this association may be amplified in overweight/obese women. Areas for future research include examination of how the interaction between OP pesticides and polymorphisms of the paraoxonase (PON1) gene, which detoxifies OP pesticides, affect GWG/PPWR; exploration of the interplay among maternal pre-pregnancy BMI, oxidative stress, and PON1 levels; and characterization of the variability of OP pesticides exposure across pregnancy using more frequent repeated urine samples.
3318 Phosphorus Absorption in Healthy Adults and in Patients with Moderate Chronic Kidney Disease
- Elizabeth Stremke, Gretchen Wiese, Amy Wright, Sharon Moe, Ranjani Moorthi, Kathleen Hill Gallant
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 51
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: The aim of this study is to compare intestinal phosphorus absorption in healthy adults and moderate stage chronic kidney disease patients in the context of a controlled feeding study METHODS/STUDY POPULATION: Participants are 30-75 years old and include 10 healthy subjects and 10 moderate-staged CKD patients. Each subject pool will be enrolled in a 9-day study period including 7 days of controlled feeding of a 1500 mg phosphorus diet. Following the controlled feeding, two days of absorption tests will take place (oral and IV tests) utilizing radioisotopic phosphorus to calculate fractional absorption efficiency. RESULTS/ANTICIPATED RESULTS: Current enrollment has produced 7 total matched subject (current n = 14/20). Four of the 7 pairs of completed subjects are female and 3 of 7 are black. Preliminary kinetic modeling data from the first enrolled subject show a moderate CKD patient with fractional absorption of 0.375. With forthcoming analyses, we expect that this fractional absorption result will not be statistically different from this subject’s matched pair, nor will each groups average absorption be different from the other. Additionally, we expect absorption to be maintained even with changes in secondary outcomes measures in serum (FGF23, 1,25-dihydroxyvitamin D, parathyroid hormone, and total phosphorus) in CKD patients. DISCUSSION/SIGNIFICANCE OF IMPACT: Lack of statistical difference in fractional phosphorus absorption between gropus would support that intestinal phosphorus absorption is inappropriately normal in CKD patients compared to healthy adults, despite evidence of abnormal phosphorus homeostatic mechanisms. Future studies will consider the effect of dietary P restriction, the most common nutrition intervention in moderate stage CKD, on fractional absorption efficiency in CKD.
3121 Potentially traumatic events and its outcomes among help-seeking adults in Puerto Rico
- Marie Torres, Alfonso Martinez-Taboas, Coralee Perez-Pedrogo, Marisol Pena-Orellana
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 51
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: This study aims to evaluate potentially traumatic events (PTEs) and its relationship with posttraumatic stress symptoms (PTSS), posttraumatic growth (PTG), and resilience in a sample of help-seeking individuals in Puerto Rico. METHODS/STUDY POPULATION: This is an analytic, cross sectional design. Adults receiving health services will participate in the study. Recruited participants will provide informed consent during a visit to a community mental health clinic or community hospital. They will complete a demographic document and four retrospective questionnaires about the variables of study. RESULTS/ANTICIPATED RESULTS: We expect that a high rate of potentially traumatic events (PTEs) is associated with an increased rate of posttraumatic stress symptoms (PTSS). We also expect that a high rate of PTSS is associated with an increased rate of posttraumatic growth (PTG). We expect that a high rate of resilience is associated with low rates of PTSS and PTG. DISCUSSION/SIGNIFICANCE OF IMPACT: This is a first step in the development of effective, clearly targeted interventions, specifically designed to treat negative effects, and also to facilitate positive change and resilience after PTE exposure.
3049 Prenatal care as a protective factor for preterm birth and smoking during pregnancy in nulliparous patients: a propensity score analysis
- Alexandra Noel Houston-Ludlam, Alison G. Cahill, Kathleen K. Bucholz, Andrew C. Heath
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 51
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: Preterm birth rates have been rising in the United States, and reducing preterm birth is a high-priority clinical and public health concern. There are no existing strategies to reduce preterm birth in nulliparous individuals. The present study aims to evaluate prenatal care as a protective factor for preterm birth in this population. METHODS/STUDY POPULATION: Missouri birth record data for child birth years 1993-2016 were used to create a sample of 325,088 singleton births to nulliparous women, themselves born in MO 1975-1985. Logistic regressions, stratified by maternal race (White, African-American, Asian, American Indian/Alaskan Native, Other), were used to predict preterm birth (< 37 weeks gestational age) as a function of 1) initiation of prenatal care of by end of first trimester and 2) Adequacy of Prenatal Care Utilization Index, with sociodemographic covariates of child birth year, maternal age, highest educational level, and marital status (four level variable, including married yes/no, and partner named on birth record, yes/no). Subsequent analyses will use this logistic regression to create a propensity score predicting smoking during pregnancy using birth record parental sociodemographic characteristics, stratified by maternal race. Primary analyses will focus on the role of prenatal care in predicting smoking during pregnancy and preterm birth risk within propensity score stratum. Secondary analyses will consider the role of other risk factors, including maternal pre-pregnancy BMI and maternal DUI history, on preterm birth risk. RESULTS/ANTICIPATED RESULTS: Preliminary logistic regressions predicting preterm birth were analyzed, stratified by maternal race. In White mothers, preterm birth prevalence was 8.2%, and risk was significantly increased by maternal age ≤ 15 and ≥ 31, being unmarried, and by receiving no prenatal care, yet unaffected by timing of prenatal care initiation. For African-American mothers, preterm birth prevalence was 11.9%, and risk was significantly increased by being unmarried and both by not initiating prenatal care by end of first trimester and receiving no prenatal care. Preliminary samples were too small for solid inferences for other races. Anticipated results are that after propensity score match, earlier initiation of prenatal care will show modest protective effect on preterm birth, but other characteristics such as maternal cigarette smoking during pregnancy and DUI status will show stronger effects on predicting preterm birth risk. DISCUSSION/SIGNIFICANCE OF IMPACT: By evaluating the role of prenatal care initiation and delivery on preterm birth, this work provides an evidence base for prenatal care schedules and for understanding the interplay of sociodemographics, healthcare delivery, and individual characteristics in the context of preterm birth risk and potentially reduce negative health outcomes.
3046 Reduced structural and functional connectivity in infants with prenatal opioid exposure
- Stephanie Merhar, Adebayo Braimah, Traci Beiersdorfer, Brenda Poindexter, Nehal Parikh
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 52
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS:. This study aims to understand the effects of prenatal opioid exposure on structural and functional connectivity in the neonatal brain. Our central hypothesis is that infants with prenatal opioid exposure will have decreased structural and functional connectivity as compared to non-exposed controls. Our overarching goal is to improve neurodevelopmental and behavioral outcomes in infants with prenatal opioid exposure. METHODS/STUDY POPULATION:. Infants with prenatal opioid exposure were recruited from 2 birth hospitals in our area. Control infants were recruited from the larger community. Infants underwent MRI between 4-6 weeks of age in the Cincinnati Children’s Hospital Imaging Research Center. MRI sequences included 3D structural T1 and T2-weighted imaging, resting state functional connectivity MRI, and multi-shell DTI (36 directions at b=800 and 68 directions at b=2000). Tract-based spatial statistics (TBSS) was used to identify differences in fractional anisotropy (a measure of white matter integrity) between groups. Group independent component analysis was used to identify differences in resting-state networks between groups RESULTS/ANTICIPATED RESULTS:. There were 5 subjects enrolled in the study with evaluable imaging, 3 infants with prenatal opioid exposure and 2 unexposed controls. Structural MRI was normal in all cases. Infants with prenatal opioid exposure had reduced structural connectivity as measured by fractional anisotropy (FA) in the genu and splenium of the corpus callosum as compared with controls. The orange/red color represents areas in which the FA of the opioid-exposed group was lower than controls and green represents the white matter skeleton common to both groups. Infants with prenatal opioid exposure also had significantly reduced within-network functional connectivity strength (z-transformed partial correlation coefficient 0.358 vs 0.199, p = 0.03) in the sensorimotor network as compared with controls. DISCUSSION/SIGNIFICANCE OF IMPACT:. In this small pilot study, both structural and functional connectivity were reduced in opioid-exposed infants compared with controls. This data suggests that differences in structural and functional connectivity may underlie the later developmental and behavioral problems seen in opioid-exposed children. These findings must be validated in a larger population with correction for confounding factors such as maternal education
3157 Relationship between abnormal nocturnal blood pressure patterns and end-organ damage following heart transplantation.
- Kris Oreschak, Eugene E. Wolfel, Amrut V. Ambardekar, Christina L. Aquilante
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 52
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: Heart transplant (HTx) recipients are more likely to exhibit abnormal circadian blood pressure (BP) patterns (e.g., lack of nocturnal dip in BP) compared with the general population. Our goal was to assess the relationship between abnormal circadian BP patterns and end-organ damage in HTx recipients. METHODS/STUDY POPULATION: The retrospective study included 30 patients who were ≥ 6 months post-heart transplant and had 24-hour ambulatory BP data collected during a parent study. Nocturnal BP decline was categorized as: ≥10% decline, dipper; <10% decline, non-dipper. The primary end-organ damage outcomes we plan to analyze are left ventricular hypertrophy (LVH), chronic kidney disease (CKD), and proteinuria. The association between nocturnal BP decline and the primary outcomes will be analyzed using logistic regression. RESULTS/ANTICIPATED RESULTS: The study cohort consists of 83% men and 83% Caucasians (mean age=57±14 years; mean time post-transplant =9.0±6.6 years). Systolic and diastolic non-dippers represent 53.3% and 40% of the cohort, respectively. Data are currently being analyzed for the association between nocturnal BP dipping status and LVH, CKD, and proteinuria. These findings will be presented at the conference. DISCUSSION/SIGNIFICANCE OF IMPACT: An understanding of factors, such as abnormal circadian BP patterns, that contribute to the development of end-organ damage following HTx may provide opportunities to improve BP management and prevent adverse complications in this high-risk population.
3547 Relationship between smoking and alcohol use status: variations in candidate genes associated with addiction and successful quitting smoking
- Magda Shaheen, Amira Brown, Deyu Pan, Katrina Schrode
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 52
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: Previous studies showed that 52% of smokers were unsuccessful in quitting smoking. Smoking in alcoholics is 2-3 times that of the general population with 50%-80% of alcoholics smoking regularly. Studies have linked several genetic variants to addiction. We examined the relation between successful quitting smoking, alcohol use, and genetic data for CYP2A6, CYP2B6, DRD2, DRD1 and GABRB1 alleles. METHODS/STUDY POPULATION: We analyzed data from NHANES III 1988-1994 for socioeconomic factors, physical activity, body mass index (BMI), alcohol status, successful quit smoking, and genetic data for CYP2A6, CYP2B6, DRD2, DRD1 and GABRB1 alleles. Multivariate logistic regression was used to examine the association between successful quit smoking and genotypes adjusting for other variables. Data were analyzed using SAS version 9.3 (design & weight). RESULTS/ANTICIPATED RESULTS: Of the 2,269 smokers, 57% were current smokers, 35% were heavy drinkers, 24% were both smokers & heavy drinkers and 41% successfully quitted smoking. Successfully quit smoking was associated with CYP2A6 (rs28399433-TG) (adjusted odds ratio (AOR) = 3.6, 95% confidence interval (CI) = 1.1-11.9, p = 0.03), CYP2B6 (rs2279343-AA and AG) (AOR = 2.3, 95%CI = 1.5-3.5, p = 0.0003 for AA & AOR = 2.3, 95%CI = 1.2-4.2, p = 0.01 for AG) and DRD1 (rs4532-AA) (AOR = 2.2, 95%CI = 1.01-4.6, p = 0.04). Among heavy drinkers, those with CYP2A6 (rs28399433-TG) and CYP2B6 (rs2279343-AA and AG) were more likely to successfully quit smoking and those with CYP2A6 (rs5031017-GG) and GABRB1 (rs1442099-CC) were less likely to successfully quit smoking (p<0.05). DISCUSSION/SIGNIFICANCE OF IMPACT: We concluded that while rs28399433-TG, rs2279343-AA & AG positively impacted the success to quit smoking, rs5031017-GG & rs1442099-CC negatively impacted the success in quitting smoking both overall and specifically in heavy drinker smokers.
3154 Resolution of right atrial congestion before LVAD implantation is associated with improved outcomes
- Gaurav Gulati, Nilay Sutaria, Amanda Vest, David DeNofrio, Masashi Kawabori, Gregory Couper, Michael Kiernan
-
- Published online by Cambridge University Press:
- 26 March 2019, p. 53
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: Increased right atrial pressure is known to be a predictor of poor outcomes after LVAD implantation. Whether resolution of right heart congestion prior to LVAD implantation is associated with more favorable outcomes is not well understood. METHODS/STUDY POPULATION: We analyzed LVAD recipients from our institution from 1/1/2015 to 2/28/2018. We excluded patients bridged to LVAD with ECMO support. Patients with admission right atrial pressure (RAPadmit) and implant RAP (RAPimplant) ≥ 14 mmHg were defined as having persistent congestion, while patients with RAPadmit ≥ 14 mmHg and RAPimplant < 14 mmHg were defined as having resolved congestion. Baseline characteristics between groups were compared using the Chi-square and unpaired t-tests. Time to death or RVAD was compared between groups using Cox proportional hazards models. RESULTS/ANTICIPATED RESULTS: Of 57 LVAD recipients with RAPadmit ≥ 14 mmHg, 14 (25%) had persistent congestion at the time of LVAD implantation. While there were no statistically significant differences between groups, patients with persistent congestion were more likely to be INTERMACS profile 1 (21.4% vs 9.5%), less likely to have a destination therapy device strategy (28.6% vs 34.9%), less likely to have moderate or severe right ventricular (RV) dysfunction (64.3% vs 83.7%), and had similar RAPadmit (20.4 mmHg vs 18.9 mmHg) compared to patients with resolved congestion. Median follow up was 307 days. Patients with persistent congestion had a higher frequency of death or RVAD implantation compared to those with resolved congestion (50% vs 14%, HR 3.75, 95% CI 1.25–11.25, p=0.02). DISCUSSION/SIGNIFICANCE OF IMPACT: Among patients with elevated RAP at admission, patients with persistently elevated RAP at the time of LVAD implantation had worse outcomes than patients who were able to be decongested prior to surgery. These data support optimization of RV filling pressures prior to LVAD surgery.