5 results
3402 A High-Impact, Structured, Collaborative Approach to Implementing and Utilizing the Research Performance Progress Report (RPPR) for a Clinical and Translational Science Award
- Boris Volkov, Jennifer Cieslak, Rachel Matthes, Christopher Pulley
-
- Journal:
- Journal of Clinical and Translational Science / Volume 3 / Issue s1 / March 2019
- Published online by Cambridge University Press:
- 26 March 2019, pp. 137-138
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: This presentation will highlight a structured, collaborative approach to implementing and utilizing the RPPR process created at the University of Minnesota CTSI in response to the need to enhance the quality, efficiency, consistency, and utilization of annual program reporting. The approach is in line with the NCATS’s strategic objective that encourages all CTS organizations to “disseminate research results and best practices broadly, and promote a culture of openness, sharing and transparency” (NCATS, 2016, p. 19). Program activities that support translational processes and contribute to clinical outcomes are complex, nonlinear, and multidisciplinary (Smith etal., 2017). In this complex context, the meaningful engagement and reflection of program staff and collaborators is essential for all aspects of program planning, implementation, reporting, and dissemination. The University of Minnesota CTSI’s key objectives, goals, and uses of RPPR are as follows: - Develop, align, and leverage the RPPR to fulfill the accountability requirements, needs, and expectations of multiple stakeholders: NIH/NCATS, Internal Advisory Board and External Advisory Board, campus/hub, program staff and collaborators. - Engage the CTSA staff and collaborators as a team in multiple aspects of program reporting. - Inform strategic management, continuous improvement, monitoring and evaluation, organizational learning and dissemination to program stakeholders. - Translate the reported information into practical, evidence-based issues and strategic questions for the leadership discussions and advisory board consultations, actionable work plans, communication to stakeholders, organizational learning, and translational science knowledge base. METHODS/STUDY POPULATION: A case study of the programmatic/evaluative and methodological approach/technique development that resulted in a formal, structured, collaborative, transparent process with detailed guidelines, templates, and timelines. The process and content for reporting has been developed via a variety of methods and sources: specific funder (NIH) requirements, Huddle meetings, document/content/database analysis, reflection meetings with component staff, informal conversations, and observations. Preparation for the report began almost one year in advance, including careful analysis of the report requirements, developing user-friendly, detailed guidelines, templates, and examples. The guide templates and worksheets were created as a result of time spent navigating current instructions provided by NIH and NCATS. Timeline/project plan was developed with start and end dates for all of the moving parts along with identified responsible personnel for each of the tasks. A grid of the grant components and responsible personnel was designed to highlight the matrixed organization of the grant and the need to work across components to create single reports. The RPPR key categories have also been considered for incorporating and tracking in a program activity/customer tracking system for ongoing data management and use. As a complex translational science program, UMN CTSI has multiple initiatives, variables, and metrics to report. The program staff has been deeply engaged in the evaluative reflection to identify, prioritize, and incorporate into the RPPR the metrics that most useful to manage and describe CTSI processes, participation, products, and outcomes. Program components responded differently to the collaborative approach implemented. The M&E technical assistance was implemented in 3 different ways: components either did the M&E RPPR template themselves, with minimal M&E team assistance; responded to comments and information provided by the M&E team as a first step; or requested a significant level of assistance from M&E. Participants/partners in developing and using RPPR include CTSI program leadership and staff, administration, communication staff, M&E team, and our collaborators. RESULTS/ANTICIPATED RESULTS: The proposed comprehensive approach to the annual program performance reporting shows sound promise to enhance program staff engagement, report utilization, learning, strategic management, self-evaluation capacity, and continuous improvement within a clinical and translational science organization. DISCUSSION/SIGNIFICANCE OF IMPACT: This structured approach’s impact is significant in that it fills the current gap in the practice, literature, and methodology and offers a practical example of a “practice that works” for CTR (and other) organizations and programs striving to improve their reporting practices, staff engagement, learning, and program impact. Leveraging and synergizing the RPPR requirements and other complex, data-demanding obligations and needs can help the CTS programs move beyond the once-a-year compilation of project accomplishments and challenges to developing and sharing a thoughtful translational science program success story. References: National Center for Advancing Translational Sciences. (2016). NCATS Strategic Plan. NIH. Available at: https://ncats.nih.gov/strategicplan Smith, C., Baveja, R., Grieb, T., & Mashour, G. (2017). Toward a science of translational science. Journal of Clinical and Translational Science, 1(4), 253-255. doi: 10.1017/cts.2017.14
2360 Engaging, capturing, and integrating the voice of the customer and collaborator in a clinical and translational science program
- Boris Volkov, Jennifer Cieslak, Brook Matthiesen
-
- Journal:
- Journal of Clinical and Translational Science / Volume 2 / Issue S1 / June 2018
- Published online by Cambridge University Press:
- 21 November 2018, pp. 69-70
-
- Article
-
- You have access Access
- Open access
- Export citation
-
OBJECTIVES/SPECIFIC AIMS: This presentation will highlight the framework, domains, and approaches of the “Engaging the Voice of the CTS Customer and Collaborator System” created at the University of Minnesota Clinical and Translational Science Institute (CTSI) in response to the need to improve the stakeholder engagement, quality, efficiency, consistency, and transparency of the clinical and translational work. This system addresses 3 important results-based accountability measures/questions: “What should we do?”, “How well did we do it?”, and “Is anyone better off?”. According to Woolf (2008), “translational research means different things to different people.” Social networks and systems that support translational processes and outcomes are complex, nonlinear, and multidisciplinary (Smith et al., 2017). In this highly uncertain and fluid context, the input of program stakeholders is paramount to move translation forward. NCATS Strategic Plan (2016) directs the grantees to engage patients, community members and nonprofit organizations meaningfully in translational science and all aspects of translational research. Engagement of stakeholders throughout the lifecycle of a translational research project ensures the project processes and outcomes are relevant to and directly address their needs and will be more readily adopted by the community. “Customer” (among other terms are Beneficiary, Collaborator, Client, Community, Consumer, Service User, etc.) is a person, organization, or entity who directly benefits from service delivery or program (Friedman, 2005). Customers can be: direct and indirect, primary and secondary, internal and external. Our analysis of CTS stakeholders (“Who are our customers/collaborators?”) produced the following list of customers and collaborators: researchers, University departments, translational science workforce, patients, community members and entities, nonprofit organizations, industry collaborators, NCATS/NIH, CTSA hub partners, and CTSI staff. The “Voice of the Customer” (VOC) is the term used to describe the stated and unstated needs or requirements of the program’s customer. The “voice of the customer” is a process used to capture the feedback from the customer (internal or external) to provide the customers with the best quality of service, support, and/or product. This process is about being proactive and constantly innovative to capture the changing needs of the customers with time. Related to the VOC is the concept of user innovation that refers to innovations developed by consumers and end users. Experience shows that sometimes the best product or a process concept idea comes from a customer (Yang, 2007: p. 20). Capturing and utilizing such ideas are also relevant to VOC and can be operationalized and implemented as a valuable strategy. The University of Minnesota CTSI’s key objectives, goals, and uses of engaging the VOC and collaborator are as follows: (1) Engage CTSA customers (“relevant stakeholders”) in multiple aspects of translational science and look for opportunities to include their perspective (per NCATS strategic principles). (2) Inform continuous improvement, strategic management, and M&E efforts, the identification of customer needs and wants, comprehensive problem definition and ideation, new concept development and optimization. (3) Synergize NCATS and partner expectations and campus/hub needs. (4) Translate VOC into functional and measurable service requirements. METHODS/STUDY POPULATION: A case study of the programmatic and methodological approach/technique development. The VOC at the UMN CTSI has been captured in a variety of ways: regular and ad hoc surveys, interviews, focus groups, Engagement Studios, formal call for patient/community ideas and proposals, informal conversations, customer/community membership and participation in the Advisory Boards and Executive Leadership Team meetings, and observations. Our VOC variables and metrics assess customer needs, wants, knowledge, and skills; customer satisfaction with processes and outcomes; and customer ideas for improvement and innovation. The ensuing customer feedback and other data have been used to identify and incorporate the important attributes needed in the CTSI processes, products, and dissemination. UMN CTSI partners in engaging and capturing the VOC include our past, current, and potential customers and collaborators, communities, program staff and service providers, program administration, communication staff, M&E team, internal and external data collectors. RESULTS/ANTICIPATED RESULTS: The proposed comprehensive approach shows sound promise to enhance customer and collaborator engagement, critical thinking, learning, strategic management, evaluation capacity and improvement within clinical and translational science organizations. DISCUSSION/SIGNIFICANCE OF IMPACT: This structured approach’s impact is significant in that it fills the current gap in the practice, literature, and methodology and offers a practical example of a “practice that works” for CTR (and other) organizations and programs striving to improve their stakeholder engagement and program impact. Leveraging and synergizing the VOC and community engagement approaches can help CTS organizations advance beyond capturing individual project/service experiences to drawing a holistic portrait of an institution-level (and, potentially, a nation-level) translational science program.
References
Friedman M. Trying Hard Is Not Good Enough: How to Produce Measurable Improvements for Customers and Communities. Trafford, 2005.
National Center for Advancing Translational Sciences. NCATS Strategic Plan [Internet], 2016. NIH (https://ncats.nih.gov/strategicplan)
Smith C,et al. Toward a science of translational science. Journal of Clinical and Translational Science 2017; 1: 253–255.
Woolf SH. The meaning of translational research and why it matters. JAMA 2008; 29: 211–213.
Yang, K. Voice of the Customer Capture and Analysis. US: McGraw-Hill Professional, 2007.
Statewide Validation of Hospital-Reported Central Line–Associated Bloodstream Infections: Oregon, 2009
- John Y. Oh, Margaret C. Cunningham, Zintars G. Beldavs, Jennifer Tujo, Stephen W. Moore, Ann R. Thomas, Paul R. Cieslak
-
- Journal:
- Infection Control & Hospital Epidemiology / Volume 33 / Issue 5 / May 2012
- Published online by Cambridge University Press:
- 02 January 2015, pp. 439-445
- Print publication:
- May 2012
-
- Article
- Export citation
-
Background.
Mandatory reporting of healthcare-associated infections is common, but underreporting by hospitals limits meaningful interpretation.
Objective.To validate mandatory intensive care unit (ICU) central line–associated bloodstream infection (CLABSI) reporting by Oregon hospitals.
Design.Blinded comparison of ICU CLABSI determination by hospitals and health department–based external reviewers with group adjudication.
Setting.Forty-four Oregon hospitals required by state law to report ICU CLABSIs.
Participants.Seventy-six patients with ICU CLABSIs and a systematic sample of 741 other patients with ICU-related bacteremia episodes.
Methods.External reviewers examined medical records and determined CLABSI status. All cases with CLABSI determinations discordant from hospital reporting were adjudicated through formal discussion with hospital staff, a process novel to validation of CLABSI reporting.
Results.Hospital representatives and external reviewers agreed on CLABSI status in 782 (96%) of 817 bacteremia episodes (k = 0.77 [95% confidence interval (CI), 0.70-0.84]). Among the 27 episodes identified as CLABSIs by external reviewers but not reported by hospitals, the final status was CLABSI in 16 (59%). The measured sensitivities of hospital ICU CLABSI reporting were 72% (95% CI, 62%-81%) with adjudicated CLABSI determination as the reference standard and 60% (95% CI, 51%-69%) with external review alone as the reference standard (P = .07). Validation increased the statewide ICU CLABSI rate from 1.21 (95% CI, 0.95-1.51) to 1.54 (95% CI, 1.25-1.88) CLABSIs/1,000 central line–days; ICU CLABSI rates increased by more than 1.00 CLABSI/1,000 central line–days in 6 (14%) hospitals.
Conclusions.Validating hospital CLABSI reporting improves accuracy of hospital-based CLABSI surveillance. Discussing discordant findings improves the quality of validation.
Contributors
-
- By Ashok Agarwal, Carrie Bedient, Nick Brook, Michelle Catenacci, Ying Cheong, Francisco Domínguez, Thomas Elliott, Sandro C. Esteves, Tommaso Falcone, Gabriel de la Fuente, Eugene Galdones, Juan A. Garcia-Velasco, David K. Gardner, Tamara Garrido, Robert B. Gilchrist, Georg Griesinger, Roy Homburg, Jeanine Cieslak Janzen, Mark T. Johnson, Jennifer Kahn, David L. Keefe, Efstratios M Kolibianakis, Laurie J. McKenzie, Nick Macklon, David Meldrum, Ashley R. Mott, Tetsunori Mukaida, Zsolt Peter Nagy, Edurne Novella-Maestre, Chris O’Neill, Chikaharo Oka, Steven F. Palta, Lewis K. Pannell, Antonio Pellicer, Valeria Pugni, Botros R. M. B. Rizk, Christopher B. Rizk, Claude Robert, Denny Sakkas, Hassan N. Sallam, William B. Schoolcraft, Lonnie D. Shea, Carlos Simón, Manuela Simoni, Marc-Andre Sirard, Johan E. J. Smitz, Eric S. Surrey, Jan Tesarik, Raquel Mendoza Tesarik, Jeremy G. Thompson, Andrew J. Watson, Teresa K. Woodruff
- Edited by David K. Gardner, University of Melbourne, Botros R. M. B. Rizk, University of South Alabama, Tommaso Falcone
-
- Book:
- Human Assisted Reproductive Technology
- Published online:
- 16 May 2011
- Print publication:
- 31 March 2011, pp ix-xii
-
- Chapter
- Export citation
Contributors
-
- By Jennifer Alvarez, Ananda B. Amstadter, Metin Başoğlu, David M. Benedek, Charles C. Benight, George A. Bonanno, Evelyn J. Bromet, Richard A. Bryant, Barbara Lopes Cardozo, M. L. Somchai Chakkraband, Claude Chemtob, Roman Cieslak, Lauren M. Conoscenti, Joan M. Cook, Judith Cukor, Carla Kmett Danielson, JoAnn Difede, Charles DiMaggio, Anja J.E. Dirkzwager, Cristiane S. Duarte, Jon D. Elhai, Diane L. Elmore, Yael L.E. Errera, Julian D. Ford, Carol S. Fullerton, Sandro Galea, Freya Goodhew, Neil Greenberg, Lindsay Greene, Linda Grievink, Michael J. Gruber, Sumati Gupta, Johan M. Havenaar, Alesia O. Hawkins, Clare Henn-Haase, Kimberly Eaton Hoagwood, Christina W. Hoven, Sabra S. Inslicht, Krzysztof Kaniasty, Ronald C. Kessler, Rachel Kimerling, Richard V. King, Rolf J. Kleber, Jessica Mass Levitt, Brett T. Litz, Maria Livanou, Katelyn P. Mack, Paula Madrid, Shira Maguen, Paul Maguire, Donald J. Mandell, Charles R. Marmar, Andrea R. Maxwell, Shannon E. McCaslin, Alexander C. McFarlane, Thomas J. Metzler, Summer Nelson, Yuval Neria, Elana Newman, Thomas C. Neylan, Fran H. Norris, Carol S. North, Lawrence A. Palinkas, Benjaporn Panyayong, Maria Petukhova, Betty Pfefferbaum, Marleen Radigan, Beverley Raphael, James Rodriguez, G. James Rubin, Kenneth J. Ruggiero, Ebru Şalcıoğlu, Nancy A. Sampson, Arieh Y. Shalev, Bruce Shapiro, Laura M. Stough, Prawate Tantipiwatanaskul, Warunee Thienkrua, Phebe Tucker, J. Blake Turner, Robert J. Ursano, Bellis van den Berg, Peter G. van der Velden, Frits van Griensven, Miranda Van Hooff, Edward Waldrep, Philip S. Wang, Simon Wessely, Leslie H. Wind, C. Joris Yzermans, Heidi M. Zinzow
- Edited by Yuval Neria, Columbia University, New York, Sandro Galea, University of Michigan, Ann Arbor, Fran H. Norris
-
- Book:
- Mental Health and Disasters
- Published online:
- 07 May 2010
- Print publication:
- 20 July 2009, pp xi-xvi
-
- Chapter
- Export citation