Skip to main content Accessibility help
×
Home

Systems approach to assessing and improving local human research Institutional Review Board performance

  • John Fontanesi (a1), Anthony Magit (a1), Jennifer J. Ford (a1), Han Nguyen (a1) and Gary S. Firestein (a1) (a2)...

Abstract

Objective

To quantifying the interdependency within the regulatory environment governing human subject research, including Institutional Review Boards (IRBs), federally mandated Medicare coverage analysis and contract negotiations.

Methods

Over 8000 IRB, coverage analysis and contract applications initiated between 2013 and 2016 were analyzed using traditional and machine learning analytics for a quality improvement effort to improve the time required to authorize the start of human research studies.

Results

Staffing ratios, study characteristics such as the number of arms, source of funding and number and type of ancillary reviews significantly influenced the timelines. Using key variables, a predictive algorithm identified outliers for a workflow distinct from the standard process. Improved communication between regulatory units, integration of common functions, and education outreach improved the regulatory approval process.

Conclusions

Understanding and improving the interdependencies between IRB, coverage analysis and contract negotiation offices requires a systems approach and might benefit from predictive analytics.

  • View HTML
    • Send article to Kindle

      To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

      Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

      Find out more about the Kindle Personal Document Service.

      Systems approach to assessing and improving local human research Institutional Review Board performance
      Available formats
      ×

      Send article to Dropbox

      To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

      Systems approach to assessing and improving local human research Institutional Review Board performance
      Available formats
      ×

      Send article to Google Drive

      To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

      Systems approach to assessing and improving local human research Institutional Review Board performance
      Available formats
      ×

Copyright

This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.

Corresponding author

*Address for correspondence: John Fontanesi, Ph.D., School of Medicine, University of California, San Diego, 200 W Arbor Drive (8415), San Diego, CA, USA. (Email: jfontanesi@ucsd.edu)

References

Hide All
1. NCATS. Clinical and Translational Science Awards Program. Opportunities for Advancing Clinical and Translational Research, Chapter: 3, Leadership CTSA Fact Sheet [Internet], 2013 [cited July 2017]. (https://ncats.nih.gov/files/CTSA-factsheet.pdf)
2. United States. President’s Commission for the Study of Ethical Problems in Medicine and Biomedical and Behavioral Research; Implementing human research regulations. United States Code Annotated United States 1982; Title 42 Sect.
3. United States Department of Health and Human Services. Office of Inspector General, Institutional Review Boards: Promising Approaches (OEI-01-97-00191) [Internet], 1998 [cited July 2017]. (https://oig.hhs.gov/oei/reports/oei-01-97-00191.pdf)
4. United States Department of Health and Human Services. Office of Inspector General, Institutional Review Boards: A System in Jeopardy “DRAFT”; A Time for Reform (OEI-01-97-00193) [Internet], 1998 [cited July 2017]. (https://oig.hhs.gov/oei/reports/oei-01-97-00193.pdf)
5. United States Department of Health and Human Services. Office of Inspector General, Institutional Review Boards: A Time for Reform (OEI-01-97-00193) [Internet], 1998 [cited July 2017]. (https://oig.hhs.gov/oei/reports/oei-01-97-00193.pdf)
6. United States Government Accountability Office. Human Subjects Research: HHS Takes Steps to Strengthen Protections, But Concerns Remain (GAO-01-775T) [Internet], 2001 [cited July 2017]. (http://www.gao.gov/products/GAO-01-775T)
7. Institute of Medicine (US). Committee on assessing the system for protecting human research participants. In: Federman DD, Hanna KE, Rodriguez LL, eds. Responsible Research: A Systems Approach to Protecting Human Research Participants. Washington, DC: National Academies Press, 2002, pp. v-vi.
8. Institute of Medicine (US). Forum on Drug Discovery, Development, and Translation. Transforming Clinical Research in the United States: Challenges and Opportunities: Workshop Summary. Washington, DC: National Academies Press, 2010.
9. Institute of Medicine (US). Envisioning a Transformed Clinical Trials Enterprise in the United States: Establishing an Agenda for 2020: Workshop Summary. Washington, DC: National Academies Press, 2012.
10. National Academies of Sciences, Engineering, and Medicine. Optimizing the Nation’s Investment in Academic Research: A New Regulatory Framework for the 21 st Century. Washington, DC: National Academies Press; 2016.
11. Caligiuri, M, et al. A multi-site study of performance drivers among Institutional Review Boards. Journal of Clinical and Translational Science 2017; 1: 192197.
12. Sobolski, GK, Flores, L, Emanuel, EJ. Institutional review board review of multicenter studies. Annals of Internal Medicine 2007; 146: 759.
13. Anderson, EE. A qualitative study of non-affiliated, non-scientist institutional review board members. Accountability in Research 2006; 13: 135155.
14. Steinbrook, R. Improving protection for research subjects. New England Journal of Medicine 2002; 346: 14251430.
15. Rubio, DM, et al. Developing common metrics for the Clinical and Translational Science Awards (CTSAs): lessons learned. Clinical and Translational Science 2015; 8: 451459.
16. Dilts, DM, Sandler, AB. Invisible barriers to clinical trials: the impact of structural, infrastructural, and procedural barriers to opening oncology clinical trials. Journal of Clinical Oncology 2006; 24: 45454552.
17. Dilts, DM, et al. Steps and time to process clinical trials at the Cancer Therapy Evaluation Program. Journal of Clinical Oncology 2009; 27: 17611766.
18. Strasser, JE, Cola, PA, Rosenblum, D. Evaluating various areas of process improvement in an effort to improve clinical research: discussions from the 2012 Clinical Translational Science Award (CTSA) Clinical Research Management workshop. Clinical and Translational Science 2013; 6: 317320.
19. Rosenblum, D, Alving, B. The role of the Clinical and Translational Science Awards program in improving the quality and efficiency of clinical research. Chest 2011; 140: 764767.
20. Walden, D, et al. INCOSE Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities, 4th edition. Hoboken, NJ: Wiley, 2015, pp. 7.1.37.3.6.
21. United States Department of Commerce. NIST/SEMATECH e-Handbook of Statistical Methods, Section 1.3.5.17 [Internet], 2012 [cited July 2017]. (http://www.itl.nist.gov/div898/handbook/)
22. Burnham, KP, Anderson, DR. Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach, 2nd edition. New York, NY: Springer, 2002.
23. Chapman, P, et al. CRISP-DM 1.0 Step-By-Step Data Mining Guides, SPSS Inc., 2000 [Internet] [cited July 2017]. (ftp://ftp.software.ibm.com/software/analytics/spss/support/Modeler/Documentation/14/UserManual/CRISP-DM.pdf)
24. International Organization for Standardization. Statistical interpretation of data — Part 4: Detection and treatment of outliers. ISO 16269-4:2010(en) [Internet], 2010 [cited May 2018]. (https://www.iso.org/obp/ui/#iso:std:iso:16269:-4:ed-1:v1:en).
25. Zarin, D. Newsmaker interview: Debora Zarin. Unseen world of clinical trials emerges from U.S. database. Science 2011; 333: 145.
26. Koenker, R, Hallock, KF. Quantile regression. Journal of Economic Perspectives 2001; 15: 143156.
27. Freedman, DA. Statistical Models: Theory and Practice. New York, NY: Cambridge University Press, 2009.
28. Nahar, A, et al. Quality Improvement and Cost Reduction Using Statistical Outlier Methods. Lake Tahoe, CA: Computer Design, 2009, pp. 6469.
29. Hodge, VJ, Austin, J. A survey of outlier detection methodologies. Artificial Intelligence Review 2004; 22: 85126.
30. Xu, S, et al. An improved methodology for outlier detection in dynamic datasets. Process Systems Engineering, The American Institute of Chemical Engineers 2015; 61: 419433.
31. Shoenbill, K, et al. IRB process improvements: a machine learning analysis. Journal of Clinical Translational Science 2017; 1: 176183.

Keywords

Type Description Title
WORD
Supplementary materials

Fontanesi et al. supplementary material
Fontanesi et al. supplementary material 1

 Word (501 KB)
501 KB

Systems approach to assessing and improving local human research Institutional Review Board performance

  • John Fontanesi (a1), Anthony Magit (a1), Jennifer J. Ford (a1), Han Nguyen (a1) and Gary S. Firestein (a1) (a2)...

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed