Building to learn: Information technology innovations to enable rapid pragmatic evaluation in a learning health system
Geetanjali Rajamani, Genevieve B. Melton, Deborah L. Pestka, Maya Peters, Iva Ninkovic, Elizabeth Lindemann, Timothy J. Beebe, Nathan Shippee, Bradley Benson, Abraham Jacob, Christopher Tignanelli, Nicholas E. Ingraham, Joseph S. Koopmeiners, Michael G. Usher- Health Information Management
- Public Health, Environmental and Occupational Health
- Health Informatics
Abstract
Background
Learning health systems (LHSs) iteratively generate evidence that can be implemented into practice to improve care and produce generalizable knowledge. Pragmatic clinical trials fit well within LHSs as they combine real‐world data and experiences with a degree of methodological rigor which supports generalizability.
Objectives
We established a pragmatic clinical trial unit (“RapidEval”) to support the development of an LHS. To further advance the field of LHS, we sought to further characterize the role of health information technology (HIT), including innovative solutions and challenges that occur, to improve LHS project delivery.
Methods
During the period from December 2021 to February 2023, eight projects were selected out of 51 applications to the RapidEval program, of which five were implemented, one is currently in pilot testing, and two are in planning. We evaluated pre‐study planning, implementation, analysis, and study closure approaches across all RapidEval initiatives to summarize approaches across studies and identify key innovations and learnings by gathering data from study investigators, quality staff, and IT staff, as well as RapidEval staff and leadership.
Implementation (Results)
Implementation approaches spanned a range of HIT capabilities including interruptive alerts, clinical decision support integrated into order systems, patient navigators, embedded micro‐education, targeted outpatient hand‐off documentation, and patient communication. Study approaches include pre‐post with time‐concordant controls (1), randomized stepped‐wedge (1), cluster randomized across providers (1) and location (3), and simple patient level randomization (2).
Conclusions
Study selection, design, deployment, data collection, and analysis required close collaboration between data analysts, informaticists, and the RapidEval team.