Conover MM, Lowman A, Sturmer T, Layton JB, Martin D, Jonsson Funk M. Usability evaluation of Sentinel's Prospective Routine Observational Monitoring Program Tools (PROMPT). Poster presented at the 33rd International Conference on Pharmacoepidemiology & Therapeutic Risk Management; August 2017. Montreal, Canada. [abstract] Pharmacoepidemiol Drug Saf. 2017 Aug; 26(S2):313.


BACKGROUND: The Sentinel System developed tools that assist users with the design and specification of product safety studies in its distributed data network.

OBJECTIVES: We evaluated two Excel-based PROMPT forms/tools: the taxonomy selection tool (TST), which guides study design selection, and the query request form (QRF) used to specify a propensity score (PS) matched analysis. We sought to (1) identify problems users encounter applying tools and (2) recommend improvements to increase quality of queries and reduce inefficiency.

METHODS: We recruited scientists with epidemiology PhDs or advanced clinical degrees and experience in pharmacoepidemiology (PE) and/or surveillance. A moderator guided two-hour sessions in which two users completed a pre-specified task together. Users populated either the TST in one session (using a hypothetical safety signal and literature review) or the QRF in two sessions (using a hypothetical study protocol). Users provided quantitative feedback on the tools via Likert questionnaire (1 = strongly disagree; 5 = strongly agree). We recorded sessions and post-task guided discussions, then coded qualitative observations and analyzed them for themes using ATLAS.ti.

RESULTS: Out of 18 unique users, totaling 36 session hours, 14 were employed at academic institutions, 16 had formal PE training, and 5 had clinical degrees. Cross-cutting issues affecting both forms included (1) challenges with Excel interface, (2) terminology confusion, (3) difficulty understanding the forms' context in the query process, and (4) need for additional guidance. In the QRF Likert survey, 67% and 58% of subjects responded favorably to the statements “form did not ask for irrelevant details” (median: 4.0, range 2-5) and “examples were helpful” (4.0, 3–5), respectively. However, less than 10% of participants responded favorably to the statements “form asked for all relevant details” (2.0, 1–3), “guidance to inform sound decisions” (2.5, 1–5), and “appropriateness of PS modeling settings” (2.5, 2–4).

CONCLUSIONS:
We recommend (1) migrating forms to a non-Excel interface, (2) increased use of visuals to communicate complex design decisions, and (3) restructuring the forms to reflect user expectations/mental models and prevent user error with a “path of least resistance” encouraging conservative design decisions. We recommend that the tools not be decoupled from a comprehensive query process ensuring adequate user support and appropriate consultation from clinical/epidemiology/database experts.

Share on: