Hostname: page-component-8448b6f56d-gtxcr Total loading time: 0 Render date: 2024-04-18T21:06:07.539Z Has data issue: false hasContentIssue false

OP21 Involving Clinical Experts In Prioritizing Topics For Health Technology Assessment: A Randomized Controlled Trial

Published online by Cambridge University Press:  12 January 2018

Rights & Permissions [Opens in a new window]

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.
INTRODUCTION:

The National Institute for Health Research Health Technology Assessment (NIHR HTA) Programme commissions research to inform health services in the United Kingdom. The program prioritises research ideas from literature, guidelines, patients, and clinicians, to decide which research should be funded. We get clinical input on these ideas through (i) committees of clinicians and patients and (ii) seeking written advice from multiple clinicians — a refereeing process. Chairs of our committees suggested that the material we sent to clinicians was too extensive and the method of response too burdensome. We set out to determine whether reducing the information provided or burden of response would improve the engagement of clinicians with our processes, and hence improve the quality of advice provided, and the research available to health services.

METHODS:

We undertook a factorial randomized controlled trial (University of Southampton Faculty of Medicine Ethics Committee #8192, Trial registration: ACTRN12614000167662). Each participant was randomized to receive one of two types of material to comment on, and one of two means to respond. In the first allocation participants were randomised in a 1:1 ratio between receiving a ‘vignette’ (a briefing paper of up to ten pages discussing possible research = usual practice), or a ‘commissioning brief’ (a single page summarising the proposed research). In the second allocation, the method of response was randomized, between a structured form and free text email.

RESULTS:

We randomized 460 clinical experts, and 356 (77.4 percent) responded. The responses were graded for quality on a scale of 0 to 4 (higher scores better). Non-response was scored as 0. Analysis using ANOVA gave results of a structured response scoring .34 points (Standard Deviation, SD .36) over a freeform response (p = .02); and the commissioning brief as .04 points over a vignette (p = .81).

CONCLUSIONS:

This was the first randomized trial to take place inside the secretariat of the HTA program. The difference in quality score between the brief and the vignette allocations was neither statistically nor practically important. The difference between the structured and freeform response was statistically significant, and sufficiently large to be important in practice. While the choice of material to share with clinicians seems unimportant we have shown that it is worth sending a structured response form to experts.

Type
Oral Presentations
Copyright
Copyright © Cambridge University Press 2018