providing consultancy & research in health economics julie glanville, york health economics...
TRANSCRIPT
Providing Consultancy & Research in Health Economics
Julie Glanville, York Health Economics Consortium, UK
Anna Noel Storr, Cochrane Dementia and Cognitive Improvement Group
Gordon Dooley, Metaxis, UK
Ruth Foxlee, Cochrane Editorial Unit
June 2014
Improving rapid access to reports of RCTs from Embase: innovative methods to enhance the Cochrane
Central Register of Controlled Trials (CENTRAL)
Providing Consultancy & Research in Health Economics
Presentation Overview
Background Objectives Methods Progress so far Challenges The future
Background
Health technology assessments ranging from rapid reviews to the most extensive projects, rely on the efficient identification of research evidence
In particular the evidence from randomised controlled trials (RCTs).
The largest single source of RCTs is the Cochrane Central Register of Controlled Trials (CENTRAL) available as part of The Cochrane Library
Cochrane Library is a subscription service which may be made available to users via organisational, regional, national or international funding arranagements
The project and its objectives The Cochrane Collaboration commissioned the Embase update
project in March 2013
Project is undertaken by a consortium of three organisations the Cochrane Dementia and Cognitive Improvement Group
Metaxis, UK
York Health Economics Consortium. University of York, UK
Objectives To identify reports of RCTs and controlled clinical trials from Embase for
more rapid availability in CENTRAL
Methods, 1
We developed and validated a sensitive search filter to identify reports of RCTs
Using textual analysis of 10,000 Embase RCT records (published 2000-2010) in Simstatw and Wordstat
Identified terms, phrases and grouped terms within relevant records which could be tested in filters
Pragmatic approach was used to select and test search terms
Following this testing a second set of 10,000 Embase RCT records from CENTRAL was obtained and the best candidate filter was validated against that set of records in Ovid Embase
At this point the records missed by the filter during the validation testing were reviewed to understand better why the records were missed
This exercise led to some further changes to the filter
The final filter was then validated on a third new set of 10,000 RCT records from CENTRAL
Progress
The validated search filter identifies reports of RCTs in Embase with over 97% sensitivity
An analysis of the records retrieved has resulted in a tiered record assessment process The most obvious RCT reports are fast-tracked into CENTRAL
Animal studies are set to one side for team assessment
The less obvious RCT records are assessed for relevance by a novel use of internet crowdsourcing
“…the practice of obtaining needed services, ideas, or content by soliciting contributions from a large group of people, and especially from an online community, rather than from traditional employees…”
Record screening software written by Metaxis
Between two and six people assess whether a record is really a report of an RCT
Embase weeks 14, 15, 16, and 17: April results
Tier 1
1469 records go straight into CENTRAL
Tier 2
8619 records go to the crowd
Tier 3
approx 400 conference abstracts and animal studies assessed by project
team
Embase weeks 14, 15, 16, and 17: April results
Screening tool
Progress: key metrics
Number needed to read (NNR) = 34Unsure records are 5% of those screened
Metric Number
Screeners who have created an account 450
Screeners who have completed training 241
Records screened 49092
Records accepted 1463
Records rejected 47474
Records unsure 2359
Progress: accuracy
Sample of records screened using the crowd was re-screened by an expert
Expert (Anna Noel-Storr) acted as the reference standard
RCT or CCT = 416 (+) Not RCT or CCT = 2654 (-) Crowd had a sensitivity of 99.8% Crowd has a specificity 99.8% Incorporation bias – Anna not blind to index test results
INDEXTEST
Screener 1 Screener 2 Screener 3 Screener 4 Screener 5 Screener 60%
2%
4%
6%
8%
10%
12%
% Unsure
% Unsure
Progress: are screeners getting better?
Progress: how long to screen a record?
All screeners: Just under 1 minute per recordScreeners who have screened more than 100 records: 42 secs/record
Challenges
Conference abstracts
Animal studies A lot of animal studies are tagged Human/ Developing an animal filter for EMBASE
Deciding what is an RCT – guidance for screeners
‘Motivating the crowd’ ‘Certificates’ – exploring how to tell screeners how many records
processed, more metrics and visuals Personalised thank you’s Community building – Facebook and twitter Enabling screeners to screen records of interest to them
The future
Ever improving currency of Embase record availability in CENTRAL
The number of irrelevant and duplicate records will be fewer
Searchers will be able to identify more RCTs more accurately than previously by a rapid search of CENTRAL
Please visit our project website
http://www.metaxis.com/embasepublic/
Feel free to join the crowd!
http://www.metaxis.com/embase/login.php
Providing Consultancy & Research in Health Economics
http://tinyurl.com/yhec-facebook
http://twitter.com/YHEC1
http://www.minerva-network.com/
Thank you
Telephone: +44 1904 324832
Website: www.yhec.co.uk