a checklist to assess database-hosting platforms for designing and running searches for systematic...
Post on 04-Apr-2017
212 Views
Preview:
TRANSCRIPT
A checklist to assess database-hosting platforms fordesigning and running searches for systematic reviewsAlison Bethel & Morwenna RogersPenCLAHRC, University of Exeter Medical School, Exeter, UK
Abstract
Background: Systematic reviews require literature searches that are precise, sensitive and often complex.Database-hosting platforms need to facilitate this type of searching in order to minimise errors and the riskof bias in the results.Objectives: The main objective of the study was to create a generic checklist of criteria to assess the abil-ity of host platforms to cope with complex searching, for example, for systematic reviews, and to test thechecklist against three host platforms (EBSCOhost, OvidSP and ProQuest).Method: The checklist was developed as usual review work was carried out and through discussionbetween the two authors. Attributes on the checklist were designated as ‘desirable’ or ‘essential’. Theauthors tested the checklist independently against three host platforms and graded their performance from1 (insufficient) to 3 (performs well).Results: Fifty-five desirable or essential attributes were identified for the checklist. None of the platformsperformed well for all of the attributes on the checklist.Conclusions: Not all database-hosting platforms are designed for complex searching. Librarians and otherdecision-makers who work in health research settings need to be aware of the different limitations of hostplatforms for complex searching when they are making purchasing decisions or training others.
Keywords: bibliographic databases; database searching; information retrieval; literature searching; search-ing; review, systematic
Key Messages
• Librarians who make purchasing decisions should consider subscribing to database-hosting plat-forms that allow for complex searching, if available. The checklist developed by the authors couldbe used to assess the suitability of platforms for designing and running complex searches for sys-tematic reviews
• Database host companies should consider the complex search needs of systematic reviewers whendesigning or updating their platforms
• Database owners should be aware that host platforms might not cope well with highly evolvedsearch strategies, such as those for systematic reviews, before they provide a particular vendor withthe sole access rights to their database.
• Library and information service professionals should be responsible for determining what tools weuse for searching, and for speaking to suppliers and budget holders if these tools are inadequate
Introduction
Information specialists within systematic reviewteams perform a number of key roles includingscoping searches, designing search strategies,advising on resources, translating and running
Correspondence: Morwenna Rogers, Information Specialist, Pen-CLAHRC, University of Exeter Medical School, Veysey Building, Sal-mon Pool Lane, Exeter EX2 4SF, UKE-mail: morwenna.rogers@exeter.ac.uk
© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group
Health Information & Libraries Journal, 31, pp. 43–53 43
DOI: 10.1111/hir.12054
searches across different databases and download-ing results for the reviewers. Invariably, the infor-mation specialist is responsible for ensuring thatall relevant data are retrieved: failure to do socould result in a biased review. There are manyreasons why data may be missed, for example,poor indexing of references on databases, failureto search across multiple resources or an insuffi-cient search strategy.1–3 However, the ability ofdatabase-hosting platforms to facilitate this type ofsearch technique is seldom cited as a reason formissing data.This issue is important as some host platforms
have gained sole right to key health databases. Forexample, currently, the British Nursing Index isonly commercially available through ProQuest,and CINAHL (Cumulative Index to Nursing andAllied Health) is only commercially available viaEBSCOhost. These databases form an importantsource of health-related literature, and they need tobe searched comprehensively and exhaustively forsystematic review, particularly if the research areais in nursing. With only one platform providingaccess to the literature on these databases, it isvital to ensure that each platform can cope withthe complex level of searching needed for system-atic reviews.Search strategies for systematic reviews should
be highly sensitive to capture as much relevantinformation as possible.4 In addition, a searchstrategy should be as transparent as possible anddocumented in a way that enables it to be evalu-ated and reproduced.5 Evidence indicates that com-prehensive search strategies are required forsystematic reviews, as basic or intermediatesearches could miss key papers.1, 6 Consequently,search strategies for systematic reviews are oftencomplex and can be hundreds of lines long, withmany combinations of Boolean logic, wildcardsand adjacency/proximity instructions.The challenge faced by information specialists
in producing a search strategy that is balancedbetween sensitivity and precision is understood.4,5
However, the difficulties faced in translating orrunning a complex search across different plat-forms are not well documented.Previous studies focus either on the performance
of individual databases on different platforms oron how alterations in search strategies translated
across different platforms affect the results.6–10 Forexample, Kimball7 produced an analysis of threeplatforms (OvidSP, EBSCO and Engineering Vil-lage) for searching one database (GeoRef) but didnot examine the specific requirements for complexsearching or produce a useable checklist. In addi-tion, there are several studies that examined theresults of individual searches carried out on thesame database but on different platforms.6,8–10
Younger and Boddy8 assessed the performance ofthe database AMED across three platforms (DIA-LOG DataStar, OVID and EBSCOhost) and foundthe number of hits can vary considerably withbasic searches. Sewell9 and Casler10 compared theperformance and features of various host platformsfor searching CAB abstracts and AGRICOLA,respectively. Sewell evaluated CAB direct,EBSCOhost, ISI and OvidSp and found no statisti-cally significant differences in precision or recall.The study concluded that the user population andcost, as well as performance, should be a consider-ation in the purchasing decision-making process.Casler produced a table of features of five hostplatforms available at the time, but no overall eval-uation of comparative data. Bandyopadhyay6 eval-uated the information retrieval of BiologicalAbstracts on two platforms (SilverPlatter andEBSCOhost) using both novice and complexqueries on each one. This study found that morecomplex searching generated the best results andconcluded that database providers were developinguser-friendly interfaces without simplifying theunderlying search mechanisms, such that unskilledsearching would not find the desired information.However, to date, there have been no publishedstudies detailing the development and validationof a checklist for assessing the performance ofplatforms.The aim of this study, therefore, was to produce
a valid checklist for assessing the abilities of data-base host platforms for carrying out complexsearches such as those used in a systematic review.Furthermore, this study aimed to assess the perfor-mance, using the checklist, of three host platformsfor this type of work. It was anticipated that thedevelopment of such a checklist could build onprevious research highlighting differences betweensets of results from databases when run across dif-ferent platforms. A checklist might also help with
© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group
Health Information & Libraries Journal, 31, pp. 43–53
A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers44
the purchasing decision-making processes carriedout in medical libraries and research institutionsdependant on these platforms for carrying outcomplex searches.
Method
Producing the checklist
We searched LISA, LISTA, Medline, EMBASE,Web of Knowledge and ERIC for studies assessinghost platform performance of searching capacity.As we found no studies assessing the performanceof database platforms using a checklist, we devel-oped our own checklist based on independent sug-gestions and follow-up discussions between thetwo authors. Both authors developed their ownperformance criteria, and these were then com-pared and merged into one checklist. Featurescommonly used for basic searching (such as sim-ple search functions, reference lists, automaticsearch filters, links to full text, etc.) were dis-counted from the list as these were either not nec-essary functions for a systematic search or wererarely used for complex searching.Each criterion was classified as essential (E) or
desirable (D) based on mutual agreement betweenthe authors, and the criteria were categorisedaccording to type. An individual criterion was con-sidered essential if its absence would render acomplex search impossible or extremely difficultand time-consuming or would severely affect theresults of the search. Results of a search wereseverely affected if records could not be saved ordownloaded, if it was not clear how many recordshad been retrieved (e.g. if there was automatic de-duplication) or if numbers were not consistentbetween running identical searches on the samedatabase. Desirable attributes were those whichgenerally made the search process easier or moretime-efficient but would not significantly affect theoverall performance of the search. The two authorsindependently rated each criterion, and discrepan-cies were resolved by discussion (see Table 1).
Assessing the performance of the platforms
Complex searches (more than 30 lines long andfeaturing extensive use of Boolean logic, proximity
terms, combinations of MeSH and free text andmultiple field headings) were carried out by theauthors on databases via three platforms: EBSCO-host (CINAHL), OvidSP (Medline) and ProQuest(ASSIA). These three platforms were chosenbecause they were all available via Exeter Univer-sity Medical School Library and were commonlyused by the authors for searching.
Testing the platforms
As stated, the three host platforms/databases testedfor this study were as follows:Platform 1 OvidSP/Medline
Platform 2 ProQuest/ASSIA
Platform 3 EBSCOhost/CINAHL
Three more columns were added to the checklistfor the testing process covering the availability ofthe particular function, the grade given by the ISand any explanatory notes (see Appendix 1).Review topics were selected from the projects
that authors were currently working on. Thismeant that the host platforms were tested underthe conditions of normal use. The subject areasselected were ‘child-reported health outcome mea-sures’ and ‘ADHD in school settings’. These werebroad enough to necessitate the use of various da-tabases located on different platforms, and thesearches were designed as part of a real systematicreview on these topics. Both strategies were over50 lines long on each of the three databases andused combinations of controlled vocabulary andfree text, field codes, Boolean terms and proximitysyntax. The search strategies used are availablefrom the authors on request.The checklist was used to rate the performance
of the individual platforms against the criteria.Scores from 1 to 3 were assigned on the basis ofthe performance of each platform as follows:1 Did not perform the function or was so difficultto find or do that it was deemed ineffective
2 Performed the function but not intuitive or con-fusing terminology was provided
3 Performed the function wellThe tests were carried out in May 2012 on days
where there were no known object circumstancessuch as server downtimes or routine upgrades.After being reviewed by the two authors, two sets
© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group
Health Information & Libraries Journal, 31, pp. 43–53
A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers 45
Table 1 List of criteria
Features Studied Essential Criteria Desirable Criteria
Searching (functions) Command line searches
Searching (syntax) Boolean terms Single character truncation
Phrase searching Left truncation
Adjacency terms Masking within a word
Proximity terms Short cut to combining strings
with AND/OR 2?>(e.g. OR/1-10)
Right truncation
Parenthesis
Combining parentheses within strings with Boolean
Combining parentheses within strings with adjacency
Combining parentheses with single field codes
Combining parentheses within strings with proximity
Combining parentheses with multiple field codes
Field codes Available to use Easily accessible
Ability to combine
Controlled
vocabulary
Subject headings e.g. MeSH Ability to choose multiple terms from
the thesaurus
Thesaurus available (displayed in hierarchy)
Ability to explode headings
Scope note available
Ability to combine controlled vocabulary
terms with free text
Ability to choose a narrower term
Display (Search) Option to view search history while
using search screen
Ability to edit previous lines of search
as it develops
Build up searches line-by-line with the
number of hits visible for each string
Ability to insert new lines of search into
existing search
Combine searches (with Boolean) Ability to move search lines around within
search
Renumber searches after deletion
Refine search by update code
Display (Records) Option to change the number of hits
viewed per page
Option to choose fields to display
Option to view search history on
record display screen
Can move onto next record when in full
record display
Search term highlighted
Downloading Select all results from complete set of
records rather than page by page
Able to download large numbers of records
(500+) in one go
A wide choice of export/download options
Search History Can save search history Can share saved searches
Re-run saved searches Export search history
Edit saved searches
Performance Can handle long and complex
searches, >50 lines long
Can handle large numbers of records >1000
Is compatible with major reference
management systems:
EndNote and Reference Manager
Compatible with major web browsers: IE, Firefox
and Google Chrome
Other Help facility is easy to locate and informative
Results are consistent
Turn off any deduplication
© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group
Health Information & Libraries Journal, 31, pp. 43–53
A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers46
of results were combined, and an average scorewas produced.
Results
Producing the checklist
The authors identified 10 basic features of plat-forms representing the main functions required tocarry out and process a search. These featuresincorporated search syntax, display, processing ofresults and overall performance. Within these cate-gories, 56 individual criteria were identified asbeing either essential (38) or desirable (18) for sys-tematic searches.Table 1 shows the list of criteria as grouped by
feature, together with the classification decisionsmade by the authors.
Agreements/Disagreements
The level of agreement between the gradesrecorded by the two reviewers for each platformwas compared. Agreement was highest forOvidSP, with 85% agreement between the tworeviewers on grade, followed by EBSCOhost(67%) and ProQuest (65%) (Table 2).The complete set of grades assigned by the
authors for each platform is given in Appen-dix 2.Graph 1. Combined results for the 3 platforms
Interpretation of results
All three host platforms performed poorly (gradingof 1.5 or 1) on three of the criteria: the ability to
insert new lines of search into the existing searchhistory; ability to move search lines around withinan existing search strategy; having an option tochoose which fields to display in the resultsscreen. It is important to note that we refer to theexisting (rather than saved) search strategy here.OvidSP, for example, allows comprehensive edit-ing of strategies once they are saved, but until thesearch is re-run, the impact of the changes (i.e. thenumber of hits generated) cannot be seen. Thisapproach can lead to problems with version con-trol as the search needs to be saved under a newname and re-run each time it is edited to see howmany hits are generated.Two of the platforms (ProQuest and EBSCOhost)
also performed poorly on other functional character-istics, such as the ability to select all results fromcomplete set of records rather than page by pageand the ability to save a search history (so that theentire strategy can be re-run). In addition, both theseplatforms performed poorly when handling long andcomplex searches of 50 lines or more, and with sav-ing and exporting large numbers of records (500+).However, for 17 of the checklist criteria, all threeplatforms scored 2.5 or 3. Using complex searchsyntax was a particular strength across all three plat-forms, as was the use of controlled vocabulary(e.g. MeSH), a wide range of download options andcompatibility with major reference management sys-tems. In addition, the help facility was easy to locateand use within the three platforms.
Discussion
Searches for systematic reviews are necessarilycomplex and involve designing strategies that areoften run across multiple databases on differenthost platforms. Information specialists andresearchers need these host platforms to be bothintuitive and capable of facilitating complexsearching. We have found through experience thatowners of host platforms are not always responsiveto these needs and focus primarily on functionsthat allow fast retrieval of references, basic searchfunctions and links to full text articles. It is possi-ble that database providers do not consider theability to conduct searches for systematic reviewsas part of their function or that it is not economi-cally viable for them to do so.
Table 2 Combined results of IS grading of host platforms
across all the checklist criteria
Grade*
OvidSP ProQuest EbscoHOST
No. of
criteria (%)
No. of
criteria (%)
No. of
criteria (%)
1 2 (3.6) 11 (20) 12 (21.8)
1.5 3 (5.5) 7 (12.7) 4 (7.3)
2 4 (7.3) 8 (14.5) 9 (16.4)
2.5 4 (7.3) 8 (14.5) 6 (10.9)
3 42 (76.4) 21 (38.2) 24 (43.6)
*Average score awarded (for example, 1.5 represents where
the first reviewer awarded grade 1 and the second reviewer
awarded grade 2).
© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group
Health Information & Libraries Journal, 31, pp. 43–53
A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers 47
In the absence of similar studies, we relied onour own knowledge and experience to produceand refine the list of criteria that we felt wereimportant for the searching performance of data-base host platforms. It is important to note thateach platform has many more functions than in thechecklist, we focus only on those criteria that areessential to carry out complex searching.The topics that were chosen for testing the plat-
forms (a review of child health outcome measuresand ADHD in school settings) were broad enoughfor relevant information to be located on the threedatabases chosen (CINAHL/EBSCOhost, Medline/OvidSP and ASSIA/ProQuest). The topics war-ranted complex searching on all three platforms,with combinations of subject headings and freetext, proximity syntax and use of field codes.Therefore, the topics chosen would not havefavoured any individual platform. For the purposesof this study, we were not interested in how thealterations in search strategy affected the results ofthe search but in the actual performance of theplatforms for carrying out the tasks identified onthe checklist.The results found that no platform performed
well for all the functions required for a systematicreview search and that there was wide variationbetween the platforms. The lower grading forsome of the criterion could be down to individualpreference, how the system has been configuredby our own institution or a lack of clear under-standing about the complex functions of the plat-forms. In addition, there was not completeagreement between the authors about the perfor-mance of the individual platforms against the listof criteria. This shows that the assessment processwas also subjective and that having at least twoassessors is required to minimise bias.Furthermore, the grading results are only accu-
rate for the time the searches were run; it is impor-tant to note that platforms are always releasingupgrades and making improvements. However, thelist of criteria is still relevant, and we anticipate itwill be of use to information specialists working inthe field of systematic reviews, as well as procure-ment teams deciding which platform to purchase,and database developers who wish to meet theneeds of review teams. We focused on the 3 majordatabase providers that we could access at the
University of Exeter Medical School; however,there are other hosts as well as stand-alone databas-es that could also be assessed using the checklist.The checklist is not intended to be a complete
and final list of all functions we require from hostplatforms. It could incorporate other useful func-tions such as the ability to share search strategiesbetween different information specialists within thesame team. This would allow searches to be runvia different accounts which would be useful forrunning update searches, for instance, if the infor-mation specialist changes, or for using and amend-ing previous search strategies for closely relatedtopics carried out by different information special-ists. Another useful function would be the abilityto group-related search histories into folders. Weintend to update and publish the criteria periodi-cally, bearing in mind that methods and technolo-gies adapt and change over time.Previous studies in this area had predominantly
looked at the variations in results generated fromthe same database but hosted on different platforms,and reasons for these variations.6–10 This projectbridges the gap between this understanding and thepractical assessment of host platforms for complexsearches, as would be expected in a systematicreview. The resulting checklist could be used byany information professional involved in either car-rying out this type of searching or evaluating data-bases before making a procurement decision.Future research could investigate more widely
the experience of academics carrying out searchesfor systematic reviews. It would be useful to inves-tigate whether databases are commonly selectedfor reviews based on the ease of searching. If so,the results might encourage hosts to develop morereview-friendly functions, or owners of databasesto consider the broader requirements of systematicreview researchers when selecting a host platformfor their resource.
Conclusion
There is little, if any, published debate about thelimitations of database host providers in facilitatingsearches for systematic reviews. It is possibletherefore that host providers, database owners andinformation professionals do not consider this tobe an important issue when upgrading systems,
© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group
Health Information & Libraries Journal, 31, pp. 43–53
A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers48
choosing hosts or purchasing database packagesand instead focus primarily on ease of use, conve-nience and cost. Systematic reviews are being car-ried out increasingly by students as part of ahigher degree course or dissertation: this is anissue therefore that warrants discussion throughoutthe academic information profession and not justamong information specialists in healthcareresearch.Of the three platforms examined, only one
(OvidSP) performed well on the majority of func-tions required for complex searching. As the othertwo host providers have sole commercial rights tokey health databases, it is imperative that the plat-form is capable of running complex searches forsystematic reviews. This is required to minimisethe risk of key databases being searched unsystem-atically, or worse, being omitted from the reviewprocess, and subsequently increasing the risk ofpossible bias in systematic reviews. Librarians andprocurement teams should be aware of the needsof systematic reviewers and information specialistswhen making decisions about database packagesand host platforms.
Funding
This article presents independent research fundedby the National Institute for Health Research(NIHR) Collaboration for Leadership in AppliedHealth Research and Care (CLAHRC) for theSouth West Peninsula. The views expressed in thispublication are those of the author(s) and not nec-essarily those of the NHS, the NIHR or theDepartment of Health in England.
References
1 Brettle, A. J., Long, A. F., Grant, M. J. & Greenhalgh, J.Searching for information on outcomes: do you need to becomprehensive? Quality in Health Care 1998, 7, 163–167.
2 Papaioannou, D., Sutton, A., Carroll, C., Booth, A. &Wong, R. Literature searching for social science systematicreviews: consideration of a range of search techniques.Health Information and Libraries Journal 2010, 27, 114–122.
3 Savoie, I., Helmer, D., Green, C. J. & Kazanjiian, A.Beyond Medline: reducing bias through extended systematicreview search. International Journal of Technology Assess-ment in Health Care 2003, 19, 168–178.
4 Higgins, J. P. T. & Green, S. (eds). Cochrane Handbook forSystematic Reviews of Interventions, version 5.1.0 (updatedMarch 2011). The Cochrane Collaboration 2011. Accessibleat: http://www.cochrane-handbook.org/
5 Centre for Reviews and Dissemination (CRD). SystematicReviews: CRD’s Guidance for Undertaking Reviews inHealthcare. York, UK, University of York, 2009.
6 Bandyopadhyay, A. Examining biological abstracts on twoplatforms: what do end users need to know? Science &Technology Libraries 2010, 29, 34–52.
7 Kimball, R. The GeoRef database: a detailed comparisonand analysis of three platforms. Science & TechnologyLibraries 2010, 29, 111–129.
8 Younger, P. & Boddy, K. When is a search not a search? Acomparison of searching the AMED complementary healthdatabase via EBSCOhost, OVID and DIALOG. HealthInformation and Libraries Journal 2009, 26, 126–135.
9 Sewell, R. R. Comparing four CAB abstracts platforms froma veterinary medicine perspective. Journal of ElectronicResources in Medical Libraries 2011, 8, 134–149.
10 Casler, C. L., Herring, E., Smith, H., Moberly, H. K., Flood,S. & Perry, V. Comparing AGRICOLA by Vendor. Journalof Agricultural & Food Information 2003, 4, 33.
Received 28 January 2013; Accepted 20 November 2013
1Appendix
Full checklist
No. Criteria E/D
Available
Y/N
IS Grade
(1-3) Notes
1 Searching (functions)
1a Command line searches E
2 Searching (syntax)
2a Boolean terms E
(continued)
© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group
Health Information & Libraries Journal, 31, pp. 43–53
A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers 49
Table . (continued)
No. Criteria E/D
Available
Y/N
IS Grade
(1-3) Notes
2b Phrase searching E
2c Adjacency terms E
2d Proximity terms E
2e Right truncation E
2f Left truncation D
2g Single character truncation D
2h Masking within a word D
2i Parenthesis E
2j Combining parentheses within strings with Boolean E
2k Combining parentheses within strings with adjacency E
2l Combining parentheses within strings with proximity E
2m Combining parentheses with single field codes E
2n Combining parentheses with multiple field codes E
2o Short cut to combining strings with AND/OR (e.g. OR/1-10) D
3 Field codes
3a Available to use E
3b Easily accessible D
3c Ability to combine (e.g. ti,ab) E
4 Controlled vocabulary
4a Subject headings e.g. MeSH E
4b Thesaurus available (displayed in hierarchy) E
4c Ability to choose multiple terms from the thesaurus D
4d Ability to combine controlled vocabulary terms with free-text E
4e Ability to explode headings E
4f Ability to choose a narrower term E
4g Scope note available E
5 Display (Search)
5a Option to view search history while using search screen E
5b Build up searches line-by-line with the number of hits visible for each string E
5c Ability to edit previous lines of search as it develops D
5d Ability to insert new lines of search into existing search D
5e Ability to move search lines around within search D
5f Combine searches (with Boolean?) E
5g Renumber searches after deletion D
5h Refine search by update code D
6 Display (Records)
6a Option to choose fields to display D
6b Option to change the number of hits viewed per page E
6c Option to view search history on record display screen E
6d Ability to choose records and not lose this choice
when you move onto the next page
E
6e Can move onto next record when in full record display D
6f Search term highlighted D
7 Downloading
7a Select all results from complete set of records rather than page-by-page E
7b Able to download large numbers of records (500+) in one go D
7c A wide choice of export/download options E
8 Search History
8a Can save search history E
8b Can share saved searched D
8c Export search history D
(continued)
© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group
Health Information & Libraries Journal, 31, pp. 43–53
A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers50
Table . (continued)
No. Criteria E/D
Available
Y/N
IS Grade
(1-3) Notes
8d Edit saved searches D
8e Re-run saved searches E
9 Performance
9a Can handle long and complex searches, >50 lines long E
9b Can handle large numbers of records >1000 E
9c Is compatible with major reference management systems E
9d Compatible with major web browsers: IE, Firefox and Google Chrome E
10 Other
10a Help facility is easy to locate and informative E
10b Results are consistent E
10c Turn off any deduplication E
2AppendixChecklist with grading
No. Criteria E/D
OvidSP ProQuest EBSCOhost
Rev
1
Rev
2 Combined
Rev
1
Rev
2 Combined
Rev
1
Rev
2 Combined
1 Searching (functions)
1a Command line searches E 3 3 3 3 3 3 1 3 2
2 Searching (syntax)
2a Boolean terms E 3 3 3 3 3 3 3 3 3
2b Phrase searching E 3 3 3 2 2 2 3 3 3
2c Adjacency terms E 3 3 3 2 3 2.5 1 1 1
2d Proximity terms E 3 3 3 3 3 3 3 3 3
2e Right truncation E 3 3 3 3 3 3 3 3 3
2f Left truncation D 1 1 1 3 3 3 1 1 1
2g Single character truncation D 3 3 3 3 3 3 3 3 3
2h Masking within a word D 3 3 3 3 3 3 3 3 3
2i Parenthesis E 3 3 3 3 3 3 3 3 3
2j Combining parentheses
within strings with
Boolean
E 3 3 3 3 3 3 3 3 3
2k Combining parentheses
within strings with
adjacency
E 3 3 3 3 3 3 1 1 1
2l Combining parentheses
within strings
with proximity
E 3 3 3 3 3 3 3 3 3
2m Combining parentheses
with single field codes
E 3 3 3 1 3 2.5 3 3 3
2n Combining parentheses
with multiple field
codes
E 3 3 3 2 3 2.5 1 1 1
2o Short cut to combining
strings with
AND/OR (e.g. OR/1-10)
D 3 3 3 1 1 1 1 3 2
3 Field codes
(continued)
© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group
Health Information & Libraries Journal, 31, pp. 43–53
A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers 51
Table . (continued)
No. Criteria E/D
OvidSP ProQuest EBSCOhost
Rev
1
Rev
2 Combined
Rev
1
Rev
2 Combined
Rev
1
Rev
2 Combined
3a Available to use E 3 3 3 3 3 3 3 1 2
3b Easily accessible D 2 3 2.5 3 3 3 2 1 1.5
3c Ability to combine
(e.g. ti,ab)
E 3 3 3 3 3 3 1 1 1
4 Controlled vocabulary
4a Subject headings e.g. MeSH E 3 3 3 3 2 2.5 2 3 2.5
4b Thesaurus available
(displayed in hierarchy)
E 3 3 3 1 2 1.5 1 3 2
4c Ability to choose multiple
terms from the thesaurus
D 3 3 3 3 3 3 3 3 3
4d Ability to combine controlled
vocabulary terms with
free-text
E 3 3 3 2 2 2 3 1 2
4e Ability to explode headings E 3 3 3 3 2 2.5 3 3 3
4f Ability to choose a
narrower term
E 3 3 3 3 2 2.5 3 2 2.5
4g Scope note available E 3 3 3 3 1 2 2 2 2
5 Display (Search)
5a Option to view search history
while using search screen
E 3 3 3 3 1 2 3 3 3
5b Build up searches line-by
line with the
number of hits visible for
each string
E 3 3 3 1 2 1.5 3 3 3
5c Ability to edit previous lines of
search as it develops
D 1 2 1.5 1 1 1 3 3 3
5d Ability to insert new lines of
search into existing search
D 1 2 1.5 2 1 1.5 1 1 1
5e Ability to move search lines
around within search
D 1 2 1.5 1 1 1 1 1 1
5f Combine searches
(with Boolean?)
E 2 3 2.5 3 2 2.5 2 3 2.5
5g Renumber searches
after deletion
D 3 3 3 1 1 1 3 3 3
5h Refine search by
update code
D 2 2 2 1 1 1 1 1 1
6 Display (Records)
6a Option to choose fields
to display
D 1 1 1 1 1 1 1 1 1
6b Option to change
the number of hits
viewed per page
E 3 3 3 3 3 3 2 2 2
6c Option to view search history
on record display screen
E 3 3 3 1 1 1 3 3 3
6d Ability to choose records and
not lose this choice
when you move onto the
next page
E 3 3 3 3 3 3 1 3 2
(continued)
© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group
Health Information & Libraries Journal, 31, pp. 43–53
A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers52
Table . (continued)
No. Criteria E/D
OvidSP ProQuest EBSCOhost
Rev
1
Rev
2 Combined
Rev
1
Rev
2 Combined
Rev
1
Rev
2 Combined
6e Can move onto next record
when in full record display
D 3 3 3 3 3 3 3 3 3
6f Search term highlighted D 2 2 2 3 3 3 3 3 3
7 Downloading
7a Select all results from
complete set of
records rather than
page-by-page
E 3 3 3 3 2 2.5 1 1 1
7b Able to download large
numbers of records
(500+) in one go
D 3 3 3 3 3 3 2 1 1.5
7c A wide choice of
export/download options
E 3 3 3 3 3 3 3 3 3
8 Search History
8a Can save search history E 3 3 3 1 1 1 1 1 1
8b Can share saved searched D 1 3 2 1 1 1 1 1 1
8c Export search history D 2 2 2 2 2 2 3 1 2
8d Edit saved searches D 2 3 2.5 3 1 2 1 3 2
8e Re-run saved searches E 3 3 3 1 1 1 3 3 3
9 Performance
9a Can handle long and complex
searches, >50 lines long
E 3 3 3 1 2 1.5 2 1 1.5
9b Can handle large numbers
of >records >1000
E 3 3 3 1 2 1.5 2 1 1.5
9c Is compatible with major
reference management
systems
E 3 3 3 3 3 3 3 2 2.5
9d Compatible with major
web browsers: IE,
Firefox and
Google Chrome
E 3 3 3 2 2 2 3 3 3
10 Other
10a Help facility is easy
to locate
and informative
E 2 3 2.5 3 2 2.5 2 3 2.5
10b Results are consistent E 3 3 3 1 1 1 3 3 3
10c Turn off any deduplication E 3 3 3 2 2 2 3 3 3
© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group
Health Information & Libraries Journal, 31, pp. 43–53
A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers 53
top related