getting the most out of your e-resources: measuring success

33
Getting the Most Out of Your E- Resources: Measuring Success Todd Carpenter Managing Director, NISO

Upload: kramsey

Post on 17-May-2015

1.396 views

Category:

Education


1 download

TRANSCRIPT

Page 1: Getting the Most Out of Your E-Resources: Measuring Success

Getting the Most Out of Your E-Resources:

Measuring Success

Todd CarpenterManaging Director, NISO

Page 2: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Where are we headed this afternoon?

• A bit about NISO• Overview of usage measurement of

e-resources• COUNTER & SUSHI• The application of usage data• Issues and concerns with use data• A glimpse into the future

Page 3: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

What is NISO?

• NISO - National Information Standards Organization

• NISO is the only ANSI-accredited organization tasked with the development of standards in the field of Information and Documentation

• Work with publishers, libraries, agents and other systems vendors to develop community consensus

• Develop wide range of standards– Paper permanence and steal shelving– Accessibility issues– Bibliographic formats and exchange– Web-based delivery, OpenURL, Metasearch, SUSHI

Page 4: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Big Challenges, Modest Resources

• Revenue: $900K, up 20% in 2007• Primary income: Member dues (60%)• Other income: Seminars, Publishing (20%)• New sources of revenue in 2007 - Grants

– Mellon $196K, IMLS - $24K (20%)

• Staff: 4 Professional full-time• Virtual staff: 10+ (Consultants, Partners)• 83 Voting Members, 25 LSA members as of

2007 • Maintenance Agencies: 12• Volunteers: 300+ spread out across the world

Page 5: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Standards – Why should I care?• Standards accelerate production,

ordering/sales, dissemination, locating, storing and preserving information

• Key standards which NISO has developed

and helping to bring consensus around– ISSN, OpenURL, Z39.50, NCIP– In development: DOI, SUSHI, SERU, LEWG– In planning: Institutional ID, Performance

Measures, OpenURL Expansion,

Page 6: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Meaningless?

Certainly, there’s a lot of data

The difference between meaningless and meaningful data is APPLICATION

Page 7: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

The Early Days of E-Resources

Thinking back to 1998 - Many open questions

• How do you record traffic? • What is a hit? • Is a hit different than a download?• What about reloading?• What about images and links?• What should a report include?• Eventually, counting different versions of texts• How often do you need to provide stats• Traffic to abstracts, TOCs or other elements? • Content over multiple pages?

Page 8: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Slow development of consensusICOLC - Guidelines for Statistical Measurement of

Usage of Web-Based Information ResourcesReleased in 1998, updated in 2000– Minimum Requirements– Data elements, timeframe, etc– Confidentiality– Access– Delivery– Definitions– Formats

National Commission on Libraries and Information Science (NCLIS) Electronic Access and Use-Related MeasuresReleased in 2001

Page 9: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Toward Formalization

• ANSI/NISO Z39.7: 2004 - Information Services and Use Metrics & Statistics for Libraries and Information Providers -- Data DictionaryONLINE: www.niso.org/emetrics/current/index.html

• Technical Committee 46 - Information and Documentations, SC 8 - Statistics and Performance Indicators– ISO 2789: 2006 Information and documentation --

International library statistics– ISO 11620: 1998 Information and documentation --

Library performance indicators

Page 10: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Project COUNTER

COUNTER (Counting Online Usage of NeTworked Electronic Resources)

Formed in 2002Membership organization (as of 3/15/08)

• Industry Organizations - 13• Library Consortia - 62• Libraries - 84• Publishers - 66

Establishes Codes of Practice on the gathering, compiling and storage of publishing usage data

Page 11: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

COUNTER Codes of Practice

• Definitions• Specifications for Usage Reports

– What they should include– What they should look like– How and when they should be delivered

• Data processing guidelines• Auditing (New in 2006)• Compliance• Maintenance and development of the

Codes of Practice• Governance of COUNTER

Page 12: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

COUNTER: Current Codes of Practice1) Journals and databases

– Release 1 Code of Practice launched January 2003– Release 2 replaced Release 1 in January 2006– Release 3 under consideration - to include SUSHI

compliance and consortia reporting– Now a widely adopted standard by publishers and

librarians– 60%+ of Science Citation Index articles now covered

2) Books and reference works– Code of Practice for Books was launched March 2006

– Relevant usage metrics less clear than for journals– Different issues than for journals

• Direct comparisons between books less relevant• Understanding how different categories of book

are used is more relevant

Page 13: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Journal Report 1 Example

Page 14: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Need for easier access to usage data

• As a community, we need to reduce the time and effort necessary to collect, format and compile usage data

• “Time for meaningful analysis is compromised by the time required just to gather and record the statistics.”

– Median percentage of time spent on analysis is only 25 percent– More than half of the time is spent on gathering and formatting– Average number of hours spent working on usage data is 96 hours,

but ranged on the high end up to 1-2 FTEs entirely focused on data– Usage reports to help them make subscription decisions (94%) and

justify expenditures (86%) for their electronic resources

DATA FROM: Gayle Baker, Eleanor J. Read, Vendor Usage Data for Electronic Resources: A Survey of Libraries http://smartech.gatech.edu/handle/1853/13611

Page 15: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

One example of the need

One university with more than 75 online resources from which they draw usage data

They have a 80-page booklet containing the details of how to access and gather usage data!

How much time does it take not just compiling and maintaining this notebook, but even just going through it?

Page 16: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Briefly: SUSHI

• Need: Simplify and automate the gathering of usage data for librarians

– Librarians spending months gathering data

• Solution– Server/Client system to exchange COUNTER

reports– Easily incorporated into usage systems (on

publisher side) or into ERM (on library side)– Client calls to server, asks for report, and

server runs the report and sends it on– Data exchange is taking place by machine

talking with machine

Page 17: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Content ProviderLibrary

SUSHIServer

UsageData

SUSHIClient

Internet

ERM

SUSHI is a Web Service which sends an XML request to a content provider to obtain an XML response containing the usage report.SUSHI is a Web Service which sends an XML request to a content provider to obtain an XML response containing the usage report.

?

Response

COUNTER

Request

SOAPSOAP

Slide courtesy of Oliver Pesch, EBSCO Information Services, Co-Chair SUSHI

Page 18: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

SUSHI: Where now, where to?

Passed unanimously by NISO membership in September, 2007

Formally approved as ANSI/NISO Z39.93:2007

Working toward broad ADOPTION• Ask that it be included in your ERM

solution• Demand your content providers become

SUSHI and COUNTER (rev 3) compliant• Talk to your vendors and your consortia

Page 19: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Careful the conclusions you draw

Page 20: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

What is the most used resource?

BioOne’s most viewed article

The nest architecture of the Florida harvester ant, Pogonomyrmex badius

• Walter R. Tschinkel

“Coolest images on the net”

Page 21: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Link Prefectching affecting data?• Link prefetching is a browser mechanism, which

utilizes browser idle time to download or prefetch documents that the user might visit in the near future.

• Based on previous use data, the site provides a set of prefetching hints to the browser, and after the browser is finished loading the page, it begins silently prefetching specified documents and stores them in its cache.

• When the user visits a prefetched document, it can be served up quickly out of the browser's cache.– Source: Mozzila.org

Page 22: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

User Interface Issues Affecting Data?

• How a publisher system is designed could affect reported usageFor example:

If you have to visit the HTML page to get the PDF

Source: Price & Davis, JASIST, 2006 arxiv.org/pdf/cs/0602060

QuickTime™ and a decompressor

are needed to see this picture.

Page 23: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Metasearch affecting data?• Metasearch engines conduct multiple

searches simultaneously• Retrieve, consolidate and ranks results

based on algorithmic and semantic analysis

• Provide users with resource selections• However, in many cases this search and

retrieve results in hits and downloads

Page 24: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Applying Usage Data

Data - All facts

Information - Facts within context

Knowledge - Interrelationships among relevant facts

Wisdom - Actionable knowledge

Page 25: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Basic Measures

• Cost-per use - Are we getting comparative value from this resource?

• Are my systems working?• Are there barriers to use - why are

similar products experiencing different use patterns?

• Expressing value to administration, contributors or government sponsors

Page 26: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Using Usage as a Quality Measure

• The amount of traffic an item receives is separate, but valuable, metric for assessing quality

• Citation measures capture only one type of use - scholarly citation, not necessarily quality

• Teaching or clinical use is extremely valuable

Page 27: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Comparing usage measuresDevelopment project underway within

COUNTER in partnership with UKSG

Goal: Derive a meaningful calculation for assessing quality through COUNTER data

Usage of items / Period of timeQuestions primarily related to the

denominator used in calculation

Page 28: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Two methods for assessing quality

• Impact Factor– Established, understood and generally accepted– Funding agencies, researchers rely on its data– Limitation in the fields of scholarship it covers– Reflects value of journals to researchers, but not all users– Over-emphasis on IF distorts the behaviour of authors– Over-used, mis-used and over-interpreted

• Usage Factor– Usage-based alternative perspective– Would cover all online journals– Would reflect value of journals to all categories of user– Would be easy to understood

Page 29: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

What’s in store in the future• MESUR - MEtrics from

Scholarly Usage of Resources

• Mellon funded project to study assessment of the impact of scholarly communication items, and hence of scholars, with metrics that derive from usage data

Data analysis imaese from Johan Bollen

Page 30: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

MESUR - Some Examples

• Discerning methods of citation networks

• Describing journal usage comparisons

• Describing potential connectedness measures

• Relatedness of items based on use patterns

• “Readers who viewed this also views…”

Image source: www.mesur.org

Page 31: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Privacy Concerns

If you have enough data, you can pinpoint exact people

• Say you have domain expertise• You see person X looks at this article,

then that article, and onto this series of articles

• You’ll probably be able to figure who the person is and what they’re working on

Page 32: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Other Areas for Development• Non-Journal content, A&I, bibliographies• Content on multiple platforms• Different versions existing on the

network• Institutional repository systems

– Limited usage tracking and consistent reporting

• Multimedia content, streaming content• Mash-ups and multi-feed content• Research data and visualization tools

Page 33: Getting the Most Out of Your E-Resources: Measuring Success

March 18, 2008 NELINET Collection Services Conference

Thank you!

Todd Carpenter, Managing [email protected]

One North Charles StreetSuite 1905Baltimore, MD 21201 USA(301) 654-2512(410) 685-5278www.niso.org