getting (and giving) credit for all that we do

21
Title: Getting (and giving) credit for all that we do Melissa Haendel NISO Research Data Metrics Landscape: An update from the NISO Altmetrics Working Group B: Output Types & Identifiers 11.16.2015 @ontowonka

Upload: mhaendel

Post on 09-Jan-2017

592 views

Category:

Science


1 download

TRANSCRIPT

Page 1: Getting (and giving) credit for all that we do

Title: Getting (and giving) credit for all that we do

Melissa Haendel

NISO Research Data Metrics Landscape: An update from the NISO Altmetrics Working

Group B: Output Types & Identifiers11.16.2015

@ontowonka

Page 2: Getting (and giving) credit for all that we do

What *IS* “success”?

Page 3: Getting (and giving) credit for all that we do

https://goo.gl/b60moX

It’s not always what you see

Page 4: Getting (and giving) credit for all that we do

What is attribution???

Page 5: Getting (and giving) credit for all that we do

Over 1000 authors

Page 6: Getting (and giving) credit for all that we do

Project CRediT

http://projectcredit.net

Page 7: Getting (and giving) credit for all that we do

Many contributions don’t lead to authorship

BD2K co-authorship

D.EichmannN.Vasilevsky

20% key personnel are not adequately profiled using publications

Page 8: Getting (and giving) credit for all that we do

Some contributions are anonymous

Data depositionImage credit: http://disruptiveviews.com/is-your-data-anonymous-or-just-encrypted/

Anonymous review

Page 9: Getting (and giving) credit for all that we do

The Research Life Cycle

EXPERIMENT

CONSULT

PUBLISHDATA

FUND

Page 10: Getting (and giving) credit for all that we do

The Research Life Cycle

EXPERIMENT

CONSULT

PUBLISHDATA

FUND

Network

Page 11: Getting (and giving) credit for all that we do

• Measurement instruments• Continuing education materials• Cost-effective intervention• Change in delivery of healthcare services• Quality measure guidelines• Gray literature

Evidence of meaningful impact

• New experimental methods, data models, databases, software tools

• New diagnostic criteria • New standards of care• Biological materials, animal models• Consent documents• Clinical/practice guidelines

https://becker.wustl.edu/impact-assessment http://nucats.northwestern.edu/

Diverse outputs Diverse impacts

Diverse rolesEach a critical component of

the research process

Page 12: Getting (and giving) credit for all that we do

EXAMPLE OUTPUTS related to software:

Outputs: binary redistribution package (installer), algorithm, data analytic software tool, analysis scripts, data cleaning, APIs, codebook (for content analysis), source code, software to make metadata for libraries archives and museums, data analytic software tool, source code, program codes (for modeling), commentary in code(thinking of open source-need to attribute code authors and commentator/enhancers/hackers, who can document what they did and why), computer language (a syntax to describe a set of operations or activities), software patch (set of changes to code to fix bugs, add features, etc.), digital workflow (automated sequence of programs, steps to an outcome), software library (non-stand alone code that can be incorporated into something larger), software application (computer code that accomplishes something)

Roles: catalog, design, develop, test, hacker, bug finder, software developer, software engineer, developer, programmer, system administrator, execute, document, software package maintainer, project manager, database administrator

Attribution workshop results - >500 scholarly products

Page 13: Getting (and giving) credit for all that we do

Connecting people to their “stuff”

Page 14: Getting (and giving) credit for all that we do

Modeling & implementation

VIVO-ISF: Suite of ontologies that integrates and extends community standards

Page 15: Getting (and giving) credit for all that we do

Credit extends beyond the original contribution

Stacy creates mouse1

Kristi creates mouse2

Karen uses performs RNAseq analysis on mouse1 and mouse2 to generate dataset3, which she subsequently curates and analyzes

Karen writes publication pmid:12345 about the results of her analysis

Karen explicitly credits Stacy as an author but not Kristi.

Page 16: Getting (and giving) credit for all that we do

Credit is connected

Credit to Stacy is asserted, but credit to Kristi can be inferred

Page 17: Getting (and giving) credit for all that we do

Introducing openRIFThe Open Research Information Framework

openRIF

SciENcv

eagle-i

VIVO-ISF

Page 18: Getting (and giving) credit for all that we do

Ensuring an openRIF that meets community needs

Data E

ntry Discovery

Interoperability

A domain configurable suite of ontologies to enable interoperability across systems

A community of developers, tools, data providers, and end-users

Page 19: Getting (and giving) credit for all that we do

Developing a computable research ecosystem

Research information is scattered amongst:Research networking toolsCitation databases (e.g., PubMED)Award databases (e.g., NIH Reporter)Curated archives (e.g., GenBank)Locked up in text (the research literature)

Map SciENcv data model to VIVO-ISF/openRIF

Enable bi-directional data exchange

Integrate SciENcv, ORCID data into CTSAsearchhttp://research.icts.uiowa.edu/polyglot/

CTSAsearch:

The Open Research Information Framework

David Eichmann

Page 20: Getting (and giving) credit for all that we do

Thank you!

Join the Force Attribution Working Group at: https://www.force11.org/group/attributionwg

Join the openRIF listserv at: http://group.openrif.org

Page 21: Getting (and giving) credit for all that we do

Identifying those scholarly outputs

Identifiers for things that are not publications, or documents, need to get beyond thinking about DOIs