software practice 12 breakout - tracking usage and impact of software

7
So#ware reward, cita.on, a0ribu.on Tracking usage and impact Neil Chue Hong, Alberto Di Meglio, Josh Greenberg, Juan Lalinde, Kevin Jorissen

Upload: softwarepractice

Post on 21-Dec-2014

1.252 views

Category:

Technology


0 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Software Practice 12 breakout - Tracking usage and impact of software

So#ware  reward,  cita.on,    a0ribu.on  

Tracking  usage  and  impact  

Neil  Chue  Hong,  Alberto  Di  Meglio,  Josh  Greenberg,  Juan  Lalinde,  Kevin  

Jorissen  

Page 2: Software Practice 12 breakout - Tracking usage and impact of software

Models  of  a0ribu.on  •  Tradi.onal  nota.on  of  cita.ons  -­‐  authority  flows  from  paper  to  paper  through  

cita.on  chains  –  Lots  of  murkiness  when  it  comes  to  so#ware.  –  Cita.on  is  one  way  of  measuring  impact  but  only  one.  

•  Papers  are  completed  and  published  before  people  “use”  them  so  impact  is  always  downstream  –  So#ware  can  be  published  mul.ple  .mes.  –  You  write  a  paper  so  someone  else  can  read  it.  Only  fix  bugs  in  pre-­‐print.    –  You  don't  maintain  the  paper,  you  publish  new  work,  papers.  –  We  don't  check  papers  for  their  dependencies  and  revise  them  without  new  work.  

•  So#ware  is  more  like  a  long  term  research  project  which  has  many  versions  (akin  to  results)  

•  If  you  create  things  which  are  higher  quality,  have  to  be  rewarded.  –  Helping  out  on  forums  -­‐  huge  impact,  but  recogni.on  is  zero.  –  Reward  for  the  so#ware  itself  should  be  more  than  the  paper  that  describes  it.  –  Impact  of  so#ware  should  be  even  greater  than  the  impact  of  a  single  paper  because  it  

provides  tools  for  doing  many  things.  

Page 3: Software Practice 12 breakout - Tracking usage and impact of software

Ways  in  which  we  like  to  be  rewarded  

•  Money  –  Salary  –  Prizes  

•  Recogni.on  and  Respect  –  Academic  –  Peers  –  Public  

•  Achievement  of  long  term  pla[orm  funding  •  Promo.on  and  tenure  •  Being  featured  by  others  •  Being  curated  •  Chocolate  cake  

Page 4: Software Practice 12 breakout - Tracking usage and impact of software

Ways  in  which  we  can  measure  usage  and  impact  

•  coun.ng  downloads  •  coun.ng  cita.ons  on  related  papers  •  coun.ng  direct  cita.ons  of  so#ware  

–  about  box  should  give  a  very  clear  cita.on  that  can  be  copied  and  pasted  •  coun.ng  numbers  of  licenses  granted  •  pu]ng  in  constraints  asking  for  updates  on  usage  as  part  of  the  licenses  •  logging  usage  through  checking  for  updates  (e.g.  in  Zotero)  •  webanaly.cs  techniques  •  sta.s.cs  from  so#ware  catalogues,  marketplaces,  science  gateways  (e.g.  in  

nanoHUB)  •  We  want  to  measure  how  people  are  using  the  so#ware  (not  just  when  they  are  

using  it  –  collect  sta.s.cs  manually  through  site  administrators  registering  services  at  their  sites  (could  

be  automa.c)  –  cita.on  of  so#ware,  generate  data  when  it's  used  (version  used,  authors,  size  of  usage)  –  number  of  commi0ers,  contributors,  par.cipants,  vitality  of  community  –  surveys,  site  visits,  observa.on  of  scien.sts  in  daily  rou.ne  

Page 5: Software Practice 12 breakout - Tracking usage and impact of software

Changes  to  make  it  easier  to  track  usage  and  impact  of  so#ware  

•  Formal  way  of  tracking  – DOIs  for  so#ware?  So#ware  cita.ons.  

•  So#ware  depositories  for  reproducible  papers  (e.g.  RunMyCode)  

•  Be0er  upstream  prac.ces  e.g.  always  using  networked  code  repositories  

•  Bu0on  in  so#ware  for  "prepare  my  results  and  other  stuff  for  publica.on"  

Page 6: Software Practice 12 breakout - Tracking usage and impact of software

What  are  the  biggest  issues  

•  changing  the  culture  surrounding  the  value  and  importance  of  so#ware  when  looking  at  career  progression  (stopping  the  self-­‐reinforcing  process)  

•  how  do  you  rela.vely  value  someone's  contribu.on,  and  appor.on  credit  (ar.cula.on  of  roles?)  

•  do  we  understand  the  core  community  who  can  judge  the  value  and  impact  

•  understanding  how  to  cite  so#ware  so  it  can  be  tracked  is  difficult  

Page 7: Software Practice 12 breakout - Tracking usage and impact of software

Things  we’d  like  to  understand  •  What’s  the  model  of  credit  for  the  impact  of  so#ware  on  the  work  it  

enables  (i.e.  what  lets  you  rack  up  points?)  –  1  point  every  .me  a  paper  cites  you  or  50  points  if  a  paper  that  uses  you  is  

cited  50  .mes?  •  Is  there  a  scien.fic  community,  many  scien.fic  communi.es?  

–  From  which  communi.es  do  people  want  to  get  recogni.on,  and  from  whom  within  the  communi.es?  

•  Are  there  examples  where  removing  the  "hierarchical  value/weigh.ng"  or  hyperdifferen.a.ng  (extreme  differen.a.on  of  roles)  models  of  a0ribu.on  work  well  in  the  world  of  regular  scholarly  communica.on?  

•  Should  there  be  a  differen.al  weigh.ng  of  the  respect  that  an  individual  gives  (Tripadvisor  model  vs  "wise  ones"/Faculty  of  the  1000)  –  Who  is  important  in  the  community  for  giving  out  “respected”  rewards?  

•  Can  we  pick  a  handful  of  rela.vely  complex  pieces  of  so#ware  and  ask  people  involved  in  the  development  to  assign  rela.ve  values  to  each  others  contribu.ons?  Does  it  change  over  .me?