Software Discovery Index Workshop Report

By Vivien Bonazzi1, Phil Bourne1, Steven Brenner2, Robin Brown3, Ishwar Chandramouliswaran3, Jennifer Couch3, Sean Davis3, Leslie Derr1, Asif Dhar4, Luke Dunlap4, Kevin Eliceiri5, Leigh Finnegan6, Ian Fore3, Melissa Haendel7, Martin Hammitzsch8, Tram Huyen9, Daniel S. Katz10, Mike Kellen11, David Kennedy12, Jennie Larkin1, Jennifer Lin13, Peter Lyster14, Ron Margolis15, Gabor Marth16, Maryann Martone17, Michael McLennan18, Martin Morgan19, Francis Ouellette20, Vinay Pai21, Andreas Prlic17, Will Schroeder22, Michael Sherman23, Heidi Sofia6, James Taylor24, Kaitlin Thaney25, Chris Wellington6, Owen White26

1. National Institutes of Health, Office of the Director 2. University of California, Berkeley 3. National Cancer Institute 4. Deloitte LLP 5. University of Wisconsin - Madison 6. National Human Genome Research Institute 7. Oregon Health & Science University 8. Deutsches GeoForschungsZentrum GFZ 9. National Institute of Allergy and Infectious Diseases 10. National Science Foundation (NSF) 11. Sage Bionetworks 12. University of Massachusetts Medical School 13. Public Library of Science (PLOS) 14. National Institute of General Medical Sciences 15. National Institute of Diabetes and Digestive and Kidney Diseases 16. Boston College 17. University of California, San Diego 18. Purdue University 19. Fred Hutchinson Cancer Research Center 20. Ontario Institute for Cancer Research 21. NIBIB, NIH 22. Kitware, Inc. 23. Stanford University 24. Johns Hopkins University 25. Mozilla Science Lab 26. University of Maryland School of Medicine

Published on


The National Institutes of Health (NIH), through the Big Data to Knowledge (BD2K) initiative, held a workshop in May of 2014 to explore challenges facing the biomedical research community in locating, citing, and reusing biomedical software. The workshop participants examined these issues and prepared this report summarizing their findings.

The constituents with the potential to benefit from improved software discoverability include software users, developers, journal publishers, and funders. Software developers face challenges disseminating their software and measuring its adoption. Software users have difficulty identifying the most appropriate software for their work. Journal publishers lack a consistent way to handle software citations or to ensure reproducibility of published findings. Funding agencies struggle to make informed funding decisions about which software projects to support, while reviewers have a hard time understanding the relevancy and effectiveness of proposed software in the context of data management plans and proposed analysis.

This document summarizes recommendations generated from an NIH Software Discovery Meeting held in May 2014. We are now requesting comments from the larger community. We have contacted a broad set of constituents who represent software users, software developers, NIH staff, electronic repositories, and journal publishers.


Cite this work

Researchers should cite this work as follows:

  • Vivien Bonazzi; Phil Bourne; Steven Brenner; Robin Brown; Ishwar Chandramouliswaran; Jennifer Couch; Sean Davis; Leslie Derr; Asif Dhar; Luke Dunlap; Kevin Eliceiri; Leigh Finnegan; Ian Fore; Melissa Haendel; Martin Hammitzsch; Tram Huyen; Daniel S. Katz; Mike Kellen; David Kennedy; Jennie Larkin; Jennifer Lin; Peter Lyster; Ron Margolis; Gabor Marth; Maryann Martone; Michael McLennan; Martin Morgan; Francis Ouellette; Vinay Pai; Andreas Prlic; Will Schroeder; Michael Sherman; Heidi Sofia; James Taylor; Kaitlin Thaney; Chris Wellington; Owen White (2015), "Software Discovery Index Workshop Report,"

    BibTex | EndNote