Research software is a driving force in today's academic ecosystem, with most researchers relying on it to do their work, and many writing some of their own code. Despite its importance, research software is typically not included in tenure, promotion, and recognition policies and processes. In this article, we invite discussions on how to value research software, integrate it into academic evaluations, and ensure its sustainability. We build on discussions hosted by the US Research Software Sustainability Institute and by the international Research Software Engineering community to outline a set of possible activities aimed at elevating the role of research software in academic career paths, recognition, and beyond. One is a study to investigate the role of software contributions in academic promotions. Another is to document and share successful academic recognition practices for research software. A third is to create guidance documents for faculty hiring and tenure evaluations. Each of these proposed activities is a building block of a larger effort to create a more equitable, transparent, and dynamic academic ecosystem. We've assembled 44 such ideas as a starting point and posted them as issues in GitHub. Our aim is to encourage engagement with this effort. Readers are invited to do this by adding potential activities or commenting on existing ideas to improve them. The issues page can also serve to inform the community of ongoing activities so that efforts aren't duplicated. Similarly, if someone else has already made strides in a particular area, point out their work to build collective knowledge. Finally, the issues page is also intended to allow anyone interested in collaborating on a specific activity to indicate their willingness to do so. This living list serves as a hub for collective action and thought, with the overall aim of recognizing the value of creating and contributing research software.
Research software is a driving force in contemporary scholarly research, with most researchers relying on it to do their work. For example, a survey of elite UK universities found that 90% of researchers were relying on research software in their work (Hettrick, et al. 2014). Despite its importance, research software is rarely explicitly included in tenure, promotion, and recognition policies and processes around the world. As the two researchers working on the US Research Software Sustainability Institute (URSSI)Policy project, based at the National Center for Supercomputing Applications, we bring backgrounds in research software and social science. In this article, we seek to open discussions on how to value research software, integrate it into academic evaluations, and ensure its sustainability and impact.
Research software covers a wide gamut: source code files, algorithms, scripts, computational workflows, and executables—all fashioned explicitly for research aims. As research increasingly leans on computational methods, the need to support the production, maintenance, and impact of software through tenure, promotion, and recognition is more urgent than ever. Seeking to address this need raises many questions. How can the impact of research software be measured and extended? How can research software be integrated into academic evaluations? How can the long-term sustainability of research software be ensured? A set of answers to such questions may be found through policy-related activities, including advocacy, organizing, and research. Because no one person or group can carry out all the needed work in this area, we (the authors of this article) have outlined and shared a set of possible activities along these lines that aim at elevating the role of research software in academic career paths, recognition, and beyond.
Building from discussions in the US Research Software Sustainability Institute (URSSI), including a number of workshops and the development of an implementation plan, we've assembled 44 possible activities aimed at elevating the role of research software in academic career paths, recognition, and beyond and we've posted them as issues in GitHub as an initial step in our current Alfred P. Sloan-funded URSSI Policy project. One of these possible activities we’ve laid out is a study to investigate the role of software contributions in academic promotions. Another activity is about establishing academic recognition for research software. A third involves developing guidelines for hiring and tenure. Each proposed activity can be viewed as a building block in a larger overall effort to create a more equitable, transparent, and dynamic academic system. Such initiatives don't require a complete overhaul of the academic reward system, but would instead help fine-tune tenure, promotion, and recognition to better support today’s research environment, which is increasingly underpinned by technology.
The proposed activities are intended to be a living list, where anyone can:
Propose new potential activities
Comment on an existing proposed activity
Indicate that they are working on an activity
Say that someone else has already worked on an activity, and point to their work
Offer to work on an activity in collaboration with others
The authors have labeled these activities based on their personal judgement as follows:
Type of activity: Research, Advocacy, Organizing/action, Documentation
Topic: Career path, Impact, Public software, Quality, Maintenance, Diversity and inclusion
Estimated effort: Small (less than one person-year), Medium (1-3 person-years), or Large (more than 3 person-years)
If you are interested in research software and related policy, we hope you will help us maintain and augment this list of new activities that we think are needed to advance the research software field, and that you might choose to work on one of the proposed activities. Perhaps an item on this list would be a good project for you to work on, or a good class/independent study project!
Among the proposed activities that readers are invited to contribute to are:
Scope: Investigate how software contributions factor into academic promotions and tenure at a research organization.
Impact: Aims to influence policy changes by revealing disparities between traditional research outputs and software contributions.
Scope: Document and share examples of successful recognition practices for research software contributions.
Impact: Provides a blueprint for academic institutions to integrate software contributions into their recognition systems.
Scope: Create guidance documents to assist stakeholders in faculty hiring and tenure evaluations.
Impact: Aims to reshape traditional practices, emphasizing the role of software in modern research.
Scope: Create a mailing list to foster community and information exchange among those interested in research software careers, including sharing advice on navigating tenure and promotion systems.
Impact: Aims to build a more robust and informed community in the research software domain.
Scope: Develop case studies revealing diverse career paths of research software engineers (RSEs) to help demystify tenure, promotion and recognition for this researcher category.
Impact: Provides a holistic understanding of the field, helping RSEs steer through the career landscape.
Scope: Conduct a comparative economic study on RSE salaries versus other researchers.
Impact: Aims to identify disparities and inform institutional tenure and promotion decision-making and policies.
Scope: Develop sample language that highlights the value of research software skills.
Impact: Bolsters organizational capacity and sets a standard for research software roles to improve tenure, promotion and recognition processes.
Scope: Help individuals claim software works on platforms like ORCID.
Impact: Empowers individuals to more effectively leverage their contributions during tenure and promotion processes.
In this video, we detail how you can engage with this set of ideas:
This living list on GitHub serves as a hub for collective action and discussion, aiming to encourage change.
Research software is indispensable in contemporary scholarly research. But let’s not kid ourselves: Shifting tenure, promotion, and recognition systems is difficult. Academic expectations and norms are resistant to change. Progress requires concrete actions such as changing evaluation rubrics and broadening the scope of what is recognized as impactful scholarly contributions. To make such changes, broad-based collaboration is essential. It's not just the researchers who should spring into action; academic institutions and funding bodies must roll up their sleeves too. We are encouraging such changes through participatory workshops with diverse research software stakeholders (for example, we have published blog articles about the workshop we convened at the 2023 IEEE eScience conference, focusing on career paths and diversity and inclusion in research software). Our project builds on URSSI’s established planning, furthering its vision for software sustainability and impact. We are just getting started, but here are some major threads that are under development for the project:
We are conducting a study with international research software funders to understand how, and with what effect, their programs and policies are advancing research software.
We’re going to offer theory of change training and supporting templates and guidelines to aid funders in refining and evaluating their investments in research software.
We’ll conduct an analysis of the impact of research software in the wider world, beyond academia.
We are collaborating with others to help improve institutional policy for research software. In particular, we are engaging with the joint Research Software Alliance (ReSA) and Research Data Alliance (RDA) Working Group on Policies in Research Organisations for Research Software (or PRO4RS).
Given the weight that research funding carries in academic promotion and recognition, it is a critical piece of this puzzle. Funders’ criteria often drive the directions in which researchers venture. By including software-related criteria in funding eligibility, consistently funding important software contributions, and publicly recognizing software-related contribution to successful research outcomes, research funders can help nudge the academic community towards recognizing computational contributions. Indeed, an increasing number of funders are creating calls specifically for research software. A key recent development in this direction is ADORE.software, the Amsterdam Declaration on Funding Research Software Sustainability. This international declaration establishes foundational guidelines and suggestions for securing the long-term viability of research software. This kind of shift in the research funding landscape can spur innovation and incentivize faculty and administrators to weave software into the academic reward fabric.
It's time to diversify the academic portfolio by incorporating software contributions. The metrics of academic merit need to shift, and universities need to be agile in adapting their evaluation parameters. Yet, it's not just about adding a new column in an evaluation spreadsheet. It's about fostering an inclusive culture that equally values multiple forms of academic output. The endgame is an academic world where software contributions are seen not as a sideline, but as a central pillar of modern scholarship. This is a world where code is peer-reviewed like a journal article, where contributions to GitHub count toward tenure, and where software’s impact is a currency that academia recognizes and respects.
Hettrick, S., Antonioletti, M., Carr, L., Chue Hong, N., Crouch, S., De Roure, D., Emsley, I., Goble, C., Hay, A., Inupakutika, D., Jackson, M., Nenadic, A., Parkinson, T., Parsons, M. I., Pawlik, A., Peru, G., Proeme, A., Robinson, J., & Sufi, S. (2014). UK Research Software Survey 2014 [Data set]. Zenodo. doi: 10.5281/zenodo.14809