Skip to main content
SearchLoginLogin or Signup

The Other Side of Data Analytics: Risks to Privacy, Equity, and Control

What are the issues surrounding data analytics and how should organizations use data ethically?
Published onApr 19, 2021
The Other Side of Data Analytics: Risks to Privacy, Equity, and Control


The core activities of senior academic administrators are threatened by algorithms that could effectively dictate how to increase research productivity, climb rankings, or raise graduation rates. If they aspire to shape the agenda of their institutions, rather than just manage them, academic leaders should take (or retake) control.

Much thoughtful work has been published on the issues posed by algorithms in terms of privacy and bias towards disadvantaged individuals and communities (for example, in books like Algorithms of Math Destruction and Algorithms of Oppression). Yet, these concerns do not seem to temper the interest of academic administrators to acquire and use these tools, in spite of the absence of safeguards to guard against the risks posed by their use. While administrators have been reluctant to discuss their adoption, we have recently seen an entire national consortium (in the Netherlands) acquiring access to Elsevier’s data analytics tools on research and research productivity as part of a broader contract to supply access to their journals.

Senior administrators face boards or governments scrutinizing every small move up and down in university rankings and many are effectively held hostage to performance-based funding, while journals Impact Factor plays a crucial role in evaluating the work of faculty and researchers. On the other hand, administrators are under limited or no scrutiny for signing up and subscribing to data analytics services. It is not surprising that data analytics and AI look poised to play an even larger role in academia.

It is important, however, to avoid becoming luddites. A world without data is not a world free of biases – after all, human biases are the root cause of algorithmic bias. In fact, without data, it is impossible to counter biased decisions. In summary, the goal is introducing safeguards, rather than renouncing data, data analytics, and AI.


In 2019, SPARC issued a report1 summarizing an approach aimed at addressing both the nuts and bolts of managing the rising tide of data and algorithms and designing a long-term, structural solution to the problems we have identified through three set of actions.

1. Risk Mitigation

A number of issues are relatively straightforward and require straightforward solutions. These actions can be described as “risk mitigation” activities, because they are aimed at minimizing some basic problems (like coordination or lack of adequate data policies) posed by the rising collection and use of data and data analytics in the academic community.

In aggregate, risk mitigation actions should not be controversial in nature, as they represent sound management practices. There should be room for different cultural approaches, of course. But the overall objective of better management of data should not be controversial. For example, many data policies today are largely technical and defensive, while they should also address how to use data to achieve the objectives of the institutions – shifting their primary focus from preventing unauthorized data access to orienting and prioritizing authorized uses. It is difficult to envision anyone opposing this shift.

2. Strategic Choices

A second category of issues is described in the SPARC report as “strategic choices”. There are several issues that are much more complex to decide, and perfectly reasonable arguments can be made for widely diverging responses. For example, should predictive software be deployed to identify students and faculty at risk of taking violent action, either against themselves or against others? Should humans screen graduate students applications, at least for a first selection, or should software drive this process?

Intuitively, different institutions may come to decide on any of these points in opposite ways. It is important to ensure that decisions are the end result of a structured and transparent process that involves all the relevant stakeholders in the institution and that is properly managed with the contribution of relevant expertise (such as business, legal, or ethical advice).

3. Community Actions

Finally, some actions are too broad and large for any one academic institution to take, or that are more likely to be effective if backed by a larger group of institutions; these actions can be described as “community actions”. They can range from promoting advocacy for Open Science and Open Access principles in the appropriate venues to acquiring control of a community-controlled infrastructure. Of course, it is not possible to have the entire academic community support each these actions, so it is reasonable to plan for these activities to aggregate coalitions of willing institutions that share goals and objectives and are culturally compatible. These coalitions will form as networks of individuals and institutions will start discussing common views about the desirability of these actions.

Figure 1

Example decision tree


2020 has provided many lessons to the academic community. Two, however, stand above all other ones:

1.     The value of Open Science

2.     The centrality of equitable access to knowledge and its benefits.

There has been much praise for the “bottom-up moonshot” performed by so many researchers, public health officials and medical staff around the world in order to combat the COVID-19 pandemic. An estimated 1000,000 articles related to COVID-19 were published in 2020; by one count, they may have been as many as 200,000. In doing so, these individuals have often prioritized the rapid dissemination of knowledge at the expense of traditional publishing processes: at least 30,000 COVID-19 preprints were published in 2020.  

However, not enough has been done to institutionalize an equitable dissemination of scholarly work. We still see many institutions signing transformative deals that are designed to be exclusionary, because they continue to endorse restricted access to articles and data in subscription journals, because they limit the capacity to text and data mine freely the research outputs, or because they limit the capacity of disadvantaged institutions and communities to contribute when the funding is based on APCs. Dismantling these inequities must be the next task.


Questions you ought to ponder for your organization:

Is there a known, effective solution for your issue?

How many people would be involved in creating a solution to this problem?

How many possible solutions can you think of for the problem at hand?

Do you need to conduct some research in order to better understand possible solutions?

Will your solution to this issue impact people outside of your institution?

Will you need buy-in from people outside of your institution to implement this change?

No comments here
Why not start the discussion?