4 February 2020

IFLA Input to a Report on New Information Technologies and Racial Equality

IFLA Submission for the Thematic report on new information technologies, racial equality, and nondiscrimination

IFLA has submitted a response to a call from the United Nations Special Rapporteur on Contemporary Forms of Racism concerning the impact of new technologies, sharing experience and highlighting the need to respect the key library value of equal access to education. 

The application and use of new information technologies – particularly data-driven technologies like Artificial Intelligence and Machine Learning – can have a significant impact on racial equality and non-discrimination. Over the last few years, there has been increasing attention and scrutiny towards biased or discriminatory outcomes stemming from the application of such technologies, from predictive policing to social benefit allocation, credit scoring or other forms of ‘social ranking’, and other areas.

IFLA’s contribution focuses on the possible impacts within the online information environment, access to information at large and the library sector itself. Some of the key points the contribution seeks to highlights are the following:

  • Online targeting and profiling could perpetuate inequalities in cases where disadvantaged groups might not have access to some information or opportunities. Targeted online advertisements are often cited as an example to illustrate such concerns.
  • There have been documented cases of algorithms in the online information environment displaying biased or discriminatory results – for example, in autocomplete suggestion or image search functions of a search engine. The existing corpus of online texts itself can contain racial biases – and these could impact algorithms of online platforms or other AI/ML systems that use such texts as training data.
  • The concerns over biased training data could also be relevant for libraries in light of the rise of the “collections as data” phenomenon.  Cultural heritage collections can be used for computational research, including machine learning – but it is important to be mindful that such collections might contain biases or underrepresent minority narratives; and work with the communities to remedy any such underrepresentation.
  • Algorithmic literacy is crucial to help people critically evaluate any decision made by an algorithmic or AI systems, navigate an AI-driven information field and know how they can exercise their rights. Several initiatives by libraries and library associations have began exploring ways to make such literacy programmes available for their communities.

Read the contribution as a pdf.

FAIFE (Committee on Freedom of Access to Information and Freedom of Expression), Access to information, Digital inclusion, Racial equality, non-discrimination, OHCHR

List all IFLA news