New guide to help companies act responsibly, when acting digitally

A man in front of a tablet making a facial recognition. Photo: Rodrigo Reyes Marin/Zuma/Ritzau Scanpix
Building on our award-winning methodology on human rights impact assessment, the Institute has developed a guide for the private sector operating in the ever-larger digital ecosystem.

Facial recognition technologies, algorithmic decision-making and many other digital products and services are essential to help private companies optimise their business models and make all kinds of processes more efficient. However, the technological breakthroughs also have downsides. For example, when companies are selling facial recognition systems to or launching smart mobility systems in countries that have no data protection laws protecting its citizens, it can lead to severe human rights impacts. 

The Danish Institute for Human Rights has developed new practical guidance for businesses, and other actors, on how to conduct human rights impact assessment of digital activities. Building on our award-winning methodology on human rights impact assessment, the Institute has drawn on our own experience and collaborated with a wide range of external stakeholders, including companies, to create accessible guidance for the private sector and other users of digital technologies.

Necessary steps to mitigate risks

“Making sure that technology serves society is not just the responsibility of companies, but of all stakeholders. I greatly hope that this guidance will help tech companies, large and small, better understand the potential human rights impacts of their products and services, and take the necessary steps to mitigate risks. Doing so requires building relationships and dialogues with different stakeholders, users and others who might be potentially affected. That collaboration is critical, and part of the strength of this guidance comes from the fact that the Danish Institute for Human Rights itself has brought in so many groups and experts from different  stakeholder groups in its development," Richard Wingfield, Head of Legal, Global Partners Digital, says.

The assessment of human rights impacts of business activities is considered a key component of the corporate responsibility to respect and corporate human rights due diligence, as outlined in the UN Guiding Principles on Business and Human Rights. After decades of digitalisation, virtually all major companies are today active in the digital ecosystem. At the same time, an increasing number of actors, from civil society and academia to intra-governmental organisations, are calling for companies to conduct human rights impact assessments in relation to their digital activities.

A challenging labyrinth to navigate

Yet there has been a shortage of guidance available for companies who aim to act responsibly and with respect for human rights. 

“The guidance is a critical instrument not just because it is firmly grounded in human rights, but because it is tailored to a digital world where human rights harms are insidious and tracking them is a challenging labyrinth to navigate. Ranking Digital Rights pushes companies to be more accountable not just for the harms that emerge from their operating environments, but also for those that stem from the architecture of their own business models. The guidance illuminates many of the points that Ranking Digital Rights has made on human rights due diligence and equips civil society and companies alike with an excellent framework for conducting it.” Jan Rydzak, Company Engagement Lead Ranking Digital Rights says.

We welcome input

Considering that human rights impact assessment of digital activities is an emerging field, the Institute also hopes to inspire conversations and dialogue around the human rights impacts of digital activities. We welcome all input and feedback on how the guidance can be further improved.