The European Commission touts its Digital Services Act as the “first time a common set of rules on intermediaries’ obligations and accountability across the single market”. For good reason: its projected economic impact could range from €12 billion to €23 billion, thanks to increased opportunities for a diversity of market operators. Given the ongoing rise of fact manipulation and disinformation, it could also (and perhaps more importantly) foster greater democratic control and oversight, by improving citizens’ fundamental rights, promoting greater, more inclusive choices at lower prices and, finally, limiting exposure to illegal content. As such, a tie in with Europe’s Corporate Sustainability Due Diligence directive would appear obvious. Various client engagements, however, shows data protection remains a tricky consideration for corporate Duty of Care management systems. Here, Ksapa reviews the implications of the Digital Services Act for Human Rights Due Diligence and outlines a way forward for organizations to embed data protection in Corporate Sustainability Due Diligence.
1. Key Aspects of the Digital Services Act
On 20 January 2022, the European Parliament voted in favor of the Digital Services Act. Last-minute amendments included the prohibition of advertising targeting minors, the use of highly sensitive personal data for behavioral targeting of anyone and manipulation or forcing consent with the use of personal data. The resulting document is subject to a series of 5 interinstitutional negotiations initiated in April 2022.
This new piece of European regulation essentially requires Big Tech platforms – including Facebook and Instagram and YouTube – to assess and manage systemic risks generated by their services, such as advocacy of hatred and the spread of disinformation. These companies will indeed be required to submit to independent yearly audits and to give regulators and third-party researchers – including civil society organizations – access to platform data and insights into their algorithmic black boxes. The keyword being accountability, this calls for the following measures:
- Countering illegal goods, services or content online, through mechanisms enabling users to flag content and for platforms to cooperate with “trusted flaggers”;
- Enforcing traceability obligations relative to online market places, whereupon business users will help identify sellers of illegal goods or provide reasonable efforts to randomly check whether products or services have been identified as illegal in official databases;
- Developing effective safeguards for users, with the possibility for them to challenge platforms’ content moderation decisions;
- Banning specific types of online adverts which target children or use special categories of personal data (including ethnicity, political or sexual orientation);
- Introducing transparency measures for online platforms – notably on the algorithms they use to provide recommendations;
- Enforcing very large platforms or search engines’ obligation to prevent the misuse of their systems, by taking appropriate risk mitigation measures and submitting to independent audits of the corresponding management systems;
- Fostering access to key data from the largest platforms and search engines to enable researchers to analyze the evolution of online risks;
- Developing an oversight structure at Member State-level, with the support of a new European board for digital services.
The breath of these requirements explains why the Digital Services Act was hailed as a watershed moment in Internet regulation. Encouragingly, 65 signatories voiced support for rights-based internet regulation through the Investor Alliance for Human Rights. That said, actionable convergences between the Digital Services Act and Sustainability Due Diligence directive have yet to be identified… to say nothing of their activation as part of corporate management systems.
2. Actionable Convergences of DSA and CSDD requirements
Back in 2018, the Commission launched a Flash Eurobarometer on Illegal Content online. The report showed 61% of respondents said they had come across illegal content online and 66.5% did not think the Internet is safe for users. The European Digital Services Act is intended to remove disincentives for companies to take voluntary measures to protect their users from illegal content, goods or services.
Conversely, it pushes businesses to use new simple and effective mechanisms for flagging illegal content and goods that infringe fundamental rights (such as intellectual property rights or unfair competition practices). It enhances obligations for marketplaces to apply “know your customer” policies, make reasonable efforts to perform random checks on products sold on their service and adopt new technologies for product traceability.
Looking beyond digital markets, the European Corporate Sustainability Due Diligence directive builds on the United Nations’ Guiding Principles on Business and Human Rights. That means companies should conduct Human Rights due diligence, which includes identifying, preventing, ceasing, mitigating, remediating and publicly accounting for potential or actual adverse impacts. In the specific context of digital services, that involves addressing risks to people tied to how Big Tech boosts revenue, competes and increases attractiveness to investors. That notably includes:
- Large-scale data collection (to train algorithms or sell insights to third-parties);
- Sales of technological products and services supportive of State functions or public service delivery, which can disproportionately target vulnerable communities;
- Hyper-personalization of corporate decision-making tools, which can foster discrimination, limit worker safeguards or make their jobs altogether redundant;
- Provision of technologies to any number of businesses or individuals with little oversight, which could maximize harm to people;
- Mainstreaming models informed by, or informing on personal choices and behaviors without user knowledge or consent.
Civil organizations argue the Digital Services Act in its current assumes all corporate risks can be mitigated, without explicitly requiring companies to implement the full range of remediation tools. They also contend the predefined risk list overlooks the socio-economic and even environmental considerations tied to digital platforms’ practices. Combining both sets of regulatory requirements would therefore help businesses and investors anticipate potential grey areas in their compliance programs and lift their respective limitations in scope as well as in implementation measures.
3. Embedding Data Protection in Corporate Due Diligences
The European Corporate Sustainability Due Diligence directive notably targets the prohibition of arbitrary or unlawful interference with a person’s privacy, family, home or correspondence and attacks on their reputation. While the directive does not explicitly cover data protection, we contend it points to the European Declaration on Digital Rights and Principles, which does. Various client engagements show, however, businesses and investors strain to include data privacy in their duty of care programs. Given the directive promotes a value chain approach, client usage is bound to be taken into consideration.
With that in mind, Ksapa outlines its 5-step approach for businesses and investors to embed data protection in corporate sustainability due diligence processes:
1. Adopt a Systemic View of Priority issues
The issue of Human Rights is as complex as it is sensitive. Data travels the world over in a matter of seconds and in untold quantities, making it even more difficult for business whose very models increasingly rely on the very same data to address potential Human Rights implications.
That trade-off is precisely why private players are regularly criticized for Human Rights abuses – and consequently required to mitigate their risks. That said, their efforts in that regard are almost automatically met with suspicion – a suspicion fostered by the very same digital services’ capacity to empower and spread information (or lack thereof). Users may for example misuse digital services to commit Human Rights abuses. Consider the impact of a breach in the protection of client information for an international hospitality organization. How an technology multinational’s worker register could enable rogue states to target vulnerable cross-sections. How a transportation operator’s files in the wrong hand could have massive international security implications.
2. Calibrate Action Plans and Corrective Measures
On the other hand, digital tools are key for a company to proactively and continuously improve their identification of priority Human Rights risks as their activities of operating contexts evolve. They may indeed play an integral role in streamlining comprehensive and regularly updated Human Rights risks assessment related to product or service usage – bearing in mind other societal and human rights benefits.
They certainly help structure mitigative measures to address those risks and track roadmap effectiveness, while documenting issues the company has not yet addressed or cannot control (at least not alone). Digital services the likes of Big Data or Machine Learning could indeed be calibrated to adopt a right-base lend, helping organizations exercise foresight on potential or actual risks it may or may not fully resolve. Finally, data-driven insights are the essential component for them to report on their decision-making processes, particularly where Human Rights harms are the most severe.
3. Including and Disseminating Rightsholders’ Perspectives
Corporate due diligences are only as good as the stakeholder dialog they rely on. Businesses and investors have everything to gain in structuring continuous, proactive and open dialog with priority stakeholders throughout their projects’ lifecycles. Stakeholder engagement empower corporate teams to address the growing diversity of Human Rights considerations brought to bear by the present digital age. Across their entire value chain, with the right level of granularity, cognizant of local specificities. All the while adapting measures based on their project developmental stage. Stakeholder engagement is perhaps even more essential when it comes to assessing the effectiveness of their remediation measures and monitor progress over time, both required by the CSDD.
Enabling digital technologies to achieve their full potential while mitigating underpinning risks means organizations must meaningfully engage civil society, rightsholders, paying particular attention to vulnerable cross-sections in their due diligence. This also helps them exert their leverage over other duty-bearers, for instance through the development of customer associations, formalizing user behavior norms and expectation, collaborative industry standard-setting and constructive negotiations with regulatory and policy-making bodies to prevent Human Rights violations. Think how a client may use a beach resort to conduct illegal trafficking which the parent company may be held liable. It may however use data-powered profiling to pinpoint illegal behavior, help teams put an end to it or engage with local authorities to prevent further abuse.
4. Activating Enforcement Mechanisms and Access to Remedy
Stakeholder engagement are also key for rightsholders to take an active role in corporate decision-making which may directly impact their lives, both in terms of project management and resource allocation. That notably includes having appropriate grievance systems for organizations to expand on strictly top-down grievance management approaches and ensure access to remedy for potential victims. A key issue with digital services is that, while they literally power such platforms (online websites, WhatsApp hotlines…), they may also be called upon to remedy abuses tied to machines and algorithms. With data so pervasive in the fabric of our global economy, it may also mean millions of adversely impacted rightsholders may make legitimate claims. Given the increasingly complex architecture of modern-day Internet, that also means a variety of players (which the Data Services Act refers to as …) may be identified a duty-bearers and potentially held liable over harm tied to the interaction of diverse digital products and services.
5. Operating in Full Transparency
While data-powered services create near real-time information, they do not necessarily foster transparency. The European Corporate Sustainability Due Diligence directive, on the other hand demands organizations make sincere efforts to understand, share feedback and develop solutions. Tackling endemic issues indeed requires collaborative work, even if it means focusing on only part of the issue. However, a logic of continuous improvement will always be more socially acceptable than (perceived) corporate inaction or opacity.
Civil society organizations have raised concerns relative to the Digital Services Act, which the Corporate Sustainability Due Diligence Directive could help alleviate. In an increasingly digitized world, it stands to reason companies will need to comply to both, which implies expanding the conventional scope of issues under consideration. Doing this puts businesses and investors in a better place to anticipate the increasing complexity of regulatory expectations.
Ksapa is a strategic global platform designed with these very questions in mind. Our network of 300+ practitioners across the globe work with business and investors to design frameworks, policies and collaborative initiatives on the ground to bring risk mitigation solutions at scale. Get in touch to discuss the best way for your organization to address multilateral pressures and regulations, ultimately driving real-world impact across your value chain.