Why GDPR does not Solve the Issues of Cybersecurity and Electronic Surveillance

The General Data Protection Regulations (GDPR), which came into force in May 2018, are a major milestone in the current digital revolution. GDPR ended a decade in which the triumphant business model was one based on the most basic rules of personal data protection. However, in a world increasingly governed by Big Data, Artificial Intelligence and Internet of Things, corporate responsibility remains enormous. Vigilance is required.

The Snowden case was a useful reminder of the real challenge of protecting personal data. The implementation of a regulatory text such as the GDPR was the result of a long battle between the European Union and the United States, which clashed, due to divergent economic interests, on two different conceptions of the role of data in product and service innovation.

  • In Europe, there is an old regulatory base that is less and less adapted as a result of technological convergence and data globalisation, but which nevertheless pushes for data respect. The French National Commission for Information Technology (CNIL) is a perfect example.
  • In the United States, web giants have long prospered through the use and exploitation of personal data. This true business model has enabled the emergence of global companies such as Google and Facebook. This makes it easier to understand the reluctance of relevant companies and stakeholders to adopt coercive measures to respect personal data and the greater ease with which they can go and “pick” the information they consider necessary.

I was able to work directly on the complexity of these issues. In particular, I have been fascinated by the creation of multi-stakeholder initiatives such as the Global Network Initiative, whose objective had been to create a platform that would reduce the imbalance of information between stakeholders (telecommunications companies, and rights-holder NGOs in particular) on these personal data protection management issues and to avoid any government intervention that would harm the privacy of private information. Having worked directly on these issues with Europeans companies such as Deutsche Telekom, Telefonica, Telia, or Vodafone, I have also seen the major distortion of competition with the GAFAs to demonstrate data protection responsibility. Finally, the GDPR provides an egalitarian and competitive legal framework allowing any company based in the European Union or elsewhere to develop digital activities with the same rules of the game.

However, the Scale and Complexity of the Digital Transformations Underway Call for Ever Greater Vigilance

But the regulatory framework is undoubtedly already lagging behind. The high-speed development of products and services based on virtual reality, algorithms using artificial intelligence or the Internet of Things leaves the issues of privacy, discriminatory treatment of individuals, and even freedom of information completely and practically blank. All the more so in a context of exponential growth in cybercrime issues or uncontrolled parallel cryptomonetic spaces. Thus, the digital transformations underway and on which companies rely to innovate, adapt or strengthen their market positions raise questions to which governments and companies will have to be accountable to their constituents and consumers with ever greater acclaim.

Guarantee conditions of equity and transparency in the provision of services
The authors of AI Now’s report say there is a “cascading number of scandals” involving AI and algorithmic systems deployed by governments and big tech companies in 2018, including accusations that Facebook helped facilitate genocide in Myanmar, to the revelation that Google’s is helping to build AI tools for drones for the military as part of Project Maven, and the Cambridge Analytica scandal

How then can we be transparent in the use of data, the functioning of algorithms, while managing the risks associated with the dissemination of complex information? Two main avenues can be suggested:

  • The use of procedures reviewed in the form of periodic quality review in product design, making it possible to test and validate the criteria implemented in the fair treatment of profiles and contextual data processed by the algorithms
  • The integration of samples of target test populations to test, for example, the methodological presentation and to validate the understanding and perception of transparency by these potential users

Monitoring of induced effects. Implement due diligence
Digital innovation still raises two major questions associated with the adoption and appropriation of solutions by users. These ethical issues require constant monitoring addressing induced effects and how to improve the aggregation and use of data as well as the algorithms providing the service offer. Follow-up due diligence is essential:

  • Adoption is about how users will actually use the service. Various governments, for example in the former Soviet Republics of Central Asia, have used data on where mobile phones were limited to the time and place where demonstrations against governments were held to identify demonstrators and then arrest them at their homes. The adoption of the technological solution by governments to track opponents was obviously different from the original intention of providing a portable means of communication to users….
  • Ownership refers to the way in which users will understand how the service works and try to manipulate it accordingly. Facebook was originally developed to create a virtual community of friends. The appropriation for users has also made it also and subsequently a tool for political propaganda that may have influenced different elections.
    There are already many laws governing customer relations in high-risk locations (e.g. export control laws), but what about legal but immoral business relationships or those that violate the rules of confidentiality and freedom of expression? Monitoring adoption and ownership issues is essential to correct and improve digital solutions, and ultimately to ensure not only their acceptability, but also their compliance with fundamental rights principles

Cybersecurity: Engaging employees, users and technical experts as a key requirement
Many scandals have demonstrated the strength with which cyber security issues and, for example, the theft of customer data can seriously affect the functioning and even the sustainability of any company. Deloitte, Sony have been seriously pushed. Ashley Madison, a sulphurous site for extramarital dating, has obviously never recovered from it. This constantly innovative world again invites companies developing in the digital world (and the authorities) to maintain an active dialogue with three key communities:

  • Training and monitoring of employee practices helps to control many of the cyber security issues that are at the heart of many companies. Employees and their vigilance remains the biggest flaw and the best security link. Nothing is easier than to introduce a USB key into a system when working in the organization for example….
  • Study and dialogue with user communities and specialized networks helps to keep an eye on adoption and product usage, in order to see how the uses of products and services are likely to create new vulnerabilities that need to be addressed
  • Finally, the continuous dialogue with technical experts, sometimes hackers in another life, obviously remains a major focus of monitoring

Thus, while GDPR has eventually created a more regulated collective space for data processing and privacy, it is clear that the ongoing digital revolution still offers many new avenues on which the liability issues of companies providing digital solutions and services will increase.

Website | more posts

Author of several books and resources on business, sustainability and responsibility. Working with top decision makers pursuing transformational changes for their organizations, leaders and industries. Working with executives improving resilience and competitiveness of their company and products given their climate and human right business agendas. Connect with Farid Baddache on Twitter at @Fbaddache.

Leave a Reply

Your email address will not be published. Required fields are marked *