The right to informational self-determination: Keep it simple!

By Mathias CELLARIUS, SAP, Data Protection Officer, Head of Data Protection and Privacy

One of SAP’s mantras is “Keep it simple!”. Why simple? Because complexity is always a challenge to master.

For 40 years, SAP has helped businesses run better through world-class software solutions that solve complex problems to help them focus on their main occupation and core strengths. Overly complex business processes can slow down and frustrate an entire organization, and people will start looking for ways to bypass them.

Data protection and privacy laws have been conceived to protect people against the threats of a digital world. Today their purpose is more relevant than ever. While most people enjoy the benefits of the World Wide Web, of being able to connect themselves and communicate at anytime and anywhere, there is also a growing uneasiness among individuals about possible misuse of their personal data and the increase of cyber threats. Data protection, thus, should be a no-brainer. Simple.

How is it then, that data protection professionals often encounter reservation and reluctance when they are doing their jobs? Well, the answer is simple, too: data protection laws are complex. Not only do they limit the use and development of new technologies, they also require a breathtaking level of bureaucracy, which are, with the exception of tax and accounting rules maybe, nowhere to be seen in any other field of law!

Unfortunately, achieving a well-balanced and well-functioning data protection framework is non-trivial, and

finding the right balance between data protection and privacy risks and the benefits stemming from new technologies may very well be one of the biggest challenges of our times.

Data protection rules do not exist in a legal vacuum. There is no question that privacy and the right to data protection are fundamental rights. However, they must be balanced against other fundamental rights, such as the right to liberty and security, the freedom to conduct business, the right to choose an occupation and engage in work, the freedom of expression and the freedom for the arts and sciences – to name but a few. Informational self-determination is a fundamental element of human dignity but so are the rights to physical well- being and economic prosperity, and we need to excel in research and education if we want to remain meaningful and guarantee our European values for the generations to come. The question is how to achieve this?

The former EU data protection law was conceived in the pre-Internet age. While it has proven to be remarkably resilient, and has been flexible enough to retain relevance even in today’s globally-networked world, the emergence of new data-driven technologies and business models has put increasing pressure on the underlying principles. The dilemma starts with the definition of personal data. The logic of an expansion in scope of ‘personal data’ is appealing and sounds simple: the broader the definition of personal data, the more data comes in scope, the more data is protected. However, if all data that can be linked back to an individual no matter how unlikely it is to be used, how much effort is required to make the link nor how tenuous the link maybe, comes under the full scope of data protection laws, many beneficial uses of data become questionable. Companies and authorities are faced with the unmanageable reality that, in effect, all data could be considered personal.

A further challenge presents itself with the traditional principles relating to the pro- cessing of personal data as they have also been formalized in Art. 5 of the GDPR when it comes to new data-driven technologies. “Big Data”, “Internet of Things” and “machine learning”, to name a few of the buzz words, all have in common that they are based on the processing of large quantities of data. They create new insights by combining and relating data to each other. Of course, this may affect the interests of human beings where data relating to them is concerned. It only seems natural that one should go to the affected individuals and ask for permission first, and then stay within the scope of permissions granted. However, are the general consent requirement and the principle of purpose limitation efficient protections in reality? One can have doubts given that hardly anyone reads privacy statements and that users happily and in no time click “I agree” buttons on web sites and in mobile phone apps. On the other hand, businesses that take the legal requirements seriously are required to invest significant effort and money in recording and maintaining consents. This is not simple! Neither for the users who click-accepts declarations of consent without reading them because they are too complex nor for the businesses who have drafted them this way to be legally safe.

The principles laid down in Art. 5 of the General Data Protection Regulation (GDPR) have changed only marginally since 1995 when the GDPR’s predecessor, Directive 95/46/EC, was enacted. And even at that time, when they were written into Art. 6 of that Directive, they were not new. In fact, the entire GDPR rather is an evolution, not a revolution. Data processing technologies, on the other hand, and the opportunities they provide have changed in a breathtaking manner. The volume of processing activities has multiplied. Along with this development has gone a broad social acceptance. One does not require a crystal ball to predict that this trend will continue and that data processing will evolve exponentially.

This raises the question whether the concept that has been underlying data protection laws in Europe for 20 years and more, to limit data processing and to keep the digital footprint of a person as low as possible, has failed or, to the contrary, helped prevent the worst. I believe that neither is the case. Data protection and privacy is still relevant, more than ever. Now, however, it may be time to re-consider its concepts. If one does not want to go as far a changing from today’s general prohibition (the processing of personal data is prohibited unless expressly allowed) to generally allowing the processing of personal data unless expressly prohibited, then at least we should think about introducing statutory permissions that define the boundary conditions for what is socially acceptable.

We should concentrate more on what matters to people and less on what we believe should matter to them!

Conceptual changes may be thoughts for the future, now that the GDPR has been enacted and that, according to some sources, is meant to be relevant for the next 20 years. However, so much will now depend on how the new law will be put into practice, whether the EU and its’ citizens will be able to participate in next generation innovation which is being driven by data, or not. At the same time a modern way of interpreting and implementing the regulation should not assume a “one size fits all” approach. Going forward, we should con- centrate on what is important: the individual’s right to informational self-determination – where it is significantly impacted. Not every processing of data is equally intrusive, not every piece of data is equally sensitive. We need to recognize the importance of context and how it affects potential consequences to users. Trying to eliminate every remote privacy risk may jeopardize valuable data uses in return for small privacy gains.

Several tools and approaches (foreseen under the GDPR) including anonymization, privacy impact assessments and privacy by design which, when properly applied, can help reduce or minimize the impact on privacy. Companies have the possibility to enact technical safeguards, such as pseudonymizing and encrypting data, automated data logging, data analytics restrictions, access management and automated data validation. A legal system that is closely attuned to these additional safeguards will enable organizations to maximize data utility while minimizing privacy risks. If companies set tighter controls on access to such data and provide consumers with meaningful controls, this should be encouraged and merit more liberal legal treatment and lighter obligations.

SAP has established an entire Data Protection and Privacy (DPP) team consisting of people with a business background, technical experts, auditors and attorneys. The team’s core responsibilities include the shaping of SAP’s data protection policies and standards, providing advice, recommending key compliance measures, monitoring compliance, conducting audits, training of staff and incident response. The DPP team works together with development and operational units across SAP to provide training and advice, and thus help them develop new and improve existing data protection technology.

In addition, making employees aware of what is expected of them in this domain helps build a culture that values the protection of personal data and the individual human beings. Ongoing privacy education and awareness training gives all employees access to the information needed to recognize and properly handle personal information, on a day-to-day basis.

The entire SAP workforce worldwide receives a data protection and information security-focused training covering all business and staff units.

We very much support the idea that the European Data Protection Supervisor as well as national data protection authorities recruit staff with more technical (e.g. data scientists) and economic education. It is an excellent way to ensuring a balanced approach. Data protection and privacy constitute a fundamental human right. But it must not become a “super” or “über” right that overrules and pushes aside every other benefit for the individual, the economy or society itself.

We must foster an approach with innovation, technology, business and European competitiveness in mind while putting the necessary controls and balances in place to ensure that society will not be put at crossroads. In this context, we must ensure that the debate isn’t becoming too polarized, with each side dismissing the concerns of the other. An automatic negative connotation versus new technologies would be detrimental and only lead to us consuming services and solutions that are offered from other parts of the world.

As little as we like the idea of having machines make decisions on our behalf, we must ensure we take conscious decisions on the right balance for the future of Europe. We are not suggesting that we should be subjecting key issues around our human individuality and dignity to automated, algorithmic decision-making. Clearly, the individual is at the center of society and critical decisions must always remain under the control of a human being. However, in a world ruled by economic principles, our European values will only be able to prevail if we manage to translate them into clear and easy to follow rules that people understand and accept and that our businesses can easily implement and comply with rather than being stalled by bureaucratic burdens and paralyzed by the fear of high fines for non-compliance. Reduce complexity, make it simple and win!