The headline of this article describes perfectly the dilemma we are facing these days: In a data driven economy privacy and the protection of data are of utmost importance. However, if we do not grant access of data to others (private companies or law enforcement authorities), we will lack economical development and take a risk on an irreparable loss of security at the same time.
The free flow of data is also of utmost importance. And we should not wait for the death of our privacy or the death of a strong future. If we do not balance the realities in our laws, our citizens will loose trust in the legislator.
Digitalisation has changed more than any other technological revolution and will change further our social and work behaviour,
our means of communication, future competitiveness, law enforcement, property rights, copyright legislation and the fundamental right to data protection.
The importance of data has changed fundamentally since the eighties. Back then it was about the simple processing of data to easily find, change and archive information. A simple piece of data remained a simple piece of data.
Today, relationships between accumulated data are made and processed in the way that someone’s whereabouts, interests, feelings and behaviours can be read and conclusions can be interpreted from these analyses.
Thus it is possible to predict how an individual behaves, how he chooses, whether he lives healthily or what kind of potential or risk lies dormant within him. With other words:
We are on our way to digitalise our souls!
The interconnection of these analytical results together with the personal data of other persons or with environmental, health or social data brings forth completely new insights and business models. In other words, data and its far-reaching analyses have become big business, which generates growth and prosperity.
This technological advancement challenges all previously existing business models, fundamental rights and our security. This „data-triangle“ is like communicating vessels. Therefore data must fulfil multiple purposes.
This involves: Firstly, the protection of our privacy (fundamental rights); Secondly, our economic survival and competitiveness (business competition) and thirdly, an appro- priate and also necessary combat through intelligence or law enforcement authorities against terrorism and organised crime (security).
Even if, somebody rejects this digital development, he/she cannot escape it.
This would be economically fatal, as global economic competition will increasingly align itself with these technological competitive advantages, otherwise economical or personal irrelevance threatens.
On top of the above mentioned, artificial intelligence, self-learning algorithms and robotics will open again another dimension, also in creating more data, including big amounts of personal data (Big Data).
The type of privacy we knew one or two decades ago has no place in the future. It will be affected by the data that we produce ourselves and that we provide to others to optimise our lives, our business and our security.
To combine the challenges of an economic data-driven future with an easy access to (personal) data with the protection of privacy and security must be the EU’s task for upcoming legislation in the digital area.
In doing so, the rights of individuals should not be affected, but we should change creatively the main criteria (consent) into a data-sover- eignty of the individual.
With more creativity, the EU could become a role model and could prove that it is possible to align the globalised digitalisation of people to the ‘analog’ framework based on fundamental rights-related values.
I am convinced that the future-oriented data protection must protect the privacy and not only the data. But this will not succeed with the criterion of compliance, because the individual will consent wherever he expects an advantage in his everyday life.
This guarantees namely a kind of free flow of data and a certain access to data, but the accessibility to data itself (for business or security reasons) is so far not a criterion for legislators, for judges or the data protection authorities. That might be proven wrong.