This page contains automatically translated content.

03/08/2018 | Wissenschaftliche Standpunkte

Panel of experts: New German government must be more concrete on data protection

In order to safeguard employees' privacy rights, the new German government should rule out secret checks by employers - this is one of the demands made by the expert panel "Forum Privatheit". In a policy paper now published, the association of scientists analyzes the statements in the coalition agreement on shaping digitization. According to Prof. Dr. Alexander Roßnagel, a legal scholar from Kassel, further efforts are needed in many areas to protect data.

Image: Sonja Rode
Prof. Dr. Alexander Roßnagel

The coalition agreement between the CDU, CSU and SPD promises a "new departure for Europe," a "new dynamic for Germany" and "new cohesion for our country. To this end, the coalition partners want to initiate extensive modernization. As the political basis of the grand coalition, however, the coalition agreement is a compromise that only specifies what the coalition partners could agree on in terms of content. Much is only hinted at and remains vague and abstract. For this reason, the expert group "Forum Privatheit" has investigated what measures should be taken to flesh out the content of the objectives of promoting innovation and data privacy mentioned in the coalition agreement(link to policy paper).

Using the scope of the General Data Protection Regulation for more user protection

According to the agreement, the coalition wants to "enable innovation and new services while maintaining Europe's and Germany's high and globally respected data protection standards." To this end, it wants to intensively accompany the EU Commission's evaluation of the General Data Protection Regulation, which is due in 2020, and review all regulations for their "future viability and effectiveness." "This is also urgently necessary because the General Data Protection Regulation lacks sustainability and effectiveness," says "Forum Privatheit" spokesman Prof. Dr. Alexander Roßnagel, who heads the Public Law/Law of Technology department at the University of Kassel and is also deputy executive director of the ITeG Scientific Center for Information Technology Design there. "It lacks future viability in that it does not regulate any of the foreseeable challenges - such as Big Data, artificial intelligence, self-learning systems, search engines, network platforms, context detection, the Internet of Things - in a risk-neutral way. If it wants to be fit for the future, it must specifically regulate the enormous risks emanating from the digitization of all areas of life. Only if it provides legal certainty against these risks can it be effective." For this reason, the German government should push for risk-appropriate regulations in European data protection law and itself enact such regulations within the scope of Germany's regulatory powers. For the protection of communications data, the German government should support the risk-specific and user-friendly proposals of the EU Commission and the EU Parliament in the Council.


User confidence based on effective data protection measures

The user trust necessary for innovations requires data protection: through system design and default settings, better ways to control the flow of data, the ability to transfer one's own data to other providers, and the protection of confidentiality through encryption. "These rights must also be enforced against economically powerful providers," Roßnagel said. "Data portability and interoperability of digital platforms as well as the modernization of competition law can also help to strengthen the competitiveness of German and European platform companies. In this context, too, data protection and user rights are tools that promote competition and innovation."

Risk-adequate data protection for employees

The coalition recognizes that digitization offers numerous advantages for companies and employees, but at the same time also poses surveillance risks. To safeguard the privacy rights of employees, the German government should establish risk-adequate data protection regulations for the employment relationship. These include regulations that explicitly exclude surreptitious checks as well as permanent surveillance and the creation of comprehensive movement profiles. When using mobile devices, as much data as possible should remain under the control of employees.

Legal framework for smart cars, smart health and smart cities

The coalition agreement provides for a legal framework for autonomous driving (smart cars) that ensures data privacy and data security as well as the highest level of traffic and data safety. This legal framework must also ensure that data subjects are always adequately informed about what data is being processed and by whom. They should have simple options for consenting or refusing to such data processing. Non-consent must not lead to serious disadvantages. "As with smart cars, specific, risk-adequate regulations for the use of technology must be provided for in the context of smart health and smart cities," says Dr. Michael Friedewald, a scientist at the Fraunhofer Institute for Systems and Innovation Research and Forum Privacy coordinator. "Preserving the freedom of choice of those affected is the right approach." However, a risk-specific regulation that protects against misuse is equally needed for the many health data that are collected, transferred (often to non-European countries) and processed as part of voluntary measurement procedures for bodily functions. The regulations used to control energy and traffic in smart cities must also ensure that they do not give rise to any new and deeper risks to the privacy and self-determination of those affected, especially through behavior and movement profiles.

The measurement of people must be regulated

It is to be welcomed that the coalition agreement wants to make the prohibitions of discrimination in the "analog world" also valid in the digital world. However, this must not be limited to consumer protection alone. Rather, for the use of algorithms, artificial intelligence and Big Data, as well as for the measurement and cataloging of people in all areas of society and the economy, it will be necessary to regulate which measurement criteria and procedures are permissible and which, because of the risk of discrimination, are impermissible.

The Forum Privacy Policy Paper on the coalition agreement "Strengthening data privacy, enabling innovation - How to shape the coalition agreement" offers an analysis of the  coalition agreement with regard to digitization and data privacy, as well as recommendations on what concrete measures are needed to achieve the goals still formulated in abstract terms in the coalition agreement. Policy Paper at https://www.forum-privatheit.de/forum-privatheit-de/publikationen-und-downloads/veroeffentlichungen-des-forums/positionspapiere-policy-paper/PolicyPaper-Koalitionsvertrag.pdf.

In the Privacy Forum, which is funded by the BMBF, experts from seven scientific institutions are addressing issues relating to the protection of privacy in an interdisciplinary manner. The project is coordinated by Fraunhofer ISI. Other partners include Fraunhofer SIT, the University of Duisburg-Essen, the Scientific Center for Information Technology Design (ITeG) at the University of Kassel, Eberhard Karls University of Tübingen, Ludwig Maximilian University of Munich, and the Independent Centre for Data Protection Schleswig-Holstein.


Speaker "Forum Privacy":
Prof. Dr. Alexander Roßnagel
University of Kassel
Project Group for Constitutionally Compatible Technology Design (provet)
Research Center for Interdisciplinary Technology Design (ITeG)
Tel: 0561/804-3130 or 2874
E-mail: a.rossnagel[at]uni-kassel[dot]de

Forum "Privacy and Self-Determined Life in the Digital World"
www.forum-privatheit.de/forum-privatheit-de/index.php
Twitter: @ForumPrivacy

& nbsp;