The dictatorship of digitalization

Freedom or innovation: a pessimist's outlook

Florian Melzer, data protection consultant at Patronus Services GmbH
Published in: DiALOG - THE MAGAZINE FOR DIGITAL CHANGE | 2021

Just two months after its discovery, the Microsoft Threat Intelligence Center (MSTIC) recently published the hacking actor Hafnium, which until a few days ago had infected almost 60,000 local Exchange server systems in Germany alone, showing that we are dependent and vulnerable. At almost the same time, representatives* of the American start-up Clearview AI were criticized for facial recognition software that creates a so-called "faceprint" based on publicly accessible photos on Facebook, Twitter, Google and the like. The facial recognition database generated in this way is the largest of its kind in the United States - even larger than that of the FBI. Their interest in the potential of this technical innovation is obvious: technologies for the rapid identification and movement monitoring of individuals simplify their work enormously, and the data that can be accessed is endless if social media can be searched. However, this gives authorities access to information from private individuals who do not live in their jurisdiction but on another continent; European citizens are also affected by Clearview's data policy, although the European General Data Protection Regulation is actually intended to ensure that personal data is not processed outside the EU in circumvention of European data protection standards.

Responsive Image

And the data subjects? For them, despite the regulations and high fines under the General Data Protection Regulation, there is no guarantee that databases abroad can be viewed or deleted if data has been stored there without consent - or that they are well secured. This is what happened recently in the case of the American video surveillance start-up Verkada; the access data to a so-called "super admin" account circulated publicly on the internet until it gave a hacking group access to over 150,000 video surveillance systems and databases. The explosive nature of the political and economic exploitation of this potential to control behavior and movement, which is inherent in data sets of such magnitude, ultimately does not make them immune to criminal energy or technical failure.

Elsewhere, less thought seems to be given to the issue of data protection: A look at the People's Republic of China illustrates the extent to which such digital innovations can be implemented in the form of the evaluation and widespread automated collection of personal data, both in private life and in the fight against pandemics, for example. The social credit system, which is already widely integrated there, literally feeds on data from tax returns, promissory bills, invoices, social factors such as compliance with road traffic regulations or family planning, voluntary work, criminal records, online behavior and many other data sources. The coronavirus warning app Alipay Health Code, which was developed by a branch of the e-commerce giant Alipay in collaboration with the government, clearly indicates your freedom of movement in green, yellow and red - and uses the program code "reportInfoAndLocationToPolice" to signal to the local police that they are now examining the personal and location-related data on your smartphone. If the light is yellow or red, the police must be contacted immediately. The app is now so integrated into everyday life that access to public transport, retail outlets etc. is not possible without a valid QR code scan. Is this what data and information processing will look like in the future? Can our data protection rules keep pace with this rapid development? The legislative project for the EU's e-Privacy Directive offers a glimpse of the outcome of this race - this was originally intended to come into force with the GDPR - the emphasis is on should.


The question of whether the status quo of advancing digitalization is beneficial or detrimental to users' rights to self-determination remains unanswered.


In view of the increased usability, user-friendliness and optimized networking with ever newer apps and applications, personal data seems like a small price to pay in order to be able to participate in the new everyday life thanks to digital and technical innovations. On closer inspection, however, the analysis and use of the data that has escaped here gives users a hand that must first be snatched from them: Location data from end devices and self-driving vehicles quickly result in a movement profile that records every step with the utmost precision. At the latest when these data sets are supplemented by the new possibilities of video surveillance and automatic facial recognition, the extent of personal data available takes on unimaginable proportions. What if the personal and location-based data is also used to calculate the probability of future crimes and illnesses? Predictive policing, facial recognition and geo-tracking are just some of the many technologies that the future holds in store for us. Will the advance of digitalization assure users of basic rights, resources and security in this network, or will it deprive them of them altogether? And what is digitized everyday life worth to them if they have to fundamentally change their behaviour and values in order not to stand out negatively in a criminological algorithm - or to be able to participate in social or political life in the first place?

The question of whether the status quo of advancing digitalization benefits or harms users' rights to self-determination remains open. However, trends are discernible in a digitalized vision of the future. Does the responsibility for not exploiting this potential for loss of control ultimately lie with us again? Should we read through data protection declarations several times in future? And what good does it do me to read through a privacy policy that only tells me the inevitable in the end? What if the state doesn't stick to its own rules, for example with the constant push for data retention? Innovation or freedom? Everyone decides for themselves or have we been "genocided"?

* The author uses the generic masculine; all references to persons apply equally to both genders.

Responsive Image
Florian Melzer is a data protection consultant and external data protection officer. As the "protector" of company assets, Patronus Services GmbH supports its customers securely and consistently in the external archiving of their documents through to GDPR-compliant destruction. Data protection does not play a subordinate role for Patronus Services GmbH, but also complements our portfolio of services in customer consulting. As an owner-managed team of lawyers, IT specialists and entrepreneurs, we support our customers in their digitalization projects on a daily basis. The advantages for our customers are obvious: on the one hand, direct savings potential through external file storage and, on the other hand, the certainty that the strict requirements for data protection and data security are met.
www.patronus-datenservice.de