Read WhatsApp and Co again soon?

With new digital laws, the EU wants to limit the power of large corporations and better protect children and young people. But above all, the "chat control" meets with criticism: The expectations of the technology are too unrealistic, too strong the collision with data protection.

In recent weeks and months, events have been turning over in the EU's digital legislation: various drafts and laws are currently at different stages on their way through the legislative procedures in order to influence our interactions on the Internet in the future. Among them are the Digital Marketing Act DMA and the Digital Services Act DSA, which are supposed to regulate the major platforms such as Facebook and Twitter and which are generally welcomed by many experts – even if it is often complained that they were mitigated at the last second, among other things, by the influence of lobby organizations of these platforms.

In contrast, there are significantly more critical voices in the latest initiative in European legislation: the draft of the EU Commission for the so-called chat control, which provides for to search private messenger messages for punishable content.

DMA and THAT: a new basic law for the Internet

Above all, the DMA is targeting the competition and is intended to ensure that large corporations such as Google, Microsoft and Facebook do not receive too much market power. As a result of the DMA, which is to become final at the end of the year, users should have more "real" freedom of choice in the future and must not be forced by the providers in their ecosystems. For example, a smartphone with the Android operating system from Google may no longer automatically have Google's browser Chrome as the standard browser. The use of data and merging huge data records should also be made more difficult and made dependent on the consent of those affected.

The DSA is about how the major platforms can avert damage to society that results, among other things, from their business model. In the future, companies will therefore have to explain how their algorithms influence what users see. In addition, you need to react to when dangerous content is shared and explain the criteria by which you delete posts.

DSA and DMA are a "huge step towards fairer markets and better online platforms," explains Matthias Kettemann, head of the research program "Regulation structures and rule education in digital communication rooms" at the Leibniz Institute for Media Research, Hans-Bredow Institute (HBI) in one Press interview by the Science Media Center. "You read about a new Basic Law for the Internet," he says - and in fact it is a new quality of regulation that deals with current challenges.

"We have seen over the past 15 years that it is not enough to leave the platforms alone," says Kettemann. Corporations such as Facebook and Twitter need more specific requirements on how they deal with content - especially those that are not expressly illegal. This is the biggest and most important step in the DSA: Because legal content can also harm, for example a leaked information that seeps through or hate messages directly before an election - especially if they are reinforced by the recommendation algorithms of social networks. This could have harmful consequences for society, which is why the platforms now have to make a risk assessment and also have to justify the criteria according to which content is tolerated or removed.

What annoys Kettemann, however, is the non-transparent process in which the new draft laws are created. "Especially towards the end, we noticed that the procedure became less transparent, that changes were pushed through at the last second," the expert observed. Not all changes are bad, "but the fact that we do not have all the versions to discuss four weeks after the relevant decisions is a bad approach."

Messenger services are to search for child pornography and signs of abuse in the future

In fact, the laws for EU relationships have arisen in a comparatively short time: from the proposal to implementation, everything took place within a legislative period of the parliament. A regulation seems to have been initiated even in a hurry whose draft the EU Commission passed in mid-May. She is entitled "Rules to Prevent and Combat Child Sexual Abuse" and is mostly discussed under the name "Chat Control".

According to the will of the Commission, in the course of the regulation, the providers of messenger services such as Whatsapp, Threema or Signal, among others, will be obliged to scan the private messages of their users and to hand over possible representations of abuse to the authorities. Tobias Keber, professor of media law and media policy at the Stuttgart Media University, lacks a clearer idea of how the requirements of the regulation should be technically implemented: "The regulation is very abstract at this point. Depending on how the companies implement them, this can lead to the end-to-end encryption that we fought for with the messengers colliding with it.«

A basic problem that runs through the new digital legislation is that it may collide with other goals and laws of the EU, for example with data protection, explains Keber. This affects not only the chat control, but also the DSA, which, among other things, wants to protect young people. But the platforms have to know the age of the users - and with the demand for age verification, the weakening of data protection is weakened, explains Keber. "So here we have fracture points that you have to discuss." Above all, data protection implications are "immensely" when checking the chat control, warns Keber.

On a collision course with data protection

The Federal Data Protection Officer Ulrich Kelber becomes even clearer: "The commission is not compatible with our European values and collides with applicable data protection law," he writes on Twitter. The scanning of private news threatens the principle of confidentiality in communication, and possible weakening of encryption opens the door. He would work to ensure that the regulation does not come like this.

Planned "chat control" of the Commission from the point of view of data protection - a thread after the first examination: the Commission's draft is not compatible with our European values and conflicts with applicable data protection law.

With new measures, their proportionality must always be checked. This includes questions such as: Is a measure suitable to solve the problem? Is it necessary? And is it appropriate? There is criticism not only when it comes to the question of whether it is appropriate to weaken data protection. The spirits also differ when answering the question of whether the chat control can achieve its goal at all. "Where does childhood abuse take place?" Asks media lawyer Tobias Keber. "Is the material mainly distributed through messenger?" Investigators, for example, would point out that corresponding perpetrators are more likely to distribute links in the Darknet, which in turn lead to encrypted content on the Internet. The chat control could not be recorded at all. Federal Interior Minister Nancy Faeser has also surprisingly changed her opinion due to this argument and now rejects the Commission's proposal.

Is the end of end-to-end encryption?

The biggest question is how measures such as chat control should be technically implemented at all. Because either the end-to-end encryption must be cracked in order to give companies insights into the communication of their users – something that privacy advocates urgently warn against. Or a so-called »client side scanning« would have to take place, which examines the messages directly on the users' devices. But experts also criticize this approach. Most recently, a group of the world's most renowned security experts, including Ron Rivest, Bruce Schneier and Carmela Troncoso, spoke out against it: The method poses "serious security and privacy risks for society as a whole," the researchers write. It can also fail and be circumvented. In addition, there is a danger that, for example, undemocratic regimes will misuse the technology for their own purposes.

According to the commission's draft, messenger services must, on the one hand, install a function in their clients that searches for known material - that is, for child pornographic photos that are already known to the authorities. "The error rates are already massive," warned the MP of the pirate party Patrick Breyer in the podcast "Logbook Netzpolitik". Even if you search for further copies of the image using a digital fingerprint of a photo (so-called "hashing"), the results are not reliable in 86 percent of cases.

It becomes even more difficult in the second point, to which the Commission also wants to sign the corporations: the detection of new, previously unknown material and the preventing »cyber grooming, in which adults approach children in a way that are at risk of sexual abuse. This cannot be implemented with a simple hash solution, for which machine learning methods are needed-artificial intelligence that is looking for unknown representations. Algorithms would have to be trained to recognize illustrations of child pornography and to raise the alarm when adults try to contact children. "Of course, this can never work reasonably reliably," says Breyer. "A huge error rate will lead to false suspicions and criminal proceedings."

Anyone who deals with machine learning and knows about the weaknesses of the technology, in fact, quickly comes to the conclusion that this can not work. Artificial intelligence already has high error rates for much simpler tasks of image recognition, the same applies to the recognition of meaningful relationships in texts that go beyond relatively sober messages or e-mails.

In addition, Breyer warns that young people, for example, also send pictures of themselves to their partner or a friend. If these were then filtered out by a AI and sent to "underpaid moderators" of the services for checking, there was a much greater danger to the young people: these moderators could sell the pictures. In addition, the planned procedure is also a shame for those children "who are really abused," says Breyer. Because experiences by the Swiss federal police showed that the vast majority of mechanically filtered processes were not relevant to punishment. "This will flood our police in order to sort out mostly completely irrelevant messages." In return, capacities for hidden investigators were missing in order to be able to take up real children's pornography rings.

A one-to-one implementation of the draft for chat control is rather unlikely

So is the chat control a quick shot that will not stand up in the courts anyway in the end? "In an ideal world, the legislator is also committed to fundamental rights," says Keber. In such an ideal world, the Commission is unlikely to make such a proposal in the first place, should it actually conflict with fundamental rights. If they do, it is ultimately up to the courts to overturn the law again.

However, in the event of chat control, Keber also warns of black painting. After all, the design was at the very beginning. "A lot can still move here," he says. First of all, the proposal Parliament and Council must pass. "In two years we will see what is left of it." He does not assume that the design one to one ends in a law.

What remains is the criticism of the pace and lack of transparency of the current EU digital legislation. The timing is currently high, Kettemann points out. Even he does not manage to read and evaluate everything. In the end, how are the rulers doing, who would have to make decisions and actually go deep into the technical context for this? "There is no real solution for this," says Kettemann. On the one hand, digital legislation is always late, because the digital world is developing rapidly, on the other hand, of course, it must be ensured that those responsible have a chance at all to understand what they are deciding at the moment. "I just warn against the fact that we currently have a lot of development in the areas and definitely have to focus on ensuring that rules do not accidentally arise that the experts have not sufficiently examined.«

If you look at the expectations of the design to the technology of machine learning, you could come to the conclusion that there is still a need for clarification.

Share In Social Media

Cookies allow us to offer the everyg website and services more effectively. For more information about cookies, please visit our Privacy Policy.
More info
 
This website is using KUSsoft® E-commerce Solutions.