Privacy: Most of the population lacks awareness of risks on the internet

Secure data use requires technical measures, but also risk awareness among the population. Experts discussed this at AnoSiDat.

Save to Pocket listen Print view
Teilnehmer der Podiumsdiskussion auf dem AnoSiDat von links nach rechts: Engelbert Beyer, Leiter der Unterabteilung "Technologieorientierte Forschung für Innovationen" beim Bundesministerium für Bildung und Forschung; Esfandiar Mohammadi, Leiter des Kompetenzclusters AnoMed und der Arbeitsgruppe Privacy and Security von der Universität Lübeck; Matthias Steffen, CEO der Fuse-AI GmbH; Matthias Marx, ein Sprecher des Chaos Computer Clubs.​

Participants in the panel discussion at AnoSiDat from left to right: Engelbert Beyer, Head of the "Technology-oriented Research for Innovation" sub-department at the Federal Ministry of Education and Research; Esfandiar Mohammadi, Head of the AnoMed competence cluster and the Privacy and Security working group at the University of Lübeck; Matthias Steffen, CEO of Fuse-AI GmbH; Matthias Marx, a spokesperson for the Chaos Computer Club.

(Bild: Thore Suthau)

7 min. read
Contents
This article was originally published in German and has been automatically translated.

How can society be made more aware of data protection and potential risks to privacy, and what technologies can be used to protect privacy? Experts discussed these issues at the AnoSiDat anonymization congress. According to the experts, different technical measures are required for different use cases, such as the redaction of court records or the use of mobile phone data. Otherwise, the privacy of citizens could not be adequately protected, and their trust could suffer as a result. The fact that the topic is not easily accessible is problematic, but the population's lack of IT education also plays a role.

The decision to focus on anonymization as a funding priority was made in 2021, as Engelbert Beyer, Head of the "Technology-oriented Research for Innovation" subdivision at the Federal Ministry of Education and Research (BMBF), explained during a panel discussion. At that time, the opportunity arose to set such a funding priority. The BMBF has been working on the topic of anonymization for many years, but never before like now. "Data is the central raw material of the years and decades ahead of us.

Data can only be used if health rights are protected," says Beyer. The issue of anonymization is therefore of central importance. "The Federal Ministry of Education and Research is investing strategically in topics and technologies that are on the cusp of practical implementation. The transfer from science to application is essential to us. We expect the anonymization research network to provide important impetus for the data economy," Beyer told heise online.

According to the participants in the discussion, the majority of the population is not aware of the risks of the digital world. This is partly because computer science has only gradually become a compulsory subject in the federal states lately. In nine out of 16 federal states, however, there are no compulsory computer science lessons. It is important to sensitize and better inform the population. This needs to be taught at school. Children should learn how a computer works, said Matthias Marx, one of the spokespersons for the Chaos Computer Club.

Marx reported where he had recently found and reported data leaks. These included, for example, 3.4 million passenger data records that were used to assess the risk of payment defaults or the data from a fitness app. In the meantime, he has become so accustomed to it that he is less shocked when basic principles of data protection and data security are disregarded. In addition, data is often collected where it is not clear for what purpose.

Cyber criminals are constantly developing new attack models, which is why the state of the art is of great importance. What is still considered adequate protection today will be outdated the day after tomorrow. According to Marx, this is why "an annual review is important", which Matthias Steffen, head of Fuse-AI GmbH, which develops medical AI products, had previously criticized. Marx also understands that this means work. As a medical device provider, Steffen has to regularly check whether the data produced by his systems is adequately protected.

Data can be protected much better than before, as the numerous projects show. The current state of the art is important: "Tomorrow my data may be leaked, and the day after tomorrow it will be deanonymized," says Marx. Technical barriers are therefore essential for data protection in data processing software, as criminals do not adhere to legal requirements. The attacker is not interested in "what barriers we set for the system". However, even the most basic security measures are often lacking, as the current Federal Data Protection Commissioner, Ulrich Kelber, repeatedly emphasizes.

According to Christian Zimmermann, cybersecurity expert at Bosch, people also prefer to work with anonymized data "because there are fewer regulations to comply with". However, many technologies are still in their infancy. According to the head of the AnoMed competence cluster and the Privacy and Security working group at the University of Lübeck, Esfandiar Mohammadi, data cleansing, or the traditional anonymization of data records, is "extremely difficult".

The aim is to use the data so that personal information remains protected at a later date. To achieve this, the methods need to be continuously evaluated and further developed. Machine learning offers many techniques that ensure that privacy is better protected. "If society decides that nothing about personal information should be made public, then we must take this seriously," appeals Mohammadi.

A university lecturer from the audience took up the cudgels for the "academic world" and attacked the CCC, which he did not want to be labelled a data aggressor. In his view, it is "completely alien" to science to misuse data; after all, there are many risks otherwise, such as fines. "The reputation would be ruined", and he doesn't want that at all. "There is no motivation to steal data". He is tired of all scientists being treated like "potential data thieves". He doesn't want any data; he would be satisfied if the data were distributed decentrally but made available.

Marx replied that there was no conflict of objectives. "Research must be possible," Marx asserted. However, he would distinguish between different types of data thieves: Between those who steal data and sell it on the darknet and "between the people from the Chaos Computer Club who point out the vulnerabilities so that this doesn't happen again in future."

"After the Cambridge Analytica incident, where a university was used and abused to obtain data from Facebook" and then passed this data on to Cambridge Analytica "for purposes that Facebook might not have agreed to", Mohammadi does not want to let it stand that all researchers are exemplary. In his view, a lot is already happening for science with various laws such as the Health Data Use Act and other laws that are currently being implemented. It is becoming increasingly important to protect data better.

Zimmermann hopes that the projects will also make it to market maturity. For him, there are two security aspects when it comes to data: On the one hand, data security (security), but then also data that could prevent accidents from happening in autonomous driving, for example (safety). As in other areas, there is of course a desire to be able to use data for safety.

(mack)