Digital violence against women: what needs to be done now

Event report

Every single day, women are abused, threatened or slandered on digital social networks. Yet the operators of these platforms are doing very little to make the digital space a safer place for women. The whistleblower Frances Haugen talked to Spiegel journalist Ann-Katrin Müller and Alexandra Geese, Member of the European Parliament, about what needs to be done right now to stop violence on the net.

Podium discussion at the Heinrich-Böll-Stiftung, Berlin, Thursday 4 November 2021


  • Frances Haugen, whistleblower, former data engineer and product manager at Facebook
  • Ann-Katrin Müller, journalist, Der Spiegel
  • Alexandra Geese MEP, Alliance 90/The Greens
  • Moderation: Anna-Lena von Hodenberg, CEO, HateAid

Digital violence is everywhere. Women and girls are incredibly vulnerable to hate messages and sexual harassment, right up to rape and murder threats, on the Internet. A survey conducted by the German advisory organisation HateAid shows that around 52% of women between the ages of 18 and 35 have suffered digital violence at least once. The social platforms, particularly Facebook, carry a large proportion of this violence, said former Facebook employee and whistleblower Frances Haugen. This is because it has effectively promoted this development with its algorithms and apparent reluctance to sanction abuses and immediately take hate posts down. She accused the group of being too soft on internet hate. It has led to increasing numbers of women withdrawing from social media, she explained.

How can digital spaces be made safe?

But what needs to happen to make social networks into safer public spaces? Frances Haugen argued that first and foremost, there should be more staff to investigate detailed complaints reliably and ensure that hate messages, threats and insults are deleted as quickly as possible. Women in non-English speaking countries are particularly at risk, Haugen explained, because Facebook predominantly focuses its anti-digital violence activities on the United States. This is because of the power of the US legal system and its stiff penalties: Facebook has already been handed down punitive fines in the USA. There is practically no other country in the world where Facebook has to worry about this.

Facebook doesn’t follow its own rules

Additionally, she explained, prominent individuals are given almost total licence to break Facebook’s rules. The Facebook accounts of people in the public eye are specifically labelled VIP accounts. Facebook is keen to avoid any negative PR, which means that, in contrast to ordinary users, no action is taken if a VIP account post breaches the rules.

Where this policy can lead is shown by the case of the football player Neymar. Neymar posted a nude photo on Facebook of a woman who had accused him of rape. It took more than a whole day for the picture to be taken down. In the meantime, the photo was seen 60 million times, according to Frances Haugen. Neymar’s Facebook account was not even closed down, contrary to Facebook’s own policy. The whistleblower summarised her analysis by saying that Facebook does not stick to its own rules, particularly in its dealings with prominent individuals.

Women are pushed out of the digital world

If women are less active in the digital world because they have to fear violence, this is also a threat to our system of democracy, said Spiegel journalist Ann-Katrin Müller. Politically active women who are digitally active may have cause to fear for their lives. Women with a migration background are particularly likely to be affected by abuse, threats and slander on the net. As a result of this, many women choose to stop discussing certain subjects, to keep themselves and their family safe. In this way, women are specifically pushed out of the digital public sphere. Women choosing to withdraw from the digital space must not be seen as the answer, as it robs women of much of the power they have won in their 100 year fight for emancipation, Müller argues. The journalist cannot understand why social networks do not adjust their algorithms to detect hate messages.

Algorithms pushing the polarisation of society

Posts that polarise users are more likely to be shared and Alexandra Geese MEP sees this as a systematic problem. The major communication platforms do too little to create safe spaces for users, she said. To grab users’ attention and keep them on the platforms for as long as possible, algorithms are most likely to spread posts that cause the greatest stir. This promotes the distribution of hate-filled and divisive content. Facebook in particular, she went on to say, is effectively promoting the polarisation of society in the sense that its algorithm increasingly pushes users towards radicalisation. Frances Haugen expressly shared this observation. Facebook users are directed to extreme content, she explained. On other sites, such as Twitter, the algorithm alone decides what content is shared. But Facebook is undermining society with the spread of hate, Haugen warned.

Facebook blocks transparency

Facebook is also far less transparent than other platforms, such as Google and Twitter, according to  Frances Haugen. In substantiation of this comment, she referred to a case in which Facebook blocked access to the platform for a project of New York University (NYU). The researchers had wanted to conduct a study into which people look at what political advertisements on Facebook, and when. To be clear, Haugen had nothing but praise for most of the people who work for Facebook: the network is full of people who want to connect people and set out to create a better system for this purpose. Unfortunately, Haugen explained, they get very little leeway from their employer to do this.

How can user rights be improved?

How can the rights of users be brought into greater proportion with those of the platforms themselves and how can more transparency be assured? Progress through case-law is an important factor, Frances Haugen said of her own experience at Facebook. Ann-Katrin Müller hopes that improvements may come in response to the increasingly poor reputation of the social media. Alexandra Geese made the case for social platforms to be allowed access only to data voluntarily made available by users. The practice whereby algorithms allow voluntary user data to be connected to other data, for instance on religious beliefs or sexual orientation, should be prohibited, she argues. But this is precisely the point at which there is a lack of transparency: users don’t know exactly what happens to their data, said Ann-Katrin Müller.

EU wants to hold the networks to account

For all these reasons, Alexandra Geese welcomed the fact that the European Commission is now taking a closer look at the system. Transparency, she believes, is the prerequisite to enable an independent expert panel to conduct a thorough risk assessment. This would require a European competence centre, Geese explained, adding that such a task could not be successfully tackled at a purely national level. The MEP has high hopes for the Digital Services Act. This law, which is currently being negotiated in Brussels, will bolster the rights of users while holding social platforms to greater account.

Too important to fail

It is so important that this legislation is a success, she went on to stress. It is also important that women receive a clear signal that politicians are taking the problem of digital violence seriously and are taking action to stamp it out. The success or failure of this legislation will determine whether, in five years’ time, the emphasis is on the needs of users or still predominantly on the business interests of the platform operators. This would also benefit countries outside the EU.

Potential for safe and democratic networks?

The law could potentially also help to usher in new, fairer social networks, Alexandra Geese predicted. Frances Haugen also made the case the new, democratic networks: the operators of platforms such as Facebook need to realise that they could be even more profitable if they finally start to take their responsibilities.


This article was first published in German on