EU-News

EU about to accept a “revolutionary” legislation for fighting online hate speech

The EU Buildings. Photographer: Jamie Szmitz

By Jamie Szmitz & Johana Kofronova

4/11/2021

The European Parliament are currently debating the Digital Services Act, which is hoped to set the trend for the future of online safety legislation.

Safety online has become a crucial goal for the European Union and will continue to be as technology progresses in citizens’ daily life.  The EU have outlined several key plans for how users and businesses will be guaranteed a safe digital environment.  It is important for Europe to set the global standards and allow other nations to follow suit, as previously done with GDPR legislation. Many commissioners have pride in the EU for setting many of these standards and hope that new legislation will be the same.  The Deputy Head of the Cabinet of commissioner Vera Jourova, a prominent figure in the makings of the Digital Services Act, Daniel Braun said that “if we come with a well thought reflection of the challenges that everyone is facing then it can be a standard setting exercise too.”

The Digital Strategy

Currently, the European Commission have set out the frameworks for their ‘digital strategy’ which sets a blueprint for the future online. The main focus of the strategy is to protect the digital rights and ensure the digital principles of users are being met.  Some key points of interest where the EU have outlined as essential are artificial intelligence, the protection of privacy, maintaining freedom of speech and ensuring that the internet is a fair space for businesses to grow and thrive. 

The Digital Services Act

As part of the outlined digital strategy, in December 2020 the European Commission drafted the Digital Services Act alongside the Digital Markets Act.  Both the DSA and DMA package is being proposed as a regulation, this means that it is a legislative motion that must be applied throughout the EU.  While the latter focuses on appointing large online platforms as gatekeepers who can monitor the market and ensure that large companies are not misusing their power, it is the Digital Services Act which is most relevant to the aims of the digital strategy.  The DSA is designed to ensure that all EU users have their rights and principles protected while online.  This is achieved by setting out clear rules for the largest social media platforms which demand them to be transparent as well as being held responsible for the removal of harmful and illegal content on their platform.  The EU have employed ‘flaggers’ who will monitor the largest social media platforms for the harmful content and then notify the EU of its findings  As explained by Daniel Braun, “the DSA solidifies and puts into legislation these responsibilities to review notifications to react and to deal with law enforcement if needed.”

As well as the focus on the removal of harmful content, the DSA also ensures more transparency for users.  This means that social media platforms like Facebook must expose the workings of their algorithm as well as sharing the data they store when requested by the relevant authorities.  These proposals come while the controversial findings of the ‘Facebook files’ are uncovered and for this reason the former Facebook employee and now whistle-blower Frances Haugen will visit and speak at the European Commission on the 8th of November.  Although these findings are important, Daniel Braun has confirmed the EU had already accounted for reports like these by saying the visit of Haugen is “more of a confirmation that what we are doing is right.”

27 member states – 27 opinions?

“All proposals in the commission are made in a collegial way, it’s a product of the whole commission”, says Daniel Braun, about creating the DSA. “I think our proposal is balanced and takes into account all the key interests, but there were different ideas and views when creating that we had to somehow incorporate.”

The process of creating a proposal in the commission includes debating with stakeholders, representatives from certain areas and different member countries. The EU noticed different approaches and thoughts appearing in the discourse, since online hatred is a problem troubling most of the world. By speeding up the creation process of the DSA, which includes the whole European Union, they can avoid fragmentation of rules and make the approach unified and therefore more effective.

Online Platforms and the European Commission against illegal hate speech

31. May 2016 the European commission together with Facebook, Google (including YouTube), Twitter and Microsoft hosted consumer services (including LinkedIn) signed the Code of conduct on countering illegal hate speech online. Later on, Instagram, Google+, Snapchat, TikTok and other smaller online platforms joined as well. “This means the Code now covers 96% of the EU market share of online platforms that may be affected by hateful content.” according to an information note from 2019 by the European Commission. This was published before TikTok joined the Code of Conduct.

The aim is to prevent and counter the spreading of illegal hate speech online. It requires that platforms have “rules and community standards that prohibit hate speech and put in place systems and teams to review content that is reported to violate these standards.” In the first monitoring, published in December 2016, companies removed less than a half of the reported content.

Only 40% of notifications were reviewed in the first 24 hours. That shifted drastically, since the second requirement is that they “review the majority of the content flagged within 24 hours and remove or disable access to hate speech content, if necessary”. In 2021 monitoring, the number of assessed notifications in the first 24 hours is 81 %. In 2020, the removal rate was about 70 %. In 2021, it lowered to 62 %.

The Code of Conduct never aimed for a 100 % removal rate. According to Daniel Braun, a lot of the cases are in the grey zone. “We wanted to avoid that the platforms would preemptively just remove problematic content, that is why the Code says majority.”

Not only the European Union

The European Union is not the only European institution trying to act against hate speech. The Council of Europe has its own movements like the No Hate Speech. At the same time, they collaborate on creating legislation with other organisations, from the European Union to the United Nations. By creating recommendations, the Council of Europe can set certain standards in Europe and in the world.

The Grounds for Removing Online Content

It can be problematic for the European Union to define harmful or illegal content for all member states.  Therefore, the DSA and other legislation that inspects illegal content does not take a content-based approach but instead ensures that there are procedures put in place for what member states deem as illegal content.  Despite this, there is still emphasis on harmful content to be removed.  The DSA states that those who are not complying with the removal can be heavily fined.  Racist and xenophobic hate speech became illegal in 2008 as part of a Framework Decision which has since been adapted to support online instances. 

The EU’s “Historic Chance”

As the DSA continues to be debated in the European Parliament, there are more changes being made to ensure the act will be able to stand the test of time as technology progresses.  For example, currently the disagreement between MEPS about whether targeted advertisement should be allowed was highlighted in a committee meeting on the 27th of October.  Despite how long the procedure is taking, there is still optimism about the act and what it will mean for the EU.  Christel Schaldemose MEP told the committee that the act is a “historic chance to regulate online platforms and protect democracy.”

Video how to accepting laws works: https://www.canva.com/design/DAEukU6vjtg/d2zBWHGLH9cN_ArBG4zmkg/watch?utm_content=DAEukU6vjtg&utm_campaign=designshare&utm_medium=link&utm_source=homepage_design_menu

Video explaining EU’s Digital Strategy: https://youtu.be/miBoHLhYZGg