A new law has come into force in Germany aimed at regulating social media platforms to ensure they remove hate speech within set periods of receiving complaints — within 24 hours in straightforward cases or within seven days where evaluation of content is more difficult.
The name of the law translates to ‘Enforcement on Social Networks’. It’s also referred to as NetzDG, an abbreviation of its full German name.
Fines of up to €50 million can be applied under the law if social media platforms fail to comply, though as Spiegal Online reports there is a transition period for companies to gear up for compliance — which will end on January 1, 2018. However the Ministry in charge has started inspections this month.
Social platform giants such as Facebook, YouTube and Twitter were couched as the initial targets for the law, but Spiegal Online suggests the government is looking to apply the law more widely — including to content on networks such as Reddit, Tumblr, Flickr, Vimeo, VK and Gab.
The usage bar for complying with the takedown timeframes is being set at a service having more than two million registered users in Germany.
While Spiegal Online reports that the German government is intending to have 50 people assigned to the task of implementing and policing the law.
It also says all social media platforms, regardless of size, must provide a contact person in Germany for user complaints or requests for information from investigators. Recent queries will need to be answered within 48 hours or risk penalties, it adds.
One obvious question here is how any fines could be applied across international borders if a social media firm has no bricks-and-mortar presence in Germany, though.
The law does also require social media firms operating in Germany to appoint a contact person in the country. But again, those companies that are outside Germany may be rather hard to police — unless the government intends to start trying to block access to non-compliance services which would only invite further controversy.
The Germany cabinet backed the proposal for the law back in April. At the time, justice minister Heiko Maas said: “Freedom of expression ends where criminal law begins.”
The country has specific hate speech laws which criminalize certain types of speech, such as incitement to racial and religious violence, and the NetzDG law cites sections of the existing German Criminal Code — applying itself specifically to social media platforms.
It’s not alone in Europe, either. The UK has also been active recently in leading a push by several G7 nations against online extremism — with the apparent aim of reducing takedown times for this type of content to an average of just two hours.
Germany has also been pushing for a European Union wide response to tackling the spread of hate speech across online platforms.
And last week the European Commission put out new guidance for social media platforms urging them to be more pro-active about removing “illegal content” — including by developing tools to automative identification and prevent re-uploading of problem content.
It warned social giants that it might seek to draft a legislative proposal if they do not improve takedown performance within six months.
However the executive body appears to be seeking to bundle up various types of “illegal” content into the same problem bucket — and quickly drew criticism it risks encouraging algorithmic censorship by seeking to create one set of rules to apply to copyrighted content and terrorist propaganda, for example. Which does underline the risks around broad efforts to regulate the types of content that can and can’t be viewed online.
Critics of Germany’s NetzDG law argue it will encourage tech platforms to censor controversial content to avoid the risk of big fines. And while speedy social media takedowns of offensive hate speech might enjoy mainstream backing in Germany, it remains to be seen how the law will operate in practice.
Meanwhile, if overly expansive rules end up being fashioned to try to regulate all sorts of “illegal” content online that could also result in a wider chilling effect on online expression, and reduced support for broad regulatory efforts.