Terrorist-related content to be removed from Internet platforms within an hour, says Juncker

EPA-EFE/PATRICK SEEGER

President of the European Commission Jean-Claude Juncker delivers the annual State of The European Union speech in the European Parliament in Strasbourg, France, September 12, 2018.

Terrorist-related content to be removed from Internet platforms within an hour, says Juncker


Share on Facebook
Share on Twitter
Share on Google+
Share on LinkedIn
+

Internet platforms will be required to remove all terrorism-related content within one hour of being reported, else sanctions await, according to a proposal presented to the European Parliament by European Commission President Jean-Claude Juncker, who said during his last State of the Union address that those who fail to comply would be subject to stiff sanctions by the bloc.

“The Commission is now proposing new rules so that any terrorist content posted online is removed within one hour because it is in this period of time that it has the greatest impact,” said Juncker in Strasbourg.

Brussels had several months ago already asked most internet providers to voluntarily monitor and remove terrorist content that appeared on online platforms

The EU’s code of conduct on countering illegal online hate speech was first presented by the EU executive in 2016 and had initially four major internet platforms participating – Facebook, Microsoft, Twitter, and YouTube. Snapchat was the seventh major IT platform to join the code in May when it teamed up with other players and the European Commission to fight illegal hate speech online, while the Commission made statements about expanding its scope to terrorist content.

The latest proposal goes further than any previous initiative by making the deadline for the removal of the material binding and includes consequences for internet providers who fail to comply.

The EU Member States will be required to follow-up with non-compliance with an “effective, proportionate and dissuasive round sanctions” aimed at eliminating any contented that is associated with terrorist organisations, according to the EU executive.

In the event that a host service provider – or a web platform – fails to comply with the cancellation orders, the offending company would be subject to a fine of up to 4% of its total annual turnover for the previous financial year.”

“By setting a minimum set of duties of care on hosting service providers which includes some specific rules and obligations, as well as obligations for the Member States,” the Commission said in its statement, “The proposal intends to increase the effectiveness of current measures to detect, identify and remove terrorist content online without encroaching on fundamental rights, such as freedom of expression and information.”

By suggesting that a harmonised legal framework will facilitate the provision of online services across the Digital Single Market, the Commission is hoping to ensure a level playing field for all service providers in the EU and provide a solid legal framework for the detection and removal of terrorist content.

The number of legal obligations that promote transparency will increase trust among European citizens, particularly internet users. They will force companies to be held accountable and remain transparent. The proposal calls for the creation of complaint mechanisms to ensure that users can challenge the removal of their content.

“We must be able to ensure that terrorists will be prosecuted across Europe, across borders, because terrorists know no borders,” said Juncker.

Counter Extremism Project reacts to the European Commission’s regulation proposal

David Ibsen, the Executive Director of The Counter Extremism Project, was quick to commend the Commission for taking a positive step towards combatting extremist content online but said the new measures fall short of a clear solution to the growing problem of illegal content online

“The fact that the Commission is proposing a Regulation, as opposed to a Directive, shows the seriousness of the issue. We, at the Counter Extremism Project, have seen that content is downloaded and consistently re-uploaded across the same platforms it was previously taken down from. This cannot continue. Reliable enforcement and automated technology so that content can be taken down within one hour of upload need to be included in the proposed draft,” said Ibsen.

Share on Facebook
Share on Twitter
Share on Google+
Share on LinkedIn
+