3rdPartyFeeds

EU parliament votes to fine internet firms for not removing extremist content quickly

The European parliament voted on Wednesday to fine firms like Facebook, Google and Twitter up to 4 percent of their turnover if they persistently fail to remove extremist content within one hour of being asked to do so by authorities. The measures have been brought into sharper focus since the live streaming on one of Facebook's platforms of a lone gunman killing 50 people at two New Zealand mosques in March. The parliament voted 308 to 204 with 70 abstentions to back the proposal to tackle the misuse of internet hosting services for "terrorist purposes" . Read More...
FILE PHOTO: The building of the European Parliament, designed by Architecture-Studio architects, is seen in Strasbourg, France March 26, 2019. REUTERS/Vincent Kessler/File Photo

By Foo Yun Chee

STRASBROUG (Reuters) – The European parliament voted on Wednesday to fine firms like Facebook, Google and Twitter up to 4 percent of their turnover if they persistently fail to remove extremist content within one hour of being asked to do so by authorities.

The measures have been brought into sharper focus since the live streaming on one of Facebook’s platforms of a lone gunman killing 50 people at two New Zealand mosques in March.

The parliament voted 308 to 204 with 70 abstentions to back the proposal to tackle the misuse of internet hosting services for “terrorist purposes” .

“Companies that systematically and persistently fail to abide by the law may be sanctioned with up to 4 percent of their global turnover,” it said.

A new European Parliament, to be elected on May 23-26, will finalize the text of the law in negotiations with the European Commission and representatives of EU governments, a process likely to take many months.

“There is clearly a problem with terrorist material circulating unchecked on the internet for too long,” said Daniel Dalton, the parliament’s rapporteur for the proposal.

“This propaganda can be linked to actual terrorist incidents and national authorities must be able to act decisively. Any new legislation must be practical and proportionate if we are to safeguard free speech,” he said.

“It …absolutely cannot lead to a general monitoring of content by the back door.”

EU officials moved to regulate because they believe internet companies are not doing enough under voluntary measures, even though the first hour is the most vital to stemming the viral spread of online content.

Facebook said it removed 1.5 million videos containing footage of the New Zealand attack in the first 24 hours after the shootings.

Worries the new rules are lacking and could be misused have been expressed by three U.N. special rapporteurs for human rights and by the EU’s own rights watchdog.

Companies rely on a mix of automated tools and human moderators to spot and delete extremist content. However, when illegal content is taken down from one platform, it often crops up on another, straining authorities’ ability to police the web.

In response to industry concerns that smaller platforms do not have the same resources to comply as speedily with tougher EU rules, lawmakers said authorities should take into account the size and revenue of companies concerned.

Draft measures call on the bloc’s national governments to put in place the tools to identify extremist content and an appeals procedure. The one-hour rule would apply from the point of notification by national authorities.

Brussels has been at the forefront of a push by regulators worldwide to force tech companies to take greater responsibility for content on their sites.

(Writing By Jan Strupczewski; Editing by Kirsten Donovan)

Read More

Add Comment

Click here to post a comment