(Bloomberg) — Technology companies such as YouTube, Facebook and Twitter are set to face a statutory duty to protect U.K. users against a broad range of harmful content or risk “heavy” fines.
Plans for an industry-funded regulator, which would enforce rules on removing online content that encourages terrorism and child sexual exploitation and abuse, are part of a push by Prime Minister Theresa May’s government to hold the companies accountable. Enforcement powers could include blocking access to sites and imposing liability on individual company managers.
The Department for Digital, Culture, Media and Sport laid out the proposals as it opened a 12-week consultation on the measures on Monday.
“Online companies must start taking responsibility for their platforms, and help restore public trust in this technology,” May said. “For too long these companies have not done enough to protect users, especially children and young people, from harmful content.”
The plans weren’t universally welcomed. The Institute of Economic Affairs, a pro-market research group, labeled them “draconian” and more likely to do harm than good by holding back innovation. Giving the government power to dictate what content is appropriate sets a dangerous precedent, Director General Mark Littlewood said.
Mosque Shootings
The proposed laws will apply to any company that allows users to share or find user-generated content or interact with each other online, such as social-media platforms, file hosting sites, public discussion forums, messaging services and search engines.
Other proposals outlined by the government include:
Ensuring companies respond to user complaints and act on them quicklyCodes of practice which could include requirements to minimize the spread of misleading or harmful disinformation with fact checkers, particularly during elections Annual transparency reports on harmful content and companies’ action to address itA framework to help companies incorporate safety features in new productsA strategy to educate users on how to recognize and deal with malicious behavior online
Damian Collins, a Conservative who chairs the Digital, Culture, Media and Sport Committee, cited the terrorist attack in New Zealand in which 50 Muslims were killed while video of the assault was live-streamed online.
“A regulator should have the power to investigate how content of that atrocity was shared and why more was not done to stop it sooner,” he said.
To contact the reporters on this story: Lucy Meakin in London at [email protected];Kitty Donaldson in London at [email protected]
To contact the editors responsible for this story: Fergal O’Brien at [email protected], Tony Czuczka
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="For more articles like this, please visit us at bloomberg.com” data-reactid=”35″>For more articles like this, please visit us at bloomberg.com
©2019 Bloomberg L.P.
Add Comment