Social media giants are fighting an attempt by the UK’s technology regulator to stop children receiving friend requests from strangers over the internet.
Ofcom has urged technology companies to ban children from being presented with lists of suggested users to add, known as “network expansion prompts”. The proposals are designed to tackle “illegal harms” such as grooming under the Online Safety Act.
Snap, the $15bn owner of messaging app Snapchat, has pushed back against the proposals. The Californian company has argued in a response to the regulator’s plans that there are “strong benefits” of such friend recommendation tools for children, including helping to tackle the UK’s “loneliness epidemic”.
Online safety campaigners said Snapchat’s submissions were an effort to “water down” the regulator’s protections for children and called the company’s decision to invoke a loneliness epidemic “cynical”.
Ofcom’s proposals, announced in November, said children should “not be presented with network expansion prompts, or included in network expansion prompts presented to other users”.
Such prompts see social media users presented with lists of potential friends and mutual contacts, including people they may never have met in person.
In its response to the regulator, Snap said the proposals went too far and said such prompts should be allowed when they had been “designed responsibly and enable connections to people the child is likely to know in real-life (as opposed to strangers)”.
It claimed such digital connections helped to support “feelings of belonging and happiness” in young people and could tackle the UK’s “loneliness epidemic”.
The social network said that “older children” between the ages of 13 and 17 should be “included in network expansion prompts and be presented with network expansion prompts by default”.
It suggested limiting such prompts to instances where there are multiple mutual contacts, or where the child already has the phone number of a potential contact on their smartphone.
Snapchat added: “This ensures that there are clear parameters in place so that the teen is very likely to know the other user in real life (whether another teen or an adult).”
‘Scattergun friend requests’
Online safety campaigners said Snapchat’s proposals were an effort to weaken protections for children.
Andy Burrows, chief executive of the Molly Rose Foundation, said: “Snapchat’s attempt to water down the regulator’s proposals shows the company continues to prioritise its revenue over children’s safety.
“It’s particularly cynical to see Snapchat invoke a so-called ‘loneliness crisis’ as it seeks to continue with high-risk design choices that are seemingly driven far more by a concern for its bottom line than child wellbeing.”
He added Snap’s proposal would “fail to address the risks of teenagers being encouraged to add users they don’t know and seem focused on the platform’s growth rather than minimising the risk of malign or dangerous contact”.
Ofcom has been preparing to enforce its new powers under the Online Safety Act, consulting with industry figures and child safety campaigners.
The full measures come into force in spring next year and would grant the regulator the power to levy fines on tech giants worth billions of pounds for failing to tackle online harms.
Announcing its proposals last year, Ofcom warned that “scattergun friend requests are frequently used by adults looking to groom children”.
Research from the regulator found that three in five secondary school-aged children had been contacted online in a way that made them uncomfortable.
One in six had been sent naked or half-dressed photos or been asked to share ones of themselves.
Dame Melanie Dawes, Ofcom’s chief executive, said such abuse “cannot continue”.
The regulator’s proposed measures included blocking children from lists of suggested contacts, not presenting users with suggested friends lists and not making their connections visible to others.
Snapchat suggested children should be offered a prompt to block a connection request from someone they did not appear to know.
The company said it believed the risks of “responsible network expansion prompts” were “relatively low” and “should not be disabled for younger users”.
Freedom of information data from the NSPCC published last year revealed there had been 34,000 online grooming offences against children in the last six years.
Where the means of communication involved in the offence was disclosed by police, 26pc of cases – or 3,692 instances – involved Snapchat.
An Ofcom spokesman said it was considering all responses to its consultation.
A Snap spokesman said: “Snapchat helps connect people to their friends and family, and we are committed to the safety of our community. We have extra protections for under 18s to make it very difficult for them to be contacted by people they don’t know.
“This includes keeping friends lists private, and we don’t allow any users to be messaged by someone they haven’t added as a friend or don’t already have in their phone contacts. We continue to roll out enhanced safety measures to support our community, including expanding our in-app notifications to ensure teens are in touch with people they trust and in some cases, preventing friend requests altogether.
“These measures have been recognised by Ofcom as examples of best practice to help protect children, and we continue to work closely with Ofcom to support the implementation of the Online Safety Act.”
Add Comment