Snapchat No.1 Grooming Platform, Says NSPCC
With alarming figures showing an 89 per cent increase in grooming cases in the last six years, the NSPCC has reported that data shows Snapchat tops the list of common platforms that perpetrators have been using to target children online.
Record Levels of Grooming
Since “Sexual Communication with a Child” was established as an offence in 2017, online grooming crimes in the UK have hit record levels. The NSPCC has reported on its website that Snapchat is used in almost half of the known cases of child grooming (48 per cent), flagging Snapchat as a hub for online predators and the most common platform used to target young children.
Calling For Stronger Regulations
With over 7,000 offences reported in 2023/24 alone, the NSPCC is now calling for strengthened regulations from Ofcom and tighter UK legislation to hold social media companies accountable for protecting young users from harm.
Why Snapchat?
Snapchat’s popularity among younger audiences, combined with design features that enable messages to disappear after a set time, is creating fertile ground for abusers to operate with little fear of detection. While other platforms, such as WhatsApp, Facebook, and Instagram, were also implicated, Snapchat emerged as the platform of choice, largely due to its ephemeral messaging feature. The actual list published by the NSPCC showing which platforms are most used for child grooming, along with percentages, is:
- Snapchat 48 per cent
- WhatsApp (Meta) 12 per cent
- Facebook and Messenger (Meta) 10 per cent
- Instagram (Meta) 6 per cent
- Kik 5 per cent
What is Kik?
For those who haven’t heard of Kik (introduced in 2010 and based in Canada), it is a messaging app that allows users to chat via text, share multimedia, and join public group chats based on interests. Although its popularity has been boosted by the fact that it doesn’t require users to link their accounts to a phone number, this has also raised concerns about the platform concerning child safety due to its limited verification measures.
Rise In Grooming Offences
The figures, showing a worrying rise in grooming, provided by the NSPCC paint a distressing picture. For example, in 2023/24, UK police forces recorded 7,062 grooming cases under the “Sexual Communication with a Child” offence. This represents an increase of nearly 90 per cent in six years, with children as young as five among the victims. The data shows that girls were overwhelmingly the targets, accounting for 81 per cent of cases where gender was known, highlighting the vulnerability of young females in online spaces.
Platform First, Then Private Messaging
The NSPCC’s findings suggest that many of these perpetrators initially engage with children on mainstream social media platforms and gaming apps, only to shift communication to private and encrypted messaging services where they can evade detection. As the NSPCC reports: “Perpetrators typically used mainstream and open web platforms as the first point of contact with children. This can include social media chat apps, video games and messaging apps on consoles, dating sites, and chatrooms. Perpetrators then encourage children to continue communication on private and encrypted messaging platforms where abuse can proceed undetected.”
Such methods, therefore, appear to allow predators to groom their targets subtly before escalating to more serious abuse.
Why Snapchat and Similar Platforms Appear To Be Favoured by Predators
Snapchat’s structure is undeniably appealing to children and teenagers, thanks to its instant messaging, photo-sharing, and geolocation features. However, these same features also appear to attract those with malicious intent. One key issue with Snapchat is the app’s “disappearing messages” function, which allows messages and images to vanish after 24 hours, making it difficult for law enforcement or concerned parents to trace interactions.
Chilling Examples
To illustrate how the disappearing messages and location features in Snapchat may be exacerbating the problem, and to put a human face on the issue, the NSPCC has posted some chilling examples on its website. These include:
– Thomas, who was just 14 when he was groomed by an online predator. “Our first conversation was quite simple. I was just chatting. The only way I can describe it is like having the most supportive person that you could ever meet,” he recalled. However, as the relationship developed, the groomer pressured him into sending explicit images under the threat of exposure.
– Liidia, a 13-year-old member of the NSPCC’s Voice of Online Youth group, points out the risks tied to Snapchat’s disappearing messages and location-sharing features. Liidia is quoted as saying, “Snapchat has disappearing messages, and that makes it easier for people to hide things they shouldn’t be doing.” She adds, “Another problem is that Snapchat has this feature where you can show your location to everyone. If you’re not careful, you might end up showing where you are to people you don’t know, which is super risky.”
Lax Rules Too?
The NSPCC is also critical of the lax rules governing user interactions on Snapchat. According to the charity, children have expressed frustration that reporting inappropriate content or behaviour on the app often leads to insufficient action, leaving them unprotected and further reinforcing the cycle of abuse.
A Call for Proactive Regulation and Tougher Legislation
Following the release of the grooming statistics on its website, NSPCC Chief Executive Sir Peter Wanless has spoken out, urging Ofcom and the UK government to take more robust steps to combat online grooming. “One year since the Online Safety Act became law, and we are still waiting for tech companies to make their platforms safe for children,” he said. “We need ambitious regulation by Ofcom, who must significantly strengthen their current approach to make companies address how their products are being exploited by offenders.”
Proactive Rather Than Reactive Approach Needed
The NSPCC is calling for a shift in approach from reactive to proactive, advocating for regulatory measures that will compel social media platforms to address potential risks within their app designs, rather than merely responding to issues after harm has occurred. The charity is also seeking to extend the Online Safety Act to cover private messaging, giving Ofcom clearer authority to tackle cases on encrypted services such as WhatsApp and Snapchat.
Jess Phillips, the minister for safeguarding and violence against women and girls, echoed these sentiments, urging social media firms to fulfil their responsibilities under the Online Safety Act. “Under the Online Safety Act, they will have to stop this kind of illegal content being shared on their sites, including on private and encrypted messaging services or face significant fines,” Phillips stated.
What Have The Police Said?
Becky Riggs, the National Police Chief’s Council lead for child protection, described the situation as “shocking” and called on social media companies to bear the responsibility for safeguarding children on their platforms. “It is imperative that the responsibility of safeguarding children online is placed with the companies who create spaces for them, and the regulator strengthens rules that social media platforms must follow,” she stated.
Tech Companies Respond: The Gap Between Policy and Practice
In response to the rising criticism, Snapchat was recently quoted in a BBC report about the problem as saying that it operates a “zero tolerance” policy toward the exploitation of young people, with safeguards designed to detect and block inappropriate behaviour. A Snapchat spokesperson also stated, “If we identify such activity, or it is reported to us, we remove the content, disable the account, take steps to prevent the offender from creating additional accounts, and report them to the authorities.”
WhatsApp Too
Similarly, in response to NSPCC’s data, WhatsApp has emphasised that it has “robust safety measures” in place, although critics argue that such measures are inadequate when app features themselves create an environment conducive to grooming.
Tech Company Inaction
In a post on X, the NSPCC’s policy manager, Rani Govender, stated that tech companies are partly to blame for shocking online grooming figures, saying: “The scale and significance of these crimes cannot be underestimated. No justification for tech company inaction.”
What’s Next for Social Media Safety in the UK?
The newly implemented Online Safety Act obliges tech companies to take children’s safety seriously. By December, major platforms will be required to publish risk assessments detailing potential illegal activities on their services. Ofcom, the media regulator responsible for enforcing these rules, is also planning stringent measures that social media firms must follow to curb online grooming (outlined in its draft codes of practice).
The NSPCC has intensified its calls for social media companies to be held accountable, emphasising the urgent need for ongoing safety technology updates to shield young users from predatory behaviour. Joined by parents, policymakers, and youth advocates, the charity is pushing for swift, decisive action to ensure social media platforms provide a secure environment for children, free from exploitation.
What Does This Mean For Your Business?
The situation surrounding Snapchat and similar platforms reflects a broader issue in online safety for children, where technology’s rapid evolution outpaces regulatory oversight. Despite assurances from tech companies regarding their safety measures, the NSPCC’s findings appear to reveal a troubling gap between policy statements and practical outcomes. It seems that Snapchat’s design, with features like disappearing messages and location sharing, clearly appeals to young users, yet inadvertently provides a means for predators to exploit these same elements with relative ease.
The NSPCC’s call for proactive regulation, rather than a reactive approach, therefore, reflects the nature of the shift needed to combat this rising wave of online grooming effectively. With record numbers of offences being reported, the charity’s insistence on improved safeguarding features and the need to hold tech companies accountable takes on renewed urgency. Parents, the police, and policymakers are now demanding that platforms put child safety at the forefront of their design considerations, implementing features that deter grooming rather than facilitate it.
As the Online Safety Act begins to take full effect, with major platforms expected to publish risk assessments by December, the coming months may signal a critical period of change. However, true progress will hinge on whether companies like Snapchat, WhatsApp (Meta), and others meaningfully adapt to prevent abuse on their platforms. With Ofcom promising to exercise its enforcement powers, the pressure is now on tech firms to close the gap between safety promises and actual practice, ensuring that children can navigate social media spaces securely, free from predatory threats.
Sponsored
Ready to find out more?
Drop us a line today for a free quote!