Playing now: - Waiting for next song...

Playing next: -  

Yorkshire Coast Radio News

> Index

Sexual Grooming Rises in Yorkshire and Humber

Sexual Grooming Rises in Yorkshire and Humber

Published by Jon Burke at 12:05am 11th September 2019.

Grooming crimes recorded by police in Yorkshire and Humberside have soared by 59% in the last year, data obtained by the NSPCC has revealed.

There were 473 offences of sexual communication with a child recorded in the region in the year to April 2019, compared with 297 in the previous year.

In North Yorkshire the number increased from 14 to 22 over that time, while in Humberside it went up from 77 to 80.

In England and Wales, there were 4,373 offences of sexual communication with a child recorded in the year to April 2019 compared with 3,217 in the previous year. The offence came into force on April 3, 2017, following an NSPCC campaign.

The data obtained from 43 police forces in England and Wales under Freedom of Information laws also revealed that, where age was provided, one in five victims were aged just 11 or younger. 

In 2018/19 in England and Wales the number of recorded instances of the use of Instagram, which is owned by Facebook, was more than double that of the previous year.

Overall in the last two years, Facebook-owned apps (Facebook, Messenger, Instagram, WhatsApp) and Snapchat were used in nearly 75% of the instances where police in Yorkshire & The Humber recorded and provided the communication method. 

The Government has indicated it will publish a draft Online Harms Bill early next year, following the NSPCC’s Wild West Web campaign. The proposals would introduce independent regulation of social networks, with tough sanctions if they fail to keep children safe on their platforms.  

The NSPCC believes it is now crucial that Boris Johnson’s Government makes a public commitment to draw up these Online Harms laws and implement robust regulation for tech firms to force them to protect children as a matter of urgency.

NSPCC Chief Executive, Peter Wanless, said:

“It’s now clearer than ever that Government has no time to lose in getting tough on these tech firms.  

Despite the huge amount of pressure that social networks have come under to put basic protections in place, children are being groomed and abused on their platforms every single day.  These figures are yet more evidence that social networks simply won’t act unless they are forced to by law. The Government needs to stand firm and bring in regulation without delay.” 

One girl was 12 when, while she was staying at a friend’s house, a stranger bombarded her Instagram account with sexual messages and videos.

Her mum told the NSPCC:

“She was quiet and seemed on edge when she came home the next day. I noticed her shaking and knew there was something wrong so encouraged her to tell me what the problem was. 

When she showed me the messages, I just felt sick. It was such a violation and he was so persistent. He knew she was 12, but he kept bombarding her with texts and explicit videos and images. She didn’t even understand what she was looking at. There were pages and pages of messages, he just didn’t give up.

Our children should be safe in their bedrooms, but they’re not. They should be safe from messages from strangers if their accounts are on private, but they’re not.” 

The NSPCC’s Wild West Web campaign is calling for social media regulation to require platforms to take proactive action to identify and prevent grooming on their sites by:

-Using Artificial Intelligence to detect suspicious behaviour

-Sharing data with other platforms to better understand the methods offenders use and flag suspicious accounts

-Turning off friend suggestion algorithms for children and young people, as they make it easier for groomers to identify and target children 

-Designing young people’s accounts with the highest privacy settings, such as geo-locators off by default, contact details being private and unsearchable and livestreaming limited to contacts only.

The charity wants to see tough sanctions for tech firms that fail to protect their young users – including steep fines for companies, boardroom bans for directors, and a new criminal offence for platforms that commit gross breaches of the duty of care.


No comments have been posted and approved yet.

Submit a Comment

Submit A Comment