Government minister to demand Tinder and Grindr explain what they're doing to protect children
Culture secretary ‘shocked’ to discover child sex offenders using dating apps
The culture secretary Jeremy Wright is to question Tinder and Grindr about measures used to protect children after police records showed they are at risk of grooming and sexual exploitation on the dating apps.
The secretary of state for digital, culture, media and sport (DCMS) said he was “truly shocked” to discover the perpetrators of child sex offences had used online dating services.
Mr Wright said: “I will be writing to these companies asking what measures they have in place to keep children safe from harm, including verifying their age.
“If I’m not satisfied with their response, I reserve the right to take further action.”
Police have investigated more than 30 incidents of child rape since 2015 where victims were sexually exploited after evading age checks on dating apps, according to The Sunday Times.
Records obtained through Freedom of Information laws showed 60 further cases of child sex offences via online dating services – including grooming, kidnapping and violent sexual assault.
The youngest victim was eight-years-old, the newspaper said.
Grindr responded to the findings by saying the company was “constantly working to improve” screening tools.
“Any account of sexual abuse or other illegal behaviour is troubling to us as well as a clear violation of our terms of service. Our team is constantly working to improve our digital and human screening tools to prevent and remove improper underage use of our app.”
Last week a man who spent the night with a 12-year-old girl who he said he thought was 19 and had met on a popular adult dating app was jailed for two-and-a-half years.
Carl Hodgson, 28, invited the child to his flat in Manchester city centre a few days after they first made contact via an app.
He pleaded guilty to causing or inciting a child under 13 to engage in sexual activity, engaging in sexual activity in the presence of a child, distributing an indecent photograph of a child and making indecent photographs of a child.
Tinder said it uses both automated and manual tools to moderate users, including scanning profiles for “red flag” images, and said it also depends on users to report profiles that may belong to a minor.
A spokeswoman said: “We utilise a network of industry-leading automated and manual moderation and review tools, systems and processes – and spend millions of dollars annually – to prevent, monitor and remove minors and other inappropriate behaviour from our app.
“We don’t want minors on Tinder.”
Mr Wright said the latest data on police investigations provide “yet more evidence that online tech firms must do more to protect children”.
It comes after Instagram pledged to ban graphic images of self-harm after health secretary Matt Hancock said social media companies “need to do more” to curb their impact on teenagers’ mental health.
The announcement followed the death of 14-year-old Molly Russell, whose family found she had viewed content on social media linked to anxiety, depression, self-harm and suicide before taking her own life in November 2017.
On Sunday Labour’s shadow health secretary Jonathan Ashworth told Sophy Ridge on Sunday: “It makes me feel physically sick that children can access images that glorify self-harm and suicide.”