A mother-of-two has issued a warning to parents after she was ‘sickened’ to learn her 11-year-old daughter had been sexually groomed by a ‘secret porn community’ on Spotify for months.
The girl, from Stockport, Greater Manchester, was told to upload multiple explicit pictures of herself to the music streaming platform by users – before her account was deleted.
It comes as children’s charities warn this new type of grooming, which they haven’t seen before, is proof of how predators will exploit any app children use if it’s possible to do so.
A spokesperson for Spotify said it ‘takes the safety of minors on our platform extremely seriously’ and had removed the exploitative content, and terminated the user in question.
Rachel, which is not her real name, claims her daughter was encouraged to create her own playlists, where other Spotify users could edit the title to enable them to message each other. Users can then upload a ‘custom’ picture to the cover of the playlist.
She says her daughter was sent an email from one user who she’d met on Spotify, who claimed to be a 12-year-old boy and who asked her to send a video of her pleasuring herself.
Another user named ‘I have nudes’ tagged her in a playlist with a message asking her to ‘show a good view’ of her genital area.
The girl, from Stockport, Greater Manchester, was told to upload multiple explicit pictures of herself to the music streaming platform by users (stock image)
The Manchester Evening News said there were numerous playlists on the music streaming service with titles changed by users who have tagged other profiles, and asked them to upload indecent pictures.
In one playlist, the name has been been changed to ‘important message to all porn posters’ and warns users about an American YouTube channel uncovering the ‘secret porn community on Spotify,’ according to the report.
The M.E.N said when they did a search of words such as ‘nudes’ and ‘porn’ on the platform, various profiles and playlists with explicit pictures of female body parts appeared.
Rachel, who has asked for her family’s identity to be concealed, keeps a tight control over what online platforms her daughters, aged eight and eleven, have access to.
Neither of them have been allowed to create Snapchat, Facebook, Instagram or TikTok profiles. The only app Rachel, a teacher, allowed her 11-year-old to use without control was Spotify – so she could listen to podcasts before bed.
Rachel said: ‘I am a teacher so I’m probably a bit stricter than other parents but I just wanted to make sure they are safe. They both have iPads but I can control their screen time and what apps they use through my phone.
‘The only app my eldest daughter was able to use after 8pm was Spotify because she’s always fallen asleep to spoken word. She likes to listen to podcasts before she goes to sleep which she plays out loud on the Alexa so I can hear exactly what she’s listening to.
A spokesperson for Spotify said it ‘takes the safety of minors on our platform extremely seriously’
‘She has access to Spotify adults because of the podcasts she listens to but I didn’t think that would be an issue because it’s just a streaming platform.’
Rachel said she and her husband only discovered the extent of what their daughter had been exposed to when she was locked out of her Spotify account shortly after Christmas.
Rachel said: ‘When I started looking at why it wasn’t working I logged into her email account which she doesn’t have access to. It’s only set up so she can have Spotify but I have the login details.
‘I opened the email and lots of them were from Spotify. It said her playlist had been removed for breaching terms and conditions. I thought it was because of the ones she’d been listening to, I didn’t realise she’d made them herself.
‘I saw another email in her inbox, this time from a man’s name I didn’t recognise. I asked her who he was and she said ‘he is one of my friends on Spotify.’ She said she had given him her email address so they could play Minecraft together. She said he was only twelve.
‘I asked her how because you can’t message people on Spotify. She told me you make a new playlist and you put your message in the title and they check the playlist and reply.’
Rachel said at this point she began to panic, and began researching reasons why Spotify would remove playlists. She came across an article referring to copyright images being taken down.
Rachel continued: ‘I asked her if she’d been putting up copyright pictures but my daughter said she didn’t understand. I just got that sinking feeling.. I asked if she’d been putting up inappropriate pictures and she nodded.
‘We found another email from the man who had asked her to pleasure herself. She didn’t understand and was getting very upset. I felt physically sick. When I searched her name on Spotify I could see her playlist. We could see pictures she’d uploaded on there. They were very explicit.’
Rachel said she put her daughter to bed and immediately called the 111 police service. She describes feeling ‘shocked’ when the call handler asked her to phone 999.
She also contacted Spotify through the website’s live chat function and alerted them to what had happened, as well as requesting the pictures of her daughter were removed from the platform.
Rachel said: ‘I got through to an actual person really quickly. They said the platform was never intended to be used in this way. We asked if they could remove the images and they said they would pass it on to the team but that they couldn’t help any further.’
Rachel says the explicit photographs have now finally been deleted from Spotify after she reported her daughter’s account and her playlists multiple times.
Rachel said: ‘The police officer who came to our house said she had not heard of Spotify being used like this so she was quite shocked.
‘They said they would try and track down the email address of the man who asked her for a video but if he lives in another country there isn’t much they can do.
‘That’s why I am so determined to spread awareness of this to make other parents aware as we had no idea this could happen. I think in my daughter’s eyes the people she was messaging were not real people.
‘Because she is so young we hadn’t had a proper chat about explicit photographs or anything like that. But I have taught kids under the age of ten Googling how sex works. I don’t think education is keeping up with the online world.’
Detective Inspector Michael Jimenez, of GMP’s Stockport division, told MailOnline: ‘An investigation is underway following reports of a child that has been incited to post indecent images on a music streaming site.’
The NSPCC warned Rachel’s story demonstrates the evolving threat to children online.
Richard Collard, Online Safety Regulatory Manager at the NSPCC, told MailOnline: ‘Child sexual abuse is an evolving threat and taking place at a record scale online, and this incident shows how offenders will exploit any app children use if it is possible to do so.
‘We must ensure tech firms do all they can to disrupt online child abuse.
‘The Government’s Online Safety Bill will introduce legislation that will hold platforms responsible for finding and disrupting this activity.
‘A strengthened Bill that holds senior managers accountable for safety would make the UK the global authority for children’s safety online.’
The Government’s Online Safety Bill, which has been hit by numerous delays, is set to be passed this year and promises to strengthen protection for children from harmful online content.
It was due to finish its Commons stages in July but was scrapped to hold a confidence vote in the former Prime Minister, Boris Johnson. The bill has since been re-written following a row among Conservative MPs about freedom of speech online.
As a result, measures which would have forced big tech firms to take down legal but harmful material have been axed from the bill – a move criticised by campaigners and the Labour party.
Technology companies will still have to stop children – defined as those under 18 – from seeing content that poses a risk of causing significant harm. Firms will also have to explain how they will check their users’ age.
A Spotify spokesperson said: ‘Spotify takes the safety of minors on our platform extremely seriously, and we do not allow content that promotes, solicits, or facilitates child sexual abuse or exploitation.
‘We have processes and technology in place that allow us to detect and remove any such exploitative material. In this case, we found the imagery in question, terminated the user, and removed the content.’
On the platform’s rule page, a sub-section titled ‘sensitive content’ says sexually explicit content is not allowed and that breaking the rules may result in the content being removed. Repeated or egregious violations may result in accounts being suspended or terminated, it says.
Spotify’s rule page states: ‘We have tons of amazing content on Spotify, but there are certain things that we don’t allow on our platform. Don’t post excessively violent or graphic content, and don’t post sexually explicit content.
‘What to avoid: Content that contains sexually explicit material includes, but may not be limited to: pornography or visual depictions of genitalia or nudity presented for the purpose of sexual gratification, advocating or glorifying sexual themes related to rape, incest, or beastiality.
‘Please respect Spotify, the owners of the Content, and other users of the Spotify Service. Don’t engage in any activity, post any User Content, or register and/or use a username, which is or includes material that is offensive, abusive, defamatory, pornographic, threatening, or obscene.’
MailOnline has contacted Spotify for further comment.