TikTok and Instagram Reels are the social media platforms known for light-hearted dancing and memes that have brought an element of seemingly harmless fun into the dark days of Covid for many of us.
But as Anne-Marie Tomchak reports, are we nonchalantly uploading content of our children having fun and ignoring the darker side of child safety issues that these social platforms present?
The account quickly grew to 17,000 followers and Krystal’s daughter (now 11) excelled in her sport to become an elite gymnast, so things were progressing nicely.
About a year ago a friend suggested that the account should be private because of child safety concerns so Krystal went about changing the settings and removing any followers that she thought seemed ‘dodgy’. But it was already too late.
In June 2020 she found out that an image of her daughter had been published on a Russian porn website. “It was a photo of Edie in a pink leotard taken from the days before the Instagram account was private,” says Krystal. “The comments on the photo were shocking and I recognised images of other kids from the dance world on there too.”
Krystal took screenshots and alerted the police immediately about the images that had been stolen. That night she sat for 24 hours and removed every single one of the 17,000 followers on her daughter’s page.
“I was hysterical and in tears. I literally just sat there tapping, tapping, tapping. It needed to be done,” she says. “If people really want to follow they can send another request.” Sure enough the requests started coming back in. Krystal’s new policy was to vet every single account before accepting them. She also decided to regularly go back in to check her follower list, closely inspecting the profile photo, posts and looking at who else they’re following.
“Some accounts are initially set up to look legit and a part of the gymnast community, but they are really lurkers who just watch and don’t post,” adds Krystal. “You need to keep checking them. I’m constantly reporting and blocking.”
What happened to Krystal and her daughter is something that parents have become increasingly aware of and experts and activists are vociferously highlighting. And the platforms are beginning to take notice.
Two months after a Glamour UK investigation revealed that child predators were harvesting images of kids on Instagram, the platform has introduced a new child safety feature. It allows users to report content that involves a child. Reported accounts are flagged as a priority to a child safety specialist.
Previously this reporting option existed solely in the nudity category. A spokesperson for Facebook (which owns Instagram) told us: “Now, we’re offering a new option for people to report entire accounts that might endanger children”. The spokesperson added: “We prohibit content and accounts that put young people in danger. We use artificial intelligence technology and reports from our community to keep child exploitation off Instagram.”
India*, an executive assistant in London in her 30’s, is one of the people who has been actively lobbying for these changes. She runs the @pd_protect Instagram page which reports and spotlights alarming content. In the space of four weeks she reported 7,000 accounts to Instagram and she says they have really listened.
“I initially had a handwritten list of six hundred accounts which I reported. Instagram investigated and removed every single one of those. I’m trying to work with them rather than against them,” she says. Having had some success with Instagram, India is now turning her attention to TikTok.
She’s keen to emphasise that she’s not trying to shame parents but says they need to be more aware. “There are parents of child gymnasts and dancers who are doing the splits and dressed in leotards. Many of them are focused on followers and sponsorship deals. If they limit the number of followers their sponsorship deals are at risk” explains India.
“The nature of TikTok is to encourage girls to dance. You can have a private profile but it won’t get as many followers or likes that way. The idea of a following or money being more important than child safety is a subject that makes people uncomfortable.”
“They are not looking at the quality of the followers,” she says before adding that “loads of mums we know are in denial and turning a blind eye because of all the free stuff (like leotards, dance shoes and other products and merchandise) that comes with having a big account.”
Like India, Krystal is now also turning her attention to TikTok particularly given the popularity of dancing on the platform and the younger demographic using it. “It took 6 months before I agreed to let my daughter use TikTok. Some people post some really funny videos on there and some are really creative. But I took it off her phone recently as I didn’t want her watching a particularly upsetting video.”
The ‘For You’ section is similar to the Discovery tab on Instagram where you can see content from creators that you don’t necessarily follow but the algorithm has tried to learn from you and serves back popular content in the belief that you’ll like it.
Why would anyone target children via kid’s content in this way? The most obvious reason is simple. Money. The ad revenue generated by video content can be sizable and given that nursery rhymes and other kids content are among the most watched things on YouTube, it doesn’t take a genius to figure it out.
But that doesn’t explain how disturbing video content has still managed to creep on to YouTube Kids which is supposed to filter out content that isn’t suitable for children.
These changes were lamented by some who knew their income would take a hit. But they were devised after the company was fined almost $200 million for breaking child privacy laws in the US. Here in the UK, a legal case against Google (which owns YouTube) has just begun for allegedly breaching the privacy of under 13’s by collecting data without parental consent.
But on TikTok there are videos showing three teenage girls talking about fucking each others boyfriends or a mother describing her daughter as a little c*** who she should’ve strangled at birth.” Carr believes that TikTok needs to vastly improve its age verification process. “There are some excellent programmes available,” he says “but the companies are not afraid of the regulators.”
When asked if the TikTok age verification was robust enough. A TikTok spokesperson said: “We know the industry as a whole needs to do more work on age verification, and we are committed to working with peers, regulators and key stakeholders to find an industry-wide solution to ensure that only those who meet minimum age requirements use platforms like ours. Keeping our users safe is our top priority.”
The prominence of dancing on TikTok is something that parents like Krystal is wary of, or more specifically the type of dancing that’s part of the TikTok culture. “I watched the movie Cuties on Netflix recently and as much as it disgusted me, it also reminded me of what you’d see on TikTok.
The dancing is very similar with all of these young girls grinding.” Cuties has attracted criticism and even led some people to cancel their Netflix subscriptions over its sexualised depiction of young girls and stereotypes of black bodies and Muslim women.
A poster for the film showing the scantily clad tween protagonists in provocative poses was widely criticised and led Netflix to issue an apology. The film is a coming of age story about an 11-year-old from a conservative Senegalese family who joins a sassy dance troupe to rebel against her background. In an op-ed for the Washington Post the film’s director and writer Maïmouna Doucouré (who is herself French-Senegalses) said that Cuties was a story about modern girlhood and the confusion that young girls experience during puberty in the digital age.
She was inspired to write the script for the film after speaking to a group of 11-year-old dancers at a community event in Paris who told her that “the sexier a woman is on Instagram or TikTok, the more likes she gets.” “They tried to imitate that sexuality in the belief that it would make them more popular,” wrote Doucouré. “They construct their self-esteem based on social media likes and the number of followers they have.”
But there are lots of resources available online: from what parents need to know about TikTok to blogs from activists like India with general information about how to keep children safe online. There is also child protection software that can be downloaded and installed on children’s devices such as SafeToNet. It’s an app that uses language processing and artificial intelligence to detect warning signals around language or behaviour online. Parents receive an alert if anything in their child’s phone activity raises a red flag but they cannot read messages so the child’s privacy is not infringed upon.
Founder Sharon Pursey says it can help guide a child and disrupt potentially harmful conversations. The alert provides an extra layer of support so that parents can initiate a dialogue with their child. The company has also released a range of steps that parents can follow in order to keep their kids safe online such as making sure video phones are not used in the bedroom or bathroom.
“Children are so savvy online,” continues Pursey. “We’ve had cases where kids as young as six are sexting as they learn from their older siblings who are watching porn.”
Supervision and building knowledge and trust over time (depending on the age of the child) is a key part of the process. Technology and the content consumed online is influencing children in so many ways. It’s informing how they see themselves and the world around them. But it is not without risks and there is no way of sugar coating what those risks are.
Perhaps John Carr said it best: “Families and children are spending more time online during lockdown, parents are busy working from home and this has presented predators with a golden opportunity. Everything is happening more. The pandemic has put the child protection issue on steroids.”