TikTok’s underage Live Sex Streaming for tokens and cash epidemic is every parents worst nightmare come true. Parents are warned not to permit under age access to the platform and to restrict In App features such as Live Streaming to keep their children safe.
Parents also need to ensure they monitor the activities of their children on the App and have all parental controls enabled. This Forbes investigation will open many parents eyes. Hopefully parents will take on board the findings and discuss it with their children as a matter of urgency.
“You’re paying my bills,” MJ told the audience, running a finger over her mouth.
“$35 for a flash,” one viewer responded. Another asked how much to send to her Cash App.
As she posed and pursed her lips, her long blonde hair draped over her tight black bralette, some asked MJ to show them her feet.
“I’m 68 and you owe me one,” one attendee told her as more requests piled on.
These exchanges did not take place between adults at a nightclub; they took place on TikTok Live, where MJ, who said she was 14 years old, was broadcasting with friends to 2,000 strangers on a recent Saturday night.
A Forbes review of hundreds of recent TikTok livestreams, reveal how viewers regularly use the comments to urge young girls to perform acts that appear to toe the line of child pornography.
Then rewarding those who oblige with TikTok gifts, which can be redeemed for money, or off-platform payments to Venmo, PayPal or Cash App accounts that users list in their TikTok profiles.
It’s “the digital equivalent of going down the street to a strip club filled with 15-year-olds,” says Leah Plunkett, an assistant dean at Harvard Law School and faculty associate at Harvard’s Berkman Klein Center for Internet & Society, focused on youth and media.
Imagine a local joint putting a bunch of minors on a stage before a live adult audience that is actively giving them money to perform whatever G, PG or PG-13 activities they request, she said. “That is sexual exploitation. But that’s exactly what TikTok is doing here.”
The transactions are happening in a public online forum open to viewers almost anywhere on the planet. Some of the demands are explicit — like asking girls to kiss each other, spread their legs or flash the camera — and some harder to detect, masked with euphemisms.
Commenters say “outfit check” to get a complete look at a girl’s body; “pedicure check” to see their feet; “there’s a spider on your wall” to get girls to turn around and show their rears; and “play rock-paper-scissors” to encourage girls to flirt-fight or wrestle with each other.
Phrases like “put your arms up” or “touch the ceiling” are often directed at girls in crop tops so viewers can see their breasts and stomachs. Many simply coax girls to show their tongues and belly buttons or do handstands and splits.
In return, the girls are showered with virtual gifts, like flowers, hearts, ice cream cones and lollipops, that can be converted to cash.
In the U.S. as here in Ireland also, TikTok users are supposed to be at least 18 in order to send or receive gifts through Live. These can be turned into money. Those under 16 are meant to be blocked from hosting livestreams altogether, according to sites company rules.
“TikTok has robust policies and measures to help protect the safety and well-being of teens,” a company spokesperson said. Those include setting accounts under age 16 to “private” by default and restricting them from using direct messaging. “We immediately revoke access to features if we find accounts that do not meet our age requirements.”
The company said in an email that it also removes content containing sexual activities, or attempting solicitation. It has a zero tolerance policy for child sexual abuse material.
Some of the accounts that hosted livestreams viewed by Forbes were no longer active several weeks later.
The U.S. government and regulators’ concerns about youth-focused apps like TikTok are intensifying. In his first State of the Union address in March, President Joe Biden called for Washington to “hold social media platforms accountable for the national experiment they’re conducting on our children for profit.”
While TikTok has been spared the worst of the criticism that has rained down on Meta, formerly Facebook, the Chinese-owned video app is beginning to draw more scrutiny for the dangers experts say it poses to minors.
In March, a bipartisan group of state attorneys general launched an investigation into TikTok’s alleged harms to underage users, and according to the Financial Times, the Department of Homeland Security is looking into the platform’s handling of child sexual abuse material.
A TikTok spokesperson said “[we] appreciate that the state attorneys general are focusing on the safety of younger users” and the company is cooperating. But TikTok was not aware of the Homeland Security investigation, she said. The department did not respond to multiple requests for comment from Forbes.
In the wake of a congressional probe into how Meta and Instagram may hurt children and teens — an inquiry prompted by revelations by a Meta whistleblower last fall — lawmakers in October for the first time ever hauled in a TikTok executive to testify about its own policies on underage users.
But more than six months since, Congress has made little progress on its pledge to revamp decades-old children’s online privacy laws.
The hotly-contested provision of federal law called Section 230 of the Communications Decency Act, which shields internet companies from legal liability for hosting and moderating content that users post on their platforms, could protect TikTok from much of the activity happening on TikTok Live.
In cases in which any livestreams contain child sexual abuse material or sex trafficking, federal criminal laws would apply regardless of Section 230, says Jeff Kosseff, a cybersecurity law professor at the U.S. Naval Academy and author of “The Twenty-Six Words That Created the Internet,” a book on Section 230.
But much of the questionable activity and gifting on TikTok Live falls short of those things.
Former federal prosecutor Mary Graw Leary said “the fundamental problem of Section 230” is that it enables social media platforms to get away with, and even monetize, many of the same harms that could result in a lawsuit for a brick-and-mortar business — simply because they’re happening online.
“If there’s a digital platform, we somehow treat it differently,” she said, “but the great irony is the harms are not only the same; the harms are worse.”
TikTok, where gifts change hands
The wildly popular social media app best known for its lighthearted videos of dance routines is, beneath the surface, a cash cow — one where money and gifts are often sent by adults to minors.
Top legal, law enforcement and children’s safety experts told Forbes that such activity on livestreams can enable predators to groom targets for online or offline sexual abuse and sextortion, warning of the consequences of unfettered access to girls’ bedrooms and bathrooms, where most of the streaming occurs.
“That’s how it starts,” said John Shehan, a vice president at the National Center for Missing & Exploited Children, which shares tips from tech companies about suspected child sexual exploitation on their platforms with law enforcement.
The unusually intimate connection created by TikTok Live can serve as a way for predators to test boundaries and build rapport with possible targets over time, the ultimate goal being to obtain explicit images and videos or potentially meet for sex.
The situation can “quickly go from images of the feet, whether there’s monetary compensation or just the fact that they’re willing to take those images, that move off-platform into other platforms or other environments where they continue to ask for additional photographs, more sexually suggestive, that then very quickly turn into pornographic images,” he said. “And then before you know it, it’s a sextortion case.”
TikTok users spent more than $2 billion in the app in 2021, up $1 billion from the year before, according to data analytics firm SensorTower. TikTok declined to comment on how much of that was spent through TikTok Live specifically.
Those watching the real-time broadcasts can buy TikTok coins they can use to purchase and send digital gifts to the hosts of the livestreams. In turn, those “going live” can link their TikTok and bank accounts to redeem those virtual items for real money.
TikTok is far from the only social media platform to enable payments or virtual gifting between users, or to offer live broadcasting features. But its scale and young userbase set TikTok apart from rivals, according to the Stanford Internet Observatory’s chief technology officer David Thiel, making the platform’s problems with live content, and the monetization of it, even more acute.
Almost half of minors in the U.S. use TikTok at least once a day, Thorn, a nonprofit fighting child sexual abuse, found in a 2020 study of 9- to 17-year-olds.
In the last quarter of 2021, TikTok removed more than 15 million accounts suspected to be younger than 13 (the age required to use its flagship platform) and nearly 86 million videos that broke its rules, according to its most recent enforcement report, out this month.
Almost half of the videos taken down during that period were removed for policy violations related to minor safety.
Snap, Instagram and YouTube — other popular destinations for children and teens — have also been criticized for exposing underage users to dangerous or unhealthy situations.
A YouTube spokesperson said users must be at least 13, or have parent or guardian permission, to use the platform and pointed Forbes to its policies prohibiting livestreams, videos and comments that exploit or endanger minors.)
Minors say a majority of potentially harmful interactions they experience online, including sexual, happen on those platforms and TikTok, according to Thorn.
These exchanges often generate material that may never go away, as screenshots and recordings spread off mainstream platforms across the internet.
The Internet Watch Foundation says it has found troves of child sexual abuse imagery from livestreams being redistributed on third party websites.
The challenge is: it goes all over the world after that,” says Peter Gentala, senior legal counsel at the National Center on Sexual Exploitation.
Predators on livestreams may “abuse in the moment, screen capture, then use that for their own purposes afterwards and make other money for it on the Internet, whether it’s dark web or other places where it’s openly traded.”
Lina Nealon, who leads the Center’s corporate accountability efforts, added that having Venmo, PayPal or Cash App accounts listed in a person’s TikTok profile — as many young women on the app do — can sometimes be indicators of trafficking, suggesting to buyers that this individual, or her sexually explicit materials, may be for sale.
TikTok said if it identifies attempts to drive viewers from Live to other platforms for sexual solicitation purposes, it immediately ends the stream and takes enforcement actions on the host’s account.
Teenage quid pro quo
Ella turned the lights down low and propped the phone up in front of her body — just beneath her chin, so her face was hidden — and picked up a pair of scissors.
Very slowly, as Ariana Grande played and nearly 3,000 people looked on, she began snipping pieces of her white t-shirt. Strip by strip, hole by hole, with every cut revealing more of Ella’s chest and black bra.
Commenters clamored for more and digital gifts poured in — a steady stream of roses, fire, whipped coffee and other cartoonish prizes.
“IF U DO THE BLACK PART IM GONNA SEND TIKTOK LIVE 35.000 TIKTOK COINS (400$),” one viewer wrote, urging her to cut off her bra.
And others: “Pop one.” “Now the shorts.” “More midrift yo.” “Keep going baby.” “At 100k she’ll flash.” “Tell me where to come so I can come give you the attention you are actually looking for.”
Though Ella did not state her age on the Live, commenters guessed that she was between 12 and 18 years old.
Some, emphasizing how young she seemed, joked that she was 2.
Despite TikTok’s intended restrictions around livestreaming and gifting for minors, verifying that users are, in fact, old enough to be using certain apps or features remains an unsolved problem across many mainstream social media platforms, TikTok included.
Madison, a 17-year-old from South Carolina, told Forbes that some of her underage friends had been earning $200 a week off the gifts they’d racked up in their TikTok livestreams.
Although she can’t receive Live gifts yet because she signed up using her correct age, Madison told Forbes that does not spare her comments from “older men trying to sexualize girls” and their offers for money.
“CAN WE SEE BROWN SHIRT TOPLESS,” one commenter asked as Madison and her friend in a brown shirt answered questions on a TikTok Live last month. Madison was also asked to “show a bit please” and “stand over cam.”
Even after her Lives end, some viewers have followed up with Madison through Instagram (which she linked to her TikTok) and offered to pay her to speak with them.
“I’m a minor, and I don’t enjoy 40-year-old men saying that to me,” Madison said. “There are probably kids way younger than me in the same situation.”
Madison explained the various euphemisms that commenters use and added that “it took me a while to put together what it was, but I feel if a girl’s 13, and doesn’t really know what that means, then they’re just like, ‘okay.'”
Some content on Live may not cross the line into violating platform rules or state and federal laws, experts say. But they warn that encouraging and financially rewarding minors through the streams — even for doing things that seem relatively innocuous — can often escalate into more exploitative situations.
“A $10 investment with a child for an offender is a fantastic return because it’s a small amount of money, it gets the kid doing something that they probably normally wouldn’t do, and then that’s when the stick comes out — that’s when the actual sextortion begins,” says Austin Berrier, a special agent with Homeland Security Investigations who specializes in livestreamed cyber crimes and child sexual abuse.
Berrier says parents he speaks to are generally not aware of what’s happening on livestreams and that when the money exchanged takes the form of fun pictures, as is the case on TikTok, it makes what’s really going on even easier to miss or dismiss.
“With the platforms where the monetization is through tokens or flowers or stupid little emojis,” he says, “it doesn’t click in a kid’s head, I think, that they’re actually being paid” and “the parents don’t really stop and think, ‘Okay, someone’s paying my kid to dance. No, they’re just getting little flowers and little hearts.’ It allows people to separate that.”
TikTok spokesperson said the company “has zero tolerance for child sexual abuse material” and “when we find any attempt to post, obtain or distribute CSAM, we remove content, ban accounts and devices, immediately report to NCMEC, and engage with law enforcement as necessary.”
TikTok said it flagged more than 150,000 potentially violative videos to NCMEC last year.
Short of that consensus, TikTok and other tech giants have been self-regulating on content moderation. But the industry has made little progress in the burgeoning livestreaming space.
Platforms have shown an inability to police comments that flow rapidly below often temporary livestreams and then disappear, says Thiel of the Stanford Internet Observatory.
While it’s not hard for AI to detect and prevent hate speech, euphemisms and code words like those often used with girls on TikTok Live make real-time, automated enforcement much more of a challenge, he says.
The live struggle
Despite agreement on both sides of the aisle that more must be done to protect kids and teens online, lawmakers could not be more politicized over how platforms should moderate content and how best to amend or repeal Section 230.
TikTok has taken steps to give users more control over comments they receive. The company announced this month it had started testing a tool that lets users “dislike” comments they feel are inappropriate.
Users can also turn off comments for their livestreams, filter out comments that contain certain words and assign a person they trust to help them manage audience comments.
Ella Brown, a college freshman near Kansas City, Missouri, said she struggles with the deluge of comments on TikTok Live — and she’s not even a minor.
“It’s super weird that you can’t even go Live on TikTok without weirdos trying to get pictures out of women,” Brown, 18, told Forbes, adding that she doubts men going Live are asked constantly for an “outfit check.”
“Definitely stay off Live as a younger girl,” she said.
In just one TikTok Live last month, Brown and her roommate brought more than 500 strangers into their house (virtually), were asked to meet up with men in-person to golf (“old guys” her Dad’s age), and were offered as much as $50 for photos of their feet (“I did not understand at all”).
Brown said she wouldn’t take cash for photos of her feet, but others certainly would.
“$20 is $20,” she said. “That’s coffee a few times a week.”
Here at children of the digital age we afford world class education to help protect parents and children against online harm.
You can contact us by email firstname.lastname@example.org or call us on +353 87 1096087 for more information about what we do.
You can also make a booking for one of our professional courses, or book a speaker for your next event.