The Dark Side of TikTok: How Sexual Predators Target Vulnerable Users

by Ysabel Bautista

On social media platforms like Facebook and Twitter, the content users see is largely determined by who they follow. TikTok operates differently, offering users new content without the need to follow specific accounts. TikTok’s algorithm tracks user interests based on their engagement—likes, comments, shares, and watch time—and curates similar videos on their "For You" feed (Vision of Humanity). While this feature contributes to TikTok’s immense popularity, especially among minors, it also leaves users vulnerable to cyber coercion by predators and traffickers. According to the U.S. Department of Homeland Security, the number of TikTok-related child exploitation investigations surged sevenfold between 2019 and 2021, with numbers across most social media platforms doubling again by 2023 (Department of Homeland Security, 2024).

With 1.1 billion active users, TikTok is used by approximately two-thirds of U.S. teenagers (Rodriguez, 2021). TikTok's policy allows users over the age of 13 to join, and it claims to ban accounts of users younger than that. TikTok asks for a birth date when users register an account. In the United States, those who say they are under 13 are allowed to use only a walled-off mode within the app in which they cannot share personal information or videos, but can still view recommended videos on their “For You” page. However, there are concerns that some under 13 may lie to get around the age restrictions and that the platform is not obtaining the required consent from those users’ guardians (Zhong, 2020). As of March 2024, the platform has removed over 21 million accounts suspected to belong to users under 13 via facial recognition technology, indicating that a significant number of underage users likely remain active (TikTok, 2024). These statistics pose a risk for child exploitation, as TikTok becomes one of the largest global communities of minors. 

One literature review analyzed the linguistic patterns in social media conversations between survivors and perpetrators of online grooming across different platforms, revealing critical insights into the dynamics of these cybercrimes. The authors found that children face an increased risk of becoming victims of cybercrimes, such as sexual exploitation, as their time spent online rises. In all 42 papers analyzed, online predators most often employed praise and compliments to build trust. Distinct grooming strategies emerged: “fast” groomers focus on commenting on a survivor’s physical appearance, while “slow” groomers emphasize personality traits (Forni, 2020). Additionally, another review challenges the common stereotype that only “adult strangers” use social media to sexually exploit children. Features such as chat rooms and the ability to “hide” shared images within an account have also been exploited by family members and peers known to the survivor, even when their offline contact is limited (Kloess, 2014). TikTok’s unique features, including algorithm-driven content recommendations and public commenting, amplify this threat by providing predators, both new and familiar to the survivor, with enhanced opportunities to target vulnerable users.

One unique feature of TikTok is TikTok Live, where users can stream live, real-time videos. If made public, creators can broadcast their activities and location, receiving real-time interactions through comments and virtual “gifts.” These gifts, including hearts, candy, diamonds, and flowers, can be converted into real currency, which creates opportunities for exploitation. TikTok’s policy states that only users over 18 can send or receive gifts that convert to cash, and users under 16 are barred from hosting livestreams (TikTok, 2024). However, minors may join or create livestreams through accounts claiming they are over 13, and remain undetected by TikTok’s safeguards due to the volume of accounts circulated. Once they have started a live video, they may be incentivized to perform inappropriate or sexual dances via comments from viewers in exchange for monetary gifts or social approval. “Commenters say “outfit check” to get a complete look at a girl’s body; “pedicure check” to see their feet; “there’s a spider on your wall” to get girls to turn around and show their rears; and “play rock-paper-scissors” to encourage girls to flirt-fight or wrestle with each other. Phrases like “put your arms up” or “touch the ceiling” are often directed at girls in crop tops so viewers can see their breasts and stomachs. And many simply coax girls to show their tongues and belly buttons or do handstands and splits,” according to a Forbes article (Levine, April 2022). Once a user views these types of videos, they will be shown more content with minors based on the app’s algorithm. 

TikTok also allows users to “hide” videos in “Only Me” mode, which can be a privacy feature but also poses risks when hidden content is shared with predators. Some predators use these private videos to exchange explicit material, which can only be reported by the account owner. A Forbes investigation recently discovered that one technique predators use to evade restrictions is sharing login credentials to accounts made by individuals over 18, such that minors can access these private accounts to post explicit and illegal videos while evading TikTok’s monitoring practices. In addition, traffickers have developed codewords in profiles and comments that evade TikTok’s security filters, enabling the spread of sexual content undetected. These videos can be circulated faster than TikTok’s algorithm can remove them, meaning that even when content is deleted, new videos quickly replace them (Levine, November 2022).

TikTok has taken steps to combat exploitation, though its measures remain imperfect. In July 2023, the platform began publishing biannual Child Sexual Exploitation and Abuse reports. The most recent report, covering July to December 2023, revealed that 595,709 videos were flagged, and 312,517 were banned for violating child protection guidelines. While TikTok does not disclose exact figures for youth exploitation, the report shows that 60% of such videos were removed before being viewed, and 83% were removed within one day of posting. For videos with sexually suggestive content involving minors, 40% were removed before any views, and 66% within one day (TikTok, 2024). Despite these efforts, the platform struggles to keep up with the rapid circulation of harmful content and the evolving tactics of predators.

Currently, TikTok faces scrutiny from a bipartisan coalition of state attorneys general, who are investigating the platform's ability to protect minors from cyber coercion and exploitation (McKinnon, 2022). Although TikTok’s efforts are steps in the right direction, they may not be sufficient to fully safeguard vulnerable users, especially given the platform's immense scale and the continually adapting strategies of predators.

References:

Forni G., Pietronigro A., Tiwana N., Gandolfi C., Del Castillo G., Mosillo M., Pellai A. (2020, May). Little red riding hood in the social forest. Online grooming as a public health issue: a narrative review. Annali di Igiene, 32(3):305-318. doi: 10.7416/ai.2020.2353. 

Kloess J., Beech A., Harkins L. (2014, April 2014). Online child sexual exploitation: prevalence, process, and offender characteristics. Trauma Violence Abuse, 15(2):126-39. doi: 10.1177/1524838013511543. PMID: 24608540.

Levine, A. (2022, April 27). How TikTok Live Became ‘A Strip Club Filled With 15-Year-Olds. Forbes. https://www.forbes.com/sites/alexandralevine/2022/04/27/how-tiktok-live-became-a-strip-club-filled-with-15-year-olds/?sh=7b46648262d7 

Levine, A. (2022, November 11). TikTok is failing to stop the spread of child sexual abuse material, experts say. Forbes. https://www.forbes.com/sites/alexandralevine/2022/11/11/tiktok-private-csam-child-sexual-abuse-material/?sh=1fcc4a543ad9

McKinnon, J. D. (2022, March 4). TikTok faces scrutiny in state attorneys general probe of online harms to children. The Wall Street Journal. https://www.wsj.com/articles/tiktok-faces-scrutiny-in-state-attorneys-general-probe-of-online-harms-to-children-11646251698?mod=article_inline

Rodriguez, S. (2021, September 27). TikTok reaches 1 billion monthly users. CNBC. https://www.cnbc.com/2021/09/27/tiktok-reaches-1-billion-monthly-users.html 

TikTok (2024). Community guidelines enforcement report Q1 2024. https://www.tiktok.com/transparency/en-us/community-guidelines-enforcement-2024-1?tc_version=2024 

U.S. Department of Homeland Security (2024, April 17). DHS launches Know2Protect public awareness campaign to combat online child exploitation. https://www.dhs.gov/news/2024/04/17/dhs-launches-know2protecttm-public-awareness-campaign-combat-online-child

Vision of Humanity (n.d.). Why TikTok isn't really a social media app. https://www.visionofhumanity.org/why-tiktok-isnt-really-a-social-media-app/

Zhong, R., Frenkel, S. (2020, August 14). A Third of TikTok’s U.S. Users May Be 14 or Under, Raising Safety Questions. The New York Times. https://www.nytimes.com/2020/08/14/technology/tiktok-underage-users-ftc.html 


Previous
Previous

Hidden in Plain Sight: Child Trafficking During the Holiday Season

Next
Next

Understanding Adverse Childhood Experiences (ACEs): A Guide for Everyone