Follow us for updates
© 2022 reportr.world
Read the Story →

Sex, Drugs and TikTok: How to Keep the Youth Safe on Their Favorite Platform

It needs a mature response over moral panic.
Oct 15, 2021
Photo/s: Dado Ruvic, Reuters
Shares

By Isabelle Volpe and Claire Southerton
The Conversation via Reuters Connect

You may have read recently that TikTok allegedly “serves up” sex and drug videos to minors. Media reports have described the video-sharing platform, which is designed predominantly for young people, as an “addiction machine” that promotes harmful content.

In an investigation, reporters at the Wall Street Journal created 31 bot accounts on TikTok, each programmed to interact only with particular themes of content. Many of the bots were registered as being aged 13-15, including one programmed with an interest in “drugs and drug use”, which was ultimately shown 569 videos related to drugs.

The investigation sought to better understand how the app’s algorithm selects videos for users. The workings of these kinds of algorithms are an industry secret, but much has been made of the app’s apparent ability to “know” what users want to see, and whether content creators can game the system and garner more views.

Continue reading below ↓

The article concluded that TikTok’s algorithm actively “serves up” drug content to minors, who “may lack the capability to stop watching and don’t have supportive adults around them” to help moderate their opinions. But is this a reasonable conclusion, and if so, should parents be concerned about drug content on TikTok?

The Wall Street Journal article doesn’t provide enough detail to allow us to evaluate the rigour of its methods and the validity of its conclusions. However, there are reasons to suspect the methodology is inherently flawed.

One problem is that a bot designed to engage only with content related to a specific set of interests is not a very realistic model of a typical social media user. Real humans do not have a set list of interests outside which they never stray – they have a diverse range of interests and curiosities.

MORE ON TIKTOK:

Those Marcos Videos on TikTok are Rooted in Decades of Misinformation 

Continue reading below ↓
Recommended Videos

Why Tiktok Just Makes Sense for Gen Z 

No Hate, TikTok Is Banning These Types of Content

TikTok Gets Fact-Checking Partner in Philippines

Anxiety and moral panic around technologies popular with young people is nothing new. Fears about the harmful effects of social media have been around for at least a quarter of a century, since the advent of MySpace and even earlier platforms in the 1990s.

In turn, these fears about harms to children help fuel calls for greater surveillance and censorship. Several countries such as India, Pakistan and the United States have temporarily banned TikTok or considered doing so. Parents have been encouraged to stop their children using it, and the app has been urged to censor drug content entirely.

TikTok offers the perfect recipe for a technopanic. The mysterious workings of its algorithm, and the unprompted nature in which users are served videos in their “For You” feed, has driven fears about the circulation of improper content that facilitates sexual grooming or disordered eating. This is exacerbated by the fact the platform is explicitly designed to attract a young user base.

Continue reading below ↓

Young people, despite being “digital natives” and highly adept at using technology, are often seen as lacking impulse control and being vulnerable to dangerous influences. Yet their voices are largely left out of these conversations. Despite their expertise in navigating these platforms, young people are spoken about, rather than spoken to.

Instead of assuming young people are inherently deficient in their judgement, taking their experiences and expertise seriously could uncover new ways of looking at old problems. One of this article’s authors (Isabelle Volpe) is investigating this in her ongoing PhD research.

Another problem with the framing of these moral concerns is that not all drug-related content on TikTok necessarily condones drug use. TikTok provides a forum for all sorts of content creators, some of whom openly use drugs and some of whom talk about drug use, its potential harms and risks.

While traditional media coverage and drug education typically focuses on criminality, addiction or distress, these framings often do not resonate with young people, which can lead to intended messages not being taken seriously. In comparison, social media platforms give exposure to a wider range of perspectives on drug use.

Continue reading below ↓

Some content creators talk about recovery from addiction (including health professionals describing their work, and people giving first-hand accounts), while some give advice aimed at reducing potential harms to people who take drugs.

It’s also undeniably true that some creators give accounts of the pleasures of recreational drug use. Drug use is complex, and appraising drug content on TikTok involves painting a complex picture.

It’s understandable parents might view TikTok as a dangerous place. But it’s important to remember any social media platform can feature drug-related content. Parents and carers can help young people navigate these spaces by having open and honest conversations about drugs, so young people feel safe and confident to raise any questions or worries about anything they see online.

TikTok also offers an opportunity to deliver evidence-based health information to people who use drugs or are considering doing so. These audiences are often considered “hard to reach”, partly because of the social stigma of seeking out information about drugs.

Continue reading below ↓

An algorithm that can identify people who may benefit from evidence-based information about drugs, and deliver it to them without them explicitly asking for it, could be a powerful tool for public health. Health professionals are already using TikTok as a new and engaging way to share public health messaging, and TikTok has already introduced “fact-checking” content warnings to combat COVID-related misinformation.

A similar approach could be applied to drug-related content, perhaps directing users to reliable health information. There is no quick fix for the complex problem of misinformation; we have to use a range of strategies to offer reliable information to those who need it.

Banning all drug content from TikTok might be a case of throwing the baby out with the bathwater, by also removing content focused on health information and harm reduction. If we are serious about protecting young people online, we need to be driven by evidence, not fear.

Continue reading below ↓

Isabelle Volpe is a PhD Candidate, Drug Policy Modelling Program, University of New South Wales. Clare Southerton is a Postdoctoral Fellow, Vitalities Lab, UNSW

Reportr is now on Quento. Download the app or visit the Quento website for more articles and videos from Reportr and your favorite websites.

Latest Headlines
Read Next
Recent News
Appointment announced a day before Marcos' inauguration.
More time for K-drama, books, and mornings without alarms.
With the reopening of several local destinations, this app has never been more timely.
You can participate in person and virtually.
The news. So what? Subscribe to the newsletter that explains what the news means for you.
The email address you entered is invalid.
Thank you for signing up to On Three, reportr's weekly newsletter delivered to your mailbox three times a week. Only the latest, most useful and most insightful reads.
By signing up to reportr.world newsletter, you agree to our Terms of Service and Privacy Policy.