TikTok updates Safety Center resources following research on harmful challenges

Social

TikTok has developed a bad reputation for hosting dangerous viral “challenges” on its app, which at their worst, have led to serious injury or death — as in the case of the blackout challenge that prompted Italian regulators to take action against the social network this year to remove underage users. More recently, the app has made headlines for challenges that have encouraged students to hit teachers and destroy school property. As the potential threat of further regulation looms, TikTok today is sharing the results of its internal research into viral challenges and hoaxes, as well as how it’s taking action.

To date, TikTok has often tried to downplay its involvement in viral challenges.

In October, TikTok denied that the “slap a teacher” dare was a TikTok trend, for example. After a child died from trying the blackout challenge, TikTok released a statement that said the company found no evidence of any challenge involving asphyxiation on its platform. It repeated this claim again during a recent Senate hearing devoted to the topic of minor safety on social platforms. But Sen. Blackburn (R-TN) told TikTok’s representative that her staff had found the “pass out videos,” along with other disturbing content.

Today, TikTok is releasing the results of its own internal research into harmful challenges and hoaxes.

The company said it launched a global project a few months ago into this topic, which included a survey of more than 10,000 teens, parents and teachers from Argentina, Australia, Brazil, Germany, Italy, Indonesia, Mexico, the U.K., the U.S. and Vietnam. It also commissioned an independent safeguarding agency, Praesidio Safeguarding, to write a report detailing the findings and its recommendations. A panel of 12 leading teen safety experts were additionally asked to review the report and provide their own input. Finally, TikTok partnered with Dr. Richard Graham, a clinical child psychiatrist specializing in healthy adolescent development, and Dr. Gretchen Brion-Meisels, a behavioral scientist specializing in risk prevention in adolescence, to offer further guidance.

The data the report uncovered is worth examining, as it speaks to how social media can be a breeding ground for harmful content, like these viral challenges, because of how social platforms are so heavily used by young people. And young people have a much larger appetite for risk due to where they are in terms of their psychological development.

As Dr. Graham explained, puberty is this extraordinary period that works to prepare a child to transition to adult life. It’s a time of “massive brain development,” he said.

“There is a lot of focus now on understanding why teens do the things they do — because those judgment centers are being revised again in preparation for more complex decision-making and thinking in the future,” he explained.

Dr. Graham said young people’s brains are developing in terms of abstract thinking, the recognition of more complex psychological and emotional states and more sophisticated consideration of relationships. And as all this is underway, their desire to learn about the world around them increases — and this can include a desire, at times, to engage in riskier activities and to test themselves or win peer approval. Sometimes these “dangerous” activities are relatively harmless, like watching a scary movie or riding a rollercoaster. But other times, teens and other young people may choose to engage with something that they believe will really stretch them in some way, and this attracts them to riskier challenges.

“Sometimes they’re going to…bite off more than they can chew, and have an experience that in a way traumatizes them at least in the short-term, but [the teens’] aspiration is to grow,” he noted.

In addition, viral challenges, in general, can appeal to teens’ desire for approval from friends and peers because they result in likes and views. But the way teens assess whether challenges are safe has flaws — they tend to just watch more videos or ask friends for advice. Parents and teachers, meanwhile, have often been hesitant to talk about challenges and hoaxes for fear of prompting more interest in them.

The study found that most teens aren’t participating in the most dangerous challenges. Only 21% of global teens were participating in challenges, and only 2% participated in those that are considered risky. An even smaller 0.3% have taken part in a challenge they considered to be “really dangerous.” Most thought participation in challenges was either neutral (54%) or positive (34%), not negative (11%). And 64% said participation had a positive impact on their friendships and relationships.

The research also examined hoax challenges — like Blue Whale or Momo — which propagate the belief there’s a bad actor directing the children to engage in harmful activities that escalate to self-harm or suicide. Sixty-one percent of teens said they search for more information on hoaxes when they come across them to try to verify if they’re real, but a lot of confusion tends to remain around the hoaxes. Teens suspect people who repost the hoaxes are doing so for likes and views (62% believe this) or because they think the hoax is real (60% believe this). Forty-six percent of teens exposed to hoaxes sought support or advice, which signals teens would benefit from resources that help them to make sense of hoax material.

Though the research indicates there’s much left for social media platforms to do in order to address problems related to user safety, TikTok’s response here is fairly minimal.

The company today announced it’s adding to its Safety Center a new section dedicated to challenges and hoaxes, and it’s revising some of the language used in its warning labels that appear when people search for hoaxes linked to suicide or self-harm, based on the researchers’ suggestions.

Considering the relatively minor nature of these changes — essentially a help doc and some revised text — it’s interesting to note that TikTok livestreamed a presentation about its research to the media for an hour on Monday ahead of today’s announcement, and emailed the full 38-page research report to reporters. In doing so, it seems as if TikTok wants to differentiate itself from Facebook, where damning internal research was kept quiet until whistleblower Frances Haugen came forward with thousands of documents indicating Facebook knew of its problems and failed to take action.

TikTok wants to be viewed as meaningfully engaged in the research process, but ultimately, the changes it’s making don’t solve the problem. At the end of the day, harmful content is a moderation challenge and a problem that stems from the nature of social media itself. Eliminating harmful content would require a system that’s designed from the ground up to not incentivize the shocking or outrageous in exchange for likes and views. That’s not how most social media is built.

TikTok additionally said it will look for sudden increases in violating content, including potentially dangerous behavior, linked to hashtags.

For example, if a hashtag like #FoodChallenge, which is commonly used to share food recipes and cooking inspiration, began to see spikes where more content was violating TikTok’s policies, then the moderation team would be alerted to look for the causes of this spike so they could take action.

Or, in other words, TikTok says it will now better moderate content — something users thought the company was doing anyway.

The full research report is below.

Praesidio Report – Exploring Effective Prevention Education Responses to Dangerous Online Challenges by TechCrunch on Scribd

Products You May Like

Articles You May Like

Beyoncé to perform live on Netflix during NFL Christmas Gameday stream
Latin America fintech will be a market to watch in 2025
Battery unicorn Northvolt files for bankruptcy, upending Europe’s industrial plan
Gadget lover Marc Benioff says his favorite gadget these days is his cars and trucks
Solar power magnate Gautam Adani and others indicted over alleged $250M bribery scheme

Leave a Reply

Your email address will not be published. Required fields are marked *