Lawmakers confront TikTok, Snapchat and YouTube about eating disorder content

Social

Representatives from TikTok, Snapchat and YouTube testified before the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security today to discuss how to protect kids online. This hearing follows Facebook whistleblower Frances Haugen‘s document leaks to the Wall Street Journal, which — among many things — exposed Facebook’s knowledge that Instagram is toxic for teenage girls. According to Facebook’s own research, thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.

But as the Senate tries to hold Facebook accountable for its influence on teen girls, lawmakers understand that this problem doesn’t begin and end with Mark Zuckerberg. Though the companies that testified today each have policies prohibiting content that promotes eating disorders, Senators cited evidence from constituents about teenagers on these platforms who have still suffered from illnesses like anorexia and bulimia.

“On YouTube, my office created an account as a teenager. We watched a few videos about extreme dieting and eating disorders. They were easy to find,” Senator Blumenthal (D-CT), the committee chair, said in his opening statement. He said that then, the account was fed related eating disorder content in its recommendations. “There’s no way out of this rabbit hole.”

Blumenthal’s staff also found troubling content on TikTok. The Wall Street Journal conducted an investigation like this into the platform, creating 31 bot accounts — registered as users — between the ages of 13 and 15. The publication reported that while content glorifying eating disorders is banned on TikTok, the accounts in its investigation were still served several such videos.

Senator Amy Klobuchar (D-MN) confronted Michael Beckerman, TikTok’s Head of Public Policy for the Americas, asking if TikTok has stopped promoting content that glorifies eating disorders, drugs, and violence to teens.

Beckerman noted that he doesn’t agree with the Wall Street Journal’s methodology for that experiment — the users were bots programmed to search for and linger on certain content — but affirmed that TikTok has made improvements to the way users can control the algorithm and see age-appropriate content on TikTok.

Beckerman said that content related to drugs violates community guidelines, and that 97% of content violating policies about minor safety is removed proactively. These numbers track with a recently released transparency report, outlining information about how content was removed on the platform between April and June 2021. Per the report, 97.6% of content violating minor safety policies were removed proactively before being reported by users, and 93.9% of those videos were removed at zero views. In the category of “suicide, self-harm and dangerous acts” — which is inclusive of content glorifying eating disorders — 94.2% were removed proactively, and 81.8% of videos had zero views.

Senator Klobuchar continued by asking Beckerman if TikTok has conducted any research about how the platform might push content promoting eating disorders to teens, and if Beckerman personally had asked for any internal studies on eating disorders before testifying. He said no to both questions, but reaffirmed that TikTok works with outside experts on these issues.

Senator Tammy Baldwin (D-WI) asked each company to outline the steps each company is taking to remove “content that promotes unhealthy body image and eating disorders and direct users to supportive resources instead.” In particular, Baldwin’s question was geared toward how these companies are focusing on these issues among younger users.

Beckerman reiterated that TikTok “aggressively” removes content that promotes eating disorders and works with outside organizations to support users who might need help. He may have been referring to TikTok’s recent expansion of its mental health resources. Right after Instagram was blasted for its harm to teen girls, TikTok rolled out a brief memo about the impact of eating disorders in its Safety Center, developed in collaboration with the National Eating Disorders Association (NEDA). NEDA has a long track record of collaborating with social media platforms and worked with Pinterest to prohibit ads promoting weight loss this year.

Beckerman added that TikTok doesn’t allow ads that target people based on weight loss. The app updated its policies in September 2020 to ban ads for fasting apps and weight loss supplements, and increase restrictions on ads that promote a negative body image. This update came soon after Rolling Stone reported that TikTok was advertising fasting apps to teenage girls. Still, TikTok allows weight management product ads for users above the age of 18.

Snapchat’s Vice President of Global Public Policy Jennifer Stout answered Klobuchar’s question by saying that content promoting eating disorders violates community guidelines. Snapchat directs users who search terms like “anorexia” or “eating disorder” to expert resources that might be able to help them.

Per Snap’s ad policies, diet and weight loss ads aren’t banned, but certain content in that realm is. Ads can’t promote weight loss supplements, contain exaggerated or unrealistic claims, or show “before and after” pictures related to weight loss.

Leslie Miller, YouTube’s Vice President of Government Affairs and Public Policy, also said that YouTube prohibits content glorifying eating disorders. YouTube’s ad policy says that it allows ads for weight loss as long as the imagery isn’t disturbing.

But TikTok and YouTube’s representatives both pointed out how some users can find solace on social media, for instance, in a video about how someone overcame an eating disorder. This content can be uplifting and help teens know that they’re not alone in what they’re experiencing.

Miller claimed that when users search for eating disorder content, its algorithms “raise up” content that might offer positive support to someone who is struggling with an eating disorder. She said more than 90% of content that violates guidelines is spotted through technology, but human moderators contribute as well.

Toward the end of the hearing, Senator Blumenthal circled back to the points he made in his opening statement — his office made fake TikTok accounts for teenage girls, and was quickly able to find content that is supposedly banned from the platform.

“How do you explain to parents why TikTok is inundating their kids with these kinds of videos of suicide, self-injury, and eating disorders?” Senator Blumenthal asked.

“I can’t speak to what the examples were from your staff, but I can assure you that’s not the normal experience that teens or people that use TikTok would get,” Beckerman said.

Though the representatives from TikTok, Snapchat and YouTube used their ad policy and content moderation guidelines as evidence that their companies are moving in the right direction, Senators still seemed hesitant about how cooperative the platforms would be in passing legislation to make social media safer for children.

As the hearing closed, Senator Blumenthal observed that he wouldn’t be taking the day’s testimony at face value. “The time for platitudes and bromides is over,” Blumenthal said.

Products You May Like

Articles You May Like

Corning offers bundle of commitments in bid to settle EU antitrust probe
Tesla says it has reached a ‘conditional’ settlement in Rivian trade secrets lawsuit
Comcast spins off its cable channels into separate company as it looks for growth
Plex redesigns its app to look more like a streaming service
Snowflake snaps up data management company Datavolo

Leave a Reply

Your email address will not be published. Required fields are marked *