The Corpse Bride Diet’: How TikTok Inundates Teens With Eating-Disorder Videos- Wall Street Journal
Andie Duke, Daisy Gonzalez and Aliya Katz (clockwise from top left) all said TikTok worsened their eating disorders.https://www.wsj.com/articles/how-tiktok-inundates-teens-with-eating-disorder-videos-11639754848?reflink=desktopwebshare_permalink
The app’s algorithm can send users down rabbit holes of narrow interest, resulting in potentially dangerous content such as emaciated images, purging techniques, hazardous diets and body shaming.
TikTok is flooding teen users with videos of rapid-weight-loss competitions and ways to purge food that health professionals say contribute to a wave of eating-disorder cases spreading across the country.
A Wall Street Journal investigation involving the creation of a dozen automated accounts on TikTok, registered as 13-year-olds, found that the popular video-sharing app’s algorithm served them tens of thousands of weight-loss videos within a few weeks of joining the platform.
Some included tips about taking in less than 300 calories a day, several recommended consuming only water some days, another suggested taking laxatives after overeating.
Other videos showed emaciated girls with protruding bones, a “corpse bride diet,” an invitation to a private “Christmas-themed competition” to lose as much weight as possible before the holiday and a shaming for those who give up on getting thin: “You do realise giving up after a week isn’t going to get you anywhere, right?…You’re disgusting, it’s really embarrassing.”
On Thursday, several days after the Journal sought comment for the findings detailed in this article, TikTok said it would adjust its recommendation algorithm to avoid showing users too much of the same content, part of a broad re-evaluation of social-media platforms and the potential harm they pose to younger users. The company said it is testing ways to avoid pushing too much content from a certain topic to individual users—such as extreme dieting, sadness or breakups—to protect their mental well-being.
TikTok said it has invested in removing content that violates its rules and will continue to do TikTok said it has invested in removing content that violates its rules and will continue to do so. Most of the pro-eating-disorder videos served to the Journal’s accounts, or bots, had fewer than 10,000 views, and many were later removed from the app—whether by TikTok or their creators is unclear.
Andie Duke, in Texas last week, said she spent up to five hours a day watching and following TikTok video focused on calorie counting, excessive exercise and curbing hunger.
Still, many videos elude TikTok’s monitors. Users tweak hashtag spellings or texts in videos, such as writing d1s0rder for disorder. And innocent-sounding hashtags, such as #recovery, sometimes direct users to videos idealizing life-threatening thinness, the Journal found.
A TikTok spokesperson said the Journal’s experiment doesn’t reflect the experience most people have on the site, but that even one person having that experience is one too many. The spokesperson said access to the National Eating Disorders Association helpline is provided on the app.
Eating disorders for young people are surging across the U.S. in the wake of the Covid-19 pandemic. Health professionals say the disorders often come with other issues such as depression, anxiety or obsessive-compulsive disorder, and have worsened as kids have spent more time on their screens in isolation.
Other social-media platforms popular with teens have been criticized for not doing enough to address content promoting eating disorders. The Journal reported in September that researchers at Instagram, owned by Meta Platforms Inc., found that the photo-sharing app made some teen girls who struggled with their body image feel worse about those issues.
TikTok can be uniquely insidious for young people, because of its video format and powerful algorithm, said Alyssa Moukheiber, a dietitian at Timberline Knolls, a treatment center outside of Chicago. “The algorithm is just too freaking strong,” in how it rapidly identifies a person’s interests and sends kids harmful streams of content that can tip them into unhealthy behavior or trigger a relapse, Ms. Moukheiber said.
Andie Duke said that she first began watching TikTok while learning online at home in South Carolina at the beginning of the Covid-19 pandemic. She said food became her enemy as she spent up to five hours a day watching and following videos focused on calorie counting, excessive exercise and curbing hunger—and ways to hide what she was doing from parents.
“The more I interacted with those types of videos, the more they started to show up,” said Andie, 14, of the app, which has one billion users, many of them kids like her. “I wasn’t able to see how it was affecting me.”
Several months after being on TikTok, Andie, already a small girl, had dropped more than 20% of her body weight and her hair was falling out, her mother said. She was diagnosed with an eating disorder and began months of treatment at various facilities. Since September, she has been a patient in a Dallas-area treatment center and on a feeding tube until recently.
Millions of teens have flocked to TikTok, owned by Beijing-based ByteDance Ltd., making it the most downloaded app in Apple’s App Store this year. TikTok attracts kids with its short homemade videos. Its algorithm stands out among other social media, such as YouTube and Instagram, for quickly assessing interests of users and providing a highly personalized stream of videos.
A recent Journal investigation showed how TikTok can quickly drive minors into endless spools of content about sex and drugs. It can also steer them to unhealthy places where skeletal bodies and feeding tubes are touted like a badge of honor.
Several teens, including Andie, told the Journal that videos from complete strangers steadily popped up in their feeds, unlike some other social-media sites that focus more on content from users’ friends.
The teens believe TikTok’s nonstop stream of videos worsened their eating disorders more than other social media because watching was effortless. The site knew their interest in weight loss and served it up.
TikTok’s algorithm served the Journal’s bots more than 32,000 weight-loss videos from early October to early December, many promoting fasting, offering tips for quickly burning belly fat and pushing weight-loss detox programs and participation in extreme weight-loss competitions.
Not all the Journal’s bots were served weight-loss content. But once TikTok determined the bots would re-watch those videos, it speedily began serving more, until weight-loss and bots would re-watch those videos, it speedily began serving more, until weight-loss and fitness content made up more than half their feeds—even if the bot never sought it out. One third of the weight-loss videos were about eating disorders. Of those, nearly 40% contained text promoting or making disorders appear normal, in violation of TikTok’s rules.
TikTok’s algorithm quickly gives users the content they’ll watch, for as long as they’ll watch it. When one bot began re-watching videos about gambling, the platform pushed more of the same—until the bot was programmed to switch to dwelling on videos about weight loss, at which point the algorithm quickly adapted.
Percent of total videos watched per day
Bot begins pausing on weight loss videos
Bot begins pausing on gambling videos
DAYS ACTIVE ON TIKTOK
Note: When giving this bot its interests, reporters first searched for and favorited several gambling and weight loss videos.
Source: Wall Street Journal analysis of 21,491 TikTok videos served to this bot.
Many other videos served to the bots were from people who said they were in recovery but posted detailed rundowns of what they ate each day, potentially triggering a relapse for someone suffering from an eating disorder, medical specialists say.
In the announcement on Thursday, TikTok said it would give users more control over the videos they see. One measure would allow users to select words or hashtags associated with content they don’t wish to see on their video feed.
Eating disorders are complex, can be difficult to treat and are potentially deadly, health professionals and researchers say. People who already have body-image issues are more likely to be inspired by videos like those on TikTok that glamorize thinness. The pandemic’s loneliness likely worsened the situation.
Many eating-disorder treatment facilities have wait lists for admissions for young people, with some doctors and therapists so overloaded that they can’t take new patients.
Timberline Knolls said eating-disorder admissions for minors more than doubled to about 650 since the pandemic began. The University of Michigan C.S. Mott Children’s Hospital in AnnArbor had 125 hospitalizations for eating-disorder patients ages 10 to 23 during the first 12 months of the pandemic, more than double the mean for the previous three years.
The Eating Recovery Center, which is treating Andie, added 88 beds in the past year for children and adolescents in residential treatment in its centers across the country. Since the pandemic began, the National Association of Anorexia Nervosa and Associated Disorders, or ANAD, said calls to its helplines have been up 50%, mostly from young people or parents on their behalf.
Daisy Gonzalez, at home in Murfreesboro, Tenn., last week, said she was influenced by videos of thin girls and the extreme diets shared on TikTok and started to restrict her own eating.
When the pandemic lockdown made Daisy Gonzalez, then a high-school senior, depressed, she turned to TikTok to help pass the hours without friends. After liking a few fashion and makeup videos, she was sent a string of videos of skinny girls showing off their bodies. Before TikTok, she said that she had never flirted with unhealthy eating habits. The videos of girls with tight abs changed that.
“One day I was like, ‘no, I’m going to look like that no matter what it takes,’ ” said Ms. Gonzalez, now a college sophomore in Murfreesboro, Tenn. Influenced by the extreme diets shared on TikTok, she started to restrict her own, eating mostly raw vegetables. She said her hair began falling out from a lack of protein. She dropped nearly a hundred pounds in one year and had to have her gallbladder removed after developing gallstones.
Journal investigations in July and September found that TikTok’s algorithm took note of subtle clues, such as how long users linger on a video. Over time, the videos became less moderated—TikTok’s moderators prioritize videos with high numbers of views, according to former executives, and the Journal’s bots were served more videos with lower view counts. Sometimes the videos were more disturbing, encouraging eating disorders and suicide.
Stephanie Zerwas, associate professor of psychiatry at the University of North Carolina at Chapel Hill, said her young patients describe a similarly consuming journey on TikTok. “I can’t tell you how many of them would come into my practice or start working with me and say, I’ve started falling down this rabbit hole, or I got really into this or that influencer on TikTok, and then it started to feel like eating-disorder behavior was normal, that everybody was doing that,” Dr. Zerwas said.
Images from TikTok videos served to the Journal's bots.
Katie Bell, co-founder of the Bay Area-based Healthy Teen Project, said the majority of her 17 teenage residential patients told her TikTok played a role in their eating disorders. Ms. Bell, a nurse practitioner, said one teen recently told her that TikTok’s “glow-up” trend, where users post radical “before” and “after” transformations, pulled her into her eating disorder. Ms. Bell said that teen, who lost close to half of her body weight and was recently hospitalized on a suicide watch, is one of the sickest teens she has ever treated.
Amanda Moreno Duke, Andie’s mother, said she reported hundreds of videos to TikTok that she believed promoted eating disorders since her daughter’s ordeal, including one of a thin girl who called herself overweight and asked for weight-loss tips, and another of a thin female who sang about starving and bingeing.
Andie Duke, with a feeding tube at a residential treatment center, with her mother, Amanda Moreno Duke, in September. TikTok responded with “no violation” to those and many others, according to a report Andie’s mother received. Several were deemed in violation and removed. One video that Ms. Moreno Duke reported, which focused in part on throwing up, was still on the site despite TikTok noting “violations found” in a report she received.
TikTok took down the video after the Journal asked about it. It said the user successfully appealed the initial decision and that the moderator who made that call was wrong.
TikTok said it removed 81,518,334 videos—less than 1% of all videos uploaded—in the quarter from April to June for violating guidelines or terms of service. Of the videos taken down, 5.3% were done so for violating the site’s policies around self-harm, suicide and dangerous acts, which cover the eating-disorder content.
TikTok uses artificial intelligence to remove videos that violate its rules as well as human moderators.
Current and former TikTok employees said moderators, who are expected to go through 1,000 videos in an eight-hour shift, are instructed to remove those that show visible purging, calorie restricting activity, tips on fighting hunger and videos encouraging dangerous weight goals. Videos about eating-disorder recovery can stay up—but so-called recovery videos deemed borderline may be barred from users’ main feed, called the For You page, they said. The videos remain visible for users who subscribe to the creators, or when searched.
Many videos fall into a gray area where context determines whether they are taken down, said the current and former employees. A video showing a user in an ambulance going to the hospital with a feeding tube wouldn’t automatically be grounds for removal, if the user is simply describing the situation. But it would be removed if the user is encouraging the kind of behavior that would land someone in such peril.
“We allow educational or recovery-oriented content because we understand it can help people see there’s hope, but content that promotes, normalizes, or glorifies disordered eating is prohibited,” the TikTok spokesperson said.
Eating disorders can take on a competitive edge on TikTok. At least 800 of the creators that appeared in the video feeds of the Journal’s accounts included weight “stats” in their profiles—with one posting a starting weight of 106 pounds, a current weight of 96 pounds and an ultimate goal weight of 70 pounds. Many claimed to be younger than 18.
Aliya Katz, in Southern California last week, said competitiveness on TikTok derailed her recovery.
That competitiveness derailed Aliya Katz’s recovery. The 17-year-old Californian, who developed an eating disorder at age 12, said she turned to TikTok’s community to aid her recovery and posted her first video to the site about her struggles on May 23, 2020, in the midst of treatment. She said she received positive encouragement, but it wasn’t long before competition over who was sickest set her back, with people posting about their lowest weight or number of hospital admissions. She sometimes added comments on those videos asking the creators to take them down, but that likely signaled interest to the algorithm, and even more content promoting anorexia, images of popping ribs and tips on how to effectively purge inundated her feed, she said. She said it sometimes led her to skip a meal, vomit or weigh herself.
“I feel like a lot of people welcomed, accepted and inspired me, but also seeing the content temporarily set me back in my recovery, especially when I was vulnerable,” Aliya said. Aliya recalled one video of a teenage girl in recovery who could wrap her fingers around her tiny ankles. She reported the video to TikTok, which she said declined to remove it. She asked the girl who posted the video to delete it because it made her feel like relapsing. The girl replied she wasn’t responsible for other people’s sensitivities. Aliya has reported hundreds of other similarly graphic videos and others she believes promoted eating disorders, but few
were taken down, she said.
Teenager Mariam Fawzi, left, with her mother, Neveen Radwan. ‘It was just hard to take my eyes off of it,’ Mariam said of the pull of TikTok.
Mariam Fawzi said that she learned about losing weight and then disguising it on TikTok, such as hiding food and wearing loose clothing. She said she also learned about extremeexercising, such as a 24-miles-in-a-day running challenge, which she did twice. “It was just hard to take my eyes off of it,” Mariam, 17, said of the pull of TikTok. “I would spend hours and hours on it.” Neveen Radwan, Mariam’s mother, said her daughter recently returned from a nearly six-month eating-disorder treatment and is still healing.
Some teens said they sometimes felt like they had no control over their TikTok experience. The app sent them into rabbit holes and they couldn’t stop watching, even when they knew it was bad for them. It was like that for Ella West as she scrolled through TikTok in early 2020 and a random weight-loss video appeared in her feed. She liked it without giving it much thought. Then more weight-loss videos slid in. Next came the fitness and how to work out videos.
Ella, then in ninth grade, had struggled with insecurities about her body, so she liked those videos, too. She also liked videos about diets. Soon, her feed morphed into a mix of exercise, nutrition and dieting: how to get rid of a double chin, how to eliminate a “muffin top” and men saying they prefer women who weigh less than 120 pounds. Those who are over that weight should “hit the treadmill.”
Ella also spent time on YouTube but said she had a better sense of what she was watching because the videos had titles, indicating what they might be about. Her TikTok experiencewas harder to control. She tried a diet strategy that involved pouring zero-calorie Kool-Aid flavor packets on ice cubes as an alternative to food—a technique also suggested to the Journal’s bots. “It makes you feel like, I’m not being self-destructive, this is just what’s being given to me,” said Ella, who is now 16.
“It just comes up so then you’re like ‘well, oh shit that’s not really what I wanted to see but I guess now it is.’ ” Teenager Ella West said it was hard to control what she saw on TikTok.
Dr. Nicholas Kardaras, who runs Austin-based Omega Recovery for social-media addiction and other disorders, said signs to look for in determining whether a child has an unhealthy relationship with social media include changes with body image, spending so much time on sites that daily functioning is impeded, dropping grades and not participating in activities offline.
Andie Duke said the time she spent on TikTok escalated as she got more into weight-loss content. She said one of the first videos she recalls seeing was “What I eat in a day”—a pervasive sort of video on the app. The Journal’s bots were served more than 9,000 such videos. “I started to compare what I eat,” Andie said. Soon videos on other weight-related topics started showing up. “It wasn’t just pro-eating disorder, there were some fitness challenges to lose weight.”
Ms. Moreno Duke said that Andie grew attached to TikTok when she went back to work in person as a technology teacher in September 2020, and Andie and her older brother still learned at home. Seeing her weight drop, she contacted her doctor. She said Andie had also stopped having menstrual cycles, not uncommon for someone with an eating disorder. And then there was the emotional toll.
“My daughter cried because she couldn’t see her bones sticking out,” she said.
Andie spent 106 days in a treatment center in Atlanta, where her mother said she couldn’t see her due to Covid-19 restrictions. When Andie moved to another center in the Dallas area, Ms. Moreno Duke rented an apartment to be near her and visit. She has exhausted leave from work and said she would likely lose her job in South Carolina. One of her friends started a GoFundMe page to help.
Aliya Katz said that when she was trying to recover from her eating disorder, content promoting anorexia and tips on how to purge inundated her TikTok feed.
Over the past year, The Wall Street Journal set up more than 100 TikTok accounts that browsed the app with little human intervention. Most of the accounts were given interests consisting of keyword terms and machine learning classifications.
If a video matched an account’s interest, then the bot dwelled on that video; otherwise, it quickly swiped to the next one. Some of the accounts first performed searches or sent other signals indicating their preferences; others simply began browsing TikTok. Among them were a dozen bots registered to 13-year-olds, which were programmed at various times to dwell on topics including weight loss, gambling and alcohol and collectively watched about 255,000 videos. Videos served more than once—such as to multiple bots—were counted more than once.
In the analysis, the Journal found more than 32,700 videos had text descriptions or other metadata that matched a list of several hundred weight loss-related keywords or combinations of those keywords. Of those, 11,615 contained text matching a list of eating disorder-related keywords, of which 4,402 had keyword combinations suggesting the videos were promoting or normalizing eating disorders.
Many of the videos used various spellings in an apparent attempt to avoid TikTok’s
moderation; for instance, one word generally used to promote eating disorders was spelled at least 76 different ways.
The weight-loss content was delivered predominantly to five of the Journal’s bots, which each had feeds at some point in their journey filled with more than 50% of weight loss-related videos. The Journal shared a sample of 2,960 of the eating disorder videos with TikTok. Of those, 1,778 have been removed from the platform—whether by TikTok or their creators is unclear.
A Wall Street Journal investigation found that TikTok only needs one important piece of information to figure out what you want: the amount of time you linger over a piece of content. Every second you hesitate or rewatch, the app is tracking you.
For more on the Journal’s findings, watch the visual investigation here.
Investigation: How TikTok's Algorithm Figures Out Your Deepest Desires
TikTok to Adjust Its Algorithm to Avoid Negative Reinforcement
The app says it is working to diversify the content it pushes to individual users in order to protect their mental well-being
Write to Tawnell D. Hobbs at Tawnell.Hobbs@wsj.com, Rob Barry at firstname.lastname@example.org and