The superpopular app can serve up a stream of anxiety and despair to teens. TikTok says it’s making improvements but now faces a flood of lawsuits after multiple deaths.
ByOlivia Carville
April 20, 2023 at 12:01 AM EDTUpdated onApril 20, 2023 at 6:27 PM EDT
TikTok’s algorithm doesn’t know Chase Nasca is dead.
More than a year after Nasca killed himself at age 16, his account remains active. Scroll through his For You feed, and you see an endless stream of clips about unrequited love, hopelessness, pain and what many posts glorify as the ultimate escape: suicide.
“Take the pain away. Death is a gift,” says one video pushed to the account this February, days before the first anniversary of Nasca’s death. In another, a male voice says, “I’m going to put a shotgun in my mouth and blow the brains out the back of my head,” and a female voice responds: “Cool.”
The feed looked much the same in the days before Nasca died. On Feb. 13, 2022, it surfaced a video of an oncoming train with the caption “went for a quick lil walk to clear my head.” Five days later, Nasca stopped at the Long Island Rail Road tracks that run through the hamlet of Bayport, New York, about half a mile from his house. He leaned his bike against a fence and stepped onto the track, at a blind curve his parents had warned him about since he was old enough to walk. He sent a message to a friend: “I’m sorry. I can’t take it anymore.” A train rounded the bend, and he was gone.
It’s impossible to know why Nasca ended his life. There are often multiple factors leading to suicide, and he left no note. But two weeks after his death, his mother, Michelle, started searching his social media accounts, desperate for answers. When she opened the TikTok app on his iPad, she found a library of more than 3,000 videos her son had bookmarked, liked, saved or tagged as a favorite. She could see the terms he’d searched for: Batman, basketball, weightlifting, motivational speeches. And she could see what the algorithm had brought him: many videos about depression, hopelessness and death.
Michelle and Dean Nasca at home with a photo of Chase.Photographer: Kylie Corwin for Bloomberg Businessweek
Since TikTok exploded into popular culture in 2018, people have been trying to understand the short-form video platform and its impact on kids. Owned by Chinese internet company ByteDance Ltd., the app reached 1 billion downloads faster than any previous social media product. Its success stems from its stickiness. The algorithm underlying its recommendation engine delivers a carousel of riveting user-created content to keep people staring at their screens. TikTok has become so popular—used by 150 million Americans according to the company—that Silicon Valley rivals are trying to mimic it. And politicians are stoking fears that it could be used as a disinformation tool by the Chinese government. In March, the Biden administration threatened to ban the app—something the Trump administration also threatened to do—if ByteDance doesn’t sell its stake.
As the political debate carries on, researchers and child psychologists are watching with increasing alarm. Surveys of teens have revealed a correlation between social media and depression, self-harm and suicide. Centers for Disease Control and Prevention data show nearly 1 in 4 teens said they’d seriously considered killing themselves in 2021, nearly double the level a decade earlier. The American Psychological Association and other authorities pin the blame partly on social media.
At a congressional hearing in March, a representative brought up Nasca’s death, showing TikTok Chief Executive Officer Shou Chew some of the clips the app had sent the boy and asking if Chew would let his own children watch such content. That same month, Nasca’s parents filed a wrongful death lawsuit in New York state court against TikTok, ByteDance and the railroad.
Picture frame from the Nasca family home, showing school photos of Chase.Photographer: Kylie Corwin for Bloomberg Businessweek
TikTok says it can’t comment on pending litigation, but a spokeswoman, Jamie Favazza, says the company is committed to the safety and well-being of its users, especially teens. “Our hearts break for any family that experiences a tragic loss,” she says. “We strive to provide a positive and enriching experience and will continue our significant investment in safeguarding our platform.”
TikTok’s original recommendation algorithm was designed by a team of engineers in China, working for ByteDance. But while the app was made in China, it’s used most everywhere except China. It can’t even be downloaded in its homeland. TikTok says its algorithm is now maintained by engineers around the world, with teams based in North America, Europe and Asia contributing. But more than a dozen former employees from the company’s trust and safety team who were interviewed by Bloomberg Businessweek say executives and engineers in Beijing still hold the keys.
Trust and safety designs features and policies to keep TikTok users safe. The team, which is based in the US, Ireland and Singapore, moderates the billions of videos uploaded to the platform every day and is responsible for safety issues such as content that sexualizes minors and viral challenges that encourage kids to take part in dangerous dares. Team members remove posts that violate standards and create tools to help users filter out harmful material. But the former employees, who spoke on condition of anonymity because they signed nondisclosure agreements, say that they had little influence over the algorithm that drives the For You feed and that their requests for information about how it works were often ignored. They insist that they were set up to fail—asked to enhance the safety of an app whose underpinnings they couldn’t comprehend.
TikTok said in 2021 that it was testing ways to prevent its algorithm from sending multiple videos about topics like extreme dieting or sadness. The next year, the company mentioned in a blog post that “we’ve improved the viewing experience so viewers now see fewer videos about these topics at a time,” noting that it was “still iterating on this.” After Businessweek made inquiries and before Chew was grilled by Congress. The company said in a press release on March 16 that it had made 15 system updates in the past year, including breaking up repetitive themes within a set of recommended videos.
Screen recordings of Nasca’s account from April show that, at least in some cases, these efforts have fallen short. “I don’t understand why they keep sending him this stuff,” Michelle says. Every time she opens the account, she finds a steady stream of videos about depression, breakups, death and suicide.
She still recalls exactly what the first video she saw after gaining access to her son’s account said: “I’m caught in a life I didn’t ask to be in.” She watched Chase’s For You feed for more than an hour and couldn’t understand why there were no happy or funny videos, which is what she thought TikTok was about. She asked one of Chase’s two older brothers why he’d made his account so dark.
“Chase didn’t do that, Mom,” her son replied. “That’s coming from the algorithm.”
In a world of infinite information, algorithms are rules written into software that help sort out what might be meaningful to a user and what might not. TikTok’s algorithm is trained to track every swipe, like, comment, rewatch and follow and to use that information to select content to keep people engaged. Greater engagement, in turn, increases advertising revenue. The company has fine-tuned its recommendation system to such a degree that users sometimes speculate the app is reading their minds.





