Lalani Walton, 8, enjoyed posting pictures and videos of herself dancing and singing on the popular TikTok app. Last year, her mom found her in her bedroom hanging above her bed with a rope around her neck after TikTok’s algorithm directed the youngster to a “Blackout Challenge.”
Matthew Bergman, an attorney with the Social Media Victims Law Center, is suing TikTok on behalf of Lalani and four other children who died allegedly at the hands of Big Social.
“In what universe does an 11-year-old or nine or 10-year-old ever want to be exposed to contact that encourages them to choke themselves?” the attorney asks. “They don't.”
Bergman alleges that TikTok's algorithms target young kids like Lalani with so-called "challenges,” such as hitting your head against a wall and ingesting cold medicine. Some of the stunts are dumb, he says, but they can also be dangerous.
In response to numerous lawsuits, Big Social says these kids are misusing their platform. TikTok told The Washington Post that it blocked users from searching for the “Blackout Challenge” and shows users a warning screen. The lawsuits allege, however, that TikTok is sending the challenges directly to its users, including children, website The Verge reported.
“The problem,” Bergman tells AFN, “is that they are being addicted to the social media product by algorithms that are explicitly designed to exploit them.”