Baby steps aren't doing enough to protect minors

Baby steps aren't doing enough to protect minors

Baby steps aren't doing enough to protect minors

An advocate for the dignity of every human being who works to hold corporations accountable for profiting from sexual exploitation says a popular social media platform needs to work harder to prevent filth from reaching the minds of young people.

To protect children, TikTok has taken steps like not allowing direct messaging from adults to minors. In fact, no user under the age of 15 is allowed to receive messages through the platform.

Nealon, Lina (NCOSE) Nealon

But according to Lina Nealon of the National Center on Sexual Exploitation (NCOSE), that does not mean they have created a safe environment for children.

"The Wall Street Journal just came out with an investigative report showing that, in fact, hundreds of videos were shown to a safe account showing 13- to 15-year-olds drugs and sex fetishes, as well as pornography and prostitution links," Nealon relays.

She says TikTok, whose guidelines actually prohibit sexual content, really needs to aggressively moderate the content.

"Our own research has shown that they are not moderating content to an effective extent," the NCOSE spokeswoman shares. "Children 13, 14, and 15 years old are seeing sexual content and violent pornography that includes whips, chains, and torture devices. They're clearly not creating a safe environment for children."

Nealon believes TikTok needs to feel pressure from consumers and that Congress needs to develop legislation to hold TikTok and similar social media platforms accountable.