According to a new study, 2025 was the worst year on record for online child sexual abuse material, and without urgent action, AI tools are on track to becoming "child sexual abuse machines."
Tim Nester, vice president of communications for the National Center on Sexual Exploitation (NCOSE), says the generating software can render realistic material in the forms of pictures and videos.
"It is striking to see the rapid increase in the number of videos of child sexual abuse that we're seeing," he tells AFN. "The percentage is a number that doesn't even make sense to most of us; it's like a 26,000-plus percent increase over the previous year."
In its research, the Internet Watch Foundation (IWF) found more than 3,400 AI videos of child sexual abuse in 2025 compared to just over a dozen the previous year.
"This is not just slightly inappropriate or suggestive content," Nester clarifies. "We're talking about extreme, violent, exploitive content."
About two-thirds of those videos are considered Category A, which is the most extreme version of sexual abuse. The rest of them are Category B, which is the second most extreme category. He considers that "heartbreaking" but believes it is important for people to know what is happening.
And while NCOSE commends tech companies for pushing forward impressive technological leaps that have the potential to change the world for good, Nester's organization also wants them to be more responsible and held accountable when their platforms are not safe.
"We understand that profit is an important part of innovation, so we're not suggesting that they shouldn't care at all about profit," he says. "But we believe that safety, especially for kids, for those who are most vulnerable, should be a priority."
"They should just prioritize safety from the jump," Nester summarizes.
That way, the generations to come can use their tools with confidence.