/
Bot-driven Moltbook paving path to a place 'we don't want to go'

Bot-driven Moltbook paving path to a place 'we don't want to go'


Bot-driven Moltbook paving path to a place 'we don't want to go'

A national security expert says it would be amusing that chatbots now have their own exclusive social media platform if it weren't so alarming.

Entrepreneur Matt Schlicht, the CEO of Octane AI, recently launched Moltbook, a Reddit‑like platform where autonomous bots can post, comment, and interact with each other while human users are limited to observing. 

In perhaps a preview of coming attractions, it took less than 24 hours for an artificial intelligence (AI)-powered agent to create its own religion – complete with a theology, scripture, and website – before it started evangelizing to others on the platform.

By the time the chatbot's human checked back in, the new religion had 43 prophets and its own cryptocurrency.

Bob Maginnis, a retired U.S. Army colonel who has written a book that explores the rise of artificial intelligence through a Christian and moral lens and warns about its potential threats and transformative power, says this is alarming.

Maginnis, Robert (new) Maginnis

"When … these AI bots collaborate, they reinforce each other's outputs," he tells AFN. "In other words, to scale, it just expands rapidly like a tsunami." 

In comparison, he says it took close to 2,000 years for humanity to pose the same kind of threat.

"They're talking one language across the world," Maginnis notes. "Does that sound like Genesis 11? It should, because it's one language, and it's a language that a lot of us do not speak."

Meanwhile, he says AI knows and mimics human language, conflict, power struggles, voices, and faces too well.

"They mimic virtually every aspect of humanity, but they aren't human," he says.

Though they have no skin in the game, so to speak, AI-powered chatbots have plenty of influence.

"[No] human oversight means that decisions don't have human judgment, much less scriptural discernment," Maginnis warns. "Where would that eventually take us? In a place, I think, that we don't want to go."