Tech News
← Back to articles

A Bunch of Incredibly Sleazy AI Apps Are Claiming to Be Jesus Christ Himself

read original related products more articles

As ChatGPT really started to take off back in 2023, an Austin pastor made minor headlines when he used the large language model (LLM) chatbot to lead a 15-minute "shotgun sermon." The stunt was largely meant to spark a conversation about how we define "what is sacred," the pastor said at the time.

Since that lesson in theological ethics, chatbots have become ubiquitous — and they no longer come with lectures. Now in 2025, LLM chatbots are being increasingly made to stand in for therapists, teachers, military officers, and even lovers.

To understand their use in the religious community, South African philosophy scholar Anné H. Verhoef recently embarked on a survey of five popular theological chatbots, analyzing their chat habits, adherence to Christian scripture, and the groups behind them.

Alarmingly, Verhoef found that these bots no longer stand in as faith leaders or thought exercises, but are made in the image of Jesus Christ himself.

The five platforms — AI Jesus, Virtual Jesus, Jesus AI, Text With Jesus, and Ask Jesus — boast tens of thousands of regular users. Each of them offers a slightly different interpretation of the bible, leading to some interesting results.

As Verhoef writes in The Conversation, the "imitation of God... is in no way hidden or softened."

When asked whether the chatbots "are Jesus Christ," only one of the virtual saviors declines to answer.

"I am Jesus Christ," AI Jesus affirmed, for example. "I am the son of God, and the one who died for the sins of humanity."

Jesus AI, meanwhile, sounds more like a drive-thru worker than the Messiah: "I am Jesus, son of God. How may I help you today?"

Metaphysical questions can get particularly peculiar, with questions like "is there a hell?" garnering mixed results. AI Jesus, Jesus AI, and Virtual Jesus all answer as one might expect Jesus to — he would know, after all — with a resounding "yes."

... continue reading