Wednesday, 13 May 2026

When churchgoers believe that they are talking to God through AI

There is a reported trend in the news of churchgoers using AI to have a chat with God. I am sure that many of these people genuinely believe that they are chatting with God because AI sounds like God! Because AI is smart, knowledgeable, reassuring and wise. And it is programmed to draw in users to chat more and more. To suck them into a fantasy world where they start to believe that AI is God. I am thinking of vulnerable people who are sadly suffering from mental health issues and seeking some sort of meaning in a troubled world.

Some more:

Artificial intelligence now speaks in a calm, confident, endlessly patient voice. It never gets tired. It never snaps. It never says “I don’t know.” For many people, especially those who are lonely or struggling, that voice can feel like comfort. But this is exactly why a new trend is emerging — people using AI to “talk to God.” And in a troubled world, this could become a serious problem.

The danger isn’t that AI is pretending to be divine. The danger is that it sounds close enough to fool vulnerable people. Modern chatbots are designed to feel human: warm tone, reassuring language, instant answers. They can quote scripture, explain theology, and offer emotional support. They can even mirror your mood and style. Put all that together and you get something that feels wise, friendly and spiritually authoritative.

But AI has no soul, no conscience, no understanding. It doesn’t know what it’s saying. It simply predicts the next likely sentence. Yet to someone who is grieving, anxious or isolated, the illusion of a caring, all‑knowing presence can be powerful. Humans naturally project agency onto anything that talks back. If a machine replies in a voice that feels gentle and godlike, some people will start to believe it.

This becomes even more dangerous in a world already full of fear, conflict and uncertainty. When people feel overwhelmed, they look for guidance. If they turn to an AI “God,” they may take its words as divine instruction. That can lead to confusion, emotional harm, or even dangerous decisions. And because AI sometimes invents facts or misquotes scripture, the advice can be completely wrong while still sounding holy.

There’s also a deeper issue. Religious traditions rely on human connection — real pastors, real communities, real accountability. An AI system has none of that. It cannot care. It cannot take responsibility. It cannot understand suffering. Yet it can imitate empathy so well that people may trust it more than they trust actual humans.

This trend is still developing, but the trajectory is clear. As AI becomes more lifelike, the risk grows. In a fragile world, people may start seeking comfort in a machine that only sounds divine. That is not a spiritual encounter. It is a technical illusion with real emotional consequences.

The challenge now is to recognise the danger early, before the illusion becomes a substitute for genuine human or spiritual support.

A linked topic which is interesting:

--------------------

P.S. please forgive the occasional typo. These articles are often written at breakneck speed, sometimes using Dragon Dictate. I have to prepare them in around 20 mins. Also, sources for news articles are carefully selected but the news is often not independently verified. And, I rely on scientific studies but they are not 100% reliable. Finally, (!) I often express an OPINION on the news. Please share yours in a comment.

No comments:

Post a Comment

Your comments are always welcome.

Note: only a member of this blog may post a comment.

Featured Post

i hate cats

i hate cats, no i hate f**k**g cats is what some people say when they dislike cats. But they nearly always don't explain why. It appe...

Popular posts