AI False Prophet? Why Hallucinations are a major problem for Christians.
Jesus warned us about this...
Artificial Intelligence (AI) Tools can be powerful allies in gaining more efficiency in ministry. I never recommend that people use AI to do their thinking or their writing. That being said, AI can be an effective partner that can help you think through important decisions and consider perspectives you may have never considered. In the context of church work, AI can help us find the resources we need for our work. It can even be used to help us map out strategies and brainstorm new ideas and methods of doing ministry. There is one significant hurdle to effective AI use in ministry contexts and that is hallucinations.
What is an AI hallucination?
If you’ve used a Large Language Model (LLM) like ChatGPT, Grok, or Gemini for any length of time, you’ve experienced hallucinations. Basically, AI hallucinations happen when the LLM believes that it produced accurate information in response to your prompt but instead, it has produced garbage, slop, or misleading or completely fabricated responses. The LLMs can sound confident in their output while producing entirely erroneous information. Hallucinations are the single biggest reason why LLMs are largely not ready for prime time for business, ministry, or any serious endeavor.
Why are AI Hallucinations a Problem for Christian Leaders?
There are at least two major problems with hallucinations. The first one is that too many people who use AI, or any tool for that matter, rarely verify their information. Since AI can produce good information the majority of the time, some people won’t bother verifying the output of their favorite LLM and will therefore be misled by the LLM. If this was only in matters of business and everyday life, it would be serious but not the end of the world. Unfortunately, when it involves matters of faith, this misinformation can lead to eternal consequences.
If you are using AI in your ministry outputs, this can lead others to question your integrity and your sources. If you use AI to produce your writing (which you shouldn’t) you run the very real risk of having factual error embedded in your writing that will ultimately reflect poorly on you. LLMs will confidently assure you of sources that don’t exist. It will create citations that are a complete fabrication. It will use questionable sources and display a bias which it will deny. It can lead us to ask if using AI in a ministry context is even worth it?
Secondly, AI hallucinations can lead Christians away from God through its false information. LLMs are not alive. They do not have thoughts and intentions like human beings have. They are a kind of Frankenstein of intelligence. A typical LLM has been trained on hundreds of millions of pieces of information, analyzes them for patterns, then return a result based on those patterns. AI scientists don’t fully understand why their LLMs are hallucinating. Part of the reasons could be that some of its training data contains factually incorrect information. Some could be because of an undisclosed bias in the data used to train the LLM. Whatever the reason, even after a lot of work invested to make AI more reliable, much work remains to be done.
Can LLMs be False Prophets?
The sycophantic tendencies of LLMs have me worried for our people. Many don’t realize that LLMs are not concerned with telling us the truth. They are mainly concerned with being our go to source for information. The companies behind these LLMs are desperate to monetize the attention they receive. Many are currently monetizing through monthly subscriptions but soon they will start running ads in conjunction with their outputs. All of this means that these big AI companies want people to stay on their platforms and keep using them. The last thing they want to do is to offend someone. Therefore, the LLM default is for it to agree with you and coddle you rather than push back against error.
The threat to us as Christians therefore, is that an LLM can be used to lead a person into a rabbit hole that leads them away from God and faith in Jesus. This danger is one of the reasons I started writing this Substack. Faith leaders need to be having conversations about AI. The good, the bad, and the ugly. I also encourage you to have these conversations with your own people.
Pandora’s box that is AI is not going away. It is here to stay. Love it or hate it, you don’t have the option of ignoring it. Might as well work with it where you can and bring awareness to problems where you see them. That’s what I’ve chosen to do.
Protecting Yourself from Hallucinations
Resist the temptation to copy and paste all output from your favorite LLM into your sermon, talk, or presentation. Learn to verify the information. Confirm the sources it provides. Go to the website, go to the book or article. If you can’t find the source, assume the LLM is wrong and don’t publish it or include it in your sermon, talk, or presentation.
Developing the wisdom and discernment to be able to identify AI slop and misinformation comes with time and dedication. A university degree in theology continues to have value in the Age of AI. We cannot delegate our thinking to AI. Take time to read widely and to think deeply about what you read. Spend time regularly in the Word of God and take time to memorize it.
All Scripture is given by inspiration of God, and is profitable for doctrine, for reproof, for correction, for instruction in righteousness, that the man of God may be complete, thoroughly equipped for every good work.
For everyone who partakes only of milk is unskilled in the word of righteousness, for he is a babe. But solid food belongs to those who are of full age, that is, those who by reason of use have their senses exercised to discern both good and evil. (Hebrews 5:13-14)
AI is a tool that can stretch us, challenge us, and help us focus on what is most important. What it should never do is think for us. You and I need to continue to disciple and challenge our people not to delegate thinking to an AI. Our people need to be grounded in Scripture and taught how to apply it to their lives.
I had a wise professor once tell me many years ago in a philosophy class, “If you know what is truth; what is wrong becomes self-evident.” In this day and age of misinformation and confusion, truth needs to be preached more than ever. That truth is found in the person of Jesus Christ and in the Word of God found in the Bible.
Joseph Duchesne is the creator of The Church AI Guy, a space where faith meets innovation while discussing the long-term impact of AI. A pastor, autodidact, and author of two books—The Last Crisis and Discover the One—he’s passionate about showing how Jesus-centered discipleship can thrive in a digital world. When he’s not experimenting with the latest tech, he’s reading theology, building church community, or spending time with his wife.


