The Hidden Privacy Crisis Lurking in Your Church’s AI Usage
Are you inadvertently handing over the keys to your private organizational data?
Are you handing over the keys to your organization’s private data? Too many leaders don’t appreciate the hidden dangers lurking behind a casual use of AI.
I had a recent experience with my daughter where a conversation we had was followed up with an advert on her Facebook account. It was kind of creepy, but it got me thinking about privacy, especially in the context of my usage of AI.
AI has the potential to be far creepier than a computer simply listening in on your conversations. Large Language Models (LLMs) like ChatGPT are actually prediction machines. They predict the next words and the next thoughts a person will have.
Think of a public LLM as a public library where every book you donate is immediately photocopied and handed to every other visitor. Once you hit ‘enter,’ that information is no longer yours; it belongs to the collective ‘brain’ of the AI.
Today’s reality is that AI knows you better than you know yourself. It can anticipate your next moves, your motives, all based on what it knows about humans and on what information you have given it in the past. LLMs use recursive learning. Every prompt you feed it becomes part of its permanent training data.
This has massive privacy implications!
The Privacy Nightmare of Data Harvesting
Confidentiality should be non-negotiable. Whether it is a congregant’s private medical information, sensitive prayer requests, member contact information, or notes taken during a counseling session, none of this information should be made available to a public LLM.
AI can help you manage your workload, organize your thinking, and brainstorm new ideas, but it also can breach the privacy of your members and your organization.
I’ve written previously about AI Companions and the privacy issues they can create.
There are several other ways that you or someone in your church or organization can be inadvertently exposing private data to the world.
Summarizing Meetings: AI is now being used to summarize meetings and conversations. It becomes an invisible secretary. You aren’t just getting a summary; you are creating a permanent, searchable record of sensitive church business on a platform you do not own. If the topic is confidential, the AI shouldn’t be invited.
Brainstorming Sermon Illustrations: Pastors and Church Leaders operate based on trust. Their congregants share confidential information with them on the assumption that it will stay confidential. Feeding counselling notes into an LLM is a massive breach of trust. Never feed personally identifiable information into the prompt just to get a sermon illustration back from the AI.
Raw Unfiltered Notes: Never upload a raw transcript of a meeting or your unfiltered personal thoughts. Do not use any information that can directly identify individuals in your congregation or you personally.
Financial / Donor Data: Avoid exposing sensitive, confidential donor information to the cloud and to AI Big Tech. When information goes up to the cloud onto servers you don’t control, you lose ultimate control over that information. Pasting a donor spreadsheet into a free AI to ‘analyze giving trends’ is a catastrophic breach of fiduciary duty. You are taking the private sacrificial acts of your members and feeding them into a commercial algorithm.
The Right to be Forgotten: People have the right to have their data deleted once they have left an organization. If the information collected isn’t 100% under your control, deleting it can be difficult or impossible.
Staff AI Use Policy: For the organization’s private data to be truly secure, everyone in the organization must be following best practices. If you haven’t done so already, develop an AI Use Policy appropriate for your organization.
You should have no expectation of privacy when using public, free tools.
What is the Solution to the AI Privacy Risk?
Avoiding the use of AI is unrealistic. Despite the privacy risks, there are several use cases for AI that still make it an asset to the church as long as the risks are understood and AI is used appropriately.
Education is important and ongoing. Be sure to keep your leaders informed about the risks. Provide periodic training in the use of technology. Don’t just assume that leaders know and understand the risks of improper use of AI.
One of my goals for my Substack is to help you prepare yourself and your people to use AI responsibly and ethically. Used properly, AI is an asset to any organization. It multiplies your efforts and can help you clarify your communication and make it more impactful.
Joseph Duchesne writes to help Christian Leaders navigate the ethical challenges that artificial intelligence poses to the Church today. He is the author of a couple of books, The Last Crisis and Discover the One, both available on Amazon.


