A.I. and the Minister

Several years ago I was sitting down with Dan Winkler. This was around 2017 or so when Dan was speaking here at Graymere. Dan has been preaching longer than I've been alive. I asked him this question, knowing for sure what I thought he would say:

"Dan, do you think it's easier to preach now than it was 30 years ago?"

His near-instant response: "No no, it was much easier 30 years ago."

I was stunned. I thought for sure that he would say that it was easier now. That we now have a repository called the internet. That there were so many more resources. There were relatively easy ways to access those resources. That you could get ideas and insights from so many more sources. But he didn't say any of that.

"Back then, we only had a few resources. That's the single biggest reason why I think it's so much more difficult to preach today than in 1975. It's so much more difficult to discern."

I think about this conversation a lot. Dan is well-respected, a good preacher, who has endured family tragedy that some of us could scarcely imagine. But almost nothing he has said has stuck with me as long as that has.


I remember using Google for the first time. It was 26 years ago, I was a junior in High School, and I think my first search was probably to Google myself. Of course, there were no results. My second search was probably something to do with Star Trek or the new Alabama Football schedule.

But I remember even thinking at the time: "This is it?"

I didn't like the way that Google presented results. I liked the way Yahoo! and others at the time even less. So I stuck with Google, and it got increasingly frustrating.

Very early on, I learned to ignore those top "sponsored" links and go straight to the ones below for the info that I wanted. I always disliked that. I also disliked that I couldn't use natural language with it - it would just pull up a person asking the same question I did.

We all trudged along for nearly 20 years after that with just using Google. They made so much money from Search that they started making an operating system and phones to compete with Apple. They rode the coattails of Search for a long time - until the next thing happened.

ChatGPT was launched in 2022, and it (literally) overnight had over 100 million users. I was one who signed up for an account. I haven't used it exclusively since then, but the A.I. landscape is something that is changing the way that we work and communicate every day. And it is changing the teaching and preaching coming out of our pulpits.


Let's get one thing straight first: I think it's very important to understand that what the media and everyone generally calls "A.I." is not artificial intelligence whatsoever. It is a learning algorithm that is more sophisticated than the search algorithms that we've been using for three decades. I sometimes refer to it as "super Google." I truly believe that it's the middle step between regular search engines and true artificial intelligence. While I believe that these tools have fundamentally changed how we work and how productive we are in just the last few years, I also believe that true artificial intelligence, once discovered, will change everything about our society as a whole. It will not merely be some form of super Google - it will be a vast, learning intelligence that we might even need to fear.

As I write this in mid-2025, there are already scientists and researchers sounding the alarm on that. It's quite unnerving.

All that being said, how are we as ministers and preachers and teachers of the Bible supposed to use these amazing tools?

I cannot speak for preachers, as I only preach once or twice a year. But what I can imagine is the temptation to use this technology the wrong way. Preachers who spend hours producing sometimes two to three sermons that are supposed to be truthful, factual and moving - every single week. Every time I preach twice on a Sunday, it makes more thankful for what my preacher does - but it also reminds me that I would never personally want that job.

As a Bible teacher that teaches three times a week or more, I can tell you that my temptation is certainly there for using these tools to manufacture a class for me, without me having to put in any work beyond prompting A.I. to do it.

Now don't read that the wrong way. I think ministers of all types should be using A.I. - in the right ways to assist in class and sermon prep.

For example, 25 years ago there was the same discussion about Google, and ministers just finding or copying their sermons from something they found on a search engine. Taking what someone else had written on a website as fact without doing the necessary checks and research yourself. The risk is the same with ChatGPT and similar tools - copying an A.I.'s "research" borders on plagiarism. Even if you're ok with that, now you have to address the slippery slope of this question - what the is true, deep theological thinking behind what you're saying in the classroom or the pulpit?

Hopefully every single one of us would scrutinize anything we find on the internet. Hopefully every one of us would test things to be true according to the Word of God, the one and only measuring stick, but that's just not happening in our culture.

I keep thinking about my conversation with Dan, and how I often ask the question now - are we better off with using these tools, or not using them at all for fear of teaching the wrong things?

The Bible says plenty about those who teach falsely. In fact, it was one of the singular focuses of John's epistles.

Any Generative Pre-trained Transformer (GPT) client is using the internet as a search tool, aggregating it's response into something coherent, and then it authoritatively presents information to you as fact. Most of it is accurate, right?

ChatGPT-4o, the latest model, achieves around 88.7% accuracy on the Massive Multitask Language Understanding (MMLU) benchmark. Does that mean that nearly 12 out of every 100 responses are wrong?

That was just based upon one benchmark. Studies show that when you ask it complicated questions like those regarding computer programming, sometimes the accuracy rate drops below an average of 50%.

Theology can be decently complicated sometimes.

Sure, you can train these models to a certain extent. Whenever I'm doing some reading on a class, I always try to preface my prompts with, "You are a theologian. Answer this question."

But what about on more... shall we say... contentious issues?

ChatGPT at least - delivers. Especially when you preface it with the correct prompts. In nearly every "contentious" issue I asked it about, it gave concise and accurate answers while also citing sources (which you can add to any query, and you should).

So if I can train it properly and it's nearly 100% accurate (it's not) and it can write sermons for me, then why can't I let it do that?

Take this example: Notebook LM. Notebook LM is a Learning Model utility app developed by Google and runs off the backend of Gemini (their version of ChatGPT). It allows you to upload or link to sources in which you can essentially build your own GPT (or in this case, Notebook Learning Model) based upon only the sources you give it. So I can give my entire lesson series on Hebrews, (18 documents and over 54,000 words) and tell the GPT to write a sermon based upon my style on the just one aspect of the book of Hebrews, say about the Preeminence of Christ as our High Priest. Based upon only the data I have given it (not using any outside sources), it will then craft me a 2,800-word sermon.

Is that wrong? I know what I taught before is correct. I know the content of the classes that I fed into it has been well-researched personally, by me, and even taught before in a Bible class over the span of 18 weeks.

My one point of advice with using these tools would be the same advice that I would've given to someone 25 years ago using Google to write a sermon: take everything with a grain of salt, and absolutely don't plagiarize.

I've read articles that many professors in our colleges are going back to handwritten or oral exams because they cannot trust that students will not use ChatGPT in some form to take the test, write the report, or give their own thought out answer to a critical thinking question.

Make no mistake - we are not dealing with computer coding, medical diagnoses, or just putting together a term paper here. We are dealing with people's souls.

And these souls are dependent on how the Gospel is preached. How it is taught in our Bible classes, both to our young people and to our adults.

With something as old as the Bible, there is going to be a lot of nuance and careful explaining that need to be done on a lot of Biblical subjects. Which is why accurate preaching is so important. Why our Bible classes are so integral to spiritual development.

Can we use these tools? In the case of Google 25 years, it would probably be foolish for us not to use all the tools we have at our disposal to bring people to Christ. But if we bring people to Christ the wrong way, we teach the wrong doctrine, we take something that a machine wrote verbatim without checking it, then we are guilty of every repercussion that comes after that.

It is up to us, as it always has been and always will be, to "test all things, hold fast to what is good, and reject every kind of evil (1 Thess 5:21-22)."

ChatGPT wrote 0% of this article, by the way.