In Part 1 of our series on artificial intelligence and the need to bring wisdom to bear on whether, when, and how we should employ it, we learned that AI is a super powerful but non-neutral tool that simulates/mimics human intelligence by analyzing and recombining inconceivably large amounts of data. Its processing and ordering power makes it appear agentic (having the properties of an agent), but as a tool it lacks true consciousness, morality, and creativity. So it’s crucial that we exercise wisdom in using it. Scripture teaches that true wisdom begins with knowing and embracing what God has called good and relentlessly aligning our lives with it. We must employ this framework of evaluating everything in terms of what God has called good (order, embodied humanity, marriage, family, vocation, worship and the three-dimensional grace of God) in order to discern how to interact wisely with AI. In Part 2 we turn specifically to the question of AI in research and content creation.
AI and Research: The Ultimate Easy Button?
You’re familiar with Google. You type in a question or subject and Google gives you millions of websites that, ostensibly, have something to do with your search, and then you have to sort through them. Large Language Models like ChatGPT, Claude, and Gemini are Google on steroids. If you type your query in one of these LLMs it will scour the internet (its box), devour everything it finds that has anything to do with your question, and then condense and order its findings for you. You don’t have to search multiple websites or read multiple books to get the information you want. It’s the ultimate “Easy Button.”
Is this good? Is there a place for easy? We drive pickups and SUVs rather than walk or ride horses because it’s easy. We warm leftovers in the microwave because it’s easy. We use the calculator to reconcile our bank statements because it’s easy. We do laundry in the washing machine because it’s easy. Is this new Easy Button different? Why have we embraced those other Easy Buttons? Aren’t we trying to prioritize other aspects of our vocations such as time with our friends and families or time to learn a valuable life skill?
[1]
Can there similarly be a place for time-saving AI in research? It is true that the time saved could free us up to prioritize our vocations (time with family, serving our neighbor, volunteering at church, etc.). But could there be a cost? Consider the microwave. Has it saved us time? Absolutely. But has there been a cost? Has it affected our ability to cook? Has it affected our understanding of food and health? A little sober reflection might lead us to conclude that the time saved through microwaves hasn’t necessarily made us healthier or wiser when it comes to food. In many ways it has cheapened food and the fellowship that traditionally happened around it.
When it comes to research,
[2] AI makes it exceedingly easy, but recent evidence is showing that our new Easy Button is also coming at a cost. A June 2025 special report on AI and learning in
The Wall Street Journal compared student learning when research was conducted through Google versus a LLM like ChatGPT. Google required students to open multiple websites and to do the work of synthesizing the information themselves. ChatGPT did the synthesizing for them. The
Journal cites one researcher who observed, “Students who use AI tools to complete assignments tend to do better on homework—but worse on tests.” He adds, “They’re getting the right answers, but they’re not learning.” He then warns, “If we don’t teach [younger people] how to synthesize and interpret information themselves, we risk deskilling the ability to learn deeply at all.”
[3] David Bourgeois, associate dean at Crowell School of Business at Biola University and director of the university’s Artificial Intelligence Lab, observes, “We need to make sure that there’s friction in our lives. AI has the potential to try to reduce as much friction as possible, so that you’re living a friction-free life. … But in that case, you’re not learning. You’re not engaging. You’re not growing or maturing.”
[4]
Wisdom requires us to ask what the goal of research is. Is it to learn or simply to gather information? If it’s to learn, LLMs like ChatGPT might not be the best tool. The Easy Button might actually prevent learning. If the goal is simply to gather information, LLMs make it easy. But this raises another question: In pillaging the internet and reassembling text according to probability, AI often reassembles falsehoods – grammatically correct falsehoods, but falsehoods nonetheless. Not only does it sometimes cite books and articles not written by the authors it lists and quotes not uttered by the speakers it references, but AI can be duped by propaganda. An increasing amount of content on the internet has been generated by AI and much of it is patently false. In fact, rogue actors are utilizing AI to generate (at alarming rates) propaganda pieces to promote their cause or their government. Programmers do their best to craft algorithms to catch such propaganda, but it’s unrealistic to think these algorithms catch it all.
WORLD magazine recently reported the following:
A March 2025 NewsGuard report revealed that leading chatbots like ChatGPT, Grok, and Claude affirmed false Russian propaganda narratives 33% of the time. The bots were drawing information from Moscow propaganda network Pravda, which published 3.6 million disinformation articles through various news sites last year.
[5]
It may be, then, that what ChatGPT generates is a logical, grammatically correct mix of truth and propaganda. Wisdom teaches us that
easy doesn’t necessarily equal
true. Truth, like a homecooked meal, often takes longer to master, but both are worth it. When it comes to research, AI is most helpful for those who bring knowledge to their queries, who have a working knowledge of the subject they’re researching. This is a critical point to appreciate:
We must bring competence to AI. Wisdom must precede it. And developing wisdom is often a slow, arduous process. It requires prolonged thought, synthesizing information from diverse perspectives, filtering these through a Biblical worldview, and arriving at a place of discernment and understanding. AI can quickly collect information into one place and expedite the research process. Learning, however, can’t be rushed. Learning is done best in slow mode with AI providing a quick assist (like using the microwave as a quick assist when preparing a home cooked meal).
[6]
AI and Content Creation
From silent code, a voice is spun,
It writes and paints, outshines the sun.
A spark of thought without a soul—
A mirror, not a mind, made whole.[7]
What you just read was generated by AI. I asked ChatGPT to write a short poem about AI’s amazing ability to generate content. This is what it produced. It’s both stunning and disturbing. Setting aside the temptation to psychoanalyze ChatGPT (
silent code outshining the sun, thinking without a soul, a mirror, not a mind, creating – disturbing indeed!), AI has the power to generate content at astonishing speed. This power is affecting us. It’s well known that students are using AI to generate papers and to complete homework.
[8] It’s also true that professionals are using it to cheat in various capacities, using AI to generate content and claiming it as their own.
This, obviously, is a line Christians should not cross. Having AI do our homework isn’t too different from having our friend do our homework (other than the fact that AI has no soul and no mind!). Either way we aren’t doing it, and turning the paper/homework/report in with our name on it is lying, not to mention a form of stealing because AI has devoured countless unnamed documents/original pieces (many of them paywalled) and offered no credit (or remuneration) to the authors. To evaluate this powerful tool we must remember and use our framework for wise living (what God considers good).
This powerful tool is also affecting pastors because AI will generate sermons. Most pastors will resist this temptation (and rightly so), but AI can generate Bible studies and discussion questions and outlines based on a pastor’s sermons or prompts. It can generate articles based on the ideas in the pastor’s sermons/prompts. It can voice clone the pastor and produce his sermon in his voice in a different language. It can even generate a podcast with incredibly human-sounding talking bots discussing the ideas communicated in the sermon. Wisdom requires us to wade through each of these questions individually. Where does AI help us prioritize God’s good and where does it undermine it? Where does it serve the proclamation of the Gospel and where does it change it?
When it comes to sermons and papers, AI can check for grammatical errors (AI is now built into Microsoft Word to do precisely this. In fact, Word has now incorporated AI to generate content too.). AI can check papers for cultural insensitivities, unhelpful buzz words, ill-formed thoughts, or logical errors. It can offer suggestions for improving flow or consistency, or generate an outline or concise summary of a paper. AI can generate policy statements or convention overtures within the parameters provided. In this way AI can serve as an assistant in content creation.
Once again, to navigate these questions and the potentialities of this powerful tool, we need to bring wisdom to the conversation—wisdom from beyond AI, and even from beyond ourselves—wisdom from God’s Word. We need to return again to what God has called good. Where does using AI in content creation help us prioritize the good of God and where does it undermine it? Where it helps affirm and further the good, Christians might use it well. Where it undercuts it, Christians should refrain and encourage others to do so as well even as we promote the good. It may be the case, though, that we find ourselves in a situation of competing goods. For instance, AI can generate a Bible study for a busy pastor and allow him the time he needs to visit a sick parishioner. It can, ironically, help create a church’s policy on AI, saving several hours of research and tedious meetings, thereby allowing time for in-person ministry efforts. As before, each scenario needs to be considered individually and honestly. Wisdom requires patience and reflection, a deep understanding of creation and our status as creatures of God, a profound understanding of the good of God, and what makes for human flourishing.
AI is a super-tool with the power to make research super easy, but that may end up being to our own detriment. Easy Buttons have their place, but an Easy Button of this magnitude may end up doing us more harm than good, allowing us to spread our knowledge exceptionally wide, but preventing it from penetrating beneath the surface; making us look smart, while keeping us from being wise. And AI’s ability to generate content is presenting us with ethical dilemmas, pushing the limits of integrity and threatening to curtail human creativity, even while it offers the potential to assist us in promoting and upholding the good of God. Christians must bring competence and wisdom to bear on these questions, continually reminding ourselves of what God has called good and evaluating everything accordingly. We must refuse to compromise on questions of integrity, guard against our sinful tendency toward laziness, and have thoughtful conversations on where and how AI might assist us in our various vocations.
In Part 3 of our series on AI and the need for real wisdom, we will consider “AI and the Life/Career Coach” and “AI and Relationships.” Until then, practice wisdom; pursue the good! - Pastor Jonathan and Rebecca Conner
[1] I think we have to acknowledge that some of us have used the time gained through these Easy Buttons to give ourselves to our media devices, scrolling mindlessly and viewing indiscriminately. In those ways, our Easy Buttons have not redeemed our time for meaningful and edifying ends. In fact, in some cases our Easy Buttons have diminished time with family and friends and eliminated the opportunity for meaningful conversations and growing in virtue.
[2] In this brief article we are considering research for the purposes of learning. We must acknowledge that AI is also being used in medical and scientific research, helpfully harnessing its superpower for pattern recognition, disease treatment, ecological and environmental impact projections, and more.
[3] The Wall Street Journal, “AI Makes Learning Easy. Maybe Too Easy.” By Jackie Snow, June 30, 2025.
[4] (Quoted in Jason Newell, Biola, “How Should Christians Prepare for the AI Revolution?” Spring 2025, 26) I remember reading a report awhile back about the failed attempts to grow trees in a biodome. They kept falling over. Researchers discovered why: There was no wind. Wind puts pressure on the trees, causing them to strengthen their roots and bark. Without the opposing force of the wind, the trees simply grew to a certain height and fell. One wonders if, by creating the ultimate Easy Button, we are eliminating the very opposing/challenging forces we need in our lives to grow strong—whether we will end up curtailing our own growth potential.
[5] Elizabeth Russell, “A Tale of Two Chatbots: Generative AI is Growing Increasingly Powerful. What Does that Mean for Humanity?”, July 2025, 86.
[6] MaryAnn Wolfe, author of Reader Come Home: The Reading Brain in the Digital World, astutely observes, “Only if we continuously work to develop and use our complex analogical and inferential skills will the neural networks underlying them sustain our capacity to be thoughtful, critical analysts of knowledge, rather than passive consumers of information.” True learning takes what many have called, the quiet eye. (MaryAnn Wolfe, Reader Come Home: The Reading Brain in the Digital World, 62.)
[7] I asked ChatGPT to generate this on July 4, 2025.