Zoe
Church & TechnologyMarch 2026·9 min read

What Does It Look Like to Equip the Kingdom to Use AI Well?

The church can't afford to sit this one out. Here's a practical framework for how churches and ministries should evaluate, adopt, and shape AI tools faithfully.

Here's something that shouldn't be surprising but still is: the Southern Baptist Convention became the first major denomination in America to pass a formal statement on AI ethics — in June 2023. The Vatican had already been at the table with Microsoft and IBM since February 2020, co-signing something called the Rome Call for AI Ethics. The National Association of Evangelicals signed on too.

Meanwhile, most local churches are still deciding whether to have a "digital ministry" strategy.

I'm not saying that to be harsh. I say it because the gap between where the conversation is happening and where most of our congregations are is enormous — and it matters. A lot.


The Church Has Always Shaped Culture or Been Shaped by It

The printing press. The radio. Television. The internet. Every major information technology in history triggered the same two responses from the church: a prophetic voice that helped communities engage wisely, and a fearful silence that left people to figure it out alone.

We don't get to make AI go away by ignoring it. It's already in your people's pockets, reshaping how they learn, how they relate, how they form habits of mind. According to the Gospel Coalition's African network, one of the top uses of generative AI in 2025 was therapy and companionship. People are turning to AI chatbots because they're lonely, because the church hasn't created enough on-ramps for real connection, and because AI is available at 2am when anxiety hits hardest.

That's not a technology problem. That's a discipleship gap AI is rushing to fill.

So let me be direct: the church has a responsibility to be at the table when AI is being built, evaluated, and deployed — especially in faith contexts. And that responsibility starts with equipping our own people well.


The "Just Ignore It" Option Is Already Gone

I've talked with pastors who think staying silent on AI is somehow neutral or safe. I want to push back on that.

Silence is a position. When you say nothing, your congregation fills the vacuum with whatever YouTube, Reddit, and their coworkers are saying. And right now, those voices aren't exactly steeped in theology of the image of God.

Barna Group research found that most Christians who have concerns about AI still fall into the "don't know" category — meaning they're not hostile to engagement, they just haven't been given tools for discernment. That's a pastoral opening, not a reason to wait.

The leaders who will serve their people best in the next decade aren't the ones who warned loudest against technology. They're the ones who understood it well enough to help their congregation use it wisely. There's a difference between prophetic caution and informed engagement.

The Vatican's Antiqua et Nova document, released in January 2025, put it plainly: technological progress is part of God's plan for creation, but people must take responsibility for how it's used. "Like any tool, AI is an extension of human power." The moral weight doesn't belong to the algorithm — it belongs to the humans who build it and the communities who choose to adopt it.

That means us.


The False Binary That's Slowing Us Down

"AI vs. faith."

I hear some version of this framing constantly, and it frustrates me every time. The question is never really "AI or faith." It's always "who built this, what values did they build in, and what does it do to the people who use it?"

A hammer isn't good or evil. A hammer wielded by someone trying to build a shelter for homeless families does something very different than the same hammer in other hands. The question isn't the tool — it's the craftsperson, the intent, and the accountability structure around both.

The SBC's 2023 AI resolution frames it well: believers should engage AI "from a place of eschatological hope rather than uncritical embrace or fearful rejection." That's not a muddled middle — that's actually a sophisticated position. It takes human dignity seriously, holds technology accountable, and doesn't flinch from either the opportunity or the risk.

The binary — AI or faith, technology or tradition — is a distraction. The real work is building and choosing tools that actually reflect our values.


What "Good AI" Looks Like in a Faith Context

If your church or ministry is evaluating an AI tool for discipleship, pastoral care, prayer support, or faith formation, here's a practical framework I'd put to any vendor or product:

1

Is it transparent about what it is?

An AI tool in a faith context should never pretend to be something it's not. It doesn't pray. It doesn't have the Holy Spirit. It doesn't replace pastoral care. The Rome Call for AI Ethics was clear that 'each person must be aware when they are interacting with a machine.' Any AI product that obscures this or plays up the mystical is either naive or deceptive. Probably both.

2

Does it protect the privacy of your people?

Spiritual conversations are among the most personal things a person shares. When someone texts about their doubts, their marriage struggles, their fear of death — that data deserves the highest protection. Good AI in a faith context means conversations stay private. Full stop. No selling behavioral data. No third-party access. No using confession-level vulnerability to train a model for someone else's product.

3

Is it theologically grounded — or theologically lazy?

There's a difference between AI that helps someone engage Scripture, ask better questions, and connect the dots across their spiritual journey — and AI that just gives you the 'Christian answer' without any depth. The Lausanne Movement's AI framework talks about 'theological alignment' as a core evaluation criterion. What tradition is informing this tool's outputs? Who did the theological work? Is it accountable to pastoral authority, or just trained on whatever's on the internet?

4

Does it reinforce human community or replace it?

This is probably the most important question. Lifeway Research found that 95% of pastors believe discipleship happens in relationships, not programs. Good AI in a faith context isn't trying to be the pastor, the community, or the Holy Spirit. It's trying to help people stay engaged between Sundays, follow through on commitments, and surface the right questions — so that when they do sit across from a real human being, they're ready. AI that creates dependence on itself is a bad tool. AI that points you toward God and toward your community is doing its job.

5

Is it accountable to pastoral leadership?

Any AI tool deployed in a church context should give ministry leaders visibility into what it's doing — at an aggregate level — without compromising individual privacy. Pastors need to be able to see whether the tool is reinforcing the teaching coming from the pulpit, whether it's doctrinally consistent, and whether it's producing fruit in people's lives. AI with no pastoral oversight isn't ministry tech. It's just tech.


Who Should Be at the Table When AI Is Being Built?

At NRB 2026 in Nashville, a panel of Christian leaders made a point that I keep coming back to. Someone named Skytland said: "I think we have a moral, ethical, theological responsibility as Christians to shape technology for good."

I agree. And I'd go further.

If we believe that human beings are made in the image of God — that our worth isn't reducible to productivity or utility — then we have something to say that the secular AI industry desperately needs to hear. The biggest AI labs in the world are building systems that make choices about learning, relationships, mental health, and meaning. They're doing it with engineers and product managers and investors. Very few of them have a theology of personhood.

That doesn't mean they're villains. It means there's a seat at the table that's going mostly empty, and it has our name on it.

The LDS Church has published seven principles for their use of AI, covering spiritual connection, transparency, privacy, and accountability. The NAE's president Walter Kim signed the Rome Call for AI Ethics at the Vatican Summit. The SBC has made AI advocacy a cornerstone of the ERLC's work. These aren't fringe reactions — they're institutional commitments from major faith communities that understand something important: you don't get to complain about the tools if you weren't in the room when they were built.

The window for faith communities to shape the values baked into AI is right now. In five years, the defaults will be set.


The Specific Challenge of AI for Discipleship

Let me make this concrete, because I care about this one personally.

Discipleship has always had a gap. The sermon ends, people go home, and the transformation that started on Sunday gets lost by Tuesday. Life is loud. Good intentions aren't enough when the world is this loud.

AI can help close that gap — but only if it's built right. The difference between AI that genuinely serves spiritual formation and AI that just mimics it is enormous, and most people can't tell them apart yet.

Here's what genuine AI-assisted discipleship looks like, in my view:

  • It asks better questions, not better answers. The goal is to help someone process what God is saying to them — not to tell them what to think.
  • It remembers. Transformation happens when someone helps you connect the dots across days, not just moments. An AI tool that forgets every conversation is just a search engine with better marketing.
  • It defers to pastoral authority. The tool should reinforce what the pastor is teaching, support the church's theological tradition, and surface concerns that need human follow-up.
  • It knows its limits. When someone is in crisis, good AI points toward real people. Always.

This is exactly what we're building with Zoe. Zoe is an SMS-based discipleship tool — it lives in your text messages, asks you what God is saying to you and what you're going to do about it, and remembers your commitments across days so you can actually follow through. It's built for church communities, accountable to pastoral leadership, and privacy-first by design. No app, no login, no surveillance of your spiritual life.

It's a tool. A well-built one, we think. Built by people who believe this stuff matters.


So What Does It Look Like to Equip the Kingdom?

It looks like pastors who understand AI well enough to evaluate tools — not fear them, not uncritically adopt them, but actually evaluate them with a theological framework.

It looks like denominations and parachurch organizations putting people in the room where AI policy is being made — not just reacting to it after the fact.

It looks like ministry leaders asking the right questions before they deploy anything: Is it transparent? Is it privacy-first? Is it theologically grounded? Does it reinforce community? Is it accountable to us?

It looks like the church reclaiming its voice on what it means to be human — what learning is for, what relationships are for, what transformation looks like — in a cultural moment that desperately needs that voice.

And practically? It looks like trying tools that are actually built by people who share your values. Because the alternative — letting tools built without any theological grounding quietly disciple your people — is already happening.

Zoe is built for this.

If your church is looking for an AI discipleship tool that's transparent, privacy-first, theologically grounded, and designed to work alongside pastoral leadership — not replace it — we'd love to have you along for the journey.

Join the Zoe Waitlist