Artificial intelligence is here. Now what?

AI is integrated into the work done in any administrative, research, or social role — including ministry, writes Kate Ott.

Human hand touching an android hand. Digital illustration.

The general public’s awareness of artificial intelligence radically expanded when Open AI released ChatGPT in November 2022, sparking major media coverage of artificial intelligence. Unfortunately, the moral panic caused by the magic-like sophistication of OpenAI models fueled misconceptions about artificial intelligence and led many Christians to disengage from critical and constructive conversations about AI and ethics.

One year later in November 2023, the Barna Group released a snapshot of the data they collected on “How U.S. Christians Feel About AI & the Church.” Overall, Christians are more pessimistic than non-Christians about AI. More than 50% believe AI is not good for the Christian church, and a slightly higher percentage would be disappointed if their church used AI. For ease of research, these questions give in to a dichotomy that AI is always good or always bad, but there is no technology that we can characterize in this way, not even atomic energy. Instead, the use or design of certain technologies can be assessed for their contributions to moral goods and ethical harms.

AI is already here

We can and must resist polarized responses that succumb to awe or fear. The reality is most of us have been interacting with AI for years through software and services. AI is integrated into the work done in any administrative, research or social role — including ministry. Daily, we might use voice assistants, search engines, social media feeds, email spam filters, predictive text, facial or fingerprint recognition, gaming, and navigation or GPS applications.

The reality is most of us have been interacting with AI for years through software and services.

We rely on e-commerce platforms and streaming applications to provide us with predictive preferences and reviews by other users. AI is revolutionizing auto, financial, and medical industries, not to mention all workplaces, churches included. If your church has a website or Facebook group, accepts e-donations, uses a Bible study or Bible translation application, not to mention administrative services like email or online shared calendars, your church is using AI.

So what makes Christians comfortable with the personal use of an Alexa smart assistant or Google search to answer a theological question, or a Bible application to provide commentary and translation, but perhaps recoil at the notion of a ChatGPT-produced sermon or Renée, the People Church’s digital assistant helping newcomers connect to church at their own pace? All these applications rely upon machine learning; some use deep learning and natural language processing. But the feel is different! A voice assistant feels more in the users’ control as an artificial narrow intelligence versus ChatGPT’s artificial general intelligence that can pass the bar exam, or the People Church’s Renée that may be replacing your favorite church administrator. What’s underlying the concerned Christian response?

Human exceptionalism is a cornerstone of Christian theology. If an AI can express, think and respond like a human, what makes humans special?

Human vs. AI?

For the “Uncovered Dish,” a ministry and leadership podcast, I took the pre-prepared list of questions from the hosts and entered each one into ChatGPT to see how similar or different the responses would be from my own. When it came to explaining things, such as how AI works, the technology’s different capacities and techniques, and how AI might contribute to fields like education and ministry, there was an uncanny similarity. Artificial intelligence was less able to intuitively match more creative ethical responses, often relying on a Scripture quote to “add” religion to a basic answer. While touting its skill at assisting with sermon writing, ChatGPT did include caveats: one, for example, stated it was not meant as a substitute for the pastoral care of a minister.

…millions of users turn to different chatbot applications for spiritual and mental health-related concerns.

Yet millions of users turn to different chatbot applications for spiritual and mental health-related concerns. In his 2022 chapter “Beyond the Live and Zoomiverse,” which appeared in the book Ecclesiology for a Digital Church: Theological Reflections on a New Normal, Philip Butler argues, “While conversational AI companions presently lack human-level nuance, platforms such as Weobot, Wysa or Seekr still provide digital space for individuals to turn inward cultivating spirituality and emotional attenuation via the different techniques these companions are trained to employ.” Butler, assistant professor of theology and Black posthuman artificial intelligence systems at Iliff School of Theology, is the creator of Seekr, a Black AI chatbot grounded in spirituality, and, unlike other spiritual companions, is built specifically to hold the context of Black users through language and images. Many other chatbots are programmed to default to stereotypical White, female language patterns. Butler’s research demonstrates the effectiveness of therapist and spiritual companion chatbots for the limited uses they support and the gaps they can fill in care. Developers and users of such applications do not suggest their AI apps as wholesale replacements or rejections of human-to-human interactions. They are supplemental. They can especially be helpful for older adults, those living with challenges of neurodiversity, and others with social anxiety.

Throughout history, various technologies have shifted how we live out our faith. Consider the theological consequences of the printing press for the Protestant Reformation and lay access to Bibles in their vernacular. In his book People of the Screen: How Evangelicals Created the Digital Bible and How It Shapes Their Reading of Scripture, John Dyer details how digital Bible applications have radically impacted the way Christians, not just those who are evangelical, interact with Scripture and the ways the evangelical approach of designers impacts users across the globe. This is one small way that AI and digital technological design have had significant and often unacknowledged impact across Christian traditions. Like the shifts in Biblical engagement from oral proclamation to printed Bibles, AI sometimes replaces and other times supplements past faith practices. In the process, human-to-AI interactions have and will continue to generate new ways of being and acting in this world impacting jobs, artistic expression, faith formation, and even our most intimate relationships which I discuss in my book, Sex, Tech, and Faith: Ethics for a Digital Age.

Many of us have already shifted everyday behaviors because of the influence of AI. I was recently on a family vacation, and everyone was grateful their smartphones worked in another country because they could not imagine how we would navigate the roads or find restaurants. In the not-so-distant past, we used paper maps and phonebooks. Unsurprisingly, everyone on the trip was in favor of AI’s assistance over those past modes, a common reaction when AI has made our lives easier.

But that is not always the case, and much of how AI affects communities and the planet remains hidden. For example, AI requires excessive amounts of energy, and it contributes to the large amount of trash and mineral mining that comes from electronic hardware. In an October 2023 Wired magazine article, Megan O’Gieblyn, author of God, Human, Animal, Machine, responds to the moral panic about what makes a human a human in the face of AI.

“Each time you fear that you’re losing ground to machines,” she writes, “you are enacting the very concerns and trepidation that make you distinctly human.” For now, self-consciousness, worry, despair and happiness are human – not AI – traits. What else might make humans uniquely different from AI? Perhaps it is our capacity for morality.

Ethical concerns have been part of every stage of AI development, from the conscious or unconscious biases of the designers to flaws in data sets to the impact of users who train the models.

Ethical concerns have been part of every stage of AI development, from the conscious or unconscious biases of the designers to flaws in data sets to the impact of users who train the models. These are human points of intervention, and they often are points of human moral failure. This is why it is critical to raise the level of digital literacy of users, especially Christians who want their faith to have an impact on their technology use. Artificial intelligence is a broad term that encompasses many types of computer systems that mimic intelligent behavior. AI utilizes different techniques and approaches. Most daily AI use is based on machine learning that utilizes algorithms to learn from data to make predictions. LLMs – Large Language Learning Models – are a type of machine learning that relies on deep learning and neural network technology to generate human-like text, natural language processing and images. Having a basic level of literacy about the capacities and technologies that fall under the umbrella of AI helps us discern important differences for personal and communal use as well as advocating for regulation.

Toward a Christian AI ethic

The diversity of technologies that rely on AI increases the complexity of how to ethically respond as technologists, regulators and users. In my book, Christian Ethics for a Digital Society, I suggest we need a more flexible, values-based ethical approach when responding to digital advances. Most often, Christians employ a rule-based approach that cannot match the complexity of digital innovation. Those respondents to the Barna study who said the use of AI would damage churches have likely ignored a host of ways churches and religious leaders already use a variety of AI software applications.

What might be a better approach than saying “no” to AI use at all? We might safely make a rule like: ministry use of AI should be limited to making everyday administrative tasks easier. Is it morally okay for ministry staff to use auto-generated responses on text messages or AI assistance in responding to emails? In church life, what I thought was a simple email can be an expression of spiritual need, harm or support, as can a text message. Instead, might Christians focus on the application of their values in all interactions rather than a list of morally okay or suspect software applications?

AI in our image?

No single AI application or system is perfect and probably never will be because it is built and shaped by flawed people. The sooner we realize what we have in common with AI, the better off I believe we will be. It will wake us to the need for regulation of design and production of certain forms of AI. We might begin discussing the features of ethical decision-making that need to be built into AI and how to promote diverse groups of designers and users, so we do not continue to end up with racial, gender and ableist biases (to name a few) in digital technologies. AI is not simply a tool; our use of it shapes us, and
we shape it.

Instead of a rules-based approach, a shared values-based approach can be flexible, adaptive, and responsive in ways that match digital technological changes.

Instead of a rules-based approach, a shared values-based approach can be flexible, adaptive, and responsive in ways that match digital technological changes. This approach would not proliferate rules for every context and application (a sheer impossibility), but rather focus on how to live out Christian values and tenets in complex situations, much like Jesus does throughout the Gospels. For example, what does it look like to live out the love commandment as one responds via email, text, or on social media? Sometimes, that might include a quick, autogenerated reply to confirm you received a request, other times it requires a thoughtfully selected .gif to express emotional affect in ways that words do not, and still other times it might require a switch to a different medium — perhaps with a call to the person or a voice message to capture tone and emotion.

AI and Faith, a collective of technologists, theologians, and scholars from the worlds’ religions (, works to center the following human values as shared across the world of faith: human life and dignity, spiritual values transcending materialism, human well-being, human liberty, societal justice and meaningful community. These values center humans, rather than technological advancement or profit. They seek to use these values to support the need for beneficial, safe, and transparent AI, fair and unbiased algorithms, safety limits on computational and biological augmentation, equality of access, stewardship of work, appropriate predictive analytics and data ownership, preservation and enhancement of civil liberties, and limits on autonomous weapons. We might add concerns beyond humans as AI advances require massive amounts of energy use with significant environmental impact.

Christians and faith communities need opportunities to cultivate what cultural sociologist Felicia Wu Song calls a “realistic and motivating vision of our circumstance that helps us imagine the kind of life we are hoping to live and how it is we can get there” in her book Restless Devices: Recovering Personhood Presence, and Place in the Digital Age. Such visioning requires that we identify and lead with our faith communities’ core values. We must then work to increase digital literacy: invite experts like a local computer scientist or professor to discuss ethics and AI, read trusted information from groups like, or ask denominational offices to design an information session on faith and digital ethics.

Being equipped and informed allows us to ask critical questions about whether AI applications and advances are helping us and our community live out the values identified in step one. The books and articles already mentioned are examples of such conversations. Our collective discernment should lead to individual behavioral interventions, community shifts, and advocacy for regulations. To support such efforts, faith communities need to routinely address the challenges and successes of implementing core faith values on individual and communal levels in the face of rapid digital change.