Advertisement

ChatGPT is here. There is no going back.

Mentioning AI on college campuses can earn you a monologue on academic integrity, but we can only move forward, writes college chaplain Maggie Alsup.

Photo by Cash Macanaya on Unsplash

Working on a college campus, you must be careful about mentioning the use of AI or the purpose of such a tool. If you’re not, you may catch a professor reciting their monolog outlining the evils of AI in the academic world. And while there is some validity to their reaction and concerns about this emerging technological tool, I find it to be just that, a tool.

I think part of what makes AI a challenge for the academic world is that there are no true rules or guides to help navigate this new instrument. Students can use it, and do use it, in ways others might deem harmful to academic integrity. I understand that side. I get the hesitation. We received this tool before we could develop the ethics about its use.

But in my experience, it is never a good practice to shut something out or make it restrictive in a way that will cause pushback and challenge. I try to embrace this tool instead of running away or ignoring it.

I try to embrace this tool instead of running away or ignoring it.

I am currently reworking my future lesson plans with the help of AI and finding ways to integrate its use alongside traditional coursework. To me, this process is fascinating. There is still a lot to learn about AI and plenty of need for ethical reflection on its use. But this much is clear to me: it can be helpful.

Several months ago, my coworkers and I decided to try ChatGPT. We wanted to see what all the fuss from our faculty colleagues was about. We sat together and thought of questions related to our work. We created the parameters for our topics and entered them all into ChatGPT. What resulted was a wild experience: outlines for emails, basic lesson plans, liturgy for worship, prayers and letters to community partners. The list went on and on. And it was captivating to engage in the process.

The items ChatGPT produced were not perfect. There were grammatical errors. There were some oddly worded phrases. All these things indicated that the product was not something created by a human. And that absence is the key to AI ethics for me.

We are just starting to build an ethical framework of AI in the academic world, and I hope the church is also thinking about such a thing. But the key to me is the human element. When working with ChatGPT to craft prayers, it does a decent job. But if you compare an AI prayer to a Chaplain Maggie prayer, the thing missing would be the heart — the human element.

ChatGPT has been introduced to our lives. There is no going back. We should find ways to integrate it into our work rather than push back or turn from it. It can offer words when you are having a brain freeze or are too tired to think. It can offer a frame for your writing. It isn’t perfect, but it is a tool that we can and should learn how to use — just don’t forget to add your human uniqueness as you go along.


The Presbyterian Outlook is committed to fostering faithful conversations by publishing a diversity of voices. The opinions expressed are the author’s and may or may not reflect the opinions and beliefs of the Outlook’s editorial staff or the Presbyterian Outlook Foundation. Want to join the conversation? You can write to us or submit your own article here

LATEST STORIES

Advertisement