We asked two professors and two students to share their thoughts on the use of artificial intelligence in the classroom, with varying perspectives. Here are their responses.ChatGPT makes us more productive. Isn’t that why we use it? It allows us to do work quickly and efficiently.
It is the perfect tool for an economic culture that values human beings chiefly for their powers to produce, i.e. to rapidly generate goods, services, or content for consumption.
But it is a profoundly imperfect tool for the work of a liberal arts education, which values very different human capacities: critical thinking and self-understanding, respect for difference and the common good, moral discernment and empathy, creativity, problem-solving, aesthetic appreciation, historical memory, happiness, a sense of justice.
ChatGPT can be used at universities in ways that help cultivate these qualities, but I fear it is more commonly used to skip the processes by which we exercise them.
When we ask ChatGPT to generate answers and opinions for us, we efficiently produce content, but we sidestep the challenges of intellectual and practical reasoning, of social exchange and self-knowledge.
It’s like skipping workouts. Eventually we weaken the muscles we need to navigate the world’s (or even the workplace’s) complex challenges with intelligence and authenticity.
The sum total of our labor at Goshen College ought to generate passionate learners, servant leaders, compassionate peacemakers, global citizens and people of active and reflective faith.
ChatGPT, by contrast, produces content by algorithm. In the process, it reduces education to production, and molds people into mere producers.
— Luke Kreider, professor of religion and sustainability
Over the past year or so, I began experimenting with using ChatGPT and immediately saw its incredible potential. When using ChatGPT as a student, I use it as a research assistant, a basic idea generator and an outline creator.
My job as a student is reduced to doing only the work where my creativity and my interpretation of the information I’m learning or writing about is actually on display. Too often, ChatGPT is considered a writing aid, but I think that is a misunder-standing of the extent of its uses.
ChatGPT is not creative, it is not unique, and to try and use this tool in those ways will result in a generic, repetitive and uninteresting paper. This AI tool shines when it can be tasked with general organization, framework creation and being used as a summarizer.
In the business world, I have found great utility in ChatGPT’s ability to play the role of a knowledgeable assistant. Prompting ChatGPT with the “Act as if” prompt was incredibly useful in the creation of my small painting business this summer.
I used ChatGPT to do everything from understanding what the next steps I should take to brainstorming an Instagram bio format. Its uses are just so widespread. With ChatGPT acting as a knowledgeable assistant, I have something to bounce ideas off of and receive basic advice sourced from real people and summarized by ChatGPT.
I’m really looking forward to seeing where ChatGPT and other AI programs can lead in the future of my own career and in the world as a whole. I have a lot of faith in the uses of these technologies and would like to see Goshen College embrace AI instead of resisting it.
I would love to see a class offered by GC teaching students how to use ChatGPT as a career tool learning how to write prompts and increasing general exposure to AI. This new wave of technology will assuredly be a very real part of our future and should be embraced rather than allowing GC to fall behind the curve.
— Micah Shenk, a senior majoring in business
Natural language processing tools fueled by AI technology like ChatGPT and Google Bard are exactly that — tools. Tools can be used to build or to do harm.
Look at agriculture: Fertilizers and manures are tools in the toolbox that can be really useful for managing agroecosystems.
Precision agriculture is the practice of using just enough fertilizer exactly where you need it to achieve optimal growth and avoid losing nutrients to runoff and leaching.
However, these tools can be misused for the detriment of our planet. Fertilizer can be overapplied, causing harmful algae blooms, dead zones in the oceans, and contamination of groundwater leading to disease and death for the very young.
AI is a tool and can be really helpful within the academic context. For example, my GIS students use it to help them write programming code. I have had students use it to help them reorganize their notes to help them study.
AI can be used to efficiently scour the internet to summarize a topic a student is trying to understand for class, and it answers questions in easy to understand language. There are so many student-centered examples of what AI can do in academia — just ask it yourself!
That being said, there are plenty of ways students could misuse AI as a tool. Simply turning in an AI generated essay, using AI to answer a take-home multiple-choice test, or even writing an entire term paper. AI is not great at these tasks yet, but it is getting better!
It is not for me to decide whether or not a professor should allow AI to be used in their classroom — context is important and the types of assignments students are being asked to complete matters in this regard.
But one thing is for certain: AI is not only here to stay… it will continue to improve and many students graduating in the next 10 years will find themselves working with AI in the workplace in one way or another.
How do we best prepare our students to work alongside AI in the careers of the future?
— John Mischler, professor of sustainability and environmental education
I think there are two big issues around using ChatGPT or other AI tools: a loss of artistic meaningfulness, and a danger of inaccuracy.
The value of any art derives from the vision or artistic intent behind it and from the skill in execution.
Skill is almost completely obscured by the use of AI, as it can emulate a variety of styles, creating a blend of the user and AI — at best.
At worst, the skill being showcased is entirely the AI’s, derived from training data.
Artistic intent is not necessarily damaged, but can be if the artist is distracted by their tools; it can be very fun to play with ChatGPT.
My other concern stems from the fact that AI tools are, and will continue to be, prone to factual inaccuracies.
If users are sufficiently diligent in their fact-checking, this is not a problem, but I do not think this can be relied on.
— Asa Schiller, a senior majoring in math