top of page
Grace Mackey

ChatGPT raises concern among university for future of education



With simple internet access, students all over the world now have unlimited information at their fingertips. Annabel Mowbray tested this on April 13 in the Warren Library at Palm Beach Atlantic University. She fed the artificial intelligence chatbot, ChatGPT, a hypothetical research prompt.


“Write me an essay about the effects of social media usage on teenager’s depression levels,” the PBA psychology major wrote.


Within seconds, ChatGPT provided five paragraphs of in-depth information on this topic organized in an essay format. The bot included statistics of what appeared to be accurate information, and Mowbray found the essay barely distinguishable from a student-written paper.


“I was shocked with how quickly it wrote a very academic paper,” Mowbray said. “It brought in different studies, and pulled actual real-life studies off the internet.”


ChatGPT was launched last November by OpenAI, an AI company owned by Microsoft, based in San Francisco, California. ChatGPT is able to adjust its responses if it makes mistakes, respond to follow-up questions and reject inappropriate topics, according to its website. The bot is one of the greatest AI advancements yet, sparking multiple conversations on its potential impacts.


While the list of potential effects of AI is endless, the potential effect of ChatGPT on education has become specifically prevalent. Some of the bot’s other skills consist of writing poetry, creating codes, summarizing long texts and writing substantially accurate research papers, which raises concerns about the growing opportunity for plagiarism.


At PBA, discussions surrounding ChatGPT are on the rise among students and professors as universities everywhere attempt to properly react to such skillful technology. Dr. Jenifer Elmore, the chair of the English Department at PBA, expressed concern about ChatGPT.


“It’s just frightening to think of a world where people haven’t practiced using their own minds, interpreting information on their own, synthesizing information on their own and evaluating conclusions,” Elmore said.


Emily Moses, an English and education student at PBA, says that professors are not the only ones wary of the new technology. She recognizes the unfairness it could create in the classes for students as well.


“It can be really frustrating to people who work really really hard on what they do,” Moses said. “It just skews the whole system.”


Despite the concerns, there is a wide range of opinions when it comes to AI. Ryan Kivett works as the senior technical support analyst at PBA, and views ChatGPT as a new tool for humanity that will require adaptation.


“I’m assuming the implication is a lot like the implication of a calculator for mathematics,” said Kivett, regarding how ChatGPT affects education. “It’s important to know the concepts but it’s also important to know how to use the tool that can pretty much do all the concepts for you.”


Kivett claims that adjustments to AI could include changing personal outlooks on plagiarism.


“If your classic view of plagiarism is the borrowing of someone else’s work, I haven’t yet really concluded yay or nay on whether or not work generated by an AI is plagiarized,” Kivett said. “Is it plagiarism to have a tool assist you in the creation of a paper?”


Professors at PBA are attempting to answer this question through a task-force created to update the university’s policy on ChatGPT and plagiarism. Dr. Elizabeth Stice, a history professor, has taken the lead on this force, and states that the school policy will prevent students from using ChatGPT without permission.


Stice added that there are instances where ChatGPT could be used as an educational tool, but any outside use of these assigned instances would be considered academic dishonesty. Like Kivett, Stice also acknowledged the ways in which teaching styles will have to catch up to the improving technology of today.


“If we’re giving an assignment that a machine can do, is it the right assignment? And the truth is sometimes it still is,” said Stice. “There should be points at which you’re offering the kinds of assignments that machines can’t do.”


Presentations, oral exams and hand-written finals are examples of assignments that are immune to the effects of ChatGPT, according to Stice. Meanwhile, assignments such as research papers are more vulnerable. Even with more vulnerable assignments, there are still some weaknesses within the AI tool as of now. She described cases in which the bot would make up sources, and even entire books.


“It knows things, but it doesn’t know why it knows things,” Stice said.


Through her work in local schools as an education major, Moses has also seen a need for teaching styles that can adapt to Chat GPT. She claims that, partly due to COVID-19, many of the students she works with are about two years behind in their education, and she worries that AI technology could worsen that fact.


“It’s really intimidating to be entering that field,” Moses said. “To have this way to to kind of cheat, and do less of the work, and not learn how to think for yourself – I think that’s really concerning.”


Elmore took the conversation surrounding ChatGPT even further by claiming that there are negative implications for how human life is valued when technology like this is used in school.


“There’s also the bigger picture of valuing human life, and valuing human dignity, and valuing human reason,” Elmore said. “You’re trying to basically eliminate all other talents and skills that make everybody valuable and replace them.”


Kivett is not new to conversations on the ethical implications of AI and remembers discussing this when he was a student 11 years ago. Although he sees ChatGPT as a useful tool, he is also in favor of asking the big-picture questions for both education and humanity as a whole in light of AI.


“The question of my generation was privacy, and it may still be a question for yours, but a new question for your generation is automation,” said Kivett. “What does it mean to automate so much of the world that we no longer need to labor as much? ”


The general consensus among faculty to make efforts to adapt is giving professors like Stice hope for the future of education.

“If the faculty are committed to also learning, as we go about it, and about how to avoid it, it’s not that it’s not a threat, but it doesn’t have to destroy everything necessarily,” said Stice.



By Grace Mackey


62 views0 comments

Comments


bottom of page