Dailey believes that eventually professors and students are going to need to understand that digital tools that generate text, rather than just collect facts, are going to need to fall under the umbrella of things that can be plagiarized from.
Although Dailey acknowledges that this technological growth incites new concerns in the world of academia, she doesn’t find it to be a realm entirely unexplored. “I think we’ve been in a version of this territory for a while already,” Dailey says. “Students who commit plagiarism often borrow material from a ‘somewhere’—a website, for example, that doesn’t have clear authorial attribution. I suspect the definition of plagiarism will expand to include things that produce.”
Eventually, Dailey believes, a student who uses text from ChatGPT will be seen as no different than one that copies and pastes chunks of text from Wikipedia without attribution.
Students’ views on ChatGPT are another issue entirely. There are those, like Cobbs, who can’t imagine putting their name on anything bot-generated, but there are others who see it as just another tool, like spellcheck or even a calculator. For Brown University sophomore Jacob Gelman, ChatGPT exists merely as a convenient research assistant and nothing more.
“Calling the use of ChatGPT to pull reliable sources from the internet ‘cheating’ is absurd. It’s like saying using the internet to conduct research is unethical,” Gelman says. “To me, ChatGPT is the research equivalent of [typing assistant] Grammarly. I use it out of practicality and that’s really all.” Cobbs expressed similar sentiment, comparing the AI bot to “an online encyclopedia.”
But while students like Gelman use the bot to speed up research, others take advantage of the high-capacity prompt input feature to generate completed works for submission. It might seem obvious what qualifies as cheating here, but different schools across the country offer contrasting takes.
According to Carlee Warfield, chair of Bryn Mawr College’s Student Honor Board, the school considers any use of these AI platforms as plagiarism. The tool’s popularization just calls for greater focus in evaluating the intent behind students’ violations. Warfield explains that students who turn in essays entirely produced by AI are categorically different from those who borrow from online tools without knowledge of standard citations. Because the ChatGPT phenomenon is still new, students’ confusion surrounding the ethics is understandable. And it’s unclear what policies will remain in place once the dust settles—at any school.
In the midst of fundamental change in both the academic and technological spheres, universities are forced to reconsider their definitions of academic integrity to reasonably reflect the circumstances of society. The only problem is, society shows no stagnance.
“Villanova’s current academic integrity code will be updated to include language that prohibits the use of these tools to generate text that then students represent as text they generated independently,” Dailey explained. “But I think it’s an evolving thing. And what it can do and what we will then need in order to keep an eye on will also be kind of a moving target.”
By Wired, March 26, 2023