A recent study revealed that ChatGPT, a Large Language Machine (LLM) that can produce academic-style content, is relatively formulaic and can be detected by many existing AI-detection tools. The study's authors, from Plymouth Marjon University and the University of Plymouth, UK, have said that these findings should serve as a wake-up call to university staff to explore ways to explain academic integrity to students and minimize academic dishonesty.


To produce academic-style content through ChatGPT, the researchers provided a series of prompts and questions, such as "Write an original academic paper, with references, describing the implications of GPT-3 for assessment in higher education," "How can academics prevent students plagiarizing using GPT-3?" and "Produce several witty and intelligent titles for an academic research paper on the challenges universities face in ChatGPT and plagiarism."


ALSO READ: Students Believe Using ChatGPT To Finish Assignments Is Cheating, Study Finds


The text generated by ChatGPT was pasted into a manuscript and ordered broadly, following the suggested structure. Genuine references were then inserted throughout the text. The study was published in the Innovations in Education and Teaching International journal, with the process revealed to readers only in the discussion section, which was written directly by the researchers without the software's input.


While ChatGPT is seen as a promising AI platform that could revolutionize research and education, it has also raised concerns about academic honesty and plagiarism. As it becomes more advanced, it poses significant challenges for the academic community. However, the authors believe that banning ChatGPT, as done within New York schools, can only be a short-term solution. Instead, universities should adapt to a paradigm where the use of AI is the expected norm.


ALSO READ: OpenAI Takes ChatGPT Offline Briefly Following Leak Of Users' Payment Information


"This latest AI development obviously brings huge challenges for universities, not least in testing student knowledge and teaching writing skills – but looking positively it is an opportunity for us to rethink what we want students to learn and why," said the study's lead author, Debby Cotton, professor at Plymouth Marjon University. 


"AI is already widely accessible to students outside their institutions, and companies like Microsoft and Google are rapidly incorporating it into search engines and Office suites. The chat (sic) is already out of the bag, and the challenge for universities will be to adapt to a paradigm where the use of AI is the expected norm," said corresponding author Peter Cotton, associate professor at University of Plymouth.