An increasing amount of British university students have been caught using artificial intelligence to cheat on their coursework in recent years.
This week, The Guardian released the results of an investigation showing that between 2023 to 2024 there were more than 7,000 proven cases of academic fraud involving AI bots. The staggering number represents a 218.75 per cent rise from the previous one year period.
The data was obtained through a series of freedom of information requests sent to 155 universities in the United Kingdom. Out of those, 131 provided The Guardian with statistics about student misconduct.
Unfortunately, details regarding precisely how AI was used by the delinquent scholars to cheat are scarce in the report. The occurrences are most likely attributable to essay work, with students using large language models like ChatGPT to write their papers for them.
“With the rise of generative AI models, many students are leveraging these technologies for essay writing, coding, and even exam answers,” the Times of AI said in a LinkedIn post regarding the investigation, “raising serious concerns around academic integrity, fairness, and how institutions adapt to this rapidly evolving tech landscape.”
Though tricky in comparison to traditional plagiarism, software has been developed that is capable of determining when a student has used artificial intelligence for their work.
Programs like CopyLeaks, Turnitin and GPTZero analyze written material to identify patterns that are typical of AI-generated content. They can provide an assessment highlighting the probability that the writer was cheating.
“The people who get caught are the ones who are too lazy to even edit the text that they are copying and pasting using AI,” one Reddit Inc (NYSE: RDDT) user commented in a discussion about the assessment.
Read more: Qure.ai offers artificial intelligence training in US to transform clinical operations
Read more: Artificial intelligence and the mining industry: A Mugglehead roundup
Follow Rowan Dunne on LinkedIn
rowan@mugglehead.com
