An overwhelming 92% of UK university students are now using artificial intelligence (AI) tools to support their studies, according to new research published by the Higher Education Policy Institute (HEPI). The report, which has resurfaced in student forums this week due to end-of-term assessment stress, suggests that AI is now deeply embedded in the academic lives of undergraduates across the country.
The study, released in February 2025, reveals a dramatic rise in the use of generative AI tools such as ChatGPT, with student adoption increasing from 66% last year to 92% this year. Among those surveyed, 88% said they had used AI tools specifically to assist with assessments, while 18% admitted to submitting work that included AI-generated and edited content.
Time-saving was the most cited reason for turning to AI, with over half of students saying the tools helped them manage heavy workloads and multiple deadlines. Others highlighted the way AI could enhance the quality of their assignments or explain difficult academic concepts in more accessible language. Despite these perceived benefits, the report found that many students still had significant concerns.
More than half of respondents said they worried about the risk of being accused of academic misconduct. A similar number said they were not fully confident in the accuracy or reliability of the information produced by AI platforms. Some students expressed concern about depending too heavily on the technology, particularly as universities have started implementing tougher guidance and detection systems.
The findings come amid growing debate over how higher education institutions should respond to the rise of generative AI. While 80% of students reported that their university had a clear policy on AI use, only 36% said they had received any formal training on how to use it responsibly. The lack of structured support has led some students to navigate the tools cautiously, unsure of where the line is drawn between legitimate assistance and plagiarism.
The OfS and other regulatory bodies have yet to issue sector-wide rules on AI, leaving individual institutions to develop their own approaches. Some universities have moved to redesign assessments to make misuse more difficult, while others have invested in AI detection software and awareness campaigns. But the pace of change continues to outstrip the guidance currently available.
The report also noted disparities in how students engage with AI, with those from more privileged backgrounds and tech-savvy courses more likely to benefit. Concerns have been raised that a digital divide could widen existing inequalities if access to AI tools and training remains uneven across institutions.
HEPI Director Nick Hillman said the findings demonstrate a clear need for universities to invest in student education around AI. “Students are embracing the technology, often for very good reasons, but too many are being left without the knowledge or support they need to use it ethically and effectively,” he said.
As universities prepare for final assessments and summer exams, the pressure is on to clarify what role AI can and should play in modern academic life. For now, students are forging ahead — cautiously, curiously, and in record numbers.