Out-Law News 1 min. read

UK universities ‘must provide clarity’ over student AI use

Photo by Matthew Horwood via Getty Images

Photo by Matthew Horwood via Getty Images

Recent research highlights a division between student approaches to artificial intelligence (AI) and emphasises the need for clarity from universities on the way in which these tools should be used, an expert has said.

The Higher Education Policy Institute released the outcomes (16 pages / 468 KB) of a survey carried out to analyse the use of AI in higher education. The survey, carried out in association with Kortext, found that of 1,250 students asked, approximately 53% have used AI to assist with assessment work.

Many of those surveyed asked AI to suggest ideas for research, explain concepts or summarise academic texts. A total of 13% admitted to having used AI tools, such as Chat GPT and Bard, to assist with creating text for coursework, with 5% admitting to submitting that text without editing it personally.

Julian Sladdin, higher education expert at Pinsent Masons said: “Although many students are taking the advantage of available products such as AI Private Tutor, Chat GPT and Bard to assist their studies, there are still many who are not accessing the available tools which may be due to the lack of access or lack of clarity from their institution regarding bounds of acceptable usage”.

The report calls on universities to provide access to all students to the tools that may benefit learning, urging institutions to ‘level up’ access after the survey findings suggests students from more privileged backgrounds were more likely to utilise these tools. The survey also found that students think institutions should provide more opportunities to use AI tools in courses as a whole.

Sladdin acknowledged the findings but added that institutions must improve methods of tracking AI use in line with more student exposure to the tools. “I agree with the concern that universities need to be clearer and support students in being able to access generative AI in a way that fosters their academic development and ensures equitable and responsible access,” he said. “However, this work needs to be bolstered by continued investment in more creative methods of assessment and detection software to mitigate the risk of misuse and unfair advantage or a perception that academic integrity is being undermined.”

Most institutions have not yet radically changed their approach to assessment with AI in mind, the survey suggests.

The report also calls on universities to have clearer and more robust policies on acceptable and unacceptable AI usage and the ‘red lines’ which need to be drawn around questions of trying to gain unfair advantage or academic misconduct.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.