Briefing note on AI in education and assessment

Developments in the abilities and ease of access to Artificial Intelligence (AI) tools are receiving extensive media coverage. The focus has largely been on ChatGPT, but there are hundreds of these tools and more recently Google released their own AI conversational chatbot, Bard. Similarly, there is extensive discussion across higher education about the impact that AI will have on learning, teaching and research as well as society at large. An early area of concern is the impact of AI on academic integrity as students may use these tools to generate answers, essays, and computer code as part of their assessed course work. Attention, however, is also turning to the how AI may be applied in our educational practice and teaching approaches.

This briefing note provides guidance on some of the key questions that we are receiving around AI and assessment. It also details some of the work we have planned over the coming months to help us develop further guidance and policy in this area and explore how we might apply AI in education.

Academic Integrity

University Policy

The University Academic Misconduct by Students Code of Practice (section 2) defines the types of activities that would constitute academic misconduct. The unauthorised use of AI systems is included in this list of activities and whilst Wolfram Alpha and Sudowrite are mentioned as examples, this equally applies to ChatGPT and other AI tools. 

Where the use of AI is authorised, students should acknowledge and reference its use as they would other sources that have been referred to and have helped to shape and inform students’ work.

AI Writing Detection

Whilst there are tools that can detect AI they demonstrate varying levels of reliability. Jisc and the QAA have provided helpful information on these detection tools:

Jisc notes: “AI detectors cannot prove conclusively that text was written by AI.” 

Michael Webb (17/3/2023), AI writing detectors – concepts and considerations, Jisc National Centre for AI

The QAA advises: “Be cautious in your use of tools that claim to detect text generated by AI and advise staff of the institutional position. The output from these tools is unverified and there is evidence that some text generated by AI evades detection. In addition, students may not have given permission to upload their work to these tools or agreed how their data will be stored.

QAA (31/1/2023), The rise of artificial intelligence software and potential risks for academic integrity: briefing paper for higher education providers

To date the University has not formally adopted any AI detection tool. Staff may be aware that Turnitin released AI writing detection capability (AIWDC) on 4th April. The University, along with the majority of UK universities, decided to opt out of this launch. A key factor in this decision was that we were unable to test and evaluate the reliability of the detection tool and were not provided with independently verified data to give us sufficient confidence to implement at this time. We are planning to evaluate and pilot the Turnitin AIWDC and will share more details on how lecturers can be involved in this activity once plans are confirmed. Colleagues across the UK HE sector will be running similar evaluation exercises and once we have reviewed all the data, we will make a decision on whether we might adopt the Turnitin AIWDC.

In the meantime, you should *not* use unauthorised AI detection tools, not least because we do not have student consent to upload their work to third party sites. 

AI in Teaching, Learning and Assessment

Whilst AI detection tools are being developed, they are already lagging behind the latest releases of AI. For example, Turnitin’s AIWDC is trained on ChatGPT-3 and 3.5 and already being viewed as out of date given ChatGPT-4 is already in wide use. This whole space is being talked about as an arms race that the detectors will never win. Consequently, the debate across higher education is shifting to focus on the implications of AI on assessment and programme design and the utility of AI to support education and research.

AI tools such as ChatGPT are already being used in many professions and areas of business that our graduates will be working in. It is essential therefore that our students develop AI literacies together with the critical thinking skills and competencies needed to use and apply AI. Colleagues in the University are already using AI in the classroom in disciplines such as marketing mirroring its adoption in marketing agencies around the globe. There are likely many others who have started exploring how they might use AI in teaching and assessment.

Enhancement theme project

Over the coming months there will be opportunities to engage with one of our current enhancement theme projects looking at the potential of AI to support teaching and learning practice. We will be sharing details of a series of webinars that will look at how AI could be applied to a range of teaching perspectives. There will be opportunities to continue the discussion and experiment with AI beyond the webinars which we hope will help us to nurture a community of practice in the use of AI. Academic and professional services staff and students are all invited to engage and participate in this project and contribute to the development of further guidance on the adoption of AI and the co-creation of good practice guides on the use of AI in teaching and assessment. 

Assessment Design

With the Covid pandemic and associated lockdowns there is already increasing reflection, review and debate about our assessment practices. The advent of AI has heightened this debate and the importance of assessment design and the need for more authentic assessment. As the University progresses its review of the current assessment policy there is an opportunity to reflect on our existing assessment practices and develop new guidance that can inform new approaches to assessment that align with our Curriculum Design Principles (CDPs). Considering more authentic approaches to assessment also requires us to think about programme and learning design and here again our CDPs are key. Our new Education Academy will play an important role in all this development work. 

Advice and Support

We will keep our education community updated on developments and have further guidance and policy amendments in place ahead of the 23/24 academic session.

In the meantime ,if you require specific advice or support on AI in teaching or assessment please be in touch with Natalie Lafferty (n.t.lafferty@dundee.ac.uk) and Emma Duke-Williams (e.dukewilliams@dundee.ac.uk) in CTIL.

Leave a Comment