Main page > Introduction to Assessment Design in the AI-Age > Tips & Ideas on Embracing AI Tools
Please note: The use of AI within teaching and learning is a rapidly developing area. This information reflects our current thinking, but please check back regularly as we will be updating the information on this page as things develop.
There are many really effective ways that students can use AI to help the learning process and AI literacy is a great employability skill to develop in students. However, in assessing students, it is important that we are assessing what the students are capable of and not the output from AI tools.
Principles for integrating AI into teaching and assessment
Do not get tied into one platform
There are lots of AI platforms out there. For general purpose use, the University recommends Bing CoPilot Chat Enterprise, because this is part of our Microsoft license and has safeguards to protect the information entered in prompts, provided you are logged in to the service with your University account.
Many of the popular AI platforms are currently offered as preview releases. There is no guarantee that free access will continue indefinitely, so designing activities which will only work on a specific platform can therefore be risky. It can also exclude students who, for whatever reason, do not wish to create a personal account with particular platform providers (eg due to privacy concerns). If you are requiring (rather than suggesting) that students create an account with a third party platform, you may need to complete a Privacy Impact Assessment.
(This does not apply to AI features which have been added to University provided tools such as the Adobe suite)
Be mindful of digital poverty issues
Some platforms offer free and paid for versions. It is important to make sure that your activity can be achieved on the free versions (and that those who have paid for advanced versions don’t get too big an advantage) – especially if it is assessed. Wherever possible, design activities to work with University provided platforms, but allow students flexibility to explore alternatives if they wish to.
Model good practice
Privacy Data Protection
Anything entered into the Generative AI chatbot could potentially be used to train it in the future. Take the opportunity to remind students about the potential risks of including personal/confidential information in the prompts used with AI models and data submitted to them for analysis. This may include research data in some situations.
Checking the output
Demonstrate strategies for verifying that the AI output is correct, free from unwanted bias and not the result of “hallucinations”. Remind students that if they use the AI output, they are taking responsibility for it being accurate.
Reference accurately
If you use AI tools to prepare your session, remember to cite the model accurately as an example for students.
Refer to the guidance
Remind students to consult the guidance and principles available to them.
Tips
- Microsoft’s Bing CoPilot Chat Enterprise is the University’s recommended AI chatbot, which is based on the same technology as ChatGPT and Dall-E. It is available to all staff, students and PGRs. Remind students to check that the green shield is shown to indicate they are logged in and the data they submit is protected.
- Always set clear expectations and boundaries around the use of AI – use the standard AI use level descriptors to make it clear what it is ok and not ok to use it for and how students declare and are transparent about its use. Also make sure students are aware of the Academic Misconduct Regulations.
- Explain how to cite AI use using the Library guides for citing AI generated text and AI generated images.
- Teach students how to use prompts effectively, discuss the benefits, challenges and limitations of these tools (accuracy, biases, transparency, ethical issues etc).
Follow this link to an idea on how to have discussions with students about the appropriate use of AI tools.
Ideas
- Ask students to use an AI tool as they would Google for researching topics and finding out about things – but as with Google or other sources, teach students to have a critical eye on the results. Encourage students to use it as a study partner – asking follow-up questions to what is provided.
- Ask students to critique a piece of AI Generated text in response to a given prompt. Ask for its strengths and weaknesses and judge how accurate the piece of work is and the importance of referencing these.
- Give students four different AI responses to the same prompt and ask them by fact-checking to decide which they think is the best and why. They could annotate the responses – see this link for a case study on this idea.
- Ask students to use prompts in AI to prepare for an-in class debate.
- Ask students to write the answer to a question without the use of AI tools and then put the same questions into an AI tool and discuss the differences. Then use the discussion with the group to improve the original piece of writing.
- Give the students a group task where they are given four pieces of writing, two written by students and two by AI tools – students have to spot which is which and feedback their rationale behind their decisions.
- Here is a useful resource about teaching AI Ethics – ideas for different disciplinary AI discussion points in the classroom.
- Here is another external case study on using AI in assessments where the tutor uses AI for an assessment in Data Science for Cultural Heritage. It is also an example of scaffolded assessment.
- This paper from colleagues at the University of Pennsylvania gives examples of prompts to turn ChatGPT into an interactive mentor or coach which can help students to check their knowledge, learn something new or prompt reflection.
- If you would like to learn more about Embracing AI, have a look into our AI case studies.