Hostname: page-component-5b777bbd6c-7mr9c Total loading time: 0 Render date: 2025-06-19T13:44:00.710Z Has data issue: false hasContentIssue false

Making use of AI in the Classics classroom

Published online by Cambridge University Press:  16 June 2025

Anya Morrice*
Affiliation:
King’s School Rochester, Rochester, UK
Sarah Deering
Affiliation:
Ipswich School, Ipswich, UK
Alex Kemsley
Affiliation:
Orwell Park School, Ipswich, UK
Sophie Judge
Affiliation:
Royal Hospital School, Ipswich, UK
*
Corresponding author: Anya Morrice; Email: amorrice@kings-rochester.co.uk
Rights & Permissions [Opens in a new window]

Abstract

This article follows a session on ‘Using AI in the Classics Classroom’ delivered at the University of Cambridge Mentors’ Day for the Classics PGCE (Initial Teacher Education programme). The first half of the article provides a brief introduction to how generative AI operates and the impact AI has had on education in the UK. In addition, this section considers the advantages of AI for educators in supporting PGCE students and early career teachers, aiding with planning and resource creation as well as the advantages for pupils. It also sets out practical limitations such as AI hallucinations, biases, database limitations, data protection concerns, and the potential risks of pupils developing reliance on AI usage and how teachers can avoid this. The second half of this article provides guidance and examples of how teachers can use AI to support their workload outside of the classroom and for using AI with pupils in the classroom. This includes advice on how to improve AI prompts, example prompts, and prompt scaffolds, as well as recommendations of AI tools for teachers to use in the classroom.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of The Classical Association

Introduction

Earlier this year, I and my co-writers co-ran a session on using generative artificial intelligence (hereafter ‘AI’) in the Classics seminar room at the University of Cambridge Mentors Day for the Classics PGCE students. We are Classics teachers who work in preparatory and secondary schools in England who have developed an interest in exploring the use of AI in teaching practice. The following is a short introduction to AI and suggestions for how to use it in the classroom. The first part of this article provides context about AI: what it is, why it is important to be aware of, and the opportunities and potential limitations of its (mis)use. If you are already familiar with AI or are primarily interested in how you can make use of AI, then I would recommend you skip to Part 2, which focusses on strategies for using AI to assist with workload, early career teaching, using AI with pupils in the classroom, and for using AI more effectively. This section also provides example prompts and prompt scaffolds to use with AI tools.

Part 1: An introduction to artificial intelligence

What is generative AI?

While AI has frequently appeared in the news since late 2022, it is not a new technology. AI has existed since the 1960s, and humans have used it for many different applications: automated telephone services, chatbots, computer-based games, and satellite navigation tools. United Nations International Children’s Emergency Fund (UNICEF) defines AI as ‘machine-based systems that can, given a set of human-defined objectives, make predictions, recommendations, or decisions that influence real or virtual environments’ (UNICEF, 2021). This definition covers a broad spectrum of capabilities. Until recently, publicly available AI systems would decide on the most appropriate response or action following human interaction by using a range of predetermined responses or actions. This response could be complex, for example a navigation system deciding the best route drawing on information from maps, road closures, and traffic levels. However, these systems did not generate new responses, independent of the responses that were pre-programmed. In the case of a navigation system, this system is only capable of deciding the best possible route out of the pre-existing possibilities according to the available data and pre-coded rules on how to interpret these data. The navigation system is not capable of generating an entirely new route.

OpenAI’s release of ChatGPT 3 in November 2022 dramatically changed public awareness of AI and its potential possibilities. ChatGPT 3 was the first publicly available open-access generative AI. Unlike previous AI systems, ChatGPT 3 could simulate human learning to generate new content in response to human prompts. Following the release of ChatGPT 3, the number of AI systems and their capabilities have risen exponentially. These AI systems learn from data, but rather than directly copying information from them, they use the information to create new content. These data may come from databases known as large language models (LLMs) – an artificially intelligent system that humans have trained to comprehend and analyse human-produced text using human-generated content such as books and articles (Maslej et al., Reference Maslej, Fattorini, Perrault, Parli, Reuel, Brynjolfsson, Etchemendy, Ligett, Lyons, Manyika, Niebles, Shoham, Wald and Clark2024). Alternatively, more recently created or updated AI systems may use the internet as their source of data. This transition from databases to using the internet occurred in a matter of months.

Why is it important for teachers to know about AI?

The development of AI systems has been exponential over the last few years. OpenAI released the first public, free and open-access generative AI, ChatGPT 3, in November 2022. Within 2 months, ChatGPT 3 had exceeded a hundred million users (Hu, Reference Hu2023). By February 2023, while some universities sought to explore the potential opportunities of AI, 40% of UK universities had banned or were preparing to ban ChatGPT (Wood, Reference Wood2023). Economists at Goldman Sachs cautioned the impact of AI on the workforce – potentially replacing over 300 million jobs (Hatzius et al, Reference Hatzius, Briggs, Kodnani and Pierdomenico2023). At the time of writing, AI has become the fastest technology in history to reach widespread societal usage across the globe (Harrison, Reference Harrison, Araya and Marber2023). Unlike previous technologies which have spread gradually allowing knowledge of their use, capabilities, and limitations to develop over time, the rapid spread of AI has not allowed this. The ethics of using AI, efficacies, and pedagogies for learning about and learning with AI remain nascent.

Despite this, AI has already and continues to have considerable societal and educational impact. The UK Department for Education (DfE) advised schools that ‘[t]he education sector needs to: prepare pupils for changing workplaces; teach pupils how to use emerging technologies, such as generative AI, safely and appropriately’ (DfE, 2023). As more workplaces make use of AI to automate tasks or augment human capabilities, it becomes increasingly important that pupils become aware of how to use AI and the ethics of using AI not only to act as responsible citizens and be able to make critical judgements about AI generations but also to provide them with skills in AI literacy that they may need in the workplace as well as an awareness of issues surrounding AI-generated content such as deepfakes in the media (images, video, or audio depicting real or non-existent people with the intention of deceiving others).

In addition, a substantial number of pupils in the UK have begun to use AI for educational purposes. Recent estimates of the number of UK secondary school children using AI for education vary from 14% (DFE, 2024) to 25% (Bisoondath, Reference Bisoondath2024) and even 42% (RM, 2023). While there is considerable variation in these results, this may be explicable through the wording of the survey questions as some of the surveys asked about specific AI systems or specific contexts of use. However, the surveys do suggest that a significant number of school children are using AI. From my own observations, in many of my classes, I am aware that I usually have at least two pupils using or trying to use AI on tasks. As more devices are built with integrated AI software such as Microsoft’s Co-Pilot and Apple’s Apple Intelligence, it will become easier for pupils to access AI, especially as many schools transition to greater use of technology and pupil devices. Many new devices with Microsoft Windows come with Co-Pilot installed, and using AI is becoming as accessible as an internet search.

What are the potential disadvantages of using AI?

Many UK pupils have received little or no guidance from schools on how to use AI safely and responsibly. Consequently, many pupils are unaware of how to use AI to enhance their learning and may use AI in ways that are detrimental. If pupils use AI to complete a task for them, they are not accessing the learning intended by that task. While written communication may appear at a high standard, if pupils frequently use AI, their performance on tasks carried out without access to AI could potentially decline if they are not otherwise developing required skills (Brynjolfsson, Reference Brynjolfsson, Araya and Marber2023; Bartoletti, Reference Bartoletti, Holmes and Porayska-Pomsta2023). This risk of technology hindering learning is not new to AI and has been researched with other digital technologies (Mueller & Oppenheimer, Reference Mueller and Oppenheimer2014; Zierer, Reference Zierer2019). Using AI to provide answer scaffolding or support could potentially lead to reliance on the AI if pupils do not use the AI to develop their skills and confidence to use these skills independently of the AI. This is akin to giving pupils a scaffold every time but never reducing it and challenging pupils to progress to a point where they no longer need it. A recent survey found that many pupils were concerned about their performance in exams when they did not have access to AI (RM, 2023). This does not necessarily mean that pupils are copying and pasting AI-generated content to use as their own work. Doubtless, some are; but other pupils may be using it because they recognise it can help them learn yet are unaware how to use AI effectively to reduce reliance on it. There is a risk that if AI use is merely to replace or automate existing tasks that it will decrease human agency and hinder learning (Bartoletti, Reference Bartoletti, Holmes and Porayska-Pomsta2023; Mueller & Oppenheimer, Reference Mueller and Oppenheimer2014; Zierer, Reference Zierer2019). Puentedura’s Substitution, Augmentation, Modification, and Redefinition (SAMR) model set out four ways that people can use technology (Puentedura, Reference Puentedera2014). Puentedura argues that, if technology is merely used to replace (substitute) or augment (substitution with some functional improvement), it does not necessarily lead to any learning gains and may hinder learning, as pupils do not sufficiently develop other skills (Puentedura, Reference Puentedera2014, Zierer, Reference Zierer2019). Alternatively, new technologies to transform tasks and what is possible (modification and redefinition) will promote higher-order thinking and provide tasks that are high in challenge but low in threat to pupils (Zierer, Reference Zierer2019). Using new technologies to substitute or redefine existing capabilities may mean that users develop different skills. For example, the rise of navigational systems has led to a decrease in map-reading skills. This could potentially exacerbate existing social inequalities as pupils’ attainment suffers through overreliance on the assistance of AI (Brynjolfsson, Reference Brynjolfsson, Araya and Marber2023). This is an area that is not well researched at present, but teachers need to be mindful of what skills their pupils should develop and whether any technology use potentially hinders their development of them. AI is a technology, and like any other technology, pupils need to learn the skills for using AI and understanding of how to use it responsibly (Shiohira & Holmes, Reference Shiohira, Holmes, Araya and Marber2023). While schools devote considerable time to teaching pupils to use technologies, at present, some educators (unintentionally) expect pupils to understand the ethics of using AI or how to use AI without providing the same extent of clear guidance as with other technologies.

AI also poses risks for academic integrity. It can be incredibly difficult to detect AI-generated work as it can complete most tasks with varying degrees of success. While anti-cheating software, such as Turintin, claim to have the ability to detect AI, this software relies on AI systems to detect AI and cannot reliably detect it, often flagging AI-generated content as human generated and vice versa (Ahmad, Reference Ahmad2024; Oxbridge Editing, 2023). At the time of writing, it is not possible to detect AI usage conclusively by technology alone (at least in primary and secondary education, teachers are likely to be familiar with their pupils’ style of work and prior attainment; however, there are other ways to identify AI usage, as I explain in the next section). Moreover, it is impossible to prevent AI usage outside of school and increasingly difficult to prevent pupils from using it within schools. Strategies to prevent AI usage on tasks are also becoming less effective as AI systems become more powerful.

As AI is a learning technology which generates new content rather than providing a series of pre-existing results, such as a search engine, it also produces flawed results. Similar to a learner, AI can and does also make mistakes. It can invent new (factually incorrect information), a phenomenon known as ‘hallucination’. In addition, AI may not necessarily be able to appropriately evaluate between conflicting sources of data (Shiohira & Holmes, Reference Shiohira, Holmes, Araya and Marber2023). Consequently, AI can replicate and exaggerate biases existing in its data set or make content on the basis of factually incorrect or completely irrelevant data. On one occasion, my attempt to ask ChatGPT for a model answer for a GCSE Classical Civilisation question for the unit on Roman City Life, resulted in detailed (but incorrect) instructions on how to carry out a completely irrelevant Chemistry experiment. On another occasion, an AI pretending to be an Ancient Egyptian provided a history of global carrot consumption when asked about their favourite food. While comical, this can also be confusing and frustrating for teachers and pupils.

Few AI tools are transparent about their data origins or cite their sources without prompts. Users need to be aware to fact check the information provided by AI and be mindful of the potential for biases. Many LLMs (used to train AI systems) are based on data created before 2021 (Maslej et al., Reference Maslej, Fattorini, Perrault, Parli, Reuel, Brynjolfsson, Etchemendy, Ligett, Lyons, Manyika, Niebles, Shoham, Wald and Clark2024). Therefore, these systems do not have access to more recent information. While this may not seem like a significant problem in Classics, it can affect how users can interact with AI. If users ask the AI to cite its data or ask for recommendations of websites, these links may be outdated and no longer work or even non-existent.

AI systems also raise data protection concerns. Once a user provides an AI with information, the AI will learn from this information and draw upon this learning to answer other users (Moares & Previtali, Reference Moares and Previtali2024). Many AI systems require users to input their personal information to a website to create an account or ask for access to additional information on a device or digital accounts such as contacts of Google Drive files. The issue is that these AI do not always make their third-party affiliations and data storage locations clear or explain how they will use these data (i.e. whether they will share it with third parties (O’Rorke & Talbot Rice Reference O’Rorke and Talbot Rice2024). The complexity of finding this information means that users may potentially risk their personal information. While deleting an AI system chat or account may give the impression of deleting the data, this is not necessarily the case. In addition, users need to take care not to input personal data or intellectual copyright (both of published works and of pupils) to avoid breaching relevant copyright or data protection legislation such as GDPR in the UK. A further issue with copyright is that AI can unintentionally infringe copyright through creating content that closely resembles existing work.

What are the benefits of using AI?

Despite the limitations, AI provides considerable opportunities for education, such as aiding with teacher workload. AI is good at completing tasks that are useful but time consuming, such as the creation of resources, templates for letters, or creating adaptive resources. AI can also assist teachers with planning and lesson preparation as well as marking. AI tools take a bullet point list of feedback for a report or UCASFootnote 2 statement and turn it into prose. The capabilities are useful for teachers with limited time or those who are less experienced, such as early career teachers. It also allows teachers to be more efficient as they face increasing demands on their time. AI also facilitates adapting existing resources such as set texts to the needs of pupils and enables teachers to provide greater support than they may otherwise be able to.

At the same time AI can support pupils as a learning aide. It can guide learners on how to plan tasks and provide non-judgemental answers to questions that pupils may be nervous asking their teacher or their peers. Surprisingly, some AI tools can also be supportive and encouraging when pupils make mistakes or lack confidence. I have observed interactions where the AI provides guidance to pupils to help them through a task and build their confidence on the task when pupils told the AI they did not understand what to do. Another method is to use AI as an alternative to a search engine. While pupils risk finding incorrect information, I have seen pupils use the AI to considerable effect when they ask it to recommend websites. AI can act as a filter, providing pupils with a smaller selection of relevant websites to start their research, reducing the cognitive load. Pupils can also ask AI to provide them with model answers, sentence starters, or examples of persuasive language that they can use to improve their communication. I have observed pupils take agency in their learning by using AI to find methods of improving their work, for example by asking for suggestions on structure, asking for examples of persuasive language, or asking how they could show one of the success criteria. These pupils showed awareness of the qualities of a good answer but were unsure of how to apply these qualities, and the AI supported them by providing a range of examples. For pupils who have English as an additional language, AI can help them readily access content by acting as a translator or providing glosses or key vocabulary to learn.

Part 2: How to use AI

This next section provides suggestions and ideas for how teachers can use AI and prompt AIs more effectively. These are strategies which we have used and found helpful with our pupils. It starts with guidance on how to more effectively prompt AI, strategies for teachers to use AI, considerations, and ideas for using AI in the classroom. It then provides a series of scenarios and how AI may be used in each.

Suggestions for prompting AI

While AI can save teachers’ time, it can also be frustrating when it does not produce effective results. This following section gives advice on how to prompt AI more effectively to achieve greater chances of success. There are several different strategies available for prompting AI to produce more effective results. One acronym which I have found helpful is PREPARE (Fitzpatrick et al., Reference Fitzpatrick, Fox and Weinstein2023).:

While this sounds quite long, this does not need to be used for every prompt. Using only a few of these may provide a more effective prompt. Equally, for model answers or exam questions, the same scaffold can be reused.

  • Imagine you are teaching a Year 7 class of mixed prior attainment. They are learning about Ancient Egypt. Create a glossary of keywords in alphabetical order providing short definitions for each word. Please ask further questions to help make the glossary better as bullet points.

  • Pretend to be a woman living in Egypt in 1500 BC. Answer questions using language appropriate for the reading age of Year 7s and accessible to pupils who have only recently started learning about ancient Egypt.

Planning

AI can assist teachers in planning schemes of work and lessons to avoid reinventing the wheel or to support teachers who are teaching topics that they may not possess or have confidence in their subject knowledge. While a teacher could simply ask AI to create the entire scheme of work and lesson plans and follow these, it is more effective using AI as a knowledgeable planning partner and resource finder. A strength of Perplexity AI is finding the most relevant materials to the prompt and presenting links to these as citations.

If a teacher prompts Perplexity AI to imagine itself as a teacher of Year X, creating a scheme of work for Topic Y in Z number of lessons, it will give suggestions on the basis of existing schemata and resources publicly available on the internet as well as links to these materials. This will then display on a single page what already exists that teachers may wish to use to avoid reinventing the wheel. Equally, AI can provide suggestions of topics and activities that can provoke conversation and reflection on a teacher’s planning decision as would happen in a conversation with a colleague. However, many Classics teachers are the only specialist teacher in their workplace – the AI can also highlight potential issues that a teacher may not have considered. This can be particularly beneficial for PGCE students or early career teachers who may find planning lessons more challenging. The following is a prompt to an AI to act as a lesson planning coach for a PGCE student:

‘You are a friendly and helpful instructional coach helping teachers plan a lesson.

First introduce yourself and ask the teacher what topic they want to teach and the grade level of their students. Wait for the teacher to respond. Do not move on until the teacher responds.

Next, ask the teacher if students have existing knowledge about the topic or if this is an entirely new topic. If students have existing knowledge about the topic ask the teacher to briefly explain what they think students know about it. Wait for the teacher to respond. Do not respond to the teacher.

Then ask the teacher what their learning goal is for the lesson – that is, what they would like students to understand or be able to do after the lesson. Wait for a response.

Given all of this information, create a customised lesson plan that includes a variety of teaching techniques and modalities including direct instruction, checking for understanding (including gathering evidence of understanding from a wide sampling of students), discussion, an engaging in-class activity, and an assignment. Explain why you are specifically choosing each.

Ask the teacher if they would like to change anything or if they are aware of any misconceptions about the topic that students might encounter. Wait for a response.

If the teacher wants to change anything or if they list any misconceptions, make the change or adapt the lesson to deal with the misconceptions.

Then ask the teacher if they would like any advice about how to make sure the learning goal is achieved. Wait for a response.

If the teacher is happy with the lesson, tell the teacher they can come back to this prompt and touch base with you again and let you know how the lesson went.’

In this scenario, this AI will prompt the teacher through different considerations when planning the lesson, scaffolding the process. Unlike a planning sheet, teachers can ask the AI questions throughout the process. Figure 1 shows an example of this interaction in a lesson introducing the imperfect tense to Year 7 pupils. It also builds in the opportunity for reflection if the teacher touches base with the AI about how the lesson went. In this scenario, the process of planning is collaborative.

Figure 1. AI lesson planning coach for a student teacher, with responses.

Creating or adapting resources

Creating quality resources can be time-consuming for teachers. However, AI is effective at locating pre-existing resources on the internet, as discussed in the previous section, or generating new resources suited to a topic. The following prompt starters help create prompts for a variety of different tasks.

  • Give an overview of…

  • Produce a timeline of…

  • Produce a glossary of keywords with short definitions appropriate for pupils in [insert level] about [insert topic]. Organise the list alphabetically.

  • Organise these words [insert list of keywords from scheme of work] alphabetically. Write a short definition of each word in language appropriate for [insert level].

  • Create a missing word activity…

  • Create [insert number] of comprehension/ language questions (in the style of [insert assessment specification and paper if using]) based on the following passage…

  • Create a table of definitions…

  • Create a card-sort of reasons why [insert topic]. Include information for the following factors…

  • Create an acronym…

  • Make flashcards…

  • Create [insert number] of character cards using known historical figures from [insert period] to help pupils explore [insert topic].

  • Give me good connective words for an essay on…

  • Create a writing scaffold/ activity template with sentence starters for [insert task e.g. diary entry] about [insert topic and give further guidance about extent of support or what pupils should focus on].

  • Summarise the following text in bullet points and language appropriate for [insert level]…

  • Rewrite the following text in language appropriate for [insert level]…/ Chunk the following text into short paragraphs…/ Provide a glossary list of key vocabulary to help pupils understand the passage …/ Provide questions to check pupils’ understanding of what they are reading.

  • Create an image of [insert lines of a set text/ key details from these lines].

Overview of useful AI tools for teachers

Different AI tools have advantages and weaknesses for resource creation. Here are several AI tools which we have found helpful and a brief explanation of each.

ChatGPT

The basic tool on ChatGPT is a generative AI chatbot that can be prompted to create a wide variety of text-based tasks. However, additional tools allow image generation. ChatGPT does not cite sources unless prompted, and the more specific and explicit the prompts are, generally the more accurate the results. If using specific mark-schemes or specific passages, copy and paste these into the prompt. (Account required. Free and paid versions. Free version allows access to almost all tools, but there may be a daily limit, e.g. creating two images per day).

Co-Pilot

Functions similarly to the chat bot of ChatGPT but inbuilt on more recent Microsoft devices. (Microsoft account required. Free).

Deep AI

Generates images. (No account required. Free).

Diffit

Good for creating resources (e.g. images, text, bullet point summaries, key vocabulary definitions, multiple choice questions, short questions, open-ended questions, and activity templates at specific reading levels) but is based on the USA school system so be careful with differences between year group names and curricula. (Account required. Free and paid versions).

Google Notebook LM

Creates study guides, practice questions, glossaries and podcasts from inputted content. Useful for revision and additional resources for pupils to use to review topics, though works better if the specification is included. If inputted content is sufficiently clear and detailed, it is generally accurate.(Google account required. Free)

MagicSchool AI

A range of different tools for creating and adapting resources. Teachers can specify information or upload documents, e.g. revision guide. Good for creating information texts, questions, multiple choice quizzes, bullet point summaries, and rewriting texts. Similar to Diffit, it is based on the USA school system. (Account required. Free and paid versions. All features are currently accessible on the free version, but there are limits to how many exportable formats are available).

OpenArt.ai

Free version has limits on number of images. Creates images. (Account required. Free and paid versions).

PerplexityAI

One of the few AIs to automatically cite sources. A chatbot which can be prompted to create text-based resources and (if given clear instructions and specifications) is able to create accurate resources and exam-style questions on the basis of Latin and Ancient Greek passages without requiring a translation. If the specification or past papers are publicly available, it does not require mark schemes or specifications to be uploaded. (Account not required but needed to save chats. Free).

Questionwell

Creates multiple-choice quizzes on the basis of a specific topic or uploaded information (free version has limit of 1000 characters). Quizzes can be exported to most quiz platforms, e.g. Google forms, Quizlet, and Kahoot. (Account required. Free and paid versions).

SchoolAI

Although other AIs offer more features for resource creation, SchoolAI allows teachers to create spaces that pupils can join without having an account to interact with AIs pretending to be historical characters or acting in particular roles, e.g. research assistant. SchoolAI does not pass any information from spaces to third parties (but may use the chats to improve the AI for the future) or save any details from the pupils’ temporary accounts. Teachers can monitor all pupil interactions from their account. (Account required for teachers. Free).

Teachmate AI

A wide range of different tools for creating and adapting resources. Good for creating adaptive resources, report writing, writing letters, finding sources, creating resources on the basis of YouTube videos, worksheets, timelines, and vocabulary mats. (Account required and subscription only. Useful if your school has a subscription).

Twee.com

Free version has monthly limits and a limit of 5 minutes per video. Creates activities on the basis of videos. (Account required. Free and paid version).

Some educational software packages such as Quizlet, Quizizz, and SenecaLearningFootnote 3 have recently added AI features. At present, these are of variable quality for Classics; while you can input data for them to use, the accuracy and level of the questions can be poor. However, these may become more effective as the AI tools develop and learn, and more teachers use the AI for Classical topics.

Model answers

AI can also generate model answers to exam questions (as well as exam-style questions and mark schemes). Both ChatGPT and Perplexity are capable of creating model answers, although, in my experience, Perplexity tends to work best if prompted using PREPARE with very specific roles (often as a ‘pupil’ who did not do well on the question previously but who now wants to listen to any guidance carefully to do really well) and instructions. Here are three different scaffolds which we have used to create model answers using AI.

With Perplexity:

You are a Year 10 pupil who has been studying OCR GCSE Classical Civilisation Myth and Religion topic Roman City Life. You have been learning about the Roman insula and have been set the 8-mark exam question ‘Living in a domus was preferable to living in an insula. How far do you agree? [8]’ Source A is a photograph of the peristyle garden of the House of Menander in Pompeii. Source B is the following quote from Plutarch’s Life of Crassus: “He would buy houses that were on fire, and houses which were next to those on fire, and the owners would let them go at a minimal price owing to their fear and uncertainty. In this way most of Rome came into his possession.” You really want to do well on this question because you struggled with 8-mark questions in your recent mock exam. Your teacher has advised you to write two paragraphs using evidence from a source and your own factual knowledge in each paragraph. You disagree with the question because you think most Romans actually lived in an insula. However, you also want to show that you know why the wealthiest Romans preferred living in a domus and the advantages and disadvantages of insula life compared to life in the domus. You look at the mark scheme as you are writing because you want to achieve as many of the 8 marks as you can. After writing the answer, rate the answer on how many marks you think it is likely to achieve. Then ask any questions in bullet points which are needed to improve your answer.

While this is a very long prompt, the italicised text is a scaffold that can be reused and the particular topics updated. If I use this prompt with ChatGPT, I would need to insert a mark scheme and potentially points made on the specification that ought to be included.

With ChatGPT:

Write a model answer to the question [insert question].

Candidates might show knowledge and understanding of [copy and paste a mark scheme here].

Candidates may demonstrate evaluation and analysis through the use of some of the following arguments [copy and paste an example mark scheme here].

The model answer you write should achieve full marks by meeting the following criteria [copy and paste criteria here].

If you are doing this for set text and it hasn’t translated the quotes, try this:

You now need to integrate a translation of the Latin quotes next to each quote.

Alternatively, it is also possible to construct the prompt in stages. For example,

You are a helpful and supportive coach helping a teacher write model answers. First, introduce yourself and ask the teacher the exam board, qualification, and year group of the pupils. Wait for the teacher to respond.

Next ask the teacher what the question is. Wait for the teacher to respond.

Next ask the teacher what mark scheme they would like you to use. Wait for the teacher to respond.

Next provide a model answer to the teacher using all the above information. Ask the teacher if they would like to make any adjustments.

If you are not satisfied with the answer, then refine the prompt and give the AI clear guidance on what it has done well and should keep and what you want it to correct, i.e. This is a strong answer with a clear argument, but paragraph 2 contains some factual inaccuracies: Penelope did not pass secret notes to the suitors. Please correct this. Alternatively, pupils can assess and improve answers with errors.

Another situation might be that I am teaching a set text and would love to have an illustration of a few lines but there are none online. It can be helpful to provide visuals to help pupils comprehend texts, particularly when the language or ideas can be difficult to understand. However, there are not always suitable illustrations available online. AI image generators such as Create can provide an opportunity to create tailored images. The teacher could create the images, or it could be set as a task for the pupils with the challenge of creating the most accurate image. This encourages the pupils to consider the key details in the passage but also develops their skills in prompting the AI and evaluating the AI’s creation rather than simply accepting the first response. Alternatively, pupils could generate visuals for projects such as mosaics and wall paintings as part of a project on Roman houses or visualisations of their mythology-inspired theme park as part of a combined revision lesson/end of topic project for a Year 9 class that has end of year assessments in their chosen GCSE subjects. The latter led to some pupils refining their prompts to create images with more details from the myths used as inspiration.

Pupils using AI in the classroom

As discussed in the first section in more detail, there is currently no software capable of reliably detecting AI usage (including those portrayed as having AI detection capabilities). However, except for higher education where educators are unlikely to be familiar with their students’ style of writing, many strategies which teachers use to identify potential plagiarism by copying and pasting still apply to AI – unusual formatting, footnote numbers, or unusual vocabulary/syntax for that pupil. In many AI tools, if the prompts are not sophisticated, the AI will often write with sophisticated vocabulary but very little substantive content. While these signs are not conclusive proof that the pupil has used AI, subsequently asking pupils to explain words or ideas can indicate whether they completed the task.

AI is similar to any technology used in schools: pupils have to learn how to use it effectively and responsibly. Banning AI does not allow pupils the opportunity to be educated about the opportunities and limitations of AI. Rather, it is more effective to have open and honest conversations about AI and set boundaries for how pupils should use it.

The following are a series of conversation starters which I have found helpful when I suspect pupils may have used AI or pupils have asked to use AI for a task:

  • Explain how AI works: AI creates this by using information on the internet. There are good sources and bad sources of information and AI cannot distinguish very well between these and can get things wrong. You need to double check information.

  • Compare AI to pupils: AI is learning like you are. It can make mistakes too.

  • Do you know what that word means? Is copying this helping you understand and progress?

  • Tell me how you are going to use this to make your own work and show your knowledge/ understanding?

  • Why do you want to use AI? Copy and paste the prompt you use and what the AI created to your assignment. Add any changes you needed to make to the prompt.

It may also be worthwhile giving pupils the option to use AI on homework tasks with clear guidance that will encourage pupils to reflect on skill/knowledge learning. This does not need to be an entirely different task but a variation on a pre-existing task.

For example,

  • Either find images of three Greek gods or create AI-generated images of three Greek gods (not Aphrodite). Explain how far each image uses Greek iconography (What does it use? What is missing?).

  • Either create a set of instructions for building a Greek temple or create an AI-generated image of a Greek temple. Identify three ways this is a typical Greek temple and three ways to improve the prompt to generate a better image of a Greek temple.

  • Either translate the passage or use AI to translate the passage then mark and correct the AI translation.

Using AI in the classroom

If teachers wish to have their pupils use AI, it is important to check the school’s policy on AI and the privacy policy, as some tools seek access to personal information and may share this with third parties. It is also necessary to check the minimum age requirement for the AI tool. Many AIs now have minimum age requirements to comply with legal requirements on data processing and governing body advice on use of AI. These tend to be either 13 (Canva), 13 with parent/ guardian consent (Chat-GPT, Perplexity), or 18 (Anthropic Claude, Co-Pilot, Chat-GPT, and Perplexity [without parent/guardian consent]). SchoolAI does not currently have a minimum age, but unlike other AI tools, pupils do not have accounts, instead accessing a ‘Space’ created by the teacher’s account. Teachers are also able to set the Space parameters and monitor pupil engagement with the AI

If you are using AI with exam classes, it may be worth familiarising yourself with the latest guidance from the exam board on using AI. The UK Joint Council of Qualifications has recently published and updated guidelines for using AI (JCQ, 2024). It is also important to give the pupils clear and specific instructions. While some pupils may be experienced in using AI, they may not have any experience or skills in using a specific AI. It is also helpful to model responsible AI usage with the pupils.

The following are a series of suggestions for how pupils could use AI in the lesson:

Interviewing a historical character (with a follow-up of assessing the accuracy of the responses). There are some specific AI systems for this, but their free versions are very limited in capability and the free versions of MagicSchool AI, ChatGPT, SchoolAI, and PerplexityAI are more capable. These tools can be directed to use specific websites, copied and pasted text, or resources available on the internet to inform their answers (to improve accuracy) or told to respond in language appropriate for the reading level of a class.

Research assistant. Pupils use the AI to find relevant and reading-age-appropriate websites on the internet. Most AIs will usually provide three or four suggestions when prompted which gives pupils a starting point and encourages pupils to read the websites in more depth rather than focusing on the Google extract. Pupils might prompt the AI to ask for sources from specific perspectives or in language appropriate for their reading level with glossed vocabulary. Pupils can also ask the AI to chunk information from the website or explain unfamiliar vocabulary or ideas.

Classroom assistant. Pupils ask the AI questions about lesson or homework, e.g. unfamiliar vocabulary, ideas in a set text or sources, how to approach questions, and how to break a larger task down into smaller stages. I have also seen pupils (unprompted) ask AI to provide examples of signposting, sentence starters, or persuasive language that they then used in their own writing. While this may be beneficial for pupils who are reluctant to ask the teacher or peers for assistance, it is important to be mindful of supporting pupils to develop the skills to do this independently and not become reliant on the AI.

Debate partner. This is a variation on a chalk-talk class discussion. Pupils prepare a debate and conduct it with the AI, prompting the AI to ask questions about their written ‘speech’ and pupils asking questions of the AI’s speech.

Generating visuals. This is useful for project work and reflecting on how to improve prompts to create more accurate visuals. Alternatively, pupils could compete to create the most accurate illustration of a few lines of set text, encouraging pupils to engage and reflect on the content of the set text.

Using AI as a ‘textbook’. In this activity, pupils might use AI to give them a short introduction to a topic which they can expand on at the end of the lesson to show their learning. Alternatively, pupils could ask the AI to give a summary or biography of a certain length and interrogate the information for factual accuracy and perspective. Pupils then try to improve the piece of writing immediately or after exploring the topic further. AI is particularly suited to Classics and History because it mirrors the process of creating narratives from evidence and raises issues of bias and what historians decide to include or omit from their narrative.

Create exam-style questions or answers. Ask the AI to create exam questions for an exam board then answer the questions or assess model answers and try to improve them. Prompting a top-tier model answer (especially for GCSE Classical Civilisation 8 and 15 markers) is difficult, and this task encourages pupils to reflect on the qualities of answers which achieve high marks.

AIs are continuously evolving with new tools and capabilities becoming available. Equally, teachers and pupils are discovering and refining strategies for using AI while researchers are developing a greater understanding of the impacts of AI in education and how AI literacy correlates with existing pedagogies to avoid reinventing the wheel. Much like the current world, AI is in a state of change and as teachers often encourage pupils, it is helpful to keep an open mind and explore new possibilities.

Footnotes

1 PGCE: Postgraduate Certificate in Education. See https://www.educ.cam.ac.uk/courses/pgce/

2 UCAS is the digital portal through which school students make applications for places in Higher Education. If doing this, do not use real names. One strategy is to use a word which you will not use in the report, for example Koala, in place of the pupils’ name then using a Find and Replace All to change this word to the pupil’s name.

References

Ahmad, S. (2024). [Blog] Can AI Detectors Be Wrong? A Comprehensive Analysis: Explore the Accuracy of AI-Detectors, Their Biases, and the Impact of False Positives on Users, Especially Non-Native English Speakers. Phrasly. Ai. 7 April 2024. Available at https://phrasly.ai/blog/can-ai-detectors-be-wrong/ (accessed 25 October 2024).Google Scholar
Bartoletti, I. (2023). AI in education. An opportunity riddled with challenges. In Holmes, W., Porayska-Pomsta, K. (Eds.). The Ethics of Artificial Intelligence in Education Practices. pp. 7490. London: Routledge.Google Scholar
Bisoondath, A. (2024). Artificially Intelligent, Children’s and Parent’ Vies on Generative AI in Education. Internetmatters.org. Available at https://www.internetmatters.org/hub/research/generative-ai-in-education-report/#full-report (accessed 25 October 2024).Google Scholar
Brynjolfsson, E. (2023). AI and education: will the promise be fulfilled? In Araya, D. and Marber, P. (Eds). Augmented Education in the Global Age: Artificial Intelligence and the Future of Learning and Work. pp.103116. London: Taylor and Francis Group.Google Scholar
Department for Education (2023). Generative Artificial Intelligence (AI) in Education. London: Department for Education. Available at https://www.gov.uk/government/publications/generative-artificial-intelligence-in-education/generative-artificial-intelligence-ai-in-education (accessed 25 October 2024).Google Scholar
Department for Education (2024). Generative AI in Education: Educator and Expert Views. London: Department for Education. Available at https://assets.publishing.service.gov.uk/media/65b8cd41b5cb6e000d8bb74e/DfE_GenAI_in_education_-_Educator_and_expert_views_report.pdf (accessed 25 October 2024).Google Scholar
Fitzpatrick, D., Fox, A. and Weinstein, B. (2023). The AI Classroom: the Ultimate Guide to Artificial Intelligence in Education. (The everything edtech series). Cincinnati: TeacherGoals Publishing, LLC.Google Scholar
Harrison, C. (2023) Foreword. In Araya, D. and Marber, P. (Eds). Augmented Education in the Global Age: Artificial Intelligence and the Future of Learning and Work. pp xxii. London: Taylor and Francis Group.Google Scholar
Hatzius, J., Briggs, J., Kodnani, D. and Pierdomenico, G. (2023). [Report] Global Economics Analyst: the Potentially Large Effects of Artificial Intelligence on Economic Growth. Goldman Sachs Economics Research. 26 March 2023. Available at https://www.gspublishing.com/content/research/en/reports/2023/03/27/d64e052b-0f6e-45d7-967b-d7be35fabd16.html (accessed 25 October 2024).Google Scholar
Hu, K. (2023, February 2). ChatGPT Sets Record for Fastest Growing User-Base – Analyst Note. Reuters. 2 February 2023. Available at https://www.reuters.com/technology/chatgpt-sets-record-fastest-growing-user-base-analyst-note-2023-02-01/ (accessed 25 October 2024).Google Scholar
Joint Council for Qualifications (JCQ). (2024). AI Use in Assessments: Protecting the Integrity of Qualifications. Available at https://www.jcq.org.uk/wp-content/uploads/2024/07/AI-Use-in-Assessments_Feb24_v6.pdf (accessed 25 October 2024).Google Scholar
Maslej, N., Fattorini, L., Perrault, R., Parli, V., Reuel, E., Brynjolfsson, E., Etchemendy, J., Ligett, K., Lyons, T., Manyika, J., Niebles, J.C., Shoham, Y., Wald, R. and Clark, J. (2024). The AI Index 2024 Annual Report, AI Index Steering Committee, Institute for Human-Centered AI. Available at https://aiindex.stanford.edu/wp-content/uploads/2024/05/HAI_AI-Index-Report-2024.pdf (accessed 25 October 2024).Google Scholar
Moares, H.F. and Previtali, M.B. (2024). Shaping the Future: a Dynamic Taxonomy for AI Privacy Risks. IAPP. 17 January 2024. Available at https://iapp.org/news/a/shaping-the-future-a-dynamic-taxonomy-for-ai-privacy-risks (accessed 25 October 2024).Google Scholar
Mueller, P.A. and Oppenheimer, D.M. (2014). The pen is mightier than the keyboard. Psychological Science, 25(6), pp. 11591168.10.1177/0956797614524581CrossRefGoogle Scholar
O’Rorke, O. and Talbot Rice, S. (2024). Artificial Intelligence – An Overview for Schools. Farrer & Co. 26 February 2024. Available at https://www.farrer.co.uk/news-and-insights/artificial-intelligence--an-overview-for-schools/ (accessed 25 October 2024).Google Scholar
Oxbridge Editing (2023, September 23). What Do We Know about the Reliability of AI Detectors? Oxbridge Editing. 24 September 2023. Available at https://www.oxbridgeediting.co.uk/blog/what-do-we-know-about-the-reliability-of-ai-detection-tools/ (accessed 25 October 2024).Google Scholar
Puentedera, R.C. (2014). SAMR and Bloom’s Taxonomy: Assembling the Puzzle. Common Sense Education. Available at https://www.commonsense.org/education/articles/samr-and-blooms-taxonomy-assembling-the-puzzle (accessed 25 October 2024).Google Scholar
RM (2023). A Staggering Two Thirds of Secondary School Students Use AI to Do Their Work. RM. 22 June 2023. Available at https://www.rm.com/news/2023/artificial-intelligence-in-education (accessed 25 October 2024).Google Scholar
Shiohira, K. and Holmes, W. (2023). Proceed with caution: the pitfalls and potential of AI and education. In Araya, D. and Marber, P. (Eds). Augmented Education in the Global Age: Artificial Intelligence and the Future of Learning and Work. pp.137156. London: Taylor and Francis Group.Google Scholar
UNICEF. (2021). National AI Strategies and Children: Reviewing the Landscape and Identifying Windows of Opportunity. Policy Brief. UNICEF. Available at https://www.unicef.org/innocenti/media/2516/file/UNICEF-Global-Insight-national-AI-strategy-review-policy-brief.pdf (accessed 25 October 2024).Google Scholar
Wood, P. (2023). Oxford and Cambridge ban ChatGPT over Plagiarism Fears but Other Universities Choose to Embrace AI Bot. The i. 28 February 2023. Available at https://inews.co.uk/news/oxford-cambridge-ban-chatgpt-plagiarism-universities-2178391 (accessed 25 October 2024).Google Scholar
Zierer, K. (2019). Putting Learning before Technology!: the Possibilities and Limits of Digitalization. London: Routledge.10.4324/9780429453243CrossRefGoogle Scholar
Figure 0

Figure 1. AI lesson planning coach for a student teacher, with responses.