《英國教育部:2024教育中領域生成式AI用例研究報告(英文版)(121頁).pdf》由會員分享,可在線閱讀,更多相關《英國教育部:2024教育中領域生成式AI用例研究報告(英文版)(121頁).pdf(121頁珍藏版)》請在三個皮匠報告上搜索。
1、 Use Cases for Generative AI in Education User Research Report August 2024 1 Contents List of figures 4 List of tables 5 Executive Summary 6 Education hackathons 6 Hackathon Outcomes 7 PoC Development 7 User Research 8 Introduction 10 Project Aims 10 Project structure and phases 10 Summary of PoC bu
2、ild 13 Outline of the 3 strands of user research 16 Hackathon Findings 18 Summary 18 Summary of key use cases 22 User Research Findings 29 Introduction 29 Methods 30 Results 31 Conclusions 44 User research conclusions 44 Suggested areas of future focus 45 Further work in progress 48 Recommendations
3、for delivery of future hackathons 49 Annex 1:Summary of pre-Hackathon consultation findings 52 Overview 52 Aims 52 Methods 53 Findings 55 Enablers and barriers to AI use 63 2 Next steps 64 Annex 2:Full Hackathon findings report(November 2023)65 Introduction 65 Potential barriers to Generative AI ado
4、ption 66 Use cases for Teachers 68 Use cases for School Leaders and Administrators 77 Use case for Students 80 Annex 3:AI mini-hackathons project summary 83 Background to the project 83 The aims of the hackathon 83 What did we do?84 What did we learn?84 Annex 4:Membership of steering committee 88 An
5、nex 5:Methods for user research 89 Methods 89 Sample and sampling strategy 89 Materials and Procedure 90 Annex 5:Prompts used in the Hackathons 92 Generating age-appropriate lesson plans 92 Generating effective questioning materials 94 Marking a KS2 essay based on a modified mark scheme and providin
6、g personalised feedback 96 Providing feedback on students work in a way meaningful to students 97 Prompt used to generate newsletter articles,using standardised data fields(parent-carer communications)99 Prompt used to simplify text for students(lesson materials)100 Hallucination in creating end-of-
7、year reports,with the hallucination highlighted in bold italics 102 Writing end-of-year reports for other teachers based on key information 103 Prompting the LLM to create a vocabulary list with simple prompting(language learning assistant)105 Prompting the LLM to correct a students work by providin
8、g steps to follow and one-shot prompting 105 Adjusting data literacy policies in line with school values 106 3 Providing detailed context for adapting school policy 109 Creating policy-based scenarios for testing with staff 110 Generating newsletters based on short user inputs 112 Mandarin language
9、assistant 114 Annex 6:Key Terms 117 Annex 7:References 118 4 List of figures Figure 1:Screenshot of the essay upload user interface 14 Figure 2:Screenshot of the teacher feedback user view 14 Figure 3:Screenshot of the student feedback user view 15 Figure 4:Screenshot of the student task user view 1
10、6 Figure 5:Opportunities and risks associated with using GenAI tools for assessment and feedback 32 Figure 6:Needs identified by teachers based on their perceived barriers and facilitators to GenAI adoption in schools 39 Figure 7:Breakdown of survey stakeholders by role 53 Figure 8:Venn diagram to s
11、how the overlap in teachers use case choices for time saving and strengthening practice 56 5 List of tables Table 1:8 use cases selected as having potential to develop to PoC.12 Table 2:Participants in each strand of user research.17 Table 3:Use cases explored during the hackathons.19 Table 4:Summar
12、y of user feedback for generating age-appropriate lesson plans.22 Table 5:Summary of user feedback for generating effective questioning ideas.23 Table 6:Summary of user feedback for generating lesson materials.23 Table 7:Summary of user feedback for assigning marks to work submitted by students 24 T
13、able 8:Summary of user feedback for generating personalised formative feedback and addressing student misconceptions.26 Table 9:Summary of user feedback for generating drafts of statutory policies.27 Table 10:Summary of user feedback for generating pupil or class data analysis and synthesis.27 6 Exe
14、cutive Summary From September 2023 to March 2024,Faculty AI,the National Institute of Teaching(NIoT)and ImpactEd Group(representing the AI in Schools Initiative)have worked with the Department for Education(DfE)to deliver the Use Cases for Generative Artificial Intelligence in Education project.The
15、project explored potential applications for Generative AI(GenAI)in the education sector as part of a wider effort to transform a teachers day to day work reducing workload and improving educational outcomes by automating routine tasks.The project team ran 4 work-streams at different stages of the pr
16、oject:pre-hackathons user engagement;delivery of the hackathons;build and testing of the PoC;and user research.Roles on this project included:Faculty AI:Project leadership;tool development and testing(the development team)National Institute of Teaching:User research;initial consultation to support s
17、election of use cases(the research team)ImpactEd Group,AI in Schools Initiative:Involvement of teachers,leaders and students in hackathons and user testing.This report sets out the user research findings of the project and has been published alongside the Technical Report which details the experimen
18、tation and technical development work conducted as part of the project.This is an independent report and is not intended to represent the policy position of the Department.Education hackathons In October 2023,60+participants came together to deliver the first ever hackathons in education to explore
19、GenAI.This event provided a huge amount of insight into the needs and perspectives of teachers,school leaders and administrators as regards the use of GenAI in education.It also enabled data scientists and engineers from across education,government and the private sector to collaborate to make progr
20、ess on solving some of the most difficult associated challenges.The impact of these events was clear,and the participants made significant progress with some of the use cases,meaning that they were able to use GenAI tools to help users complete tasks such as drafting content for school websites or c
21、reating lesson materials.They also generated important learnings applicable to the future development of tools using GenAI.“The participants developed a very practical solution that nobody came here thinking about,but the fact youve got teachers,the fact youve got computer scientists,youve got resea
22、rchers and experts all together,and theyve actually created examples of it right here that could be utilised.Its quite astonishing.”Education Secretary,Gillian Keegan 7“What the AI was able to do was to be fed examples of existing best practice policies to learn from,and then tailor to the school co
23、ntext,and that potentially shortens the time it takes school leadership to create policies,but also reviews,checks quality,checks for adherence to best practice or legislation.So,I think thats really exciting.”Tom Nixon,Head of Government Practice at Faculty“ChatGPT initially wasnt producing the out
24、puts that we were hoping for,so were feeding it even more information.But what that really highlighted for us was the fact that we need these models to be trained in subject disciplines for each individual use case scenario for each individual academy.That could have potentially a great amount of im
25、pact on teacher planning time,adapting lessons for each individual student.”Jonathan ODonnell,Computing Consultant at the Harris Federation.Hackathon Outcomes Ahead of the hackathons,the project team co-designed a list of use cases for GenAI in education,engaging over 700 stakeholders from the educa
26、tion sector(involving teachers,leaders,administrators,teacher educators,and students),and aiming for a school-led approach in selecting use cases most useful to schools.These use cases were reviewed and a shortlist of twelve use cases was created.One of the expected outcomes was the classification o
27、f the twelve use cases explored during the hackathons into three groups,namely:use cases which were ready to be used in schools;use cases which would first need to be developed into a PoC tool;and use cases which were beyond the models capabilities,or too risky to deploy.Although some of the twelve
28、use cases explored in the hackathons had significant potential for impact,there were no cases in which the participants thought that the solution developed was ready for use in schools.There are ways in which GenAI can support teachers in some limited versions of some of the use cases identified,suc
29、h as coming up with suggestions for lesson activities.However,even the most successful use cases had remaining challenges to be addressed before they could be widely used by teachers or solve the problem entirely,such as the need for access to additional data sources or integration with other tools,
30、and the team then progressed to development of the PoC tool.It is important to note that the hackathons were restricted to the use of GenAI models(mostly GPT-3.5 and GPT-4)and did not explore wider EdTech tools,which may be able to successfully solve some of the use cases.PoC Development To support
31、these objectives,the development team developed a PoC GenAI tool that marks students work and provides personalised feedback and a revision activity.This tool was developed as an experiment and there was no intention that it would be rolled out in schools,but instead was designed to provide an oppor
32、tunity to investigate a specific application of GenAI,the performance of the tool and users responses to it.As 8 such,the learnings from this process,such as the importance of allowing teachers to customise outputs based on their professional judgement,the ability of LLMs to provide feedback,and the
33、 potential for GenAI tools to effectively assess model outputs,are expected to be broadly applicable to the development and implementation of any GenAI tool in this educational context.The development team then spent several months developing a PoC tool focused around the areas of feedback and gener
34、ating revision activities.This was developed to the point where a user can upload a piece of student work,have the errors automatically detected and linked to the national curriculum,generate feedback for the student and teacher,and generate a series of revision activities tailored to help the stude
35、nt address the errors in their work.Through experimentation with different models,architectures,prompts and reference data,the team drew a number of key insights which are applicable to the development of a broad range of GenAI tools in education.For example,the positive impact of blending determini
36、stic language modelling and GenAI based approaches on performance,and the importance of structuring reference documents such as the national curriculum to enable Large Language Models(LLMs)to interpret them.These insights are outlined in detail in the accompanying Technical Report.User Research Alon
37、gside this technical development,the research team completed a programme of user research to understand users perspectives on the PoC tool and their views on the use of GenAI in education more generally.The aim of this research was to put teachers views at the forefront of the tool development,and t
38、o gain their perspectives of the potential of tools in this area.Although teachers perspectives were varied,the key themes that emerged included general positivity about the potential impact of GenAI,and many teachers were able to see the potential time saving and standardising benefits of using Gen
39、AI tools for feedback.However,for many this was tempered with concern about the potential risks.For example,some teachers reported that they would not trust a PoC tool to provide feedback,and others were concerned that teachers may become over-reliant on the tool.Teachers also reported a need for ti
40、me,training,funding,and expert help to increase their knowledge in the use of GenAI in their practice,as well as a need for guidance on how they should be using AI,and how to do so safely.It is important to note for both the user research and technical experimentation work,that the insights and conc
41、lusions drawn reflect the current state of GenAI technology and users adoption of it.This is a rapidly advancing area,and increasing interest in and adoption of AI tools for educational purposes in recent years has resulted in a fast-growing body of research in this area.However,such has been the sp
42、eed of development that even recent large-scale reviews of the literature(e.g.,Ng et al.,2023;Zhai et al.,2021;Zhang&Tur,2023)are in danger of becoming out of date within a few years.This highlights the importance of up-to-date research,bringing teachers and students perspectives to the fore in this
43、 fast-changing landscape.For example,in the 9 time since the hackathons,new EdTech tools have come to market that address(or partly address)some of the use cases explored,and a key part of the challenge for schools and educators in the adoption of GenAI is navigating this changing landscape.10 Intro
44、duction Project Aims The central aims of this project were to:put teachers voices at the heart of the future of AI in education,broaden the evidence base on the existing strengths and limitations of GenAI tools within educational contexts,understand teachers requirements for AI to meet their needs a
45、nd effectively enhance their role in improving students learning,disseminate these learnings for the sector,and investigate the potential to improve the performance of generative AI models using education-specific datasets.Project structure and phases To meet the project objectives outlined above,th
46、e project team ran 4 work-streams at different stages of the project:pre-hackathon user engagement(all);delivery of the hackathons(all);build and testing of the PoC(Faculty AI and AI in Schools Initiative);and user research(NIoT).Pre-hackathon user engagement The first of these work-streams was the
47、initial user consultation and development of a shortlist of use cases.We used a co-design approach with stakeholders from the education sector,where teachers,school leaders,administrators,teacher educators and students took part in a consultation via surveys and stakeholder group meetings.710 school
48、-based practitioners took part in a 10-minute online consultation survey,and an additional c.20 stakeholders(teachers,leaders,and secondary school and sixth form students)took part in one of 4 online stakeholder group sessions to discuss the potential uses of AI in schools.This research is summarise
49、d in more detail in section 3 of this report,and the full report into the consultation work is provided in Annex 1.The findings of the consultation,alongside the DfEs Call for Evidence on Generative AI in Education,were then used to develop a list of use cases,or applications of generative AI in edu
50、cation,with the further input of a team of teacher educators with digital expertise.To narrow this to a shortlist of use cases for testing in the hackathons,this larger group was assessed and prioritised based on iterative feedback from DfE as well as consideration of whether:LLMs could expected to
51、perform the given task in line with users needs,11 the use of LLMs for a given purpose reflected users priorities and views on potential efficacy of workload reduction and outcome improvement,there were any safety considerations,cultural or organisational blockers that would make using an LLM tool f
52、or a given purpose difficult.The proposed shortlist was reviewed by the projects Steering Group,and 12 use cases,representing potential GenAI applications for teachers,school leaders,administrators and students were selected for testing in the hackathons.Membership of the projects Steering Group is
53、listed in Annex 3.Delivery of the hackathons On the 30 and 31 of October 2023,the team held the hackathons,bringing together potential users,education policy experts,and data scientists.Participants tested different approaches to:tasks such as plan a French lesson for Year 9 students on the past ten
54、se for a selection of use cases,further engineer their prompts to improve the GPT models outputs,and assess whether the final outputs produced would be usable in a school environment.The 60+participants demonstrated clear enthusiasm for the application of GenAI in their own contexts,and the events p
55、rovided significant insight into their needs and perspectives.The Education Secretary and Minister for the School System and Student Finance also hosted a roundtable discussion with teachers,school leaders and education policy experts.In two days of events,participants made significant progress with
56、 some of the use cases,as well as finding important learnings applicable to the future development of tools using GenAI.The process and findings of the hackathons are outlined in more detail in section 3 of this report.Of the 12 use cases explored in the hackathons,8 were selected as having potentia
57、l for development to PoC.These 8 use cases were assessed against selected criteria:innovation,learning potential,practicality,feasibility,novelty and strength of evidence.12 Table 1:8 use cases selected as having potential to develop to PoC Use Case Description Lesson plan or activity adaptor Adapt
58、existing lesson plans to the context required and tailor lesson activities to specific classes.Feedback and revision activity generator Review student work and provides them with both feedback and a personalised activity to develop and consolidate their learning.Question generator Generate graded,le
59、sson plan aligned questions automatically,based on information from sources such as lesson plans,objectives,etc.SEND support tool A tool able to support teachers to adapt lesson content to meet the needs of students with Special Educational Needs and Disabilities(SEND).Essay marker A tool able to su
60、pport the marking of English essays in bulk and provide insights for teachers to better understand how performance varies across a class.Lesson activity generator A tool able to generate a variety of lesson materials including differentiated activities,quizzes and scripts for a lesson.Parent and car
61、er communications tool A tool able to generate communications with parents and carers,for example school newsletters or emails about upcoming school events.Policy document generator A tool able to support the generation of school policy based on submitted characteristics about a school,any existing
62、policy documents and national legislation or guidance.A candidate for PoC was then selected based on this assessment,combining 2 of the proposed use cases:essay marker and feedback and revision activity generator.The proposed PoC was A tool which reviews the students work and provides them with both
63、 feedback and a personalised activity to develop and consolidate their learning.This was limited to Year 4 literacy work,comparing the students errors against the national curriculum.Focusing on Year 4 work was due to this year group not being close to any national assessments,and as their writing w
64、as expected to generally be accurate enough to be well-suited to processing by an LLM,while being short enough examples to limit the required processing power where possible.13 Although there were clear emerging challenges with GenAI marking,during the hackathons it was also clear that support with
65、giving feedback to students is highly valued by teachers,and that a PoC tool that addressed this would be impactful.Other expected benefits to developing this PoC included:that there is no existing generalisable tool that performs both the feedback andactivity generation functions together based on
66、the English curriculum,developing the feedback and activity generation functions have wider applicationsthan these use cases,and the PoC produced would demonstrate a broader application of GenAI in education,with both teachers and students as potential users.Following the selection of a use case for
67、 development to PoC,the project team progressed to the build and testing of the PoC,and the delivery of the user research.Summary of PoC build The PoC has been developed to test the potential of the technology to support educators and teachers and facilitate student learning and development.Specific
68、ally,the PoC was designed to test GenAIs ability to take a piece of Year 4 writing,assess it against the national curriculum,and produce personalised feedback and a revision activity based on the students writing.The PoC was designed to enable two main user journeys;one that allows the user to uploa
69、d their own pupil work,while the other uses a bank of pre-processed essays.These only differ at the start,and so this document will describe only the user upload journey,which is more realistic to how an end user would interact with the tool if this were developed to deployment and integrated into t
70、heir workflow.It is important to reiterate that the objective of this work was to explore the potential of GenAI in this context,rather than to develop a tool to Minimum Viable Product(MVP)or deployment.The PoC has 4 key features:1.Student Essay:when a user accesses the tool,they are presented with
71、the essayupload page.This allows them to upload details of the specific task a pupil wasset,such as a creative writing task where the pupil was asked to practise theirskills in developing suspense in an everyday situation,and the pupils piece ofwork.The user will copy and paste these into the releva
72、nt input fields and click theSubmit essay button.The tool analyses the work and,based on guidance fromthe national curriculum and other materials,identifies areas for improvement.Theuser can then hover over any of the highlighted areas for more information,including the type of error,the correction
73、and the Year group associated with theerror according to the national curriculum.14 Figure 1:Screenshot of the essay upload user interface 2.Teacher Feedback:the user then navigates to the Teacher Feedback tab to viewan assessment of the pupils work intended for the teacher.This generatedfeedback gi
74、ves a summary of how well the pupil did in relation to the task,as wellas specific details on their spelling,punctuation and grammar based on the errorsthat the tool has detected.Figure 2:Screenshot of the teacher feedback user view 3.Student Feedback:navigating to the Student Feedback tab,the user
75、seesfeedback intended for the pupil.The language here is encouraging and focuses on15 how the pupil can improve,rather than with a list of errors.The user can share this directly with the pupil or use it as the basis for their own feedback.Figure 3:Screenshot of the student feedback user view 4.Task
76、 Generation:finally,the user can view a selection of formative worksheets or practice exercises for the student to complete that the tool has generated based on the errors and feedback.Four varieties of worksheet are generated;the first focuses on the most important errors as defined by their order
77、in the national curriculum,and rest specifically on spelling,punctuation and grammar respectively.These could be used directly as they are,or as a first draft for the user to refine.16 Figure 4:Screenshot of the student task user view A detailed overview of the PoC tool is provided in section three
78、of the technical report published alongside this report.Outline of the 3 strands of user research Alongside the development of the PoC tool,the project team conducted 3 strands of user research:superuser engagement(Faculty AI and AI in Schools Initiative);user testing of the PoC tool(Faculty AI and
79、AI in Schools Initiative);and user research(NIoT).Each of these strands of user research were designed to contribute to the projects overall objectives(2.1,page 6).Superuser engagement:Co-design of PoC The superuser engagement involved building a cohort of users who are supportive proponents of usin
80、g GenAI in educational contexts,with significant amounts of experience as classroom teachers and curriculum leaders.These users were drawn from the AI in Schools Initiative hackathons participants and the stakeholders from the consultation phase of this project.This took part in a series of sessions
81、 to help the team to understand users requirements and priorities when it came to the marking and feedback tool,and once the initial PoC had been developed,to give iterative feedback on the tool.17 User testing:Teacher evaluation of PoC performance The user testing strand involved asking 8 primary s
82、chool teachers to rate the quality and accuracy of the outputs of the PoC tool for specific examples of student writing.Teachers were sent examples of feedback which the tool had generated and were asked to review and rate these outputs for a variety of criteria including whether the feedback genera
83、ted by the tool was accurate and met their general expectations for what feedback to students should look like such as tone or length.Teachers also explored the tools functionality by directly interacting with the tool.A detailed summary of the findings of the superuser engagement and the user testi
84、ng is provided in section 4 of the technical report published alongside this report.User research:Experiences and perspectives study These two strands focused on teachers and students perspectives on the PoC tool and its outputs.The NIoT conducted user research exploring users experiences and percep
85、tions of AI in broader terms,with a particular focus on how these may vary depending on their schools context and their previous experiences of using AI.Teachers(N=12)and secondary school students(N=9)were recruited from different geographic regions of England and took part in a series of interviews
86、 and focus group discussions.This research used the PoC as a platform for discussion to explore:the barriers and facilitators to users adoption of AI for feedback in schools,and understand teachers perceptions of the opportunities AI for feedback would offer and the associated risks and challenges.A
87、 detailed summary of the findings of this user research is provided in section 4 of this report.Table 2:Participants in each strand of user research Group Number of Participants Roles Superuser Engagement 5 Secondary school classroom teachers and AI and technology leads User Testing of the PoC Tool
88、8 Primary school classroom teachers User Research 21 Teachers and secondary school students 18 Hackathon Findings Summary Working with NIoT and DfE,Faculty explored potential applications for GenAI in the education sector as part of a wider effort to reduce teacher workloads and improve educational
89、outcomes by automating routine tasks.As part of this exploratory work,Faculty held two days of Generative AI in Education Hackathons,inviting participants from across the education sector to test a range of education-related use cases for LLMs.During the hackathons,over 60 potential users,data scien
90、tists and education experts were brought together to test different approaches to education-based tasks.They engineered prompts to improve the GPT models outputs,and assess whether the final outputs produced would be usable in a school environment.For example,one task involved planning a French less
91、on for Year 9 students on the past tense.Ultimately,the purpose of the hackathons was to generate a set of findings to inform the identification of 3 groups of use cases:use cases for GPT models which are currently ready for schools(potentially with some associated guidance/instructions provided),us
92、e cases which could be good candidates for a PoC tool with some additional tooling/functionality,and use cases which are currently not possible for GPT models or too risky to take forward into schools.Twelve GPT use cases were tested in the Generative AI in Education Hackathon and were then assessed
93、 and prioritised based on iterative feedback from key project stakeholders as well as consideration of the following key elements:technical feasibility whether LLMs were expected to be able to perform the given task in line with users needs,expected impact whether the use of LLMs for a given purpose
94、 reflected users priorities and their views on potential efficacy in terms of workload reduction and outcome improvement,and risk assessment and feasibility whether there were any safety considerations,cultural or organisational blockers that would make an LLM tool for a given purpose difficult to r
95、oll out.For some of the most successful use cases the participants rated the solutions developed highly in terms of their future potential for impact.However,there were no cases in which the participants thought that the solution developed was currently ready for use in schools.Even for the highest
96、scoring use cases,there were remaining challenges to be addressed,often related to the quality or consistency of outputs,safety 19 or privacy concerns,and the need for access to additional data sources or integration with other tools,although it is possible that the desired results may have been ach
97、ieved using different models or existing edtech tools.As a result,the team prioritised developing a comprehensive PoC model that would provide valuable learnings on approaches to optimising a model for education,rather than focussing on creating guidance.Table 3:Use cases explored during the hackath
98、ons Category Use Case List of suggested tasks provided to hackathon participants Lesson planning Generating age-appropriate lesson plans Plan a French lesson for Year 9 students covering the past tense.Plan a History lesson for Year 1 students comparing the lives of Elizabeth I and Queen Victoria.Pl
99、an a Maths lesson for Year 6 students covering Ratio and Proportion.Plan an English lesson for Year 13 students covering the structure of A Midsummer Nights Dream.Generating effective questioning ideas Generate a group of questions across a range of difficulty which I could ask Year 10 students in a
100、 GCSE History lesson covering World War 2.Generate a set of questions that I can ask a group of Year 6 students to test whether they have understood the concept of fractions.Generating lesson materials Create two short quizzes that I can use as part of a Year 10 Biology lesson on ecosystems.One quiz
101、 should be made of short-answer questions and the other should be made of long-answer questions.Generate a range of independent learning activities of varying difficulty for Year 8 students in a lesson about Lord of the Flies.Assessment Assigning marks to work submitted by students Mark a collection
102、 of writing exercises completed by Year 4 students for spelling and grammar.20 Category Use Case List of suggested tasks provided to hackathon participants Generating personalised formative feedback and addressing student misconceptionsIn the form of short paragraphs,provide personalised feedback on
103、 completed Plants worksheets to Year 1 students.Analyse these pieces of Geography work submitted by Year 5 pupils and provide me with a list of common misconceptions.My Year 8 History class often confuse James I and Charles I;suggest strategies to help them remember the differences.Analyse these pie
104、ces of Maths work submitted by Year 11 pupils and provide personalised feedback on areas of strength and weakness in the form of short paragraphs.Review these mock A-Level Sociology essays submitted by Year 13 students and suggest specific improvements which would improve the quality of their work.G
105、enAI as a teaching aid Generate a series of GCSE Physics questions for Year 9 students to practise.Ask each question one-by-one and provide feedback on correct and incorrect answers.Report writing Writing end-of-year reports Generate an end-of-year report for Student X.Student X is high-achieving wi
106、th no concerns about behaviour.They enjoy English and achieve good marks but struggle with Maths and dislike Science.SEND Generating tips for SEND intervention Generate a list of recommendations for a teacher supporting a pupil in Reception with speech,language and communication needs,with reference
107、 to the latest SEND guidance and academic literature,to be reviewed by a SENCO.21 Category Use Case List of suggested tasks provided to hackathon participants Communication assistantGenerating parent-carer communications Create a letter to parents reminding them that PE lessons are every Monday and
108、that pupils need to remember to bring their kit to school.Create a reminder for parents that a nursery will be closed for a bank holiday.Create an article for the school newsletter about Year 6s recent trip to London Zoo.Policy generation Generating drafts of statutory policies Create a first draft
109、of an explanation of my schools uniform policy for publication on the school website,according to DfE guidance.Create a first draft of an explanation of my schools remote education policy for publication on the school website,according to DfE guidance.Data analysis Pupil or class data analysis/synth
110、esis Generate a report to be sent to Student Xs form tutor about their behaviour over the past two terms.In the report,highlight any recurring patterns.Generate a report for a headteacher about Year 7 Maths formative assessment results over a school year.Language learning Language learning assistant
111、 Can you help me revise for my GCSE German exam?I need you to test me on vocabulary to do with hobbies.Can we have a conversation in Spanish about travel and holidays in the style of an A-Level speaking exam?I cant remember how to conjugate the French verb avoir,can you explain to me how the conjuga
112、tion works?Is this sentence right?Le weekend dernier,je suis aller au cafe et jai manger les frites.Jaime frites.22 Summary of key use cases Use cases for Teachers Generating age-appropriate lesson plans Teachers report spending significant time on the generation of lesson plans,as good planning and
113、 structuring are key to ensuring that students make progress and that lessons themselves are effectively delivered.During the hackathons,teachers explored using base GPT models to both plan age-appropriate lessons in Key Stage 3(KS3)French or A level computing and adapt an existing lesson plan for Y
114、ear 7 History to cater for different ability levels.Results:The outputs from using base GPT models to generate lesson plans were not particularly positively rated by teacher participants of the hackathons.They scored the outputs of their testing at an average of 3.7 out of 5 for time saving.However,
115、meeting national standards,improving outcomes and likelihood of use were scored between 2 and 2.3 out of 5.In addition,usability was scored at 1.7 out of 3,emphasising the need for improvement in functionality or additional tooling in future.A PoC with additional functionality,allowing an LLM to ref
116、er to any necessary contextual data(which would be synthetic during any development work),could surmount the above drawbacks.Table 4:Summary of user feedback for generating age-appropriate lesson plans Use case Usability Time saved Meeting national standards Improving outcomes Likely to use Generati
117、ng age-appropriate lesson plans 1.7 3.7 2.3 2.0 2.3 Average across all use cases 2.1 3.7 3.2 3.2 3.6 Generating effective questioning ideas During the hackathons,teachers explored:using base GPT models as a diagnosis agent to understand misconceptions among students;a tool to develop effective quest
118、ions for teachers to evaluate understanding;and a chatbot to gauge student understanding.Successful approaches included employing an inner monologue technique(requiring base GPT models to outline their reasoning for returned outputs)and prompting base GPT models with a combination of a topic specifi
119、c lesson and overarching learning objectives to ensure questions generated were relevant.Results:On average,this use case was rated joint highest by users at 4 out of 5 for whether the final output met established national standards such as the national 23 curriculum.Users also rated output for this
120、 use case highly in terms of time saving(4.7 out of 5)but moderately in terms of usability(2.3 out of 3),improving outcomes(3.7 out of 5)and likelihood of use(3.3 out 5).Table 5:Summary of user feedback for generating effective questioning ideas Use case Usability Time saved Meeting national standar
121、ds Improving outcomes Likely to use Generating effective questioning ideas 2.3 4.7 4.0 3.7 3.3 Average across all use cases 2.1 3.7 3.2 3.2 3.6 Generating lesson materials During the hackathons,teachers explored ways for GPT models to assist in the generation of lesson materials.Specifically,teacher
122、s tested whether the models could generate homework for students based on a lesson PowerPoint and transcript,adapt text extracts to a lower reading age and develop progressive worksheets to boost exam confidence in English as an Additional Language(EAL)students.Results:Overall,user scoring was posit
123、ive with participants giving the model average scores of 3 out 3 for usability,4 out of 5 for following national standards,4.5 out of 5 for time saved,and 5 out of 5 for improved outcomes.They also rated their likelihood to use such a model for generating lesson materials as 4.5 out of 5.Considerati
124、ons for future development include providing guidance for prompting and delivering an efficient and effective user experience for users.Table 6:Summary of user feedback for generating lesson materials Use case Usability Time saved Meeting national standards Improving outcomes Likely to use Generatin
125、g lesson materials 3.0 4.5 4.0 5.0 4.5 Average across all use cases 2.1 3.7 3.2 3.2 3.6 Assigning marks to work submitted by students In the hackathons,teachers tested a range of prompting approaches to improve GPT models effectiveness and accuracy when presented with Year 4 essays and asked to mark
126、 them,assign grades and provide feedback based on the national curriculum.24 The group applied a variety of approaches such as using more detailed prompts to specify what an output must contain,and applying a modular approach to prompting requesting that the LLM generate an exemplar essay based on a
127、 real-world mark scheme,and use this to benchmark the grading of a pupil essay.In addition,the group was able to improve the models output by reducing its temperature,described by OpenAI as“a parameter that controls the“creativity”or randomness of the text generated”,as well as experimenting with th
128、e more recently released GPT-4 model.Results:It was evident from the hackathons user feedback,as well as our previous user engagement,that the development of an accurate AI marker could save time for teachers and schools.However,teachers scored this use case low on meeting national standards(2.5 out
129、 of 5)and moderately on usability(1.8 out of 3),time saving(3.3 out of 5)and outcome improvement(3 out of 5).Likelihood of use however was scored slightly higher at 3.8 out of 5 suggesting that,should the associated challenges be resolved in future,a tool such as this may be useful but the bar for a
130、ccuracy is high.Table 7:Summary of user feedback for assigning marks to work submitted by students Use case Usability Time saved Meeting national standards Improving outcomes Likely to use Assigning marks to work submitted by students 1.8 3.3 2.5 3.0 3.8 Average across all use cases 2.1 3.7 3.2 3.2
131、3.6 Generating personalised formative feedback and addressing student misconceptions Teachers participating in the hackathons investigated whether GPT models could be useful for generating specific feedback for individual Year 4 students on pieces of English work.The aim was to investigate a scenari
132、o in which the models were asked to provide positive feedback,suggesting ways for students to improve their work.The team began by using different prompt structures with GPT-3.5 to elicit specific feedback on each essay(asking for both teacher facing and student facing feedback)and extracting this i
133、nto more structured formats(specifically,JSON files)before testing the viability of using these JSON files to request different kinds of outputs.Results:The group felt that the inaccuracies observed could be mitigated in future with:better quality mark schemes for the model to interpret;more expert
134、input;and more training data for the model to see a range of performance and greater context.Despite these potential areas for improvement,the group expressed doubts that it would be 25 possible to improve accuracy to the level desired by teachers.LLMs therefore may be better used to aggregate or sw
135、iftly structure human feedback to aid lesson planning,rather than to produce feedback unaided.26 Table 8:Summary of user feedback for generating personalised formative feedback and addressing student misconceptions Use case Usability Time saved Meeting national standards Improving outcomes Likely to
136、 use Generating personalised formative feedback and addressing student misconceptions 1.8 2.3 2.0 2.0 2.3 Average across all use cases 2.1 3.7 3.2 3.2 3.6 For the full breakdown of use cases for teachers including GenAI as a teaching aid,supporting students with SEND,and writing end of year reports
137、readers should consult Section 3 of the Summary of Hackathons Findings Report(November 2023),included here as Annex 2.Use cases for School Leaders and Administrators Generating drafts of statutory school policies School leaders are often tasked with creating or updating school policies.This is a tim
138、e-consuming process with the necessary preparation time impacting staff leave periods and involving multiple meetings across the Senior Leadership Team.School leaders and administrators examined whether GPT models could act as a support tool to generate drafts of statutory school policies for furthe
139、r review before implementation.Results:School leaders and administrators scored this use case positively across almost all criteria rating the generation of draft statutory policies between 4 and 5 out 5 for time saving,meeting established standards,improving outcomes and likelihood of use.Usability
140、 was rated as an average 2 out of 3,indicating users felt that outputs were somewhat usable but would require further adjustment by a human before being trusted.27 Table 9:Summary of user feedback for generating drafts of statutory policies Use case Usability Time saved Meeting national standards Im
141、proving outcomes Likely to use Generating drafts of statutory policies 2.0 4.0 4.0 4.0 5.0 Average across all use cases 2.1 3.7 3.2 3.2 3.6 Pupil or class data analysis and synthesis During the hackathons,school leaders and administrators explored whether GPT models could be used to ingest a synthet
142、ic pupil data set(such as dummy IDs,target grades,and reading ages)and analyse the specific assessment data to identify personal and group-level capability gaps.The group tested whether GPT models were able to rank students overall but also perform granular,question-level analysis of mock exam score
143、s.This could enable class teachers to identify patterns and variation in understanding and develop next steps much faster.Results:Despite the concerns and difficulties encountered during the exploration of this use case,school leaders and administrators scored this use case moderately.School leaders
144、 and administrators scored this use case at or above 4 out of 5 for likelihood of use and whether outputs met established national standards.They rated this use case at an average of 3 out of 5 for both time saving and outcome improvement.Usability was also scored moderately at 2 out of 3.This indic
145、ates that there is still an appetite among this user group to test this use case further with alternative approaches to those considered in the hackathons.Table 10:Summary of user feedback for generating pupil or class data analysis and synthesis Use case Usability Time saved Meeting national standa
146、rds Improving outcomes Likely to use Pupil or class data analysis and synthesis 2.0 3.7 4.0 3.7 4.3 Average across all use cases 2.1 3.7 3.2 3.2 3.6 28 For the full breakdown of use cases for school leaders and administrators,including generating parent-carer communications,readers should consult Se
147、ction 4 of the Summary of Hackathon Findings Report(November 2023)included here as Annex 2.Use case for students Language learning assistant During the hackathons,Faculty tested the viability of using GPT models as an assistant for students learning modern foreign languages such as French,Spanish an
148、d German.The group of students in attendance explored how GPT models could help them understand grammar concepts,correct their work and practice conversations in their target language.Insights:Due to time constraints on the day and to ensure robust safeguarding by avoiding direct contact with studen
149、ts,the student user group did not complete the same post-use case survey as the teachers and school leaders and administrators groups.However,there were still key insights drawn from the student hackathons session.One clear finding was that a useful language learning assistant would need to be able
150、to accomplish several tasks concurrently as what appears to be a simple request can spread into various areas of competence.For example,in the case where a student asked GPT-3.5 to have a practice conversation in their target language using a certain vocabulary list(Spanish GCSE vocabulary),the conv
151、ersation evolved into asking the LLM which mistakes they had made,before requesting exercises to strengthen their understanding of the grammar points they struggled with.The group also found that hallucinations can undermine users trust in model outputs.Occasionally,the LLM made mistakes when correc
152、ting answers to a list of multiple-choice questions it had created or hallucinated and corrected non-existent mistakes in student responses to short answer questions.The more advanced GPT-4 model was able to correct some of these errors and was substantially better at correction than GPT-3.5.The stu
153、dents agreed that GPT models could be usable for practising conversations,but trust in the model remained a major concern,and they indicated that they would be more likely to use ChatGPT as a supplemental tool to the other methods they use for language learning.For the full breakdown of use cases fo
154、r students,readers should consult Section 5 of the Summary of Hackathon Findings Report(November 2023)included here as Annex 2.29 User Research Findings Introduction Teachers uses for GenAI in education As noted in section 3 of this report on the hackathons and the interim report on the stakeholder
155、consultation,there is a broad scope for the potential use of GenAI in schools.These range from the facilitation of administrative tasks such as data management and report writing to more creative tasks such as material generation and the production of exemplar essays.While much of the research in th
156、is field predicts positive outcomes resulting from GenAI adoption,such as saved time and teacher empowerment(Kim et al.,2020;Wang et al.,2024;Zhang&Tur,2023),there are also a wide range of concerns about its implementation in the classroom,including issues relating to ethical considerations(Akgun&Gr
157、eenhow,2021;Nazaretsky et al.,2022).Many teachers still have limited knowledge about GenAI(Chounta et al.,2022),and may be put off using it in their classrooms if it is not perceived to be useful,useable,and trustworthy(Choi et al.,2023;Department for Education,2023,2024).GenAI for feedback and the
158、current study Feedback is one of the most powerful stages of the learning process,where teachers support their students to close the gap between where they are and where they should be in their learning(Hattie,2012).However,marking and providing feedback often take up a significant amount of teacher
159、s time(OECD,2018).Workload is one of the main reasons why teachers consider leaving the profession(Rsnen et al.,2020),and a national survey in England found that 46%of teachers felt they spent too much time on marking(Department for Education,2024).Research has also shown that the more time teachers
160、 spend on these tasks,the lower their wellbeing tends to be(Jerrim&Sims,2021),highlighting the importance of finding ways to support teachers with this process.One potential method that has recently garnered interest is to use GenAI tools to generate feedback on students work.In recent years,a range
161、 of GenAI tools have been released for this purpose and one estimate suggested this could save teachers around three hours of marking time a week if implemented successfully(Bryant et al.,2020).However,much of the existing research on the efficacy of these tools has been focused on higher education(
162、e.g.,Lee,2023),produced mixed results(Aloisi,2023;Cavalcanti et al.,2021),and highlighted concerns relating to their reliability(e.g.,Li et al.,2023).We still have limited knowledge about the potential accuracy or reliability of these tools,or their potential impact on learning and teaching in schoo
163、ls.Although there is some evidence relating to teachers broad perspectives on the use of GenAI in education(Department for Education,2023)there is less comprehensive 30 research that covers the practical,pedagogical,psychological,and social considerations raised by teachers and students in relation
164、to GenAIs use for feedback and across education more broadly.This qualitative study therefore used semi-structured interviews(teachers)and focus group discussions(students)to examine teachers and students perspectives about GenAI following a trial of a PoC tool designed to support teachers in provid
165、ing personalised feedback to students.The research focused on the opportunities and risks presented by the use of GenAI tools for feedback,and the facilitators and barriers that could help or hinder teachers adoption of GenAI tools in schools.Methods This study employed a qualitative methodology,inv
166、olving interviews with teachers and focus group discussions with students,across several schools in England.Full details of the methods used are in Annex 4.In January and February 2024,teachers(N=12)with varying levels of experience with GenAI trialled a new PoC GenAI feedback tool and took part in
167、a 45 minute online one to one interview about their views on GenAI.The PoC tool was developed by Faculty and was designed to provide feedback on the spelling,punctuation,grammar,and vocabulary of Year 4 students written work,and produce a personalised revision activity for students that aimed to add
168、ress a key area for improvement.In addition,secondary school students(N=9)took part in online focus group discussions about their perspectives on GenAI use in school,focusing on teachers use of GenAI for feedback.We chose to speak to secondary over primary students as this was deemed more appropriat
169、e to elicit nuanced discussions on the subject of GenAI,through the medium of remote focus group discussions.Ethical considerations Ethical approval for the study was granted by the NIoTs ethical review board.All participants shared written agreement prior to the study.Teachers received an informati
170、on sheet and had opportunities to ask questions,and written agreement was provided by participating teachers and a senior leader at their schools.For students,headteachers provided written agreement for their participation,and parents and carers were informed of the studys aims and methods and were
171、given one week in which to opt out on behalf of their child.At the start of the discussions,students were presented with information about the study by an NIoT researcher and were offered the chance to either withdraw or provide verbal assent to take part.For safeguarding purposes,students were join
172、ed in the(physical)room by a member of school staff for the online focus group discussion.31 Results These results are supplemented by the additional user research designed to inform the development of the PoC tool(see section 4).The PoC was not designed for development to deployment,but rather to s
173、upport learning about how GenAI can be applied to an education context.As a result,the findings from this research were not used to inform subsequent phases of development of the tool as they would usually be in the development of a product or service.The results are presented in three main sections
174、:teachers prior experiences with GenAI;opportunities and risks associated with using GenAI for feedback;barriers and facilitators to GenAI use in education.Teachers prior experience with AI To contextualise their views on these subjects,we begin with a brief overview of the teachers attitudes toward
175、s AI in general.Despite very few self-reported knowledgeable or confident users of AI in the sample,most teachers had experimented with AI in one or more aspects of their professional life.This is broadly in line with survey data suggesting that in November 2023,42%of teachers had used AI to support
176、 their roles(Fletcher-Wood,2023).In our sample,these activities ranged from AI for report writing(Teacher 8)and idea generation(Teacher 7)to AI for adapting learning tasks for individual needs(Teacher 6)and generating materials(Teacher 3):“It might sound a bit weird,but Ive got lots of PowerPoints w
177、ith bullet points on them and I dont really want the kids just copying that down.So Ive actually asked ChatGPT to basically turn all my bullet points into paragraphs.So the kids can then re-read them back and highlight the key points.”Teacher 3,Secondary “I said Oh,lets hold on a second what if we a
178、sk Chat GPT maybe to sort of give some ideas?So we put the prompts in again,similar to my talk for writing cycle that Ive done previously.”Teacher 7,Primary Although these teachers had at least considered how they might use AI,some initially reported negative feelings towards AI:“Its probably someth
179、ing Ive kind of avoided Yeah,I am someone that mistrusts technology in that way.”Teacher 5,Secondary In certain cases,teachers also reported that the negative states of uncertainty and intimidation surrounding AI were precisely what had motivated them to learn more about it:32 One of the reasons why
180、 Im quite interested in AI is because Im also quite scared of AI and quite intimidated by it.So,I kind of want to understand a little bit,kind of get to grips with it.Teacher 4,Secondary I went on an AI webinar and I spoke with the headteacher and I said,We dont really know anything about this,I fin
181、d it fascinating.Teacher 6,All-through Special School Despite having limited experience with AI,teachers were generally open to exploring its potential in education.Many also felt some trepidation and lack of certainty about what it would mean going forward.GenAI for feedback Although it will not be
182、 deployed,teachers were generally very interested in discussing the new PoC tool and what it could mean for teaching.Its got my head spinning now Teacher 7,Primary Teachers identified several positives about the tool such as being able to choose a specific focus for the feedback(Teacher 4),but also
183、several areas for improvement.For example,the feedback was deemed“a bit wordy”(Teacher 8),and several teachers questioned how it would cope with handwritten work.Conversations about the PoC tool acted as starting points for exploring teachers perceptions of the opportunities and risks associated wit
184、h using GenAI tools for assessment and feedback.It also sparked wider debates about the heart of what it means to be a teacher.These are summarised visually in figure 5.Figure 5:Opportunities and risks associated with using GenAI tools for assessment and feedback 33 Opportunities Reducing subjectivi
185、ty Teachers described two main opportunities that could arise from using GenAI feedback tools.The first was the potential for standardising outputs,thereby reducing subjectivity and inter-teacher discrepancies in judgements,and increasing the consistency of feedback.This was mentioned by a small num
186、ber of teachers in relation to moderation of work:“Last year I went into a room with all the top teachers in city.Theyre all there,all Year 6 teachers,and discussing every single text.No one was quite sure.Is it this level?Is that level?If you want to create,you know,the AI,the formula for that,then
187、 it takes out all the subjectivity.”Teacher 8,Primary“Moderation is hard because youre personally involved.So thats why,if youre a teacher,moderating your own work is completely different You want to fight for that child,so thats why I think if you remove that thats why I think the AI system could b
188、e quite strong.”Teacher 12,Digital Learning Lead across a MAT This suggests that using GenAI to support feedback could help to remove the subjectivity and bias from teachers own assessments.Similarly,one teacher suggested that GenAI could be used to support inexperienced teachers with the uncertaint
189、ies of working with a new year group:“If theres a teacher moving to a new year group,having a tool that could then help them with where the mistakes are what is expected for that particular year group and you kind of lose some of those discrepancies between the teachers as well.”Teacher 1,Primary Ti
190、me saving The second key opportunity was that GenAI feedback tools would potentially save teachers time on marking and that this time could be used for other tasks,which would support the learning of their students:“Its the case of it saving you the time.So,then you can spend that time analysing how
191、 to therefore take their journey to the next step.So the teachers role becomes less of the assessor and more of the next step,the teaching again,it puts the onus back on track,doesnt it?OK,so the computer programs telling me that its assessing you as this,so therefore I am a teacher and actually now
192、 I can teach you.”Teacher 8,Primary 34 This suggests that the time saved on a task such as feedback could be reinvested into the learning process by supporting learners based on the feedback generated and teaching accordingly.Some teachers therefore felt that there were opportunities to save time an
193、d strengthen practice using GenAI for feedback,however,implicit in these hypothetical opportunities was the assumption that the GenAI tools would be accurate and reliable.Risks Teachers also identified a range of risks that they felt should be considered before implementing GenAI for feedback.Teache
194、rs generally discussed these risks in greater depth than they did opportunities.Firstly,and in contrast to the positive potential for time saving,some teachers expressed concerns about whether such tools would save time due to their lack of trust in the tool:“I think I would end up going back over a
195、nd reading their essays anyway to see if the AI was correct.”Teacher 3,Secondary This suggests that in order for the time to be saved,teachers would need to trust the accuracy of the tool,although it remains important to note that this time saving does not remove the need for a human in the loop,and
196、 the importance and centrality of the teachers professional judgement.Changing role of the teacher Teachers also expressed risks that may exist regardless of the tools capacity to give precise,appropriate,and accurate feedback.One such concern was that using a GenAI tool for feedback would mean chan
197、ging the role of the teacher and the learning process in a significant way.This theme reoccurred across the interviews,with teachers reflecting on the centrality of teachers involvement in students work for the learning process:“I dont think I could let it go in that way because these are my student
198、s and I should be the one giving them that feedback.”Teacher 3,Secondary“Its our job to know their barriers to learning.Its our job to know how to deliver that feedback in a way that will actually ensure that the progress happens.”Teacher 11,Primary Special School“The next step is that discussions a
199、round marking through AI and then you just think,well,if the students are writing it through AI and then were marking it through AI,then the whole thing is pointless.Like what?None of us need to be engaging in this activity at all,Its an empty hollow exercise.If the students arent doing the work and
200、 were not doing the work,then what was the point of writing or setting that assignment?”35 Teacher 2,Secondary In this quote,Teacher 2s worries around her own changing role are further compounded by her fears around broader issues arising from students GenAI use,thus making the entire feedback proce
201、ss redundant in her eyes.Overreliance and deskilling teachers Another risk was that using GenAI for feedback may deskill teachers,especially those new to the profession:“What are they teachers in the classroom for?It doesnt require any skills whatsoever from that teacher other than some basic ICT sk
202、ills.”Teacher 10,Secondary“I think my concern is you would have a lot of teachers that would just rely on that and they would actually lose their professional judgement You could have,you know,ECTs that come into school and only use that and thats not gonna develop them professionally.”Teacher 11,Pr
203、imary Special School“You need to be able to identify these errors yourself.You cant just be relying on technology all the time My fear with that would be that you raise a generation of lazy teachers.”Teacher 9,Primary Concerns about teachers becoming over-reliant on such tools were also echoed by ot
204、hers.While some teachers reported that they would be likely to trust such feedback tools,they also reported concerns that both they and their colleagues could become lazy and excessively trusting of the technology,in part due to their unsustainable workloads:“I think I could get quite too reliant on
205、 it if I had really positive experiences at the beginning,you know,because I have four Year 7 classes thats over 120 Year 7 students.I dont have the time to go through every single test,right?What did they put for this question?What did they put for this?So,I think,you know,anything I can do to save
206、 my time Im all for it.”Teacher 4,Secondary“I think there could be a danger,as I said,like teachers getting a bit,you know,lazy.I think that I would get quite reliant on it.”Teacher 4,Secondary“Theres the temptation,I guess,for somebody to not actually really have a proper look at it and see what th
207、eyve done well and what theyve not done well.So that personal interpretation of what theyve done right and what theyve 36 done wrong,they might miss that because they might not take the time to actually read a summary of what theyve done.”Teacher 10,Secondary The concern about losing track of where
208、students are in their learning was shared by many of the teachers:“I know that obviously the whole point of AI is to take that job away from me,but as a teacher,I think thats quite an important part,and Id worry that somehow AI would lose the sort of nuance of what the kids should be doing”Teacher 3
209、,Secondary Teacher-student relationships and individual needs Many teachers expressed a concern that handing feedback over to GenAI would remove an intrinsic aspect of the teacher-student relationship.Some teachers suggested that the acts of work submission,feedback,and response to feedback are part
210、s of an important cycle between student and teacher where the student can open up to their teacher,helping their teacher to better understand them:“What I find really problematic about using AI to mark a students work is that there is no relationship in that and so many students want,they want you t
211、o read their work because this isnt just about them producing a piece of quality work.If they thought that you were just going to run that through an AI marker,I think their investment in that is gone.They want you to read their work.They want you to know and understand who they are as an individual
212、.They want to impress you often.They want to interest you in who they are.”Teacher 2,Secondary“I think,to not even read it as a human would be really detrimental to the relationship”Teacher 2,Secondary“I would lose that kind of rapport,I suppose,to some extent with the kids and that kind of ongoing
213、conversation that is there.”Teacher 3,Secondary“I think the downside for me personally is I dont think Id get to know my students and their quirks as effectively”Teacher 5,Secondary This highlights teachers perceptions of feedback as being more than an academic exercise and how using GenAI to provid
214、e feedback could result in negative socio-emotional consequences.Additionally,relating to teacher-student relationships,some teachers expressed the importance of knowing ones students both academically and behaviourally when providing feedback.37“You still need to read it all through yourself to see
215、 what the AI has put,and whether or not it has focused on what you want it to focus on for that student as an individual learner.”Teacher 2,Secondary Teacher 2s concerns indicate a need for teachers involvement in the process to ensure that outputs are appropriate for the individual learner.However,
216、the perceptions of teachers in special educational settings suggested that even such mitigations may not make written feedback via AI a viable option for many students with special educational needs:“There are a lot of other issues that we need to take into consideration for our pupils whenever we a
217、re doing any learning,and you could have someone who gets really thrown by any negative marks on their paper or rather than a verbal feedback or there are people who dont like verbal feedback.So,its all individualised that way.”Teacher 6,All-through Special School“They never read it.They never respo
218、nd to it.At the moment the feedback needs to be immediate.You know,in the lesson,not after,and especially in a special school because they dont really have the capacity to go back and reflect.And also,if theres something that they found particularly tricky and youre asking them to reflect upon it,yo
219、u could actually trigger behaviour.”Teacher 11,Primary Special School While both describe the importance of individualised and carefully considered feedback,Teacher 11 also suggests that the written feedback produced by GenAI would likely be inaccessible for many of her students.This further demonst
220、rates the perceived importance of having teachers at the heart of the feedback process who understand their students on a social,academic,and behavioural level.Students views on using GenAI for feedback Similar themes came up in the student focus group discussions.Regarding opportunities,some studen
221、ts noted that an GenAI feedback tool could save teachers time that could then be used to cover more content when teaching:“I think itd be easier for the teachers just to put it in,so then its not so time consuming as they dont have to hand mark it I think more content could be covered instead of jus
222、t focusing on marking a particular piece of homework.”Student 3,Focus Group 2 However,there were also concerns among students,with many suggesting that an important part of the feedback process is the teachers understanding of their students learning needs:38“AI wouldnt know the way you learn like a
223、 teacher would.unless there was a way where you could tell it beforehand.”Student 2,Focus Group 2“Artificial intelligence wont ever really be able to match how a teacher has known you through the class.”Student 4,Focus Group 2 This echoes the teachers sentiments and shows that students value the per
224、sonal level of understanding that teachers bring to the feedback process.One student also described the sense of demotivation that may arise from knowing that their work would not be checked.“And demotivation from that that no ones ever going to check it.And also,it doesnt show the teacher It doesnt
225、 matter if youre improving or not when youre like learning good stuff like that.”Student 1,Focus Group 1 While it is possible that the academic subject in question will influence the extent to which GenAI feedback would damage the teacher-student relationship(for example,maths tasks are less likely
226、to facilitate the sharing of personal information than literacy or humanities),a students motivation to complete their work to a high standard may nonetheless be diminished if they are aware that it wont be read.Teachers needs for GenAI uptake in schools The interviews also included opportunities fo
227、r teachers to explain potential barriers and facilitators to GenAI adoption in their schools.In many cases,these were two sides of the same coin,so are analysed together below as Needs rather than discreetly.The most commonly cited facilitators that would support the use of GenAI in schools included
228、 time,training,expertise,and funding(see figure 6).39 Figure 6:Needs identified by teachers based on their perceived barriers and facilitators to GenAI adoption in schools Shifting attitudes of teachers Perception of teachers was most commonly cited in relation to the implementation of GenAI,and sev
229、eral teachers suggested there would be a degree of scepticism unless it was clear and evidenced how the tools would support their practice:“Its about convincing us that its worth putting the time and effort into learning how to use it.”Teacher 3,Secondary “I think all staff to some extent,you know,f
230、eel a bit of trepidation because its technology and the children stereotypically understand the technology better than most teachers.”Teacher 8,Primary“I think it would have to be made very clear that it is to benefit workload and not add to it.”Teacher 9,Primary If teachers concerns about the time
231、investment required and the efficacy of the tools can be allayed,they may be much more likely to implement them.However,until those needs are met,this culture of hesitance within some schools may delay the adoption of new technology.Some teachers suggested that individual differences between teacher
232、s may influence their eagerness to adopt GenAI in their classrooms.One teacher felt that their age played a factor in confidence around uptake of GenAI and other technology,and expressed their preference for old school ways of working:40“I think my age probably says it all really.Im not.Im not reall
233、y super tech savvy.The youngsters in my faculty are far more tech savvy than I am.Im very precious about marking and things and although I dont like it,I feel like I need to do it and I do it and Im very old school.”Teacher 3,Secondary Teachers of a range of ages also picked up the point about teach
234、ers years of experience and age,sharing the view that older teachers may be less interested in GenAI:“I think also a lot of teachers are quite hesitant for change,to change,so I feel.dont mean to be rude,but maybe that older generation of teachers would be quite thrown by.you know,sort of worried ab
235、out it.You know,theres a lot of teachers that do struggle with their IT and technology.”Teacher 4,Secondary While our data does not offer extensive insights into how teachers with more years of experience feel about GenAI,these quotes suggest that discussions around age-related differences in willin
236、gness to adopt new technologies occur within schools.Whether based on truth or stereotypical judgements,these show that some teachers perceive differences in colleagues readiness to adopt GenAI based on age.One teacher suggested that teachers with more years of experience may be less likely to stay
237、in the profession if forced to implement GenAI technologies:“I am seeing teachers hitting around 60 and thinking I cant do this anymore.thinking its not worth me now having to engage with the whole new system of working.I cant do that again.”Teacher 2,Secondary Time There were tangible facilitators
238、to GenAI use identified in the interviews.Teachers widely reported that to effectively implement GenAI tools in school,they and their colleagues would require time set aside to learn about the tools and become confident in their use.Without this time,it was suggested that teachers would likely disen
239、gage from the GenAI learning process:“I think time would be something that teachers are saying.I dont have time for this and kind of throw it off because of that.”Teacher 4,Secondary The most key element,is if you want us to engage with this,you have to carve out time into our timetables in order to
240、 be able to play around with it.Create it.Try it out.And so much of our timetable,I mean,this is the inbuilt catch 22,isnt it?The kind of irony of situation is so much of our time is so busy already.There is so little time to take on board new things.So,every inset day we are bombarded with more ini
241、tiatives,more ideas.”41 Teacher 2,Secondary If youre asking someone to go off and do it themselves and saying theres information go and look at that yourself its not gonna happen.there will have to be a dedicated time set aside because people are just very busy and theyre not gonna go off unless the
242、y can see.the potential in an allotted time.”Teacher 6,All-through Special School This demonstrates the somewhat circular situation that teachers find themselves in,whereby they do not feel that they have enough time to invest in learning how to use tools that are designed to help save them time in
243、the long run.Training and expertise Teachers also expressed a need for training and someone with expert knowledge to support the transition towards GenAI adoption:“I feel as long as the right training was in place in terms of how to use it,with it being technology,I think any teacher thats worth the
244、ir salt would welcome it with open arms because its seen as a tool to support them.”Teacher 8,Primary“We would need someone with a lot more knowledge on which are the best AI tools out there.Which ones lend themselves better to our primary and our primary curriculum and yeah,certain areas and I know
245、 there is training out there,its just tapping into it and having somebody,I guess,championing it.”Teacher 1,Primary Guidance Some teachers seemed to be at a crossroads with regards to their own sentiments towards adopting GenAI,in part due to uncertainty about what others might think if they knew th
246、at teachers were using it:“So actually,the first time I used it to do my references I,I kept it quite quiet because I didnt know if morally that was the right thing to be doing whether the people would see that as me obviously not taking the time to obviously write those references individually and
247、bespoke”Teacher 10,Secondary“I dont think parents would like it I think society is very much like,you know,teachers need to be working really hard.And I think that when you,when you hear AI,your first impression is oh,you know,making your life easier,you know,cheating,you know,that sort of thing bec
248、ause it does have those negative connotations so I think youd have to be really careful how you sold it to parents if you were gonna use it in school.”42 Teacher 4,Secondary This suggests that some teachers may find themselves conflicted about whether they should be using GenAI.Moreover,if GenAI too
249、ls are to be recommended for use by teachers,such guidance will need to clarify exactly how teachers should be using them:“We havent had anything on how teachers might use AI.”Teacher 2,Secondary“I think the only thing that you would necessarily need from someone like Ofsted is the OK that things ar
250、e done like that.Kind of like their approval,in a way because there is always that fear that,you know,the teachers ultimately are the ones that are accountable.”Teacher 9,Primary“Schools need a policy of what are the boundaries that we are working in,whats safe practice so that we know that were put
251、ting our children and childrens data,in particular in a safe environment that we can play and explore.And I think if we dont make sure that is at the forefront of what we do,the danger is that we will have lots of these apps and new things going out and places where were sharing data.”Teacher 12,Dig
252、ital Learning Lead across a MAT These quotes suggest the need for greater clarity and guidance about the boundaries of GenAI in education and teachers roles in a GenAI-assisted school system.It is possible that with these clarifications,some of the risks identified above(such as,risk of over-relianc
253、e etc.)may be partially mitigated.Limitations This qualitative study was conducted with a sample of 12 teachers and 9 students.While we aimed to be comprehensive during these interviews and focus groups,the findings from this small sample should not be generalised across the education sector.Althoug
254、h we attempted to recruit participants from a diverse range of geographic locations within England,we note that all participating schools were rated either Good or Outstanding by Ofsted,and two-thirds had below the national average levels of students eligible for free school meals.Moreover,8 of the
255、9 schools were part of Multi Academy Trusts(MATs).As the needs and priorities of schools in different contexts may vary significantly,we recommend that future research explores GenAI adoption in schools deemed to be inadequate or requiring improvements,those in areas of greater deprivation,and those
256、 not part of MATs.Taking these limitations into consideration,the data revealed issues that were important and concerning to teachers and will likely be transferable to other schools across the sector.Some of these,such as time saving,mistrust in GenAI,and a need for professional development activit
257、ies were broadly in line with existing research,but our 43 research also includes perspectives on lesser explored issues,such as the potentially detrimental impacts to teacher-student relationship that may occur if GenAI tools are used for feedback.44 Conclusions User research conclusions Teachers a
258、nd school leaders insights reveal that there is not one uniformly adopted stance on the idea of using AI in schools.Most interviewees were positive about some aspects of AI but cautious about others.The teachers and students in our study expressed high levels of interest in AI for feedback and saw a
259、 number of important opportunities,but also discussed their concerns and risks in depth.Although many users were able to see the potential time saving and standardising benefits of using AI tools for feedback,they also highlighted considerations that would have to be addressed before any widespread
260、rollout.Some do not have universal solutions,for example,with some teachers reporting that they would not trust the tool at all,and others suggesting that teachers may become overly trusting and therefore not take the time to check its judgements.Interviewees were concerned that these issues could r
261、esult in educators losing the ability to make accurate professional judgements,and potentially lose track of where their students are in their learning journeys.A common theme in the interviews was that feedback is more than an academic exercise it is also social,and to take that task out of the han
262、ds of teachers may risk damaging the relationship between teacher and student,which is a key predictor of students academic development.When discussing potential barriers and facilitators to AI adoption,interviewees reported a need for time,training,funding,and expert help to increase teachers own e
263、xpertise in AI use.Further,teachers reported needing support and guidance to feel that AI use is acceptable,important,and safe.However,regardless of what guidance and professional development opportunities are in place,it is possible that certain teachers may not be swayed towards AI implementation.
264、Further,discussions about willingness to adopt AI often veered towards the subject of age and experience,with some teachers perceiving older teachers as being more reluctant to adopt AI tools.This suggests that first,we need more research on approaches to develop teachers confidence in using AI at a
265、ll stages of their career,and second,it would be helpful to understand further how discussions around AI may play into perceptions of difference and possibly stereotyping,related to age and other factors among staff.The methodology we employed facilitated debates about the use of AI and the role of
266、the teacher that may not otherwise occurred,had there not been practical examples to discuss,such as inviting teachers to trial a PoC AI tool before taking part in an in-depth interview.We therefore recommend that future research in this area consider employing similar methods to capture the develop
267、ing picture.45 The findings both highlight the importance of this critical area,and express the need for caution when moving forwards,emphasising both a need for tools that are accurate and fit for purpose,and a greater understanding of the consequences arising from their implementation by teachers
268、in practice.Future research should focus on the hypothetical perceptions of stakeholders and should also implement short-and long-term trials with AI in the field of education,measuring academic,behavioural,and psychological outcomes,offering recommendations for the mitigation of any negative conseq
269、uences.Suggested areas of future focus Some of the most significant findings from the work conducted as part of this project related to the degree to which GenAI has the potential to benefit the education sector over the long-term and the enthusiasm of many in education to take advantage of this pow
270、erful technology.Further detailed exploration of the potential benefits of GenAI,in its current state as well as following its inevitable further development,will enable government to continue to shape the policy landscape and how the technology is adopted by the sector.However,it is also clear that
271、 there are a range of cultural,logistical and technical constraints that prevent schools and individuals from fully exploiting the potential of GenAI.Following the hackathons,Faculty,NIoT and ImpactEd Group conducted a review of the projects findings up to that point,to identify a number of these co
272、nstraints.The list of potential barriers to the adoption of GenAI in schools has been updated in light of the further findings from the second phase of the project,including the PoC build,the user testing and the user research programme.Linked to these barriers,we have suggested a range of areas for
273、 focus for future work that may support the broader take up of GenAI,as well as mitigating risks and increasing overall impact.Perceptions of the impact of AI on the role of teachers,and a lack of clear expectations regarding its use Some teachers raised concerns that using an AI tool for feedback w
274、ould change the role of the teacher and this would affect the learning process in a significant way.Teachers also raised the importance of close human interaction to students personal development as well as their learning.This concern was not limited to giving feedback but was echoed by teachers and
275、 school leaders in discussions of other use cases,including lesson planning and writing student reports.Teachers also raised concerns about whether using AI to support or replace elements of their role is the right thing to do in terms of best educational practice,as well as whether it is morally ri
276、ght.They were also uncertain about what other teachers,and parents or students would think if they were found to be using AI,as well as authorities such as their school leaders or Ofsted.46 Suggested areas of focus:exploration of the role of GenAI in teachers and school leaders work,with a special f
277、ocus on their interactions with students and how these may be affected by the introduction of GenAI;development of further guidance for teachers and school leaders on best practice in the use of GenAI including an assessment of the existing evidence and its limitations;and commissioning of related r
278、esearch where gaps are identified in the evidence base.Challenges of integration with existing systems,and disparities in access to technology Most of the GenAI use cases in this project would require a regular feed of existing,context relevant data into an LLM based service.This would require LLM i
279、ntegration via Application Programming Interface(APIs)with existing systems that capture this data at source,such as information management systems and school databases.This may be more straightforward for some applications than others:for example,enabling access to a bank of lesson plans is likely
280、to be more achievable than to student personal data,given challenges with privacy,frequent updates,and the systems used to store this type of data.In some cases,commercial data management solutions or APIs developed by the edtech provider can close this gap,but this may not always be possible,or may
281、 become prohibitively expensive.Schools varying digital maturity and student access to technology were also raised as potential barriers to take up of GenAI.Varying levels of digital maturity across schools would need to be considered and possibly provisioned for if such tools were to be adopted on
282、a national scale.For example,for schools in some areas internet bandwidth could come under strain with increased use of LLMs.And for the student-facing use cases,it is important to note that not all students have access to the same devices or internet availability at home,creating a divide between t
283、hose who are able to spend extra time practising use of GenAI,and those who are not.There are examples of edtech tools which students can access through a mobile app or browser,but even these cases may preclude some students.Schools are better equipped post Covid-19,but access remains varied.Suggest
284、ed area of focus:exploration of the digital and data infrastructure challenges that schools face in accessing GenAI tools,and how these may differ across different schools or different types of school.47 Lack of reliable information on GenAI tools effectiveness,accuracy,and safety Confidence in the
285、accuracy of AI tools During the hackathons the teachers and leaders expressed reluctance to adopt tools that have not been tried and tested.Specifically in relation to the PoC feedback tool,teachers expressed concerns about whether the tool was accurate.A lack of accuracy would risk negating the pot
286、ential benefits of the application of GenAI,for example with the PoC tool,if the feedback given is inaccurate the tool will be unable to deliver the potential benefits around improving marking consistency and saving teachers time.Even if the tool is accurate,a perceived lack of accuracy on the part
287、of users could result in reduced take up and equally negate the potential benefits.Budget constraints and lack of information on value for money Effective adoption of GenAI may require additional financial resourcing for schools,for example to enable schools to use commercial off the shelf GenAI too
288、ls which have an associated licensing fee.Depending on the tool,the costs can be prohibitive,and this challenge may be more acute for schools outside of large MATs if they are less able to negotiate due to their smaller size.In addition,teachers and schools technology leads can find it difficult to
289、access objective assessments of edtech tools,relating to their overall efficacy,impact on outcomes,and safety or data privacy questions.Data protection and intellectual property challenges Use cases that require LLMs to be trained on or use student owned or personally identifiable data may require a
290、greement from parents or students,or may not be permissible depending on school policies.These challenges cover both data protection challenges such as ensuring privacy and gaining agreement to the use of personal data,as well as challenges related to intellectual property,including the training of
291、AI models using student-owned data and the required agreement.Even for models deployed securely,where student data is kept within the school or MATs environment,ensuring sufficient protection of students data can be challenging and complicated to navigate.Appropriate and sufficient guidance would ne
292、ed to be provided from DfE to assist schools and the edtech sector to navigate the UK General Data Protection Regulation(UK GDPR)and the use of AI,and would need to be updated on a regular basis given the rapid pace of evolution of technology.Suggested area of focus:exploring options for producing g
293、uidance or kitemarking GenAI tools to enable schools to make informed judgements about their safety,data privacy,effectiveness and impact on outcomes and accuracy,including ensuring that schools are clearly informed where the evidence base for a tool is limited or in development.48 Strengthening tea
294、chers and school leaders confidence in using GenAI tools If teachers and school leaders do not have access to high quality advice and training which helps them to be confident in using GenAI effectively and safely there is a risk that teachers disengage from incorporating GenAI tools into their prac
295、tice and lose access to the potential benefits.Teachers may not be aware of the tools available or their benefits,or may perceive them as prohibitively complicated or time-consuming to learn about.Individual differences between teachers,such as their prior experience with other technology,may also a
296、ffect their confidence and readiness to adopt GenAI tools.Suggested areas of focus:development and testing of training for teachers and school leaders in the use of GenAI,including the potential benefits,how it can be applied in schools,and practical guidance on how to support their school or teache
297、rs on the journey to implementing GenAI;and building cohorts of technology leads who can support their colleagues and school leadership in implementing GenAI tools in their specific contexts.Further work in progress Since the beginning of the Use Cases for Generative AI in Education project,two addi
298、tional projects have been conducted in response to the interim project findings.AI Readiness in Schools Data Systems Project In light of project findings relating to the challenges schools face with integration of GenAI tools with existing systems,and how this is preventing schools and teachers from
299、 fully exploiting the potential of GenAI,the AI Readiness in Schools Data Systems Project was launched in January 2024.This project is engaging with several schools and MATs to explore how they could be best readied for GenAI adoption,producing analysis and case studies that can guide further workst
300、reams and guidance for schools and Trusts.Mini hackathons for schools Similarly,interim project findings identified teachers and school leaders views that a lack of guidance and training around how to use GenAI in their context prevents them from identifying potential applications of this technology
301、 and being confident that they can use it appropriately and safely.As a result,a programme of mini hackathons for schools was established,delivering a small number of Proof of Concept(PoC)training sessions to trial approaches to:49 providing an introduction to AI and key concepts required to apply i
302、t in an education context,explaining strategies,ideas and tools for teachers to use AI,giving an overview of the risks and how they can be mitigated,and delivering a mini hackathon,where teachers are provided with a list of tools and invited to explore ways they might use this in their own practice.
303、Teachers are then invited to share their learnings with the group,and with their peers across their school.Details of this are in annex 3.Recommendations for delivery of future hackathons In addition to the specific learnings related to the application of GenAI in education via each use case,there w
304、ere a number of lessons learnt relevant to the delivery of hackathons,which would be valuable in the design and planning of any similar future events:User consultation ahead of the hackathons:Engagement with a range of users ahead of the hackathons provided the team with a strong initial understandi
305、ng of users needs,their perspectives on the use of GenAI,and the key use cases that were most important to them.This had a clear impact on the success of the hackathons,as the use cases selected for experimentation were closely aligned to users expectations and their priorities,helping to build buy-
306、in from participants and to ensure that the findings of the events were relevant.The combination of surveys and focus groups was particularly effective,reaching a high number of respondents and enabling ranking of use cases,while also providing detailed insight into users perspectives.Allowing flexi
307、bility for the use cases selected:While participants were presented with a list of use cases,they were also encouraged by the group facilitators to interpret them in a way which was most relevant to their own specific contexts.This had a number of benefits,including very high levels of user engageme
308、nt with the process and the solutions developed,ensuring that users were able to give very specific feedback as to the potential effectiveness of a solution,and enabling users to provide real test examples drawn from their own experience.Engineering work ahead of the hackathons:Before the events,a t
309、eam of engineers and data scientists worked with the education policy experts to prepare for the hackathons.The key elements of their preparation(aside from ensuring that all participating data scientists and engineers could access the platform used)were the preparation of the environment,including
310、access to the GPT API,provision of example datasets relevant to each use case,and provision of Jupyter notebooks with all essential code to start prompting GPT.This ensured that the teams could 50 begin their prompt engineering work immediately,with no need to search for or process data,or set up a
311、coding environment.Ability to upload data:In some of the hackathon sessions,the users were keen to make use of their own datasets,for example in the lesson planning use case,where the teachers were keen to use their schools curriculum or lesson planning guidance as reference data for GPT.In some cas
312、es,this was still possible,such as when the data was available for download from the internet.However,provision for the upload of data of course subject to considerations like data privacy,and protection of Intellectual Property(IP)would be beneficial in planning for future hackathons.Composition of
313、 hackathons teams:The teams in the hackathons included a balance of users(teachers,school administrators and school leaders),data scientists,engineers and policy experts.Each of the hackathon groups included one facilitator,one policy expert,2-3 users and 3-5 data scientists and engineers.This compo
314、sition was adjusted based on feedback from stakeholders with experience in running government hackathons and proved to be very productive.Attempting to solve multiple use cases:Each of the teams outlined above attempted to solve three to four use cases over the two days of the hackathons.This was ch
315、allenging,as often the teams felt that they did not have sufficient time to fully explore all the different approaches that they had designed for each use case.It may have been more productive to have smaller groups,but each focusing on just one or two use cases throughout the two days of the hackat
316、hons,potentially giving a higher likelihood of participants being able to solve a use case.Involvement of maintained schools and smaller MATs:The user participants were drawn from large MATs such as Outwood Grange,Harris and Star,and brought huge value to the hackathon groups given their extensive k
317、nowledge of teaching practice and school processes,their judgement as to whether particular solutions would work in their school context,and their enthusiasm for finding solutions to difficult problems.However,had the participants been drawn from other types of school including maintained schools an
318、d smaller MATs(as the participants for user research were),we could have ensured that their feedback was representative of a broader range of school contexts.Student participation:Bringing in students as participants was challenging in terms of the logistics needed ahead of the hackathons,with a ran
319、ge of insurance,safeguarding,travel and accompaniment implications.However,the insight gained from the feedback provided by students was very valuable,and although only one use-case for students was explored,there were some novel findings that were also relevant to other applications of GenAI(such a
320、s the ways in which students use existing AI tools to practise concepts they have learnt in lesson).Some of the practical difficulties of student participation could be mitigated(while still benefiting from the clear value add of students perspectives)by holding a separate hackathon that was focused
321、 on students held in a school.51 Size of the hackathons event:The hackathons were all held in the same location over two days,and although this had some benefits(e.g.reducing travel requirements),it meant that the event itself was large and complex,with over 60 participants and attendees,including m
322、embers of the press.At certain points it was challenging for the attendees and organisers to maintain focus on the problem-solving required,and the high levels of event management involved took up a significant amount of the projects available resource in the lead up to the event.Although there were
323、 additional benefits to this size of event,such as the press coverage increasing awareness of the potential of GenAI in education,a series of smaller events may better deliver the key aims of the hackathons.Opportunities for networking and building a cohort of users:Throughout the hackathons,a numbe
324、r of opportunities for networking over coffee,breaks and lunch were built into the timeline.This helped maintain focus during the intense days,but also provided time for the participants and attendees to jointly reflect on the interim findings of the hackathons and to cross-pollinate and spread effe
325、ctive practice between hackathons groups.It also enabled participants to share their experiences and ongoing initiatives related to the use of AI in schools that they were involved in,leading to the development of a group of super-users who were key in the testing of the eventual PoC once developed.
326、52 Annex 1:Summary of pre-Hackathon consultation findings Overview The Generative AI in Education project is a collaboration between the National Institute of Teaching,Faculty AI,the Department for Education,and the AI in Schools Initiative.The project is exploring how Generative AI could be used to
327、 reduce school staff workload,increase practitioners effectiveness,and improve learning in schools.Stage 1 of this project was a school led,co-design phase with stakeholders from the education sector.Teachers,school leaders,administrators,and students took part in a consultation via surveys and stak
328、eholder group meetings.We found that teachers were most keen on the idea of using tools to support them with lesson planning,marking and assessment,and making better use of class and pupil data.Likewise,school leaders were keen on implementing tools that could support them with data analysis,whilst
329、also reporting that they would like support with writing and updating policies,documents,risk assessments,and parental communication.Administrators were keen to use tools to streamline and improve their understanding of data,develop timetables,and draft risk assessments.Finally,in discussion groups,
330、students showed interest and support for several AI use cases,but expressed concerns about AI replacing teachers and other school staff.Aims Stage 1 was a school led,co-design phase with stakeholders from the education sector.Teachers,school leaders,administrators,and students were consulted via sur
331、veys and stakeholder group meetings.The first aim of this phase was to understand how stakeholders felt that AI could support school staff to save time and strengthen practice.The second aim was to gauge students perceptions on how AI could support their learning.The findings from this phase were us
332、ed to inform the shortlisting of use cases for the hackathons in stage 2.This consultation phase of the project engaged stakeholders in a dialogue about the potential uses of AI in schools,seeking their input as key stakeholders and collaborators,rather than research participants.We do not report di
333、rect quotes from stakeholders.This initial stage will feed into more focused research into users perspectives about AI during the user testing phase in stage 3 of the project.53 Methods Initial Generation of Use Cases In collaboration with education experts,teacher trainers,and teachers,we generated an initial long list of use cases.Use cases were categorised depending on the target user(for examp