Timon - stock.adobe.com
Department for Education plans to assess AI risk and benefits
The department has begun a consultation looking at the role of generative artificial intelligence in education
The Department for Education has begun a consultation on the use of generative AI. The department aims to explore the opportunities such technology presents for education, as well as understand the concerns of educators and experts in education.
The call for evidence is asking schools, colleges and other academic institutes, along with local authorities, about their experiences with ChatGPT and other generative AI systems. The DfE wants to find out about the main challenges in using generative AI and how these can be addressed.
“We would like to understand your experiences of using this technology in education settings in England,” the DfE said. “We would also like to hear your views on where using it could benefit education, and about the risks and challenges of using it.”
The department is also keen to understand the subjects or areas of education that those submitting evidence believe could benefit most from generative AI tools.
In March, the DfE published a report setting out its position on generative AI, which found that although generative AI technologies can produce fluent and convincing responses to user prompts, the content produced can be factually inaccurate.
“Students need foundational knowledge and skills to discern and judge the accuracy and appropriateness of information, so a knowledge-rich curriculum is therefore all the more important. It’s vital that our system of assessment can fairly and robustly assess the skills and knowledge of those being examined,” The DfE stated in the report.
The Joint Qualifications Council (JCQ) has also assessed the impact of using such systems in exams and formal assessments. It reported that while the potential for student artificial intelligence misuse is new, most of the ways to prevent its misuse and mitigate the associated risks are not.
The JCQ’s AI use in assessments: Protecting the integrity of qualifications report stated that there are already established measures in place to ensure that students are aware of the importance of submitting their own independent work for assessment and for identifying potential malpractice. “AI tools must only be used when the conditions of the assessment permit the use of the internet and where the student is able to demonstrate that the final submission is the product of their own independent work and independent thinking,” said the JCQ.
While the JCQ recognises that AI will be used in business, it has stipulated that any work submitted for formal assessment needs to state that AI tools have been used as a source of information. The JCQ said a student’s acknowledgement must show the name of the AI source used and the date the content was generated.
“The student must retain a copy of the question(s) and computer-generated content for reference and authentication purposes, in a non-editable format (such as a screenshot), and provide a brief explanation of how it has been used. This must be submitted with the work so the teacher/assessor is able to review the work,” the authors of the JCQ report wrote.
Read more about AI in education
- By using AI and analytics, higher education schools can gauge how their students are faring at scale during the coronavirus pandemic
- CompTIA finds that many firms are in the dark over AI and would benefit from some guidance over its advantages.