Shaping the future of education: AI as a double-edged tool

It’s been hard to ignore the buzz around generative AI since the launch of ChatGPT last November. Swiftly followed by other large language model chatbots such as Bard, their ability to create content, simulate human-like interactions, and analyse data has captured the public’s imagination. But what are the pros and cons of using AI in education? Claire Penketh from BCS, The Chartered Institute for IT, reports

Generative AI is in the spotlight in education because of its potential to revolutionise teaching and learning. BCS, The Chartered Institute for IT, as the professional membership body for information technology, is well placed to consider the implications of this latest technological innovation.
It recently surveyed the BCS-backed Computing At School (CAS) network of teachers and found educational establishments unprepared for the impact of AI tools. Overall, teachers felt generative AI should not be banned from the classroom, and instead, pupils should be helped to use tools like ChatGPT appropriately.

Admin burden
As any teacher knows, there’s a large amount of admin in education, and generative AI offers the promise of several potential benefits for staff members. BCS recently submitted its evidence to a government consultation on this subject, pointing out that generative AI can quickly produce learning materials, worksheets, assessments and help with lesson plans. It can support struggling students by analysing the individual’s data and adapting teaching materials, pacing, and evaluations to suit each pupil’s unique learning style.  
For students, round-the-clock access to AI-powered educational platforms and learning materials will allow the flexibility to engage with content when they want, fostering self-directed learning and taking into account different schedules.
It can also help track performance, allowing teachers to make informed decisions about curriculum adjustments and interventions. This means teachers can identify areas of improvement and carry out effective early interventions.
These are all tasks that teachers already do – but the promise is AI could free up teachers from spending so long on administrative tasks, giving them more time for teaching, mentoring and interactions with students.

Ethical concerns and plagiarism
There are risks, though, as using AI also brings enormous ethical and legal considerations. Most existing AI systems still need to be trained on student data created by a sufficiently diverse range of school-age students to avoid bias. Plus, there must be significant controls to protect the use of students’ personal and sensitive data.  
When it comes to students work, there are several potential pitfalls. There are concerns about plagiarism with students using the new technology and saying the generated AI work is their own. Plus, there are doubts that students who use generative AI will engage with learning as deeply as they do when using traditional learning methods. And there are well-founded worries about the accuracy and sources of the information used to produce work from generative AI - the hallucinations - and how students reference it as a source.

It’s too early to fully assess the impact of generative AI in the classroom yet. However, BCS has developed a series of recommendations for the government as part of a recent call for evidence on using AI in education.  
The first is training for educators. BCS recently recommended that AI become part of teacher training courses. Schools should teach children how to use AI from 11, with pupils working with tools like ChatGPT to understand their strengths and limitations better. The scope of the Computer Science GCSE should include a focus on how AI is built and consider its risks and opportunities. BCS also recommends that young people need a new alternative digital literacy qualification to the GCSE, emphasising AI and other modern digital skills.
The second recommendation is for schools and colleges to train staff to use AI ethically and effectively.
The third recommendation is to consider the importance of human judgement. This should be done by retaining and enhancing the importance of human expert assessment to maintain academic integrity and address nuances that automated systems might overlook. The role of highly trained expert assessors can play an essential part in mitigating against misuse of AI in education. They can examine authenticity, consistency, and coherence and detect nuances that automated systems might miss.

Forensic analysis
Where there are doubts about authenticity, forensic analysis techniques should be used to examine metadata, file properties, or digital footprints for signs of tampering or AI generation.
There should also be randomised spot checks, as this can identify anomalies that need deeper investigation.
Rigorous identity verification is also needed. Use robust verification to confirm the identity of each person submitting evidence. While this doesn’t directly address using tools like ChatGPT, it could protect against more advanced forms of AI-based cheating, such as deepfakes, in the future.
There should also collaboration between educators, exam boards and technology providers to share information, best practices, and emerging techniques for detecting AI-generated evidence. Collaborative networks can enhance the collective ability to identify and address threats.
The BCS also calls for continuously adapting teaching and assessment methods to address emerging AI-related risks, and for there to be clear policies. BCS welcomes guidance from the Joint Council for Qualifications: AI Use in Assessments: Protecting the Integrity of Qualifications. BCS agrees that, as the guidance states, exam centres must now develop a coordinated approach and policies to address AI risks and misuse. Schools should also update existing plagiarism policies to account for AI, which will be effective if communicated to learners and staff so everyone understands what is expected of them.

The human touch
Another unintended consequence could be undermining the one-to-one connection between teachers and pupils in areas such as personalised learning. The emotional aspect of education, such as mentorship, encouragement, and empathetic understanding, might be otherwise compromised.
Niel Mclean, BCS head of education, said: “The integration of AI into education presents a unique opportunity for enhancing the learning experience. However, it’s crucial to approach its implementation with a balanced perspective. While AI can certainly offer tailored support to students and facilitate administrative tasks, educators must remain at the heart of the educational process. The emotional connection and mentorship teachers provide are irreplaceable and contribute significantly to a holistic learning journey.”
Generative AI has the potential to revolutionise education, but schools need to approach its integration into learning and assessment cautiously. The role of AI is to augment, not replace, the human touch in education.
In conclusion, educators need to be well-trained, and clear policies must be in place to harness AI’s benefits while safeguarding against pitfalls.

About BCS
BCS is the professional body for information technology. In addition to the Institute’s professional community of 70,000 plus members, BCS activities include ‘Education and Public Benefit’, for example supporting thousands of computing teachers as part of its peer-to-peer network through Computing At School (CAS). Additionally, its ‘Learning and Development division develops and maintains professional standards and certification, and is an Ofqual-regulated awarding body, plus BCS is an approved apprenticeship end-point assessment organisation (EPAO). We also accredit computer science courses at 110 universities. BCS’ key areas work together to ‘Make IT Good for Society’ and help ensure that current and future IT professionals are competent, ethical and responsible, whatever their role or area.