A Strategic Approach to the Wild West of AI in Education

The rapid rise of Generative AI tools like ChatGPT has sparked both excitement and apprehension in the education sector. While these tools offer immense potential for enhancing productivity and learning experiences, they also raise concerns around data privacy, accuracy, and the potential for misuse. School staff already grappling with increasing workloads and the complexities of modern education, must now navigate this new frontier carefully.

Schools, mindful of the risks associated with unregulated AI and its potential to reshape education as dramatically as Google Search changed the way we collect and recall information, have until recently avoided bringing AI into their classrooms even in spite of increasing workloads and the complexities of providing equitable learning experiences to all students. With productivity gains available at our fingertips, the allure of ever-improving AI tools such as ChatGPT, Gemini and Gradescope enabling teachers to unlock more time for teaching over classroom admin, has created a new Wild West. Without a degree of formalisation or training in place within most institutions, new risks are arising that school leaders must address.

The Dangers of Unregulated AI Tools
While innovation is essential for driving success, it mustn’t come at the expense of safeguarding students and data privacy. Copy and pasting data into textboxes on websites that may have lax privacy policies or process data outside of the UK is a quick way to fall short of UK legislation. Schools need to ensure that the platforms and tools their staff are using adhere to regulations and cybersecurity best practices.

Google’s Gemini Education is embedded into your existing Google Workspace for Education suite and maintains the same privacy and security standards that are preferred by the DfE and Cyber Essentials, ensuring your organisation’s staff and student information is safe as no data leaves the text documents or spreadsheets where it already resides.

Beyond data and security, the quality and ethics of new AI applications is concerning. AI tools trained on data from across the internet are renowned for perpetuating bias and inaccurate responses known as ‘hallucinations’, which can be especially harmful in a learning environment without additional fact and sanity checks from teachers and other school staff–defeating the point of using AI for time-saving productivity gains.

Separate from Gemini Education, Google is currently developing NotebookLM, where organisations can leverage the power of language models to extract insights from their own documents faster. This new tool can answer questions, summarise facts, explain complex ideas and brainstorm new connections based only on the data provided, therefore keeping research grounded in what you know to be true.

A Path to Responsible Use of AI in Education
One way school leaders can ensure their staff are adequately protecting their own information is to implement a rigorous evaluation strategy. However, with billions being pumped into this new industry from the biggest companies in tech, it’s a struggle to find the time required to stay on top of new developments. Equally, with time and money already being squeezed, the power of AI is in liberating school staff to spend more of their time improving learning outcomes, not costing more of both to maintain their safeguarding and cybersecurity posture.

Gemini Education is designed with a strong emphasis on privacy and security, prioritising data residency and providing robust encryption to safeguard user data at rest and in transit. Strict data minimisation and retention policies as well as constant monitoring ensures the level of security you expect from Google. Unlike general-purpose AI models, Gemini Education’s seamless integration with Google Workspace focuses on the specific needs of educators while adhering to the special and diverse demands of every worker in the 21st century school.

School leaders who are mindful of the security and safety implications of allowing their staff to use AI should consider undertaking a managed pilot programme to ensure proper training is provided, best practices are shared, risks are understood and every benefit is realised. As Google’s #1 Education Premier Partner in the UK and Ireland, Getech has developed the Trailblazers programme for schools and MATs to come on the AI journey using Gemini Education in a safe and structured way with measurable success.

If you are interested in integrating AI into your school responsibly, reach out to Getech.