Category: Artificial Intelligence

TransforMATive Innovation Lab: Data Leaders

I walked into Google’s London office with a simple aim. Could we move the conversation on from tools to value. By the end of the morning it was clear that the answer is yes, but only if we are honest about where we are and deliberate about where we are going.

We started with a tour of what is now possible. Gemini continues to mature at pace, from image and video generation to deep research and code on canvas. New workflow features promise to stitch everyday tasks together. That was exciting, but the best moment came when we looked past the feature list and into the architecture and guardrails that make this safe for schools. Enterprise deployment, data residency, sandboxing and clear human approval points. That is where confidence grows.

The highlight for me was a practical AI agent story. A simple HR assistant that answers routine questions, checks policy and prepares actions has already given real hours back each week. Nothing flashy. Just a clear problem, a small pilot and a measurable outcome. It reminded me that transformation is rarely a single leap. It is a series of well chosen steps that build trust and capability.

Across the room we heard the same pressures. Funding in real terms, staffing churn and the paradox of doing more with less. The easy response is to chase the next shiny tool. The harder and better response is to design our digital estate with the same seriousness we give to our buildings. Name the architect. Decide what good looks like. Integrate systems. Improve data quality. Measure the experience of staff and pupils, not just the cost line.

We used a value compass to ground our choices. Yes, efficiency matters. So does risk reduction, staff and pupil experience and, for some, new revenue models. When leaders frame decisions through that lens, conversations move from technology to strategy, which is where they belong.

If there was a single word that captured the day it was intent. Hope is not passive. It is choosing the next right step and taking it together. Our next steps are clear. Define the data leadership approach. Audit the digital estate. Pilot one safe AI agent with human approval in the loop. Share what works so we all move faster. My thanks to our speakers and to everyone who gave their time and thinking. The energy in the room was real. Now we turn it into outcomes.

Empowering Education Through AI: Reflections from Our AI in Education Conference

We were honoured to work alongside the brilliant Zaitoon Bukhari from ATC Trust to design and deliver this fantastic event.

Artificial Intelligence is no longer a futuristic concept it’s here, reshaping the way schools operate, teachers teach, and learners engage. Our recent AI in Education Conference in collaboration with ATC Trust brought together educators, leaders, and innovators from across the sector to explore how AI can be harnessed responsibly, creatively, and effectively in schools and trusts.

The response was overwhelmingly positive. Delegates left feeling inspired, informed, and empowered to take their next steps toward meaningful AI integration. Here’s what they had to say.

Relevance, Quality, and Organisation: Setting a New Standard

Across the board, delegates rated the conference as Excellent or Very Good in every category from the relevance of topics to the quality of speakers, networking opportunities, and overall organisation.

Attendees particularly valued the event’s balance between strategic vision and practical implementation. The sessions offered both high-level insight and hands-on guidance, equipping leaders to begin applying AI tools safely and effectively in their own contexts.

“The conference was excellent, informative, thought-provoking, and brilliantly organised. It gave us the confidence to move forward with AI in our schools.”

Learning, Sharing, and Taking Action

The conference provided a platform for collaboration and reflection. Delegates highlighted the panel discussions, workshops, and networking sessions as standout elements that encouraged sharing of ideas and strategies.

From ethical considerations to policy development, AI audits, and teacher training, participants left with a renewed sense of purpose and clarity about their next steps.

Many reported that they will now:

  • Audit their school’s current AI use
  • Develop or refine AI policies
  • Appoint digital champions to lead AI initiatives
  • Build staff confidence through targeted professional development

“It was so helpful to talk with colleagues about where we are now and where we want to be. The event gave us tools to create a clear strategy for AI in our trust.”

Themes That Resonated Most

While every session received positive feedback, several themes emerged as particularly impactful:

  • Practical implementation of AI in the classroom
  • Ethical and safeguarding considerations
  • AI for administrative efficiency
  • Personalised learning through AI
  • Teacher training and professional development
  • Policy and strategic planning for AI adoption

These themes highlight the education sector’s growing commitment to embedding AI not as a novelty, but as a sustainable, purposeful part of teaching and learning.

Inspiring Confidence and Collaboration

One of the strongest takeaways was the sense of collective optimism that filled the room. Delegates described the event as “a fear-free introduction to AI”; an opportunity to learn, question, and share ideas in a supportive environment.

“The conference created an open space to explore AI with confidence and curiosity. It’s helped us understand how to use AI safely and purposefully.”

By the close of the day, the message was clear: AI in education is not just about technology; it’s about people, pedagogy, and purposeful change.

Looking Ahead

Delegates also shared their hopes for future events, expressing interest in deeper dives into:

  • Ethical leadership in AI
  • Data protection and governance
  • Real-world case studies of successful AI implementation
  • Safeguarding and inclusivity in AI systems

The appetite for continued learning is strong, and it’s clear that educators are eager to shape the future of AI in education together.

Final Reflections

“The AI in Education Conference was an inspiring and empowering experience. The sessions were engaging, the discussions were rich, and the takeaways were immediately actionable. It was the perfect balance of strategy and practice a must-attend event for any school leader looking to embrace AI with confidence.”

As AI continues to evolve, so too does the educational landscape. Events like this one play a crucial role in helping schools and trusts navigate that journey; ensuring that innovation is always grounded in ethics, inclusion, and impact.

Privacy Notices: Statutory & Best Practice

Last week at the forum we discussed how to construct a privacy notice, considering both statutory required inclusions and other useful information. Whilst we discussed privacy notices generally there was an underlying focus on changes that may be required for organisations adopting AI systems and AI features that have gone live in their current systems.

Setting the Ground Rules: The Importance of Transparency

Privacy notices are how organisations comply with the transparency principle set out in Article 13 & 14 UK GDPR. Being open and upfront about what you do with people’s personal data helps you deal with them in a clear and transparent way. This makes good sense for any organisation and is key to developing trust with individuals

There is no prescriptive legislative description of how a privacy notice should be set out although it does need to include the following types of information:

  • The name and contact details of your organisation
  • The contact details of your data protection officer
  • The purposes of the processing
  • The lawful basis for the processing
  • Explain which lawful basis you are relying on in order to collect and use people’s personal data and/or special category data.
  • The legitimate interests for the processing
  • The recipients, or categories of recipients of the personal data
  • The details of transfers of the personal data to any third countries or international organisations
  • The retention periods for the personal data
  • The rights available to individuals in respect of the processing
  • The right to withdraw consent and how
  • The right to lodge a complaint with a supervisory authority
  • Tell people that they can complain to a supervisory authority.
  • The details of whether individuals are under a statutory or contractual obligation to provide the personal data
  • Tell people if they are required by law, or under contract, to provide personal data to you, and what will happen if they don’t provide that data.
  • The details of the existence of automated decision-making, including profiling. This is particularly important when AI is being used for placing pupils in capability related classes, exam levels and similar decisions which have a significant effect on a pupil.

AI & Privacy Notices: New Challenges

For any AI systems that process personal data, they must be included in the recipients and international transfers sections at a minimum. If a system is entirely AI, you should explain what the system is used for, who the vendor is, and the name of the system. It may be easier and more user-friendly to add a separate AI section addressing these systems. If AI features have been added to existing systems, you should expand the section of your notice that refers to that system/processor to explain the feature. This might include transcribing tools in Teams/Google Meet or grading in edTech systems for example.

For any systems used for automated decision-making and/or profiling, there are extra legal provisions to comply with. You should confirm your use of AI-enabled decisions, when you use them, and why you choose to do this, including which systems and vendors are involved. It is important to include a “human-in-the-loop” for decisions that have legal or similar effects, as Article 22 gives individuals the right not to be subject to a solely automated decision.

Article 21 of the UK GDPR also gives individuals the right to object to any profiling that you carry out on the basis of legitimate interests or a public task. In these cases, an individual can object on grounds relating to their particular situation. This applies to all systems and not just those which use AI.

If you do not use AI for automated decision making and/or profiling it can be useful to state this within your privacy notice but you would need to be certain that edTech systems aren’t being used in this way in any of your schools. Given that vendors are rushing to introduce AI in their systems it might not be possible to confidently state this in your privacy notice.

Q&A Session

A great debate emerged during our Q&A session about centralised control versus academy-level autonomy when it comes to privacy notices. Privacy notices are the responsibility of the ‘data controller,’ which in a multi-academy trust (MAT) is the Trust itself, not the individual academies. While there’s nothing stopping a Trust privacy notice from having a section relating to processing at each individual academy, this may be redundant.

The question to consider is what school-specific information would be included that couldn’t already be part of the Trust notice. If this relates to the use of systems, it may be worth adding in a paragraph for a specific school under the relevant section.

It’s also worth splitting your privacy notices into separate documents for different classes of data subjects, as a single notice can become quite large. This could include separate notices for pupils, parents/guardians, staff, governors/trustees, and suppliers/contractors. You might also consider a visitor notice, especially if you have CCTV on site.

Final Thoughts

Privacy notices and the implications of AI are complex topics, and these are just some of the key takeaways from our forum discussion. As we move forward, we’ll continue to explore new challenges. Our next session will be on 14 November at 12:45 pm, where we’ll be diving into the latest on Article 30 Record of Processing Activities including what’s required and recommended process for populating

I look forward to seeing you there!

Click here to add it to your Google Calendar or download the attached .ics file at the bottom of this blog post.

Thanks again to everyone who joined the session. See you at the next one.

Please feel free to reach out if you would like to find out more about our range of data protection, information governance & AI governance services.

Why AI Literacy Must Be Central to Modern Education

At TransforMATive, we believe that AI literacy isn’t a future aspiration, it’s a present necessity. As multi-academy trusts navigate the complexities of digital transformation, equipping staff and pupils with the skills to understand and engage with artificial intelligence is fast becoming essential. We support MATs to move beyond simply adopting AI tools, helping them embed AI literacy into their curriculum, CPD programmes, and strategic planning, ensuring their communities are confident, informed, and ready to lead in an AI-driven world.

Rethinking teaching and learning in the age of artificial intelligence

Artificial intelligence (AI) is no longer a distant concept, it’s already embedded in classrooms, shaping the curriculum, and redefining what it means to be digitally literate. But as AI capabilities continue to evolve, we must ask: are we equipping learners, and educators, not just to use AI, but to understand and critically engage with it?

A compelling new paper by Kong and colleagues sets out a clear case: AI literacy should be a fundamental component of contemporary education; not just for pupils, but for teachers, school leaders, and those shaping strategy across the system.

What Do We Mean by AI Literacy—And Why Is It So Important?

AI literacy goes far beyond knowing how to use tools like ChatGPT. It’s about developing a mindset. A blend of knowledge, skills, and ethical awareness that allows people to use AI responsibly, creatively, and with confidence.

According to Kong et al., AI literacy must sit alongside digital and media literacy in the curriculum. Not everyone needs to be a computer scientist, but every learner should be able to:

  • Understand what AI is and where it shows up in everyday life
  • Question its impact on fairness, bias, and inclusion
  • Use AI tools to solve real-world problems and support innovation

A Force for Inclusion and Lifelong Learning

AI has the potential to personalise learning in powerful ways tailoring support, pace, and content to individual needs. But unless we teach AI literacy alongside this, we risk deepening existing inequalities.

Kong and colleagues advocate for equity-informed AI education ensuring all pupils have access to the knowledge and tools to thrive in an AI-driven world, not just the digitally advantaged.

Done well, this isn’t just about skills for today, it’s about preparing young people for a lifetime of learning and work in an AI-enhanced society.

What Can We Do in Schools and Trusts?

The message from Kong et al. is clear: AI is already reshaping the world around us. Education can either keep pace or risk being left behind. By prioritising AI literacy, we empower pupils to not only navigate an AI-rich world—but to shape it.

The question isn’t should we teach AI literacy? it’s how soon can we embed it meaningfully into teaching and learning?

https://doi.org/10.1080/14703297.2024.2332744

Certainly! Here’s a UK English closing paragraph with a professional and encouraging call to action:


As the role of AI in education continues to grow, now is the time to ensure your trust is not just keeping up, but leading the way. Whether you’re looking to develop a trust-wide approach to AI literacy, up-skill your workforce, or embed AI into your strategic vision, TransforMATive is here to help. If you’d like to explore how we can support your journey, please don’t hesitate to get in touch, we’d love to connect.

AI in Education: Taking Your First (Or Next Steps)

Exceed Academies Trust, in partnership with TransforMATive, hosted our first AI in education conference on the 28th February 2025 at the University of Leeds. The conference was an opportunity for Exceed to share our AI journey and to help other schools and trusts start theirs or to take their next step on the journey. As an outward facing trust with a strong moral purpose to contribute to system wide improvements in education, we committed to not only sharing our journey but also publishing a wealth of resources and tools to support the wider sector.

The event held at the University of Leeds brought together educators, innovators, and system leaders from across the sector for an inspiring day of discovery, challenge, and opportunity.

TransforMATive believe that sharing what works—and what doesn’t—is critical. And this event did exactly that. We heard from schools at the very start of their journey and from those already embedding AI into daily practice with impact. There was a powerful sense of collective purpose: to better understand how artificial intelligence can support Multi-Academy Trusts (MATs) and transform education for staff and students alike.

The interactive nature of the event stood out. Attendees got hands-on with VR and AR headsets, explored new product demonstrations, and even had the chance to engage with neuroimaging technologies such as EEG—showcasing how deep cognitive insights could help personalise learning experiences.

With teacher workload at an all-time high, one of the central questions was: How can AI support those at the front line? Whether through intelligent automation, smarter lesson planning tools, or marking assistance, we saw examples of technology being used not to replace educators, but to amplify their impact and reclaim time for what matters most.

Our sponsors played a vital role in bringing this vision to life. They showcased cutting-edge tools and features, providing a tangible sense of what’s possible right now—and what’s just around the corner.

Above all, what made the day special was the willingness of participants to share, to challenge assumptions, and to reimagine what education could look like in an AI-augmented world.

We’re proud to be part of this movement—and even prouder to help connect the dots for MATs navigating this change. AI is not a distant future; it’s already reshaping how we think about teaching, learning, and leadership.

Let’s keep the conversation going.

From Good to Great: Walking the Journey Together

At TransforMATive, we believe that moving from good to great isn’t just a slogan—it’s a deliberate strategy. As Jim Collins reminds us, “greatness is not a function of circumstance. Greatness, it turns out, is largely a matter of conscious choice.” And for us, that choice is shaped by disciplined people, disciplined thought, and disciplined action.

For Multi-Academy Trusts, this journey isn’t a solo venture. It’s fuelled by long-term vision, robust governance, and the thoughtful enablement of technology—including the emerging power of AI. When these elements align, we see more than organisational growth; we see communities thrive and children flourish.

It was a privilege to join Samira Sadeghi from the Confederation of School Trusts (CST) for her keynote on this very theme. Her reflections brought clarity to the leadership required for transformation—and reinforced the importance of bold, strategic decision-making in education. A heartfelt thank you to Samira for her insight, and to Lorrayne Hughes OBE and the Cumbria Education Trust team for the generous invitation and warm hospitality.

And most importantly, thank you to the pupils whose performances brought such energy and inspiration to the day. Their passion reminds us exactly why this work matters.

From vision to action, from systems to culture, the path from good to great is not a straight line. But it is a shared journey—and one we’re proud to walk with courage, clarity, and care.