|
SPECIAL FEATURE - May 2026
Anchored Leadership in an AI Tide David Ingram
Recently, I spent time with school leaders, researchers and practitioners in a session led by Professor Rose Luckin, exploring AI, critical thinking and what genuine AI literacy might look like. The most compelling voices in the room, however, were the students - from across the Dulwich College International family of schools. A particular highlight was their panel - impressive, articulate, and strikingly matter of fact in how they described already using AI to enhance, not replace, their learning. For them, AI is an everyday companion: they use it to test ideas, clarify understanding and refine their work, with little of the anxiety that so often colours adult discussion. The future is not something being done to them; it is something they are already actively shaping. What felt different was the normality. Unlike the introduction of interactive whiteboards or BYOD, this is not a school led innovation being carefully implemented. Students are already using AI, independently and intuitively. This is a technological shift that is happening with or without us. That reality sharpens the question of AI literacy. AI literacy is often misunderstood as technical competence or fluency with new tools. In reality, it is far broader and more human: the capability to understand what artificial intelligence is, to question what it produces, and to use it responsibly, ethically and effectively. Crucially, it is about judgement, not mastery. At its foundation, AI literacy begins with clear mental models. Learners need to understand that AI systems do not “think” or “know”; they identify patterns in data and generate predictions. This helps demystify AI - preventing it from being treated as either magic or authority - and foregrounds its limits: it can be helpful, but it can also be wrong, biased or misleading. As learners develop, AI literacy becomes increasingly about critical evaluation: questioning how an output was generated, recognising when confidence masks inaccuracy, and verifying information using trusted sources. It also means being alert to bias and representation - who benefits, who is excluded, and why apparently neutral technologies can produce unfair outcomes. In many ways, mature AI literacy looks like strong critical thinking applied to a new domain. Ethical awareness runs through all of this. AI literate individuals treat issues of privacy, consent, transparency and fairness as part of everyday use, not as abstract add ons. They recognise that AI systems shape choices, attention and opportunities, and that responsibility for their use still rests with the human. Agency matters: knowing when to rely on AI and when to push it aside. A particularly important aspect is cognitive offloading. AI can take on routine or mechanical tasks - summarising information, generating drafts, spotting patterns - reducing cognitive load and increasing efficiency. Used well, this frees human capacity for analysis, creativity, empathy and ethical judgement. Used poorly, it leads to over reliance, shallow learning and erosion of core skills. Being AI literate means making conscious choices about when offloading is wise, and when the hard work of thinking must be done by the learner. All of this sits within a wider tension in school leadership. We are surrounded by bold, seductive visions of the future - personalised AI compressing the school day into a few efficient hours, freeing students to pursue their passions. Personalisation does carry real promise: the ability to adapt learning to individual needs, pace and context in ways traditional models often struggle to achieve. But there are shadow sides, even to ideas we like. Highly personalised systems risk a kind of sycophancy, where technology - and sometimes people - learn to tell users what they want to hear, constantly affirming preferences rather than productively challenging them. In a learning context, that is dangerous: students may become less accustomed to disagreement, less resilient in the face of critique, less practised at navigating the frictions of real classrooms and workplaces. Over personalisation carries a related risk. If every task, resource and interaction is finely tuned around each learner, at what point do we start to blunt their ability to function in less tailored environments? Classrooms, communities and workplaces require compromise, shared norms and the capacity to work with what is given, not just what is optimised. There is a line - hard to draw, but important to name - beyond which personalisation narrows rather than expands our students’ readiness for the world. Many of the loudest advocates for these futures have a vested interest in them becoming reality. Their optimism may be genuine - but it is not neutral. We have, of course, been here before. As a child in the 1970s, I can remember being told that advances in technology would provide us with unimaginable leisure time. That prediction has not aged well. Today’s concerns around wellbeing and purpose suggest otherwise. At the same time, there is an equal danger: the pull of nostalgia. I recognise this tension in myself. I am drawn to what feels grounded and enduring - but those instincts are shaped by my own experiences, not necessarily our students’ future. We cannot afford to be seduced by simplified extremes - neither uncritical enthusiasm for AI driven personalisation, nor reflex resistance in defence of a remembered past. The path forward will be complex, often messy, and sometimes uncomfortable. What matters is that we stay anchored:
That may be the defining leadership challenge of our time. How do we move forward thoughtfully - embracing the genuine power of AI and personalisation, while also protecting the collective, the challenging and the imperfect spaces in which human learning so often thrives - without rushing toward the future, or retreating into the past? David Ingram is Head of Dulwich College, Singapore Anchored Leadership in an AI Tide (May 2026) Special Feature - March 2026
Remember when we thought it couldn't get any worse than Covid? Believe me, it can. Education for a World in Turmoil is the purpose of schools these days.
Simon Watson We need to prepare students for the most extreme circumstances, from war to climate issues to social fragmentation, all wrapped up in a post-truth world. Students need extreme emotional resilience, and the critical skills of a PhD student to help them decipher conflicting information from multiple sources. Jigsawing fragments of information into something that vaguely resembles coherence is now a skill for us all, a bit like a carpenter who has passed their prime, whose cut pieces of wood no longer quite align. Education for a World in Turmoil (March 2026) Special Feature - January 2024
The current inspection paradigm has long since outlived its original purpose. The case for inspection is that it provides an independent, external evaluation and identifies what needs to improve for provision to be good or better. Rob Stokoe Appreciative Enquiry: A New Inspection Paradigm (January 2024) |