Breadcrumbs navigation
Keeping the human in the loop: Reassuring educators in the age of AI
In October 2025, we ran a weekend of joint workshops in collaboration with the International Studies Association (ISA). The workshops were convened with an overarching theme of ‘Transforming the International: Scholarship and Solidarity in a World of Inequalities’. They took place in Newcastle, UK, with over 250 participants.
This briefing draws on insights from the BISA-ISA joint Workshop on AI Pedagogies: Practice, Prompts and Problems in Contemporary Higher Education, sponsored by the ASPIRE (Academic Scholarship in Politics and International Relations Education) Network.
As universities scramble to keep up with technological developments in artificial intelligence (AI), it is easy to forget that much of the anxiety about these tools is not really technical but deeply personal. Educators often worry less about whether they will be able to ‘keep up’ with handling the technology, and more about what AI might do to them as teachers. The fear that teaching could be reduced to prompt-writing or content moderation runs deep, surfacing questions of identity and professional purpose.
This unease is understandable. In an academic culture built on expertise and judgment, the idea of outsourcing explanation to a machine feels like a threat to craft and care alike. As one colleague at the Newcastle workshop put it, “AI challenges the identity of the educator…the one who knows, interprets, translates, and gives meaning.”
The fear behind the function
If ontological security refers to a stable sense of self and identity, research on technology adoption shows that resistance often stems from its obverse; in this case, a sense that one’s professional worth is being unsettled. When educators describe AI as “deskilling” or “dehumanising,” we are really describing a loss of agency: the worry that teaching will be done without us rather than by us.
Fortunately, history offers perspective. Every major technological shift – from the introduction of calculators to the arrival of the internet – has sparked similar fears. In each case, the role of the teacher changed, but did not disappear. The educator’s value lies in contextualisation. interpretation, empathy, curation, and the ability to build the bridge between knowledge and understanding - all tasks that AI cannot authentically replace.
Reskilling, not replacement
Reframing AI as a partner in that process requires confidence and guidance. For educators, reskilling does not mandate mastering every new tool, but reflecting on what components of such tools warrant integrating into established pedagogical frameworks.
Vygotsky’s (1978) zone of proximal development reminds us that learning happens just beyond a student’s current ability, through guided interaction. AI can extend this zone, providing scaffolding or feedback, but it still relies on human calibration. Kolb’s (1983) experiential learning cycle offers another guide: experience, reflection, conceptualisation, application. AI may generate simulations or examples, but teachers help students reflect and connect these experiences to theory.
Socio-material approaches, such as those developed by Fenwick (2015), further emphasise that knowledge emerges through relationships – between people, tools, and both material and social environments. AI simply adds another actor to that network. The task for educators is to curate those interactions so that technology enhances, rather than dictates, the learning process.
The value of being human
In the rush to adopt new technologies, it is easy to forget why human presence still matters. Students consistently say they learn best when they feel seen and taken seriously. As Davies (2025) reminds us, “most students simply want to be taken seriously, treated honestly and paid attention to,” not treated as a cash-cow or a problem to be solved. AI cannot replicate that sense of being noticed or valued. Social interaction is an inherent part of the higher education experience, which cannot be fully replaced by human-machine interaction.
The educator’s role, therefore, is not diminishing but evolving. It involves helping students to use AI critically and in context: to question its sources, recognise its biases, and appreciate that AI systems are, in the words of Bender et al., “stochastic parrots” (2021): fluent generators of text that imitate understanding but do not possess it. It also means modelling how to combine technological competence with ethical judgment and intellectual humility.
Towards confidence and care
Updating policies and offering one-off workshops will not suffice to support staff through this transition. Given the pace of technological change, training must be continuous, responsive and tailored to different teaching contexts, mindful of ethics and inclusion, and grounded in real and concrete examples of good practice (rather than just listing prohibitions). Workshops that allow educators to share experiences and test AI tools in a safe environment can rebuild confidence and reduce resistance. Recognising experimentation in promotion or workload models would signal that AI literacy is a valued professional skill, not an optional extra.
Most importantly, universities must communicate clearly that AI will not replace teachers but will reshape the conditions in which teaching takes place. The goal is a partnership between human and machine that enhances learning while preserving academic values.
A closing thought
AI can automate feedback, simulate case studies, search for information, and generate endless variations of exam questions. What it cannot do is care, interpret, empathise, or inspire. These are not sentimental claims; they are the foundations of learning. The human remains indispensable because our task is not to outperform the machine, but to understand what learning is for.
If universities invest in reskilling and reflection, educators can approach AI with confidence rather than fear. Keeping the human in the loop will ensure that the digital university remains a place where intelligence – both artificial and human – develops in dialogue, not competition.
If you’d like to find out more about the joint BISA-ISA workshops, take a look at our summary video on YouTube or visit the workshop website.