A practical roadmap for AI in South African classrooms
- Johan Steyn

- 3 hours ago
- 4 min read
Start small, protect children, train teachers, fix assessment, then scale what actually works.

Audio summary: https://youtu.be/JaN3wDfR_CU
Follow me on LinkedIn: https://www.linkedin.com/in/johanosteyn/
The Gauteng Premier has called for the government to back the use of artificial intelligence in schools, including allowing pupils to use AI in their schoolwork.
The instinct is understandable: AI is here, learners are already experimenting with it, and education cannot pretend the world is not changing. But “AI in schools” can quickly become a slogan that hides the hard work. If we do this badly, we widen inequality, undermine assessment integrity, and expose children to new risks. If we do it well, AI could become a practical support layer for teachers and a learning accelerator for pupils. The difference will not be the model. It will be the roadmap.
CONTEXT AND BACKGROUND
South African schooling has two realities at once. In some schools, learners have devices, reliable connectivity, and teachers who can experiment. In many others, basics are still a daily struggle: overcrowding, limited resources, and infrastructure constraints that make any technology rollout uneven by default.
That is why a national push for AI in schools cannot be treated as a single switch to flip. AI is not one tool, and “use” is not one behaviour. There is AI for tutoring, AI for lesson planning, AI for admin, AI for language support, and AI for assessment practice. Each comes with different costs and different risks.
We also need to be honest: learners will use AI whether policy keeps up or not. The question is whether schools guide its use responsibly, or whether we leave it to chance and inequality.
INSIGHT AND ANALYSIS
A workable roadmap starts with the principle of least regret: begin where AI creates clear value with manageable risk.
Step one is teachers first. The biggest leverage is not giving every child an AI app. It is helping teachers save time and improve quality. AI can assist with drafting lesson plans, creating differentiated worksheets, generating examples, simplifying reading passages, and preparing quizzes. But teachers need training, boundaries, and confidence. If teachers feel threatened or overwhelmed, the rollout dies quietly.
Step two is governance before scale. Schools need simple, plain-language rules: what is allowed, what is not, and what requires supervision. This must include privacy and consent, especially for minors. It must also cover data handling: what can be uploaded, what must never be uploaded, and what tools are approved. Without this, AI becomes the new shadow IT in education.
Step three is child safety by design. AI use intersects with bullying, manipulation, and harmful content. Schools need reporting pathways, age-appropriate safeguards, and clear escalation. The message to children must be practical: if something feels wrong, report it; if something seems too good to be true, verify it; never share personal details.
Step four is assessment reform. The most predictable failure mode is treating AI as a cheating problem while keeping old assessment methods. If homework is easily generated, then homework becomes less useful as proof of learning.
Schools must shift towards assessments that show thinking: in-class work, oral explanations, drafts with reflection, process evidence, and project-based tasks where learners defend their reasoning. The goal is not to ban tools, but to test understanding.
Step five is phased learner access. Start with structured use cases: guided tutoring for specific topics, reading support, practice questions, and feedback on writing that focuses on improvement rather than replacement. Build digital literacy into the curriculum: how to ask good questions, how to check accuracy, how to cite sources, and how to recognise when a tool is confidently wrong.
IMPLICATIONS
For government and provincial departments, the smartest move is targeted pilots with clear outcomes. Choose a representative mix of schools, measure impact on literacy and numeracy, track teacher workload, and publish what works and what doesn’t. Avoid big-bang procurement driven by hype. Fund connectivity and devices where needed, but tie spend to learning outcomes, not press releases.
For school leadership, treat AI as a change programme, not an app rollout. Create an internal policy, pick approved tools, train staff, and communicate clearly with parents. Establish an ethics and safety checklist. If a tool needs broad access to data, treat that as a red flag, not a convenience.
For parents, the practical question is not “Is my child using AI?” It is “How are they using it?” Encourage learning behaviours: using AI to practise, to explain, to explore, and to improve. Discourage replacement behaviours: copying, outsourcing thinking, and hiding use.
CLOSING TAKEAWAY
AI in schools is coming. The only real choice is whether it arrives as chaos or as a guided, equitable shift. A practical roadmap is not complicated: start with teacher enablement, lock down governance and child safety, redesign assessment so it measures understanding, and run small pilots that prove value before scaling. If we get the order right, AI could help teachers teach, and learners learn. If we get the order wrong, we will simply automate inequality and call it innovation.
Author Bio: Johan Steyn is a prominent AI thought leader, speaker, and author with a deep understanding of artificial intelligence’s impact on business and society. He is passionate about ethical AI development and its role in shaping a better future. Find out more about Johan’s work at https://www.aiforbusiness.net






Comments