top of page

Why Ubuntu needs multilingual AI to work in South African schools

Restorative discipline fails when children can’t express themselves; AI translation can make repair conversations fairer and more human.





Over the last few days, I’ve been helping my sister plan her PhD. She is a seasoned social worker, and she teaches students at a local university. Sitting with her drafts, her research questions, and the ethical complexity of working with children has taught me a lot. It has also sharpened one belief: restorative discipline in schools is only as strong as a child’s ability to be heard. 


In a multilingual country like South Africa, that often means the child’s “language of the heart”, the language they reach for when they are scared, angry, ashamed, or trying to explain what really happened. This is where multilingual AI becomes more than a technology story. It becomes a dignity story, and potentially a child protection story.


CONTEXT AND BACKGROUND

South Africa’s under-resourced primary schools carry a heavy load. Teachers are expected to manage learning gaps, behavioural incidents, trauma spill-over from communities, and language barriers, often without enough psychosocial support. Ubuntu-centred restorative discipline asks for something deeply human: to repair relationships rather than simply punish behaviour, and to restore belonging rather than humiliate a learner.


But restorative conversations depend on nuance. A child who can only speak in short, broken English under stress is not “less truthful”; they are less able to explain their lived experience. When language fails, misunderstandings multiply, emotions escalate, and discipline can become unfair.


At the same time, we have to be honest about the operational and ethical risk of introducing AI into school contexts, especially when children are involved. The promise of multilingual AI is real, but it only becomes a net positive if schools treat it as a safeguarded support tool rather than an uncontrolled “app” that learners and teachers casually feed with personal information.


A recent South African legal briefing on the use of AI in schools highlights exactly this concern: many AI platforms collect or process children’s personal data without proper consent, oversight, or clear POPIA-aligned controls. That warning matters here because any AI-assisted restorative conversation must prioritise confidentiality, data minimisation, and child protection from day one.


INSIGHT AND ANALYSIS

The promising shift is that multilingual AI is finally becoming more practical for African contexts. Google has begun rolling out AI Overviews and AI Mode support for 13 African languages, including Afrikaans, Sesotho, Setswana, and isiZulu, which signals serious investment in African-language AI experiences.

Language capability is also being strengthened “under the hood”. Google’s WAXAL initiative launched an open dataset aimed at improving African speech technology, tackling the data scarcity that has historically left African languages behind in AI.


This matters because the school use case is not abstract. Imagine a teacher trying to de-escalate a conflict, or a social worker trying to understand what a child is communicating about a tense incident. The goal is not to replace the adult. The goal is to remove the language bottleneck so the adult can do their job better. Voice-first, low-friction translation and paraphrasing can help a child explain, help an educator reflect back accurately, and help caregivers engage without shame or misunderstanding.


But Ubuntu is not only about “being understood”. It is also about safety, belonging, and relational repair. That means the design must be responsible. A translation tool that stores sensitive conversations, or encourages surveillance, can do harm. The tool must minimise data, protect confidentiality, and be used as support for human judgement, not as an authority over a child.


IMPLICATIONS

For school leaders, the first step is not buying software. It is defining the use case clearly: language support for restorative conversations, caregiver communication, and learner reflection, with strict boundaries around privacy. A practical starting point could be a controlled, approved set of multilingual tools, with training for teachers on when to use them, how to verify meaning, and how to document incidents safely.


For policymakers and funders, this is an inclusion lever. The Masakhane Hub has highlighted the need to bridge the AI gap across African languages, including through funding and collaboration, which should be seen as foundational infrastructure for education and public services.


And for universities and researchers, the work is already moving. SADiLaR has pointed to an “AI in the Pedagogy of African Languages” conference hosted at the University of KwaZulu-Natal, reflecting growing momentum around AI and African language education.


CLOSING TAKEAWAY

Ubuntu-centred restorative discipline is not a script. It is a relationship practice, built on listening, dignity, and repair. In multilingual South Africa, that practice weakens when children cannot speak in the language that best carries their emotion and meaning. Multilingual AI will not solve poverty, trauma, or under-resourcing. But it can remove a critical barrier: the ability to be heard. If we implement it with strong ethics, child protection, and simple governance, it can become a practical bridge between policy ideals and classroom reality, helping adults understand children before they judge them.


Author Bio: Johan Steyn is a prominent AI thought leader, speaker, and author with a deep understanding of artificial intelligence’s impact on business and society. He is passionate about ethical AI development and its role in shaping a better future. Find out more about Johan’s work at https://www.aiforbusiness.net

 
 
 

Comments


Leveraging AI in Human Resources ​for Organisational Success
CTU Training Solutions webinar

bottom of page