top of page

Teaching my son to talk to machines, not be controlled by them

Watching my 11-year-old explore an AI phone reminds me that tomorrow’s literacy is not just reading and writing, but learning to converse with algorithms.





I write about various issues of interest to me that I want to bring to the reader’s attention. While my main work is in Artificial Intelligence and technology, I also cover areas around politics, education, and the future of our children.


There is something quietly profound about sitting next to my 11-year-old son while he uses a Samsung Galaxy Z Fold phone, its screen opened like a small book, asking questions about life, schoolwork and his hobbies. He is not just tapping and scrolling; he is talking to Google Gemini, experimenting with prompts, refining his questions, and listening carefully to the answers.


As a parent, I realise that one of the most important things I can teach him is not how to operate a gadget, but how to think with it: how to ask better questions, how to challenge the answers, and how to see artificial intelligence as a partner in learning rather than a shortcut or a crutch. This, I believe, is the new digital literacy our children will need.


CONTEXT AND BACKGROUND

Children are growing up in a world where AI is not an abstract concept, but a daily companion in their pockets. For many of us, our first digital literacy was learning how to use a search engine. For our children, it is learning how to have a conversation with an AI assistant. Instead of typing keywords into a browser, they speak naturally, asking for explanations, stories, summaries and advice.


In South Africa and across Africa, this shift happens against a backdrop of uneven education quality, infrastructure challenges and deep inequality. For some children, advanced devices and AI tools will be normal. For many others, reliable internet access is still a luxury. This makes the way we introduce AI to our children even more important.

If those who have access learn to use these tools thoughtfully, they can gain a powerful educational advantage. If they merely use them to copy homework answers or disappear into endless entertainment, we are simply amplifying existing problems with new technology rather than solving them.


INSIGHT AND ANALYSIS

When I watch my son chatting to Gemini, I am struck by how much prompting resembles a form of literacy. To get a useful answer, he has to explain what he really wants, provide enough context, and iterate when the response is not quite right. He is practising clarity of thought and language. He is learning that vague questions produce vague answers, and that better inputs lead to better outputs. That, in itself, is a valuable life skill.


But there is a danger if we mistake fluent responses for unquestionable truth. Large language models can sound confident even when they are wrong. They reflect biases in their training data. They do not know my son, our family or our values. So, alongside teaching him how to prompt, I have to teach him how to doubt. We talk about cross-checking information, about asking “how do you know?” and about understanding that AI is a tool built by humans, not an oracle dropped from the sky.


This is where the deeper form of AI literacy lives: in the ability to use these systems without surrendering judgment. If we do not help our children develop this critical habit, we risk raising a generation that outsources its thinking to the most convenient voice on the screen. The challenge is to raise young people who can harness AI’s strengths while still doing the hard work of reflection, empathy and moral reasoning themselves.


IMPLICATIONS

For parents, the implication is clear: banning AI outright or ignoring it will not protect our children. They will encounter these tools anyway, at school, with friends, or later in the workplace. Our responsibility is to walk alongside them, to model healthy use, and to turn everyday interactions with AI into learning opportunities. Ask them what they are curious about. Help them frame their questions. Discuss the answers together, including where the AI might be wrong or incomplete.


For educators and policymakers, AI literacy should become part of the curriculum, not as a niche technical module, but as a core competency alongside reading and writing. Children need to understand how these systems work at a high level, what they are good at, where they fail, and how to use them ethically. If we ignore this, we leave them unprepared for a world where AI will be woven into almost every profession and public service, including in South Africa’s already strained education and labour markets.


CLOSING TAKEAWAY

Watching my son use a powerful AI-enabled phone does not fill me with fear; it fills me with urgency and responsibility. The devices will only grow more capable. The real question is whether we will grow more intentional in how we guide our children. Teaching them to prompt well is really about teaching them to think clearly, to stay curious, and to remain in charge of the tools they use.


If we want a future where technology amplifies human potential rather than narrowing it, we must raise children who know how to talk to machines without surrendering their own voice. That, I believe, is one of the greatest gifts we can give them.


Author Bio: Johan Steyn is a prominent AI thought leader, speaker, and author with a deep understanding of artificial intelligence’s impact on business and society. He is passionate about ethical AI development and its role in shaping a better future. Find out more about Johan’s work at https://www.aiforbusiness.net

 
 
 

Comments


Leveraging AI in Human Resources ​for Organisational Success
CTU Training Solutions webinar

bottom of page