AI will not fix what you do not understand
- Johan Steyn
- 2 hours ago
- 4 min read
Lessons from a healthcare call centre on why leaders must get closer to the work before spending on clever technology.

Audio summary: https://youtu.be/sLxnYJDZcj4
I write about various issues of interest to me that I want to bring to the reader’s attention. While my main work is in Artificial Intelligence and technology, I also cover areas around politics, education, and the future of our children.
About a year ago, a large South African healthcare company asked me to help them use AI and automation in their contact centre. The numbers were eye-catching: half a million inbound calls every month. As so often happens in these rooms, the conversation jumped almost immediately to chatbots, predictive analytics and real-time agent assistance. Everyone wanted to know which clever tools to buy. I decided to ask a different question: why are you getting so many calls in the first place? The silence that followed was revealing. It was not that the leaders were ignorant or uncaring; they simply had never been close enough to the daily reality of the call centre to see the problem clearly.
CONTEXT AND BACKGROUND
Across sectors such as healthcare, banking and insurance, contact centres are treated as both a burden and an opportunity. They are expensive to run, yet they provide a wealth of customer insight. AI vendors promise to turn this pressure into a competitive advantage: tools that predict intent, surface the right information instantly and coach agents as they speak. In principle, there is nothing wrong with these ambitions. In practice, many organisations reach for AI before they have taken the time to understand what is driving the demand in the first place.
Part of the problem is distance. Senior leaders typically experience their organisations through PowerPoint decks and KPI dashboards, not through headsets and customer queues. They see call volumes and average handling times, but not the confused parent trying to submit a first medical claim at ten o’clock at night. When you are far from the work, it is easy to believe that technology is the missing ingredient, rather than the way your own processes and communication may be generating avoidable complexity.
INSIGHT AND ANALYSIS
In the case of this healthcare provider, once we moved past the AI sales pitch and examined the reality on the ground, a simple pattern emerged. Roughly 80% of the calls were from new customers. Most of them were not asking sophisticated questions about benefits or exclusions. They were first-time members trying to submit a claim and struggling to understand how. The onboarding pack they received was written in dense, legalistic language. The website journey was fragmented and confusing. Instructions were scattered across channels. The call centre was, in effect, acting as an expensive help desk for bad communication.
My team and I did not begin by deploying algorithms. We started with common sense. We worked with frontline staff to understand where customers got stuck. We rewrote the onboarding material in plain, friendly language. We simplified the step-by-step guidance on the website. We created short, clear self-help videos walking customers through their first claim. None of this required cutting-edge AI. It required empathy, listening and a willingness to accept that the real problem sat within the organisation’s own design, not with the customer.
Six months later, call volumes had dropped to around 60% of their previous level. Agents were less overwhelmed, customers were less frustrated, and leaders could now explore AI from a position of greater clarity. The experience confirmed something I see repeatedly: if you do not understand the work, AI becomes a distraction. Technology amplifies whatever is already there. If the underlying process is broken, AI will help you get the wrong outcome faster.
IMPLICATIONS
For leaders, the first implication is practical and uncomfortable: get closer to the work. Spend time in your contact centre or service desks. Listen to real customer calls. Ask agents what they see every day that never makes it into the board pack. Very often, the most powerful interventions are low-tech: clearer forms, better scripts, redesigned letters, more intuitive digital journeys. These are not glamorous changes, but they can transform demand patterns long before an AI system is switched on.
The second implication is about how we think. AI should be treated as a means, not an end. The real goals are efficiency, effectiveness and a humane experience for customers and staff. Once you have a grounded understanding of the problem and have addressed the obvious friction points, AI can absolutely help: predicting likely call reasons, providing agents with suggested responses, and automating repetitive follow-ups. But if you skip the foundational work, you risk spending large budgets optimising noise.
CLOSING TAKEAWAY
That healthcare call centre remains one of my favourite stories because it exposes a wider temptation in our AI-obsessed age. It is seductive to believe that the latest model or platform will rescue us from complexity. Yet the most important questions are still human and basic: why is this happening, who is affected, and what small, sensible changes could make their lives easier? AI will not fix what you do not understand. If leaders are willing to leave the comfort of the boardroom, listen carefully to the people doing the work and trust their common sense, they can create conditions where AI genuinely adds value. That is the kind of thoughtful, grounded progress we should want for our organisations, and for the future our children will inherit.
Author Bio: Johan Steyn is a prominent AI thought leader, speaker, and author with a deep understanding of artificial intelligence’s impact on business and society. He is passionate about ethical AI development and its role in shaping a better future. Find out more about Johan’s work at https://www.aiforbusiness.net


