The shoemaker’s children in the AI age: why automation vendors must live their own story
- Johan Steyn

- 33 minutes ago
- 3 min read
When AI and automation vendors neglect their own operations, clients should treat the gap between promise and practice as a serious warning sign.

Audio summary: https://youtu.be/EkVdjwfviSo
I write about various issues of interest to me that I want to bring to the reader’s attention. While my main work is in Artificial Intelligence and technology, I also cover areas around politics, education, and the future of our children.
In recent years, I have sat in many boardrooms with local and global firms that promise to revolutionise their clients’ efficiency with AI and automation. The decks are polished, the demos are impressive, and the language is filled with talk of transformation. Yet, once the meeting ends, a different reality emerges. Emails go unanswered for weeks, documents are produced in a chaotic manner, and agreed actions quietly evaporate.
The old proverb about the shoemaker’s children going barefoot takes on a sharper meaning in the AI age. When the people selling automation cannot manage their own follow-up, it is not a charming irony; it is a risk signal that clients can no longer afford to ignore.
CONTEXT AND BACKGROUND
Across South Africa and globally, organisations are under immense pressure to “modernise” through digital transformation. Cloud providers, automation specialists and AI consultancies present themselves as guides into this promised future, offering tools from the likes of AWS, Microsoft and Google as the route to streamlined operations and smarter decisions.
For many businesses, especially in emerging markets, these vendors shape not only technology choices but also how leaders think about efficiency, productivity and the future of work. Yet this growing dependency makes it even more important to ask a simple question: do these firms actually practise what they preach inside their own walls?
INSIGHT AND ANALYSIS
When an automation vendor’s internal behaviour is slow, fragmented and unreliable, it suggests a deeper problem than poor manners. It raises doubts about their ability to implement robust processes, manage change and learn from their own experiments with AI. If a company cannot use its own tools to track commitments, generate timely documentation or coordinate teams, how ready is it to guide a client through similar complexity? In an AI-driven landscape, operational discipline is not a cosmetic detail; it is the foundation on which any serious automation effort must rest.
The danger for clients is that the “magic” of AI can become a kind of smoke and mirrors. Clever demonstrations prove that a tool can work in principle, but they tell us little about whether it works in practice, at scale, over time. The vendors who impress me most are not the ones with the flashiest slides, but those who can calmly show how they use AI to manage their own sales pipeline, generate proposals, support their staff and learn from failures. They are frank about what went wrong, what did not deliver the expected return, and how they adapted. That honesty is worth more than any marketing slogan.
IMPLICATIONS
For boards, executives and public-sector leaders in South Africa and across the continent, the lesson is straightforward: before signing the contract, turn the questions back on the vendor. Ask them to walk you through how they use AI and automation in their own organisation. Where has it genuinely improved response times, reduced manual work or improved quality? Where did it fail, and what did they change as a result? If they cannot answer clearly, or if their own processes look suspiciously manual, that gap should carry significant weight in your decision.
This matters not only for budgets and project outcomes, but for the future we are building for our children. Every failed, or superficial digital project drains scarce resources that could have strengthened education, healthcare or infrastructure. Choosing partners who live their own story of responsible, disciplined automation is part of our duty of care: to ensure that the tools we adopt genuinely improve how we work, rather than simply adding another layer of jargon to already stressed systems.
CLOSING TAKEAWAY
The age of AI demands a higher standard of honesty from those who sell transformation. Clients should no longer be satisfied with glossy narratives about efficiency while the vendor’s own operations stumble along in email chaos and missed commitments. The shoemaker’s children cannot remain barefoot without consequence.
By insisting that partners demonstrate how they use AI and automation in their own businesses, and by listening carefully to the lessons they have learned, organisations can cut through the smoke and mirrors. In doing so, we give ourselves a better chance of building digital systems that truly serve people, protect our limited resources, and leave a more coherent, opportunity-rich world for the next generation.
Author Bio: Johan Steyn is a prominent AI thought leader, speaker, and author with a deep understanding of artificial intelligence’s impact on business and society. He is passionate about ethical AI development and its role in shaping a better future. Find out more about Johan’s work at https://www.aiforbusiness.net






Comments