Many Companies Have an AI Strategy. Only One in Four Has Governance to Match It
- Johan Steyn

- 17 minutes ago
- 4 min read
The world's largest study of corporate responsible AI finds a dangerous gap between what organisations say about artificial intelligence and what they actually have in place to manage it

Video summary: https://youtu.be/UYs7oYVgJtE
Sign up for my Substack daily AI newsletter here.
See my AI Training course portfolio for corporate Business Leaders here.
Follow me on LinkedIn: https://www.linkedin.com/in/johanosteyn/
A new global report has confirmed, with the weight of nearly three thousand data points across eleven sectors and five geographies, what many business leaders have suspected but few have been willing to state plainly: the gap between artificial intelligence ambition and artificial intelligence accountability is wide, systemic, and growing. The findings deserve careful attention from every executive and board director in South Africa.
CONTEXT AND BACKGROUND
The AI Company Data Initiative, a joint programme of the Thomson Reuters Foundation and UNESCO grounded in UNESCO’s Recommendation on the Ethics of AI, has produced what it describes as the world’s largest study assessing corporate AI adoption globally. Covering 2,972 companies across eleven sectors and five regions, the study draws on more than 100,000 data points sourced from publicly available corporate disclosures, including annual reports, ESG reports, governance and responsible AI webpages, cybersecurity and privacy policies, and direct company survey participation. The result is the most comprehensive picture of corporate AI governance currently available, and what it shows is not reassuring.
Artificial intelligence adoption has become near-universal. Research from McKinsey cited in the report found that AI use in at least one business function rose to 88 per cent of companies in 2025, up from 78 per cent the year before. Across sectors, organisations are integrating AI into customer-facing services, internal workflows, and decision-making processes at an accelerating pace. The governance infrastructure required to manage that integration responsibly has not kept pace.
INSIGHT AND ANALYSIS
The headline finding is stark. While 43.7 per cent of the companies studied report having an AI strategy or guidelines, only 13 per cent publicly claim to adhere to a formal AI governance framework. Among those companies with an AI strategy, only around 27 per cent also report adherence to a governance framework. As the report states, AI strategies are frequently developed without a corresponding external commitment to recognised governance frameworks, suggesting that many strategies are oriented primarily towards accelerating adoption and capturing value rather than setting out robust governance commitments.
The operational picture is equally concerning. Only 40 per cent of companies report board or committee-level oversight on AI. Only 12.4 per cent have a policy to ensure a human oversees AI systems. Only 2.7 per cent publicly report having a formal AI model registry. And 72 per cent of companies do not report conducting any impact assessment with regard to AI — whether related to data protection, privacy, human rights, or environmental consequences. The report draws a pointed conclusion: organisations tend to describe governance at a conceptual level but more rarely demonstrate how it functions day to day across the lifecycle of deployed systems.
Where companies do align with a formal framework, the EU AI Act dominates — cited by 53 per cent of those that report framework adherence, despite nearly half operating outside the European Union. This is itself a governance signal: South African companies, operating under King V, POPIA, and a draft National AI Policy, have a rapidly maturing regulatory environment of their own. The governance frameworks being built now will need to serve not just current expectations but the more demanding requirements that follow.
I have written previously about South Africa’s tendency to produce AI strategy documents without the implementation architecture to give them meaning. The AICDI findings suggest this is not a uniquely South African problem — but that does not diminish the urgency of addressing it here.
IMPLICATIONS
For South African business leaders, three implications stand out. First, having an AI strategy is not a governance achievement. It is a starting point. The gap between the 43.7 per cent of companies with a strategy and the 13 per cent with genuine framework alignment represents a material governance risk — one that South Africa’s King V code, with its outcomes-driven accountability standard, is now designed to surface and address.
Second, the absence of human oversight mechanisms is not a minor procedural gap. It is a fundamental accountability failure. When AI systems make or influence consequential decisions — in credit, hiring, risk assessment, customer service — and no policy exists to ensure a human oversees those outcomes, the organisation cannot demonstrate accountability for what its AI does. That is precisely the standard King V now demands, and precisely where most organisations are currently falling short.
Third, the data reveal that governance maturity is concentrated in large-cap firms and in technology, communications, and financial services sectors. Energy, materials, and real estate companies lag significantly. South African organisations across all sectors should audit their own AI governance against the AICDI framework, not as a benchmarking exercise, but as a practical tool for identifying where accountability gaps exist before they become regulatory or reputational ones.
CLOSING TAKEAWAY
The AICDI report is the most comprehensive evidence yet that the corporate world has embraced AI as a capability while treating governance as optional. Nearly nine in ten companies have not publicly committed to any named AI governance framework. Only a fraction have the operational controls — impact assessments, model registries, human oversight policies — that responsible deployment requires. For South African boards operating under King V, and for executives who will be asked to demonstrate not just that they adopted governance practices but that those practices produced measurable outcomes, the message is unambiguous. A strategy document is not governance. The question is what comes after the strategy — and for most organisations, the honest answer is not yet enough.
Author Bio: Johan Steyn is a prominent AI thought leader, speaker, and author with a deep understanding of artificial intelligence’s impact on business and society. He served as a working group member contributing recommendations toward South Africa’s national AI strategy, an initiative by the National Advisory Council on Innovation (NACI), the Council for Scientific and Industrial Research (CSIR), the Human Sciences Research Council (HSRC) and the Department of Science and Innovation. He is passionate about ethical AI development and its role in shaping a better future. Find out more about Johan’s work at https://www.aiforbusiness.net



Comments