The Danger of Building on Someone Else’s AI
- Johan Steyn

- 2 days ago
- 4 min read
For creators, agencies and start-ups, Sora is a warning that platform dependency can destroy momentum overnight.

Video summary: https://youtu.be/lFrF1Zae9eI
Sign up for my Substack daily AI newsletter here.
See my AI Training course portfolio for corporate Business Leaders here.
Follow me on LinkedIn: https://www.linkedin.com/in/johanosteyn/
The reported shutdown of Sora matters for far more than one flashy AI video app. It is a reminder that in the rush to experiment with generative AI, many creators, agencies and software start-ups are building on technology they do not own and cannot ultimately protect. OpenAI’s decision appears tied to a wider refocusing of the company, with reporting from MyBroadband, Reuters and WIRED all pointing to a shift towards coding tools, enterprise products and a more streamlined product set rather than costly consumer experiments. That is why this story matters now. Sora is not just about OpenAI. It is about what happens when people mistake access for ownership and excitement for permanence.
CONTEXT AND BACKGROUND
Sora arrived as one of the most visually impressive examples of generative AI. It promised to turn simple prompts into cinematic video and briefly became a symbol of how quickly synthetic media was advancing. Yet The Guardian reported that OpenAI “said goodbye” to the app only six months after the launch of its stand-alone version, despite it having quickly risen to the top of Apple’s app store and attracted a community of users who treated it as the future of AI creativity.
Recent reporting suggests that the decision was not random. Reuters reported that OpenAI’s move startled Disney and came as the company shifted towards coding tools, corporate customers and broader AI ambitions, while WIRED described it as part of a “focus era” ahead of a possible IPO and a push towards a unified AI assistant rather than a scattered set of experiments. In other words, what looked essential to users may have looked expendable to management.
For South Africa and the rest of Africa, that is especially relevant. Most local businesses cannot afford to build their own foundational AI systems, so they naturally depend on large global platforms. That is understandable, but it also means local innovation can rest on foreign roadmaps, foreign pricing and foreign strategic decisions. The danger is not using these tools. The danger is building too much of your future inside them.
INSIGHT AND ANALYSIS
The deeper lesson is that many AI businesses are not really building assets of their own. They are building wrappers around somebody else’s model, API or interface. That can create speed, novelty and short-term commercial excitement, but it also creates structural fragility. If the platform owner changes its priorities, pulls an API, changes pricing or exits a category altogether, the downstream business can lose momentum overnight.
Sora is a clear example of that tension. Reuters reported that the app required significant computational resources, while The Guardian pointed to criticism around violent and racist videos, deepfakes, copyrighted characters and misinformation. TechCrunch went even further, calling the moment a possible reality check for AI video and for those who had started speaking as though prompt-generated filmmaking was about to replace Hollywood any day now.
This is also why the story matters beyond the tech industry. Our children are growing up in a digital world where popular tools can feel permanent simply because they are widely used. But popularity is not the same as stability. If young creators, students and entrepreneurs build entirely within closed ecosystems, they may become highly productive while remaining deeply dependent. That is not true empowerment. It is convenient with hidden conditions.
IMPLICATIONS
Business leaders should treat this as a strategic warning. Use AI platforms, certainly, but do not make one provider the foundation of your whole proposition. Keep your data portable. Keep your workflows transferable. Build real value through judgment, integration, trust, human relationships and domain expertise, not merely through access to a fashionable model.
Educators and policymakers should make platform literacy part of AI literacy. People need to understand lock-in, continuity risk, export options and who really controls the tools they use. In Africa, where capital is limited and execution matters, these are not abstract concerns. They shape whether digital progress becomes sustainable or simply dependent.
CLOSING TAKEAWAY
The reported end of Sora is not mainly a story about one AI product disappearing. It is a reminder that power in the AI economy still sits with the platforms, not with the people building on top of them. That does not mean creators or businesses should avoid these tools. It means they should use them with much clearer eyes. In the years ahead, resilience, portability and ownership may matter more than novelty, especially in markets like ours. If Africa wants to build a durable digital future, we should learn early that rented capability is never the same as real control.
Author Bio: Johan Steyn is a prominent AI thought leader, speaker, and author with a deep understanding of artificial intelligence’s impact on business and society. He is passionate about ethical AI development and its role in shaping a better future. Find out more about Johan’s work at https://www.aiforbusiness.net



Comments