The hidden cost behind “free” apps
- Johan Steyn

- Dec 19, 2025
- 4 min read
Our favourite “free” apps quietly bill us in data instead of money, and most of us have no idea what we have agreed to.

Audio summary: https://youtu.be/YCDidpDq4R8
Follow me on LinkedIn: https://www.linkedin.com/in/johanosteyn/
I write about various issues of interest to me that I want to bring to the reader’s attention. While my main work is in Artificial Intelligence and technology, I also cover areas around politics, education, and the future of our children.
Open your phone and count how many of the tools you use every day cost nothing at all: Waze or Google Maps to get around, Gmail for email, a free weather or fitness app, games to pass the time, and perhaps a “free” VPN or password manager. On the surface, it looks like a miracle of modern capitalism: powerful services with a price tag of zero rand.
But there is a hidden invoice: We pay with our location, our contacts, our browsing habits, our movements and sometimes our most intimate details. Almost nobody reads the terms and conditions. We tap “I agree” because we are in a hurry. The problem is that behind that single tap lies an entire business model built on wide permissions, constant profiling and sharing our data far beyond what we imagine.
CONTEXT AND BACKGROUND
For years, people have said: If you are not paying for the product, you are the product. It is an old line, but it remains uncomfortably accurate. Free digital tools still have to make money. If they are not charging a subscription, they usually rely on targeted advertising, data analytics or selling access to your information. Studies comparing free and paid apps have repeatedly found that free versions tend to ask for broader and more invasive permissions, including precise location, access to contacts, microphones, cameras and sensors that have little to do with the app’s basic function.
Around this sits a vast and largely invisible data industry. Data brokers and analytics firms aggregate information from thousands of apps and websites, combine it with other sources, and build detailed profiles of individuals and communities. These profiles can include where you go, what you buy, which devices you use, your likely income bracket and your interests.
They are then used to target adverts, adjust insurance risk, shape political messaging or simply sold on to whoever is willing to pay. For people in countries like South Africa, much of this happens offshore, in legal and technical grey areas that are very hard to see or challenge.
INSIGHT AND ANALYSIS
The hidden cost begins with permissions. When you install a navigation app, it obviously needs to know where you are right now. But does it also need a continuous log of your movements, even when you are not actively using it? Does a casual mobile game really need access to your precise location or your address book? Many free apps bundle long lists of permissions into one screen, essentially framing it as a choice between “accept everything” or “do not use the app at all”. In theory, we have a choice; in practice, the design nudges us towards surrendering far more information than we realise.
Once those permissions are granted, the data rarely stays with the original app. Many developers include external tracking and advertising components in their software. When you agree to the app’s terms, you often also grant access to these third parties, whose names you do not recognise and whose policies you have never read. Your location, device identifiers and behavioural data can then flow into a web of advertising networks, analytics platforms and data brokers.
Over time, each small piece starts to form a surprisingly detailed picture of your life: where you live, where you work, who you meet, what you value.
Targeted advertising is the most visible tip of this iceberg. You see a strangely accurate advert and think, “That is creepy.” But the same machinery can have quieter consequences. Driving data from “free” apps can influence insurance decisions. Shopping and browsing patterns can affect which loan offers or job adverts you see. In an AI-driven world, this data also becomes fuel for training models that predict, score and nudge people at scale. Our supposedly free tools become part of a system that categorises us in ways we rarely understand and never explicitly approve.
IMPLICATIONS
The implications go far beyond abstract privacy debates. For ordinary people, this can translate into different prices, different opportunities and different risks based on opaque digital profiles. For vulnerable groups, including children and teenagers, constant tracking can normalise surveillance and expose them to targeted content at moments when they are least able to make informed choices. In a country marked by deep inequality, like South Africa, we should worry about how these invisible data economies might reinforce old divides in new, algorithmic ways.
Policymakers are slowly waking up. Data protection laws such as POPIA in South Africa and similar frameworks elsewhere are starting to put pressure on companies to limit excessive data collection and to be more transparent about sharing. But regulation alone will not fix the problem. Businesses that rely on “free” tools must take their responsibilities seriously, asking whether the convenience is worth the ethical and legal risk. Citizens need to become more sceptical. It is not realistic to expect everyone to read every clause, but we can at least look at which permissions an app demands and ask whether there are more privacy-respecting alternatives.
CLOSING TAKEAWAY
I am not arguing that we should all throw away our phones and retreat from the digital world. Many free apps genuinely improve our lives. But we should stop pretending that they come without cost. There is always a hidden invoice, and it is almost always paid in data. The real question is whether we are comfortable with who sends that invoice and what they do with the payment.
As parents, as professionals and as citizens, we need to move from blind acceptance to informed choice. That means learning to question the apps we install, supporting stronger privacy protections, and teaching our children that convenience is never truly free. In the age of AI and data-driven decision-making, understanding the hidden cost behind “free” apps is not paranoia; it is a basic form of digital self-defence.
Author Bio: Johan Steyn is a prominent AI thought leader, speaker, and author with a deep understanding of artificial intelligence’s impact on business and society. He is passionate about ethical AI development and its role in shaping a better future. Find out more about Johan’s work at https://www.aiforbusiness.net






Comments