top of page

Meta Is Not Your Friend — And It Never Was

Privacy · Digital Literacy


A reminder that what you post, say, and write — especially in private messages — has consequences you may not see coming.

Let me be upfront: I use Facebook and Instagram for my business. Both are public-facing, and I'll be honest — I hate that I'm on them. But I understand why so many of us feel we have no choice. That's exactly why I think it's worth saying clearly what these platforms are and what they are not.


Meta — the parent company of Facebook, Instagram, and WhatsApp — is not designed to protect you. It is not your community. It is a data extraction machine, and you are the product. Its ultimate goal is to scrape your existence, package your persona, and sell it to the highest bidder. That's not a conspiracy theory. That's the business model. What just changed — and why it matters


On March 8, 2026, Meta quietly rolled back end-to-end encryption (E2EE) for Instagram direct messages. If you're unfamiliar with the term, here's what it means in plain language: until now, a private conversation between you and a friend was exactly that — private. Only the two of you could read it. Not Meta, not hackers, not governments.

End-to-end encryption ensures only the sender and recipient can read a message — not the platform, not hackers, not governments. Security experts widely consider it the strongest form of protection for private digital communications.

That protection is now gone. Meta's stated reason is child safety and support for law enforcement investigations — and on the surface, that sounds reasonable. But the tradeoff is enormous and largely invisible: every conversation you have on Instagram can now be monitored, stored, and fed into increasingly vast databases that build detailed profiles of who you are. Not just today. For generations.


In December 2025, Meta had already announced that interactions with its AI tools — including those inside private conversations — may be used for targeted advertising. Before that, all Meta AI interactions were already being used for AI training. The removal of E2EE is the next logical step in that direction.


Fines don't fix anything


You might wonder: don't regulators hold these companies accountable? The short answer is no — not in any meaningful way. Big Tech is regularly fined billions of dollars for privacy violations and anti-competitive practices. But for companies generating tens of billions annually, those penalties are simply a cost of doing business. In 2025 alone, Google, Apple, Meta, and Amazon could have paid off their collective annual fines within weeks using existing cash flow.


The financial incentive to collect your data far outweighs any penalty for doing so. Until that equation changes, it won't stop. We are in uncharted territory


Humanity has never lived like this before. We now have corporations that can, in effect, listen to nearly every word we write to one another — across decades, at scale, with no meaningful oversight. The accumulation of that power is something we genuinely don't have the frameworks to understand yet. History tells us that concentrated power over information is almost always abused, eventually.


Think about the posts you made ten years ago. Now imagine how those same posts — the opinions, the jokes, the throwaway comments, the private messages — might be interpreted in another ten or twenty years. These platforms are not curating your memories. They are building a permanent, searchable record of your life, indexed to serve their commercial interests.


Ask yourself: will something you post today cause you problems in fifty years? It's not an absurd question. It's the right one. What you can do Privacy is not just for people with something to hide. It is a fundamental right — and protecting it makes life better for everyone. A society where every communication is potentially monitored is one where people self-censor, where dissent becomes risky, and where the architecture of trust quietly collapses.


You don't have to leave these platforms tomorrow. But you should be deliberate about how you use them. Share less than you think is harmless. Move sensitive conversations to platforms that still offer genuine E2EE — Signal, for example. And understand that the algorithms running beneath your feed are not designed to enrich your life. They are designed to maximise your time on the platform, because your attention is what's being sold. Further reading Stolen Focus — Johann Hari


Investigative journalist Johann Hari spent years going behind closed doors at some of the world's most powerful tech companies. What he found challenges the story these platforms tell about themselves — that they exist to connect communities and enrich lives. In reality, as Hari documents in forensic detail, the product is your attention, and the business is designed to steal as much of it as possible. A sharp, accessible read for anyone who wants to understand what is actually being built around us.


Learning to keep your cards closer to your chest takes practice. It can feel inconvenient in a world built to reward oversharing. But it is one of the most important digital habits you can build — and the limited data we have from the past fifteen years suggests that the cost of not doing so is only going to grow.

Part of an ongoing series on personal privacy, digital habits, and reducing unnecessary exposure in an increasingly surveilled world.


Comments


bottom of page