TikTok Flat-Out Refuses DM Encryption, Plays the Same Old "Child Safety" Tune
"Child safety" has become the Swiss Army knife of surveillance justification. Governments reach for it when they want to mandate age verification. Regulators reach for it when they want backdoors into encrypted apps. And now TikTok is reaching for it to explain why, unlike virtually every other major messaging platform on earth, it refuses to protect your private messages with end-to-end encryption.
In a briefing at its London office, TikTok confirmed to the BBC that it will not implement end-to-end encryption (E2EE) for direct messages. The reasoning: E2EE would prevent safety teams and law enforcement from reading DMs when they "need to." The company framed this as a deliberate, principled stance to protect users, especially younger ones, from harm.
Anything but actually making your platform safer, isn’t that right?
The "Child Safety" Card, Played Whenever Convenient
End-to-end encryption means only the sender and recipient can read a message. Nobody in between, not the platform, not the government, not a hacker who breaches the server, can see its contents.
It is the baseline privacy standard for modern messaging, and every serious platform has adopted it: WhatsApp since 2016, Signal from day one, Apple's iMessage, Google Messages, Facebook Messenger, and even Snapchat for photos and video. Instagram is in the process of rolling it out by default.
TikTok, however, is going the completely opposite direction. Instead of E2EE, it uses standard encryption, comparable to what Gmail uses, where authorized employees can access message content in response to law enforcement requests or user reports. The company called rejecting "privacy absolutism" a feature, not a flaw.
Child protection charities like the NSPCC backed the decision, arguing that encryption creates hiding places for predators. That argument isn't entirely wrong. But it also isn't the whole picture, not by a long shot. Blaming privacy for lack of child safety when your platform is literally built on a whole arsenal of child-harming features is nothing short of absurd.
Who Actually Benefits From Unencrypted DMs
The Electronic Frontier Foundation has made the counter-case clearly: E2EE doesn't just protect criminals. It protects journalists, abuse survivors, activists, and anyone whose private conversations could be weaponized against them. The technology isn't the threat. What you do with readable messages is.
And here's where TikTok's ownership history becomes impossible to ignore. ByteDance, TikTok's parent company, is headquartered in China, where end-to-end encryption is effectively banned across domestic platforms. The US entity restructure, with Oracle and others holding an 80% stake, didn't fully resolve concerns about data governance.
Keeping DMs readable means they remain accessible to TikTok employees, law enforcement on request, and potentially, state actors with the right leverage (and there will be plenty of these, you can be sure of that). This is exactly the kind of surveillance infrastructure that scientists and researchers have warned about when pushing back against age verification demands and other "safety" frameworks that quietly normalize access to private data.
Calling this a child safety decision is like a landlord saying they kept a copy of your house key for your own protection.
The Norm Being Quietly Buried
The deeper damage is more than just to TikTok's own billion-plus users. It's to the expectation that private means private.
Every time a platform this large normalizes readable messages, it softens resistance to the same demand everywhere else. Regulators in the UK, EU, and US have spent years pushing for encryption backdoors, largely blocked by public and legal pushback. TikTok just handed them a working proof of concept and dressed it in concern for children.
I'm not saying TikTok invented this tactic. But I am saying they're running it at scale, on one of the most-used platforms among teenagers globally, and calling it safety. If your DMs aren't private on TikTok, that's not a trade-off for your protection. It's a trade-off for someone else's access.
And the children? They remain hardly any safer and just as addicted to algorithms as before. But sure, who cares, right? At least they won’t have to deal with that “devilish” privacy any longer.
Be part of the resistance, quietly.
Get Mysterium VPN

Dominykas is a technical writer with a mission to bring you information that will help you in keeping your digital privacy and security protected at all times. If there's knowledge that can help keep you safe online, Dominykas will be there to cover it.
