Insecure by Design

Telegram is seen as the gold-standard for secure messaging, but the app isn’t as secure as it paints itself to be. Here are 7 reasons to question Telegram’s privacy claims.


Earlier this month, WhatsApp started showing a notice informing users of an update to the app's privacy policy.

The update seemed to indicate that WhatsApp would share user data with its parent company, Facebook.

Nevermind that WhatsApp had already been doing it since 2016, the notice raised awareness on the data-sharing scheme and pushed millions to look for alternatives to the messaging app. 

Many flocked to Signal, the secure messenger endorsed by Edward Snowden and already used by journalists and activists worldwide (in fact, Signal got so many new users that its servers went down for a few hours on Friday).

But even larger numbers turned to Telegram: over 25 million people registered on the app within 72 hours. The same thing happened when Facebook purchased WhatsApp in 2014, and in 2019, when WhatsApp discovered a vulnerability in the app, and thousands of alarmed users—particularly activists—turned to Telegram for safety. 

This increased awareness of digital privacy risks is certainly a positive development. But Telegram is seen as the gold-standard for secure messaging is deeply concerning. So here are 7 reasons why Telegram isn't as secure as it paints itself to be.

image

1. Chats are not end-to-end encrypted by default 

By contrast to Signal or WhatsApp, conversations are not end-to-end encrypted by default. This means that anyone with access to the Telegram servers can read user chats (messages, photos, audio recordings, etc)—whether that's Telegram staff, a hacker who manages to get in, or a government serving the company a subpoena.

We don't know whether Telegram staff has accessed user conversations in the past or handed over data to governments, but we don't need to know: if they want to, they can.

Telegram does support end-to-end encrypted “secret chats”, which keep data encrypted even on Telegram servers.

But users need to go out of their way to start a secret chat, and in the process lose significant functionalities, such as cross-device syncing (secret chats started on the phone are not viewable on Telegram desktop, and vice versa).

Because they join Telegram expecting an already-secure messenger, casual users are often not aware that default chats are not fully private. (It's also worth noting that secret chat is not a particularly unique feature: Facebook Messenger and Skype support similar features)

2. Group chats are not end-to-end encrypted

Just like regular one-on-one chats, group chats are not end-to-end encrypted. Except there, Telegram does not offer an option for a secret group chat at all. So if you are on Telegram and want a truly private group chat, you're out of luck. 

3. Full chat histories are stored in the cloud

The entirety of users' chat history is stored on Telegram servers. This means that if someone gets access to your account, they are able to read every message of every conversation you've ever had on the app (except for secret chats and those you have manually deleted). In the words of security researcher The Grugq, “this is a security nightmare”. 

To get access, a hacker would need to intercept the verification code that Telegram sends when a user activates the app on a new device. This is not theoretical, it happens in real life—including in places where criticizing the government can land you in jail. There are ways of protecting yourself against this threat, such as enabling an extra password to register a new device, but yet again, most users are unaware of those risks. 

This is not an issue unique to Telegram: other messengers that use the phone number as an identifier face the same risks. What is unique about Telegram is that it stores all conversations on its servers.

Other apps like WhatsApp or Signal do not store chat history on their servers, leaving it to the user to back up their conversations on their device. Telegram storing chat history in the cloud adds a great deal of convenience (all conversations are available on any device at any time), but presents serious risks that are often not clear to ordinary users.  

4. Metadata collection

Metadata is data about your conversations. It can include who you talk to, when, and for how long; it can identify members of a group chat, their locations, IP addresses, etc. Even without access to the content of a conversation, metadata can reveal a lot about someone's life—so much so, that a former head of the NSA admitted that the US government kills enemies just based on metadata. When WhatsApp says it shares data about its users to Facebook, the data it's talking about is that metadata.

Like WhatsApp, Telegram collects metadata about its users, including IP addresses that can reveal the users' location. To many users, that's much better than Facebook holding that data. But others many not want anyone, not even Telegram, to know who they are talking to, when, and from where. 

This is all the more regretful because private messaging can work perfectly well without collecting metadata. In 2016, as part of an FBI investigation, the US court subpoenaed Signal to handover all the information it had about two of its users. Because Signal is end-to-end encrypted, and therefore has no access to user conversations, all that the team could share with the court was metadata and it did share all the metadata available: the date the users created their Signal account, and the date they were last connected. That's all. 

Collecting user metadata is a choice apps make, and Telegram made the privacy-invasive choice.

image

Above: the entirety of the user metadata Signal handed over to the FBI

5. More of a social network than a secure messenger

While Telegram calls itself a “messaging app with a focus on security and speed”, over the years, it’s become more of a social network. Telegram supports groups of up to 200,000 members and channels where admins can broadcast messages to millions, similar to a Twitter feed or Facebook page. There is of course nothing wrong with being a social network. The problem is that users are misled to think that Telegram is a particularly secure or private one. 

In August 2019, when millions of Hongkongers turned to Telegram groups to coordinate protests against Chinese interference, a vulnerability was discovered in the app. An attacker was able to identify the phone numbers of those who were part of those Telegram groups—even if those users had set their app preferences to hide their phone number. This meant that if that attacker was the government, they could then use the phone number to track down the users’ real world identities—and detain them for their activism.The issue was quickly fixed, but it showed yet again that many Telegram features of Telegram were not built with privacy as a primary concern. 

image

And like all social networks, Telegram uses censorship: it shuts down groups and channels when it is under public or political pressure—like when the app became a hub of ISIS propaganda or, just last week, when white nationalists turned to Telegram after being booted from all major social networks. 

6. A great deal of confusion

Telegram brands itself as a “secure” app and says its chats are “highly encrypted”. But as all of the above shows, it's a little more complicated than that. Secret chats are end-to-end encrypted, but not regular chats. Telegram's website says nothing about groups being end-to-end encrypted (they are not), so many users may mistakenly believe that they are. Voice calls and video calls are end-to-end encrypted. 

To an ordinary user—not a security expert or a hacker—this is all pretty confusing and error prone. Users who buy into Telegram's branding are likely to feel like anything they do in the app is safe. Others may mistakenly open a regular chat instead of a secret chat, or forget which feature is end-to-end encrypted and which isn't. All in all, Telegram was either not designed to protect user privacy, or protecting user privacy was poorly executed.

7. Worthy of our trust?

A big part of digital security is about trust. It's about asking ourselves questions like: Who are we willing to trust with our data and digital identities? Do they have the right motives? What is their history and reputation? Asking ourselves those questions about Telegram, it's hard to recommend the app. 

The app was launched by Russian billionaire Pavel Durov and his mathematician brother Nikolai, the creators of VK, Russia's most popular social network. When in 2011 VK topped 100 million users, Durov starting attracting the anger of the Kremlin for refusing to censor Putin's political opposition. By 2014, the situation had escalated to the point where Durov was ousted from the company and left Russia. This is when Durov launched Telegram. He said at the time that “the No. 1 reason for me to support and help launch Telegram was to build a means of communication that can’t be accessed by the Russian security agencies”. But Durov's aim to help Russians bypass censorship says little about his commitment to privacy in general.

The first warning sign about Telegram came early on. When designing the app, the Durov brothers ignored cryptography's golden rule, a consensus among security experts: don't create your own encryption scheme. Cryptography is so complex that designing an encryption algorithm from scratch almost inevitably leads to mistakes, so developers should instead use time-tested, recognized algorithms. But Telegram arrogantly decided to create its own homegrown encryption technology, and as expected it was filled with errors (which remained unaddressed for years).

In the following years, despite fixing those initial cryptographic flaws, Telegram did nothing to address the many worrying weaknesses, and continued to paint itself as the most secure and private app out there—often making his case by arguing that Signal cannot be trusted (despite the universal endorsement of the app by cryptographers, security experts, and whistleblowers).  

Years later, Durov's appetite to cash-in on the app further raised eyebrows. In 2018, Telegram raised a whopping $1.7 billion from investors to launch a cryptocurrency on the platform—the largest amount ever raised for a cryptocurrency launch. The move was expected to generate a whole lot of cash for Telegram's founders. The project was eventually abandoned after a US court ruled that Telegram didn't comply with financial regulations, but the whole saga raised questions about Durov's motives. 

Finally, though initially based in Berlin, Telegram has operated out of Dubai since 2017. Durov explained that it moved to the United Arab Emirates for its fiscally advantageous policy. This was, according to him, out of belief in small government rather than financial motivation—though the billion-dollar cryptocurrency fundraising was announced just months after the move to the tax-free zone.

Most worrying, the UAE is a deeply repressive state, routinely jailing political dissidents and journalists for critizing the ruling family. It is a violent player in regional politics, backing dictatorships against democratic movements, waging a cold war against Iran, and fueling proxy wars in Yemen, Libya, and Syria. The UAE is an unabashed enemy of freedom of speech, with a long list of political enemies. Durov claims that its server infrastructure is distributed around the world and strategically set up to avoid the jurisdiction of any one country, including of the UAE. With no means of verifying those claims, we'll have to take his word for it and hope that the billions of conversations stored in Telegram's servers are, indeed, safe.

Conclusion

As the Electronic Frontier Foundation explains, it would be silly, dangerous even, to give blanket recommendations on what communication tool is the most “secure”. Security is subjective and personal. What is it you want to protect—the content of your conversations, who you talk to, your identity? Who do you want to protect it from—Facebook, Google, the US government, the Chinese government, your boss, your stalking ex-lover? Each app has strengths and weaknesses, and different needs call for different apps.

Telegram may well fit some people's security needs, and it may offer social networking features that people are looking for, free of Facebook surveillance. But the app’s self-branding as “secure” and “highly encrypted” is deceptive and puts users at risk. From Signal to Wire or Briar, there are many apps that were designed from the ground up with security in mind. Telegram is simply not one of them.