Telegram doesn’t hold up to the promise of being private, nor secure. The end-to-end encryption is opt-in, only applies to one-on-one conversations and uses a controversial ‘homebrewn’ encryption algorithm. The rest of this article outlines some of the fundamentally broken aspects of Telegram.
↫ h3artbl33d
Telegram is not a secure messenger, nor is it a platform you should want to be on. Chats are not encrypted by default, and are stored in plain text on Telegram’s server. Only chats between two (not more!) people who also happen to both be online at that time can be “encrypted”. In addition, the quotation marks highlight another massive issue with Telegram: its “encryption” is non-standard, home-grown, and countless security researchers have warned against relying on it.
Telegram’s issues go even further than this, though. The application also copies your contacts to its servers and keeps them there, they’ve got a “People nearby” feature that shares location data, and so much more. The linked article does a great job of listing the litany of problems Telegram has, backed up by sources and studies, and these alone should convince anyone to not use Telegram for anything serious.
And that’s even before we talk about Telegram’s utter disinterest in stopping the highly illegal activities that openly take place on its platform, from selling drugs, down to far more shocking and dangerous activities like sharing revenge pron, CSAM, and more. Telegram has a long history of not giving a single iota about shuttering groups that share and promote such material, leaving victims of such heinous crimes out in the cold.
Don’t use Telegram. A much better alternative is Signal, and hell, even WhatsApp, of all things, is a better choice.
It’s interesting the difference in perspective on the last two articles. This one is upholding the rights of a person (and groups of people) to private conversations. Yet in the case of the previous Google article, it’s editorial appears against to that same right in the workplace.
To rehash some of the previous thread, my perspective is everyone is entitled to private conversations. In work, and outside. And I don’t expect the platform itself to censor the content of my private conversations.
Corporations are not people. Different rules apply.
Thom Holwerda,
Morally, I am in agreement. It’s wrong to treat corporations as people. They are not deserving of human rights. Corporations should only exist to serve society. To the extent that they are failing to do so then we ought to have a recourse. IMHO we are morally justified in laying down corporate laws making sure they do serve society.
However this ideology is typically not used in legal matters where corporations get afforded constitutional rights as people.
https://www.alternet.org/2014/07/10-supreme-court-rulings-turned-corporations-people
https://www.npr.org/2014/07/28/335288388/when-did-companies-become-people-excavating-the-legal-evolution
And I think there’s enough corruption to sustain the “corporations are people” position even if it isn’t sensible.
:-/
The corporation itself isn’t a person (I’m not arguing that) but People work for those corporations. Their rights don’t get suspended because of their employment.
If a person is entitled to private communication as a right, they are entitled to it inside and outside of work.
Adurbe
Morally, you don’t have to tell me that, I agree.
This is admittedly US-centric, but here in the US privacy rights actually do get suspected in favor of employers. The constitution only applies to government surveillance (and even that gets ignored). Privacy from employers isn’t a thing in federal law.
Morally, everything about it is wrong, but I’ve heard of companies asking for social media accounts and access to non-public posts. It sucks, but in many states employees do not have a legal right to digital privacy even on private social media accounts outside of work!!! Those rights, if any, come from the states. My state doesn’t have social media privacy protections.
https://www.justia.com/employment/employment-laws-50-state-surveys/social-media-privacy-laws-in-the-workplace-50-state-survey/
https://www.nolo.com/legal-encyclopedia/state-laws-on-social-media-password-requests-by-employers.html
Telegram sucks as a company and service, but the problem is:
– for larger group chats (i.e. more than a dozen people), it works really well. Telegram’s E2EE is a joke, but E2EE matters less when you get into groups of this size, and WhatsApp and Signal are not fit for this purpose. Discord is comparable, but still a bit different.
– their client game is on point. They have native clients for every platform that matters, no Electron to be seen. They’re fast, pleasant to use, and integrate into the system (i.e. on Mac, Telegram is the only chat app other than iMessage that integrates with the share sheet).
>”Telegram’s E2EE is a joke”
If Telegram’s E2EE is a joke like you and Thom claim, then why is France imprisoning the founder? You would think their bright spies would have already broken the code and wouldn’t need leverage in order to get Pavel to give up the secret sauce.
France didn’t arrest the founder “to get leverage”. They arrested him precisely _because_ they were able to view communications describing illegal activity – and actually illegal communication, such as transmitting child porn – and are arresting him because Telegram is refusing to moderate/deny access to purveyors of said illegal activities.
If Telegram _actually_ had implemented strong E2EE for all user communications, and could legitimately claim to be _unable_ to access (and hence moderate) said user comms, Pavel and their other executive officers wouldn’t be legally liable in the same way that they almost undoubtedly currently are in most jurisdictions.
Kuraegomon,
The article doesn’t really allege that the E2EE was broken.. I know virtually nothing about Telegram or its founder, but just as a point of logic: it is possible for law enforcement to know that Telegram was being used to conduct illegal activities without asserting or even implying that the E2EE has been broken by said law enforcement. It wouldn’t be the first time governments have requested companies defeat cryptographic controls. Many of us probably remember the San Bernardino shooting case where the FBI demanded apple hack a phone to bypass security measures (IIRC a third party was able to break into the phone without apple). In the Telegram case, it seems plausible that a court would be willing & able to find a Telegram officer in contempt for refusing to obey it’s demands. Without more details we can’t really say if the demands were reasonable in terms of the computer science.
They don’t have to break E2EE or even breach Telegram servers to know what is going on there; they can simply confiscate or trojan-horse a device of an offender or enter the group through cover-up. The real problem with Telegram and its weak encryption & closed-source infra is that the company can impersonate anybody at any time. Given its huge user base and the fact that it is likely controlled by FSB, it makes it a powerful info-weapon. It also allows to mine huge amounts of open chats for statistical intel.
Security is always relative to the threat model.
Moreover, Durov was not simply arrested, he has surrendered himself; whether voluntary or not is an another question, but still.
mbq,
That problem isn’t unique to telegram though. It’s a weakness of imessage too, The clients have E2EE, but how do you think the public keys that are used for crypto are distributed? They come from apple’s centralized directory server that manages keys transparently. Apple can impersonate users as well. Crypto like GPG are more manual and therefor harder to use but technically more secure from centralized control.
While I agree with you about that, I just don’t think those criticisms are unique to telegram. Roll your own crypto is frowned upon, but nobody has reported a break. I also strongly dislike proprietary services, but this isn’t unique to telegram either.
I know nothing about him, but I’m not sure what the implication is? Does surrendering to the police make him guilty of anything?
>”France didn’t arrest the founder “to get leverage”.
Of course. Because tech CEOs are always detained without any prior criminal indictment while passing through international transit lounges. Happens to Zuck like every other day from what I hear. No, there’s no chance this could POSSIBLY be an attempt to squeeze him for access to the stuff that they don’t already have.
Telegram is an infinite source of “war in ukraine news” and that is selling clicks.
>”And that’s even before we talk about Telegram’s utter disinterest in stopping the highly illegal activities that openly take place on its platform, from selling drugs, down to far more shocking and dangerous activities like sharing revenge pron, CSAM, and more. Telegram has a long history of not giving a single iota about shuttering groups that share and promote such material, leaving victims of such heinous crimes out in the cold. Don’t use Telegram. A much better alternative is Signal, and hell, even WhatsApp, of all things, is a better choice.”
Signal claims to do end-to-end encryption and has slammed the EU for their recent upload moderation proposals. So you won’t find all your list of vile nasties being moderated there. If you want constant surveillance and moderation and restriction of all communication, then places like China and North Korea have you pretty well covered. Fancy some WeChat to go along with your WhatsApp?
If you want to moderate the content, then either it has to be unencrypted or the decryption method is provided to the authorities. Thereby providing state the means of censorship / law enforcement / spying depending on your perspective. You can’t really have it all ways.
This. If you provide true E2EE by default, then moderation is simply not a thing. I personally believe that this is the way. Law enforcement and governments simply need to use the plethora of other methods they have at their disposal to track down lawbreakers.
And – more importantly – dissidents and all others whose very lives may depend on having access to strongly-encrypted messaging will have a viable option.
This is not true; you can still moderate in E2EE, mainly by controlling particular users’ access to the system, but also through automated filters on end-points (this one is stupid & ineffective, but still). E2EE is also not sufficient to hide the meta data of who is talking to who at a given time, which is actually the only thing high-level agencies want. Getting the plain text is only interesting for authoritarian and incompetent police-level forces.
I’m not sure I understand the point you’re making here. In an encrypted message app, yes you can ban a user/number, but nothing would stop them getting another account. And who would implement the ban? Without seeing the content what would the context/justification be?
You could argue that if there is lawful interception there is no secure encryption. So it might be better to use none at all (like telegram on groups) and don’t care.
Are telephone companies responsible if illegal stuff is discussed on their network?
Do they cancel contracts if police tells them to, or is an order from a judge needed?
WhatsApp?
common is usa compagny so… need to respect patriot act and cloud act……. so no security
I use no messaging service. SMS once in a while, and i participate in forums online.
If you want to be shady, just join darknet and get some really good bouncer. Whatsapp is a nothing burger in northern eurpoe anyways,