r/technology Feb 17 '25

Social Media X is blocking links to Signal

https://www.theverge.com/news/613997/x-blocks-signal-me-links-errors
17.4k Upvotes

986 comments sorted by

View all comments

3.2k

u/kixkato Feb 17 '25

This is the reason why no government or entity should ever be allowed a backdoor into any encryption system.

Next time any government wants to "protect the children" or insert other generic emotional reaction here by forcing backdoors into encryption systems, remember the overwhelming good things they for us.

373

u/[deleted] Feb 17 '25

[deleted]

136

u/josh_the_misanthrope Feb 17 '25

That's why we use open source stuff like Signal, and why you should verify signatures of compiled binaries I'd you don't want to compile from source yourself.

While it's not impossible to introduce a weakness in open source, it's a lot more difficult because there are so many eyes on it. It would be like committing a crime in time square on NYE.

54

u/Old-Adhesiveness-156 Feb 17 '25

There are examples of holes being put into open source projects. I bet some are uncaught. Look at the XZ Utils Backdoor as an example of one that was caught, barely.

74

u/Patch86UK Feb 17 '25

It's a basic tenet of security that it's impossible to reduce the risk of a successful attack to zero. A sufficiently determined attacker with access to sufficient resources will always win eventually.

The aim of the game is to make a successful attack as hard as possible. To reduce attack vectors, increase detection rates, and increase the cost to the attacker such that you reduce the pool of viable attackers to as small a group as you can.

If open source development methods mean that a larger proportion of vulnerabilities are caught, then it's doing its job. The fact that you can't possibly guarantee that you've reduced it to zero doesn't negate the value of reducing it at all.

9

u/Old-Adhesiveness-156 Feb 17 '25

Of course. I would actually trust open source over proprietary.

6

u/[deleted] Feb 17 '25

Fascinating story

2

u/[deleted] Feb 18 '25

Holes will always exist. It's a matter of degree. And did you even read the story about xz? Someone infiltrated and bullied their way into having the access that they did. It took years, and because xz is open-source, they failed.

2

u/funkiestj Feb 19 '25

your chance of cating XZ utils backdoor is much higher than your chance of catching a government mandated secret backdoor inserted into closed source.

Furthermore, if somebody can figure out how to pay people doing important work like running the XZ Utils the bar for getting the backdoor inserted is much much higher. I read the story and it worked because a person nobody had ever met or seen volunteers to take over the project (everything after that is window dressing).

6

u/TbonerT Feb 17 '25

While it's not impossible to introduce a weakness in open source, it's a lot more difficult because there are so many eyes on it. It would be like committing a crime in time square on NYE.

This isn’t always the case. Many high-profile flaws have been found that were introduced years earlier.

9

u/rpkarma Feb 17 '25

You’re right, but the we’re found! That’s more than we can say for a lot of closed source versions.

2

u/RobotsGoneWild Feb 17 '25

PGP encryption is pretty awesome for this stuff. You control your own keys and can decrypt on a computer that has no net connection.

2

u/InVultusSolis Feb 17 '25

That's why we use open source stuff like Signal, and why you should verify signatures of compiled binaries I'd you don't want to compile from source yourself.

While it's not impossible to introduce a weakness in open source, it's a lot more difficult because there are so many eyes on it. It would be like committing a crime in time square on NYE.

Honestly the only way to be absolutely sure is to use a set of operating parameters that make security a total inconvenience and unusable. Basically - air-gapped computers, physical hand-off of key material, purpose designed communications software all written by a trusted party, etc.

Any "app" that you can download can be tampered with at any stage of the supply chain. Even open source apps like Signal. And most people who use Signal aren't compiling their own binaries, and even if they did, Apple does not let you do that at all.

3

u/josh_the_misanthrope Feb 17 '25

Sure, but security isn't an all or nothing thing. One end of the spectrum is trusting a multinational corporation isn't going to get its arm twisted in FISA courts. The current government is being overly hostile towards several groups of people, journalists etc... and big tech is kissing the ring. It's a security mistake.

The other end of the spectrum is your comment.

Somewhere in the middle is using software which is difficult to control by traditional methods of governmental pressures. It's difficult to scrub from the internet, difficult to sneak in a backdoor without getting noticed, and makes it enough of a pain in the ass to do either of these things that it serves as a deterrent for the government to even try.

With a company like, say, Apple, the government can apply pressure in a lot of ways to get them to play ball. With something like Signal, if the company gets taken down, the code is still out there. Thousands of people have cloned the repo and have a copy just lying around to re-upload. It can operate independently of its creating company. Developers from several companies are working on the code base, as well as hobbyists and security researchers in a decentralized manner, so sneaking in a backdoor runs the risk of being noticed and becoming a political scandal.

It's not perfect, but you're miles ahead than just blindly putting your trust in a corporation. Corporations will throw your privacy under the bus the second preserving your privacy jeopardizes their profits, and disobeying FISA warrants is exactly going to do exactly that.

1

u/ScF0400 Feb 18 '25

The problem with that analogy in this case about government surveillance is that if there are no authorities to enforce punishments for the crime then the point is moot.

It's basically saying, yes they attacked an open source project or inserted a backdoor into one... Aka committed a crime in Times Square. But in the end they got away with it because if a government was really that corrupt, what could you do?

If a company like Google found a backdoor they could potentially sue if they cared enough the analogy of the police coming in and arresting the perpetrator in Times Square. But as an individual contributor to a free and open source project, what are you going to do?

Am appropriate example of this would be the mass firings happening. By the time a lawsuit is filed, thousands of people are already out of a job/forced out of the office. Even if you do win that lawsuit... Which in this case is to find the person who submitted a backdoor commit, the most you can do is ban them after which they create a new account and try again.

So I disagree, it's not hard at all to introduce a vulnerability or weakness as it's a zero sum game. Even if it's not a backdoor for spying, simply denying access to the project by breaking people's trust or messing with the code to make the backbone infrastructure unusable is still a win for malicious adversaries. Criminal hacking Google, the government and corporations will go after then. Government illegally spying and using backdoors? No one will stop them because they won't arrest themselves.

1

u/josh_the_misanthrope Feb 18 '25

The point is you hopefully notice the attempt and keep a copy of the code base without the commit so you can build a clean version. It's not to contest the government in court, it's to retain the ability to have secure communication and distribute that codebase to people who need it.

It's not perfect, and we can speculate to infinity for how a project could be compromised, but my point is that it's a lot more difficult to squash open source or tamper with it than it is to FISA a corporation to introduce a backdoor in closed source software without people knowing. The risk of the government doing that is so high that closed source software should be assumed to be insecure from government surveillance by default.

We know they've repeatedly attempted this over and over. And the most well known backdoor in OSS: Dual_EC_DRBG, was done by the NSA paying off a corporation to distribute a shoddy implementation. And it was noticed during the standardization process before it was even distributed. In fact, that's generally the NSA's MO. They introduce hardware and software weaknesses into cryptographic algorithms by obliging companies to do so. We know about a ton of weaknesses in OSS because it is public, we have to assume that closed source software has been compromised just as frequently but discovered much less so by virtue of code obscurity and FISA warrants.

1

u/ScF0400 Feb 19 '25

Very true on the knowing part, at least knowing will make it less likely people continue to use the service and fall victim even if there are multiple compromises. However therein lies the cyclical conundrum, open source is usually not funded by anyone but the team or individual and donors. If you have a 9-4 job and limited funds, by the time you've reached your 4th compromise and backdoor, people stop using your project, you're out of funds, you have stress in maintaining the project and may quit, and the shoddy implementation wins, which then causes people to open another open source project that can follow the same cycle. Unless the FOSS community is willing to accept malicious entities can always infiltrate and use either direct commit or other tactics such as mandating code written by the GitHub AI secretly be modified to infiltrate projects.

For example next gen ECC quantum resistant algorithms. If the NSA consistently contributed bad commits that were discovered time and again by 1 maintainer how long until said maintainer makes a mistake? After hearing of ECC backdoor compromise in open source implementations, some other projects might decide to not use or support it anymore. Or worse they decide to clone it, but they don't know if it's a clean variant or not. This is why supply chain attacks are so effective. Even businesses don't have enough time and don't want to invest resources on building everything from scratch or reading through 9,000 lines of code and instead just pip or npm install.

0

u/Raven-Haired-Witch Feb 17 '25

IIRC (it’s been a few years), the server side of Signal is closed source. The apps are open source, sure, but that doesn’t help you if the server software has been compromised/backdoored.

There certainly are open source encrypted chat platforms, Matrix being the main one I can think of.

2

u/josh_the_misanthrope Feb 17 '25

Well you can ensure that your end to end encryption is secure, so what happens on the server is mostly irrelevant. They can probably know who you're talking to and when, but the contents of what you're saying should be secure.

57

u/[deleted] Feb 17 '25 edited Feb 17 '25

[deleted]

11

u/KerouacsGirlfriend Feb 17 '25

Not looking forward to the Great De-Anonymization.

2

u/tjsr Feb 17 '25

Multiply that to infinity for all US owned or influenced phone, networking, or security products. And that's just the endpoints, let alone the monitoring of actual network transport infrastructure and everything people think of as "private" like VPN's, any sort of crypto blockchain, TOR, etc.

I'm amazed that people don't believe that a significant percentage of all chips and networking equipment on the market doesn't have, hard-coded in to it somewhere, something that looks for a specific frame, packet, or sequence of these, which acts as a kill-switch or similar.

14

u/DuckDatum Feb 17 '25

If a company truly cared about security, they’d leak the information that reveals the back door. Do it from a vpn on a random 4chan under a throwaway account. Then go “discover” the leak to give it some attention. Companies can follow an open source model that implements open source protocols designed and reviewed for ensuring no back doors exist.

3

u/[deleted] Feb 17 '25

Don’t forget how much overwhelming support FISA renewal received last year from congress. 

2

u/Frekavichk Feb 17 '25

From my understanding a lot of these backdoors are also of the "and it's illegal for you to disclose this" kind - so even a company that advertises encryption and privacy might be compromised and unable to disclose that fact.

Isn't that what the canary thing is supposed to be for?

2

u/qfjp Feb 17 '25

This is what a "canary clause" is for. The company will advertise that they don't have a backdoor, until they do at which point they remove the clause. So they're not advertising a backdoor exists, but it's still known that they are compromised. "Canary" after the canary-in-a-coal-mine warning.

69

u/Temp_84847399 Feb 17 '25

I've seen people in this very sub, advocate for law enforcement to have access to all files on people's computers so they can automatically search for CSAM.

Queue up, "What's wrong with that? It will only be used to catch pedos..." responses. SMH

30

u/SwiftySanders Feb 17 '25

Like they haven’t been known to doctor up evidence anyway.

15

u/kixkato Feb 17 '25

https://youtu.be/VPBH1eW28mo?si=elEm2S5TtV2gnF0H

Fantastic quote about that at the end of this video.

23

u/EccentricPayload Feb 17 '25

Almost had it with Telegram til Macron jailed him for it. Daddy always gets his access.

61

u/jfkfnndnd Feb 17 '25

Telegram can read your messages FYI, there is no E2E encryption if you are not in secret chat

-9

u/EccentricPayload Feb 17 '25

Yes, because Durov got arrested and folded

9

u/jfkfnndnd Feb 17 '25

Homie it never was E2E. It uses SSL, and the traffic is even encrypted in-flight with Telegram keys, but that encryption is between you and Telegram, not between you and the other person.

Other than that Durov is a very shady character, who claimed to have left russia for good, yet had a whole Telegram department working in the same VK offices for years after that

1

u/[deleted] Feb 18 '25

Telegram knows everything you say and do by default. Nothing is end-to-end encrypted.

2

u/neuromonkey Feb 17 '25

Fortunately, it's very difficult to stuff the Strong Encryption genie back into the bottle. The government can pressure corporations, but they can't put pressure on algorithms.

2

u/Rex9 Feb 17 '25

Just like the backdoor legislated into all telecom equipment got hacked by the Chinese last year (probably earlier than that). Just like we all screamed about back in the '00's. There's no such thing as a secure back door. Eventually the bad guys will find it. Or a "good guy" will get fed up with his superiors and expose/exploit it.

2

u/hardypart Feb 18 '25

This is such a good point. From now on whenever someone claims "I don't care, I have nothing to hide" I will just point at the things that are currently going on in the US and how this is possible to happen in any other country, too.

1

u/kixkato Feb 18 '25

You understand my point exactly! Go tell your friends.

1

u/Pongo_Crust Feb 18 '25

There’s no “back door” in Signal. It’s open source, so please point out the back door.

This is smoke from the dudes who want you to use their surveillance platforms…

1

u/kixkato Feb 18 '25

I'm not saying there is one. I've been a fan of signal for years.

But there is a large pressure to force a backdoor in from governments. For example : https://arstechnica.com/tech-policy/2025/02/uk-demands-apple-break-encryption-to-allow-govt-spying-worldwide-reports-say/

UK and Apple but the concept is the same and it's bad.

1

u/[deleted] Feb 18 '25

Apple's code is closed-source. They could put in a backdoor and nobody would know.

Signal is open-source and that backdoor would be found very quickly, but Signal would say something publicly before that would ever happen.

All of Signal's code is public on GitHub:

Android - https://github.com/signalapp/Signal-Android

iOS - https://github.com/signalapp/Signal-iOS

Desktop - https://github.com/signalapp/Signal-Desktop

Server - https://github.com/signalapp/Signal-Server

Everything on Signal is end-to-end encrypted by default.

Signal cannot provide any usable data to law enforcement when under subpoena:

https://signal.org/bigbrother/

You can hide your phone number and create a username on Signal:

https://support.signal.org/hc/en-us/articles/6829998083994-Phone-Number-Privacy-and-Usernames-Deeper-Dive

Signal has built in protection when you receive messages from unknown numbers. You can block or delete the message without the sender ever knowing the message went through. Google Messages, WhatsApp, and iMessage have no such protection:

https://support.signal.org/hc/en-us/articles/360007459591-Signal-Profiles-and-Message-Requests

Signal has been extensively audited for years, unlike Telegram, WhatsApp, and Facebook Messenger:

https://community.signalusers.org/t/overview-of-third-party-security-audits/13243

Signal is a 501(c)3 charity with a Form-990 IRS document disclosed every year:

https://projects.propublica.org/nonprofits/organizations/824506840

With Signal, your security and privacy are guaranteed by open-source, audited code, and universally praised encryption:

https://support.signal.org/hc/en-us/sections/360001602792-Signal-Messenger-Features

1

u/Pongo_Crust Feb 18 '25

Sorry, I misunderstood your comment. My bad tips hat

1

u/bricious Feb 18 '25

Seeing the current state of children the goverment doesn’t seem to care about them at all. Its an excuse to be able to spy on anyone anytime.

1

u/[deleted] Feb 18 '25

Signal in particular will be the most effective weapon against the coming fascism of the Musk/Trump co-presidency.

All of Signal's code is public on GitHub:

Android - https://github.com/signalapp/Signal-Android

iOS - https://github.com/signalapp/Signal-iOS

Desktop - https://github.com/signalapp/Signal-Desktop

Server - https://github.com/signalapp/Signal-Server

Everything on Signal is end-to-end encrypted by default.

Signal cannot provide any usable data to law enforcement when under subpoena:

https://signal.org/bigbrother/

You can hide your phone number and create a username on Signal:

https://support.signal.org/hc/en-us/articles/6829998083994-Phone-Number-Privacy-and-Usernames-Deeper-Dive

Signal has built in protection when you receive messages from unknown numbers. You can block or delete the message without the sender ever knowing the message went through. Google Messages, WhatsApp, and iMessage have no such protection:

https://support.signal.org/hc/en-us/articles/360007459591-Signal-Profiles-and-Message-Requests

Signal has been extensively audited for years, unlike Telegram, WhatsApp, and Facebook Messenger:

https://community.signalusers.org/t/overview-of-third-party-security-audits/13243

Signal is a 501(c)3 charity with a Form-990 IRS document disclosed every year:

https://projects.propublica.org/nonprofits/organizations/824506840

With Signal, your security and privacy are guaranteed by open-source, audited code, and universally praised encryption:

https://support.signal.org/hc/en-us/sections/360001602792-Signal-Messenger-Features

1

u/kixkato Feb 18 '25

I truly love signal for all of these reasons. I worry that it may fall victim to the constant trend of enshitification of things that are good. Let's hope that never happens.

1

u/[deleted] Feb 18 '25

I worry that it may fall victim to the constant trend of enshitification of things that are good.

Unlikely. It's run by a charity and its only funding is exclusively user donations.

1

u/jjhunter4 Feb 20 '25

Signal links are working just fine right now on x. I just tried it.

1

u/RelaxPrime Feb 17 '25

Once you start paying attention- some hypothetical victim is the reason for taking any and all rights away.

1

u/[deleted] Feb 18 '25

Before dictators become dictators, they use undefinable enemies to justify their overreach e.g.

War on drugs

Weapons of mass destruction

War on terror

Groomers

They're eating the dogs. They're eating the cats. They're eating the pets. https://www.youtube.com/watch?v=5llMaZ80ErY

0

u/Ecstatic_Potential67 Feb 17 '25

Even whatever may be the case, use another innermost layer of user-defined encryption, and don't share the keys. Do with whatsapp.

1

u/Pongo_Crust Feb 18 '25

WhatsApp is only encrypted using Signal’s protocol

1

u/Ecstatic_Potential67 Feb 18 '25

Do you know reverse engineering like those cool guys that I met in a software exhibition? You need to add another encryption layer. That's it.

0

u/Ecstatic_Potential67 Feb 18 '25

Just use another extra encryption layer.

1

u/[deleted] Feb 18 '25

WhatsApp is a bad example. They hold the encryption keys on their servers.

Signal is a much better option. The keys are held on users' devices.

All of Signal's code is public on GitHub:

Android - https://github.com/signalapp/Signal-Android

iOS - https://github.com/signalapp/Signal-iOS

Desktop - https://github.com/signalapp/Signal-Desktop

Server - https://github.com/signalapp/Signal-Server

Everything on Signal is end-to-end encrypted by default.

Signal cannot provide any usable data to law enforcement when under subpoena:

https://signal.org/bigbrother/

You can hide your phone number and create a username on Signal:

https://support.signal.org/hc/en-us/articles/6829998083994-Phone-Number-Privacy-and-Usernames-Deeper-Dive

Signal has built in protection when you receive messages from unknown numbers. You can block or delete the message without the sender ever knowing the message went through. Google Messages, WhatsApp, and iMessage have no such protection:

https://support.signal.org/hc/en-us/articles/360007459591-Signal-Profiles-and-Message-Requests

Signal has been extensively audited for years, unlike Telegram, WhatsApp, and Facebook Messenger:

https://community.signalusers.org/t/overview-of-third-party-security-audits/13243

Signal is a 501(c)3 charity with a Form-990 IRS document disclosed every year:

https://projects.propublica.org/nonprofits/organizations/824506840

With Signal, your security and privacy are guaranteed by open-source, audited code, and universally praised encryption:

https://support.signal.org/hc/en-us/sections/360001602792-Signal-Messenger-Features