As evidence, the lawsuit cites unnamed “courageous whistleblowers” who allege that WhatsApp and Meta employees can request to view a user’s messages through a simple process, thus bypassing the app’s end-to-end encryption. “A worker need only send a ‘task’ (i.e., request via Meta’s internal system) to a Meta engineer with an explanation that they need access to WhatsApp messages for their job,” the lawsuit claims. “The Meta engineering team will then grant access – often without any scrutiny at all – and the worker’s workstation will then have a new window or widget available that can pull up any WhatsApp user’s messages based on the user’s User ID number, which is unique to a user but identical across all Meta products.”
“Once the Meta worker has this access, they can read users’ messages by opening the widget; no separate decryption step is required,” the 51-page complaint adds. “The WhatsApp messages appear in widgets commingled with widgets containing messages from unencrypted sources. Messages appear almost as soon as they are communicated – essentially, in real-time. Moreover, access is unlimited in temporal scope, with Meta workers able to access messages from the time users first activated their accounts, including those messages users believe they have deleted.” The lawsuit does not provide any technical details to back up the rather sensational claims.



Tox also isn’t that great security wise. It’s hard to beat Signal when it comes to security messengers. And Signal is open source so, if it did anything weird with private keys, everyone would know
Well, no. At least not by default as you are running a compiled version of it. Someone could inject code you don’t know anything about before compilation that for example leaked your keys.
One way to be more confident no one has, would be to have predictable builds that you can recreate and then compare the file fingerprints. But I do not think that is possible, at least on android, as google holds they signature keys to apps.
Signal is also on F-Droid, so it should be verifiable
Signal has reproducible builds and here’s the instruction how to check it on Android https://github.com/signalapp/Signal-Android/blob/main/reproducible-builds/README.md
If they have, then good. Wasn’t sure it was doable with current google’s signing process. Highly unlikely someone hasn’t tampered with them then (far easier to target the site displaying the “correct” fingerprint).
However, my original point still stands. Just because it is open source doesn’t in itself mean that a bad actor can’t tamper with it.
Well, Whatsapp uses signal. Bad timing
It only uses some of signal’s code. Not necessarily the OOTB key storage and security.
How?
Unless proof is given, assume troll
Read the article? An app using signal does not imply that your data is still encrypted from corporations or government. Your neighbour joe is not very likely to break already established SSL, so using signal feels like someone is trying to sell me a bridge. Sense of false security. In fact, that was probably their goal all along.
WhatsApp is using Signals protocol for communication: https://signal.org/blog/whatsapp-complete/
I don’t fully understand what it entails, but from what I understand is that yes, WhatsApp is using the same encryption and message flow that signal uses, but you’re still using Meta’s app, and they can just read the plaintext message from there.
To my knowledge, under Signal, the encription keys are locally generated and stored, and the traffic flows between end points as a closed packet.
This does not seem to be the case here, as the keys are generated and stored outside your equipment and, thus, are viable to be used by a third party to access packets.
But I admit I speak heavily burdened by technical ignorance.
My understanding is they’re sending a request to your device that then decrypts and uploads messages, not storing the keys outside your device.
that’s incorrect. with whatsapp, your keys are stored on meta servers (the same as things like imessage). they can simply decrypt them whenever they like, just like being signed in as you. it’s completely invisible to your client
Ewwwwwwww
Or they can make a copy of the encryption keys on creation. Using the code is very different than using the code unedited, or using all the code.
Read more than just the title ffs
I did and nowhere is Signal mentioned in the article.
You state Whatsapp uses Signal. So, again: how?
The article does not describe what encryption it uses, it described how they’re abusing it. Whatsapp using Signal protocol is public knowledge.
What I’m trying to say is that a company using signal for it’s messaging app does not imply your data is safe from that company or governments.
You recommending an app purely because of Signal protocol under an article about how an app abuses signal protocol is pretty fucking ironic (aka. bad timing)