Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're describing exactly the problem that key transparency helps to solve.

With this rolled out, the WhatsApp app itself will be able to detect, by default without any manual verification, if FB attempts to MITM the connection.

While this doesn't make it technically impossible for Facebook to modify the app and servers, it does make it organizationally almost impossible to do so secretly. Such a move would require the involvement of numerous individuals across multiple teams and would be noticeable to security researchers through changes to the app.

This approach is taking off in a bunch of similar problem spaces (web PKI, code signing, etc), so very exciting to see it applied here.

Randomly, and somewhat weirdly, Facebook actually offered one of the first Certificate Transparency monitoring tools, which made it possible to monitor all certificates issued for your domain using a very similar approach: https://www.facebook.com/notes/3497286220327506/



Not really ?

I don't see what prevents the app from pushing a decoded copy of the conversation ?

Even a variant of Skype was caught doing such (we only know about it because they left the server which had the raw logs completely open).

And still, Skype is very secure/encrypted/blablabla; which is true, but within the borders of local regulations.

https://web.archive.org/web/20090210230204/http://www.inform...

The end comment/advice from the US part is even a bit funny: "travelers should assume that all communications are monitored."


You're making my point: some Chinese Skype variant did this, back in 2009, and got caught.

There's just no way, in real life, for Facebook to add what you're describing to one of the most prominent messaging apps in the world without somebody noticing.

I'm not here to tell you that your WhatsApp messages are perfectly secure. If the CIA wants to read your messages they'll probably just hit you with the wrench instead of some FB exec. But I do think that transparency logs are deeply under-appreciated for their ability to make undetected mass-surveillance dramatically more challenging.


> There's just no way, in real life, for Facebook to add what you're describing to one of the most prominent messaging apps in the world without somebody noticing.

That assumes somebody is digging through each update and the thousands of classes. FFS the OG Facebook app was already blowing past the limits of Android in 2013 [1], and the current Whatsapp app isn't much better - just look at the current APK file:

    2023-04-12 11:38:58 .....      2578508      1171345  classes.dex
    2023-04-12 11:39:04 .....     13312588      6020223  classes2.dex
    2023-04-12 11:39:08 .....      7671448      3310145  classes3.dex
    2023-04-12 11:39:08 .....      2118352       945166  classes4.dex
25MB of already compressed Dalvik code, probably double that if you restore it to Java class files and triple to quadruple that in Java source files. It's impossible to audit that there is no routine pushing keys to, say, the usual analytics backend they use - and to make it worse, according to APKMirror, they push updates every few days [2].

Although my biggest question is... it's a fucking messenger app. Why does it produce a larger binary content than a full-blown Linux kernel?!

[1] https://engineering.fb.com/2013/03/04/android/under-the-hood...

[2] https://www.apkmirror.com/uploads/?appcategory=whatsapp


>Although my biggest question is... it's a fucking messenger app. Why does it produce a larger binary content than a full-blown Linux kernel?!

Because it does so much more than messaging. Also, UI code is generally very verbose.


Also, conversely, the kernel doesn't do that much. Most of the Linux kernel source consists of device drivers, which compile to modules rather than get bundled into vmlinuz. Many of these modules are also rarely built if ever. The kernel itself really is a pretty small fraction of the complete software bundle that makes a Linux system functional.


I know. But still, that amount of compiled code is insane.


> There's just no way, in real life, for Facebook to add what you're describing to one of the most prominent messaging apps in the world without somebody noticing

Your point moved from "key transparency is the defense" to "someone will notice". But if your defense is the hope of "someone noticing" you're in for a big surprise. Sometimes things go unnoticed. Look no further than OpenSSL, open source, used by billions, deployed by companies worth as much as small countries, and yet nobody noticed Heartbleed for years.

So I'll be very cynical that some development flag targeting a handful of people in an app like WhatsApp and then is removed will be so noticeable that it's a strong defense.


I think you are trying to say "it's never 100% secure", and the parent agrees with you. The parent is just saying "this is making it more secure (but not 100% secure)".


The trick is to push a modified version only to the few clients you want to attack. Use it sparingly and you won't get caught.


Or just hack the phone of those few clients with another attack vector. Doesn't mean that security is entirely useless. It depends on the threat model.


and most of all, do not forget the logfiles on an open server (it was their mistake, otherwise it would have been fine I think)


There are also tons of ways to exfiltrate data through known channels in ways that are difficult for security researchers to distinguish from otherwise secure app analytics code.

A crash/exception logging system, say, might appear to researchers to anonymize data, but it would be very possible for code to be written that happens to raise a mundane exception when specific users or geofences see specific words on screen, in a way where that list of users/geofences/words could be controlled by non-technical teams. The log message itself doesn't even need to carry sensitive data; its existence alone, when the trigger conditions are known, can be used to carry out a highly targeted attack.

Even open-source systems can be vulnerable to this: see e.g. https://github.com/signalapp/Signal-iOS/blob/eaed4da06347a3a... and consider the ways it might be possible for a small group of people at Signal to cause a specific set of messages to be seen as corrupt without raising any flags to the community auditing the code.

Of course, lack of visibility into runtime errors can lead to vulnerabilities as well. I don't think the solution is for us as a community to advocate for removing all error analytics in distributed systems. But we can't ever forget that: all analytics surfaces are attack surfaces.


Exactly, without Zuck opening the protocol and sanctioning the use of open-source clients it is not meaningful.


Somehow I think this is still possible. The engineers behind WhatsApp seem to be very talented, and they may be able to convince Zuck that an open client would increase trust in Meta's brands, and increase usage (which can then be used to promote other Meta's products).

If they keep the server-side closed, it's totally fair I think.


Or open source alternatives will pick up their work, and the WhatsApp engineers will probably be happy about it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: