Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I haven't seen the talk, but it sounds plausible to me: Technical people got strong crypto so they didn't worry about legislating for privacy.

We still have this blind spot today: Google and Apple talk about security and privacy, but what they mean by those terms is making it so only they get your data.



> Technical people got strong crypto so they didn't worry about legislating for privacy.

The article debunks this, demonstrating that privacy was a primary concern (e.g. Cypherpunk's Manifesto) decades ago. Also that mass surveillance was already happening even further back.

I think it's fair to say that security has made significantly more progress over the decades than privacy has, but I don't think there is evidence of a causal link. Rather, privacy rights are held back because of other separate factors.


As you point out, decades ago privacy was a widespread social value among everyone who used the internet. Security through cryptography was also a widespread technical value among everyone (well at least some people) who designed software for the internet.

Over time, because security and cryptography were beneficial to business and government, cryptography got steadily increasing technical investment and attention.

On the other hand, since privacy as a social value does not serve business or government needs, it has been steadily de-emphasized and undermined.

Technical people have coped with the progressive erosion of privacy by pointing to cryptography as a way for individuals to uphold their privacy even in the absence of state-protected rights or a civil society which cares. This is the tradeoff being described.


> demonstrating that privacy was a primary concern (e.g. Cypherpunk's Manifesto) decades ago. Also that mass surveillance was already happening even further back.

How does that debunk it? If they were so concerned, why didn't they do anything about it?

One plausible answer: they were mollified by cryptography. Remember when it was revealed that the NSA was sniffing cleartext traffic between Google data centers[0]? In response, rather than campaigning for changes to legislation (requiring warrants for data collection, etc.), the big tech firms just started encrypting their internal traffic. If you're Google and your adversaries are nation state actors and other giant tech firms, that makes a lot of sense.

But as far as user privacy goes, it's pointless: Google is the adversary.

[0] https://theweek.com/articles/457590/why-google-isnt-happy-ab...


I think it's a bit dismissive to claim that "they didn't do anything about it", just because you're not living in a perfect world right now.

As one prominent example, the EFF has been actively campaigning all this time: "The Electronic Frontier Foundation was founded in July of 1990 in response to a basic threat to speech and privacy.". A couple of decades later, the Pirate Party movement probably reached its peak. These organizations are political activism, for digital rights and privacy, precisely by the kind of people who are here accused of "doing nothing".

In a few decades, people will probably look back on this era and ask why we didn't do anything about it either.


Sure, that line of thinking makes sense, but I do not understand the alternative. Are you saying that if we (the users) got new legislation (e.g., requiring warrants), then big tech wouldn't do mass surveillance anymore?


Yes, I think if there were laws that forbid mass data collection by private companies, or assessed sufficiently high penalties in the case of a breach (such that keeping huge troves of PII became a liability rather than an asset) then big tech firms would largely obey those laws.


I think they're saying if they couldn't do cryptography they'd push for legislation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: