Privacy (by alaric)
I have a looser attitude towards privacy than most people, but I have began to reconsider that lately.
Generally, I believed (and still do) that anything I do in public is pretty much exempt from privacy. I have no privacy objection to pervasive CCTV, because if I do anything in a public place, somebody could be watching me anyway. The fact that my enemies can now just consult massive archives of CCTV to find me rather than having to get somebody to follow me around isn't, in my view, a huge deal. Indeed, I quite like the idea of sousveillance, having my own recording of what happens around me. It might be inappropriate to be doing that in circumstances that the people around me consider "private", so I'd turn it off for their comfort when it seemed right to do so, but I would still assume that anything I do in the presence of other people is basically recorded to some extent - after all, it's in their memory, at least!
Likewise with monitoring my network traffic at my ISP; I have never had any illusion of privacy there. I encrypt traffic that matters, and accept that the existence and destination/origin of encrypted traffic might be used by my enemies for traffic analysis.
So, I didn't really have any objections to mass surveillance; I had far more objection to the facts that encryption is far from ubiquitous and that information security is not taught in schools. My feeling was that if I can't stop an enemy that doesn't abide by the law (eg, organised criminals) from performing traffic analysis on me, then I can't assume it's private; I can stop them reading my stuff or impersonating me by using public key cryptography, so as long as the law doesn't hinder that, I'm content.
As such, I always wished that Web browsers would just include some kind of unique user ID in the headers, ideally backing it up with a public-key signature of the entire HTTP request. Then we could dispense with session cookies, logins, and even things like OpenID; we'd just authenticate to our browser by supplying the keypair in some browser-dependent way, and then head out onto the secure-single-sign-on Web. There's no loss in privacy compared to the current status quo that people are happy to identify themselves to web sites with email addresses, but it'd be a whole lot simpler for users and for developers. And so that, basically, is the security model I developed for ARGON.
However, I am starting to change my mind.
I've always felt that the "hole" in my approach to privacy was that it depended on my own knowledge of security and my enlightened use of encryption; I wanted sufficient education to bring everyone to that level. Encryption tools are generally a bit clunky, but if more people wanted to use them, that would create demand for better tools (or, more pertinently, better integration into the tools they already use). I felt that if we could just get people to encrypt and sign their communications, and encrypt their storage, and use Tor for things where the cost is worth the protection against traffic analysis, everything would be fine.
However, what has made me start to change my mind is the move towards storing one's data on third-party servers. By which I mean, living your life through Facebook, or letting Google store your email and your documents. People are moving away from having a computer full of their stuff, and communicating semi-directly with their peer's computers, towards letting third parties hold all their stuff. Often third parties they don't pay money to and are in no contract with, so they have little or no leverage over.
It's easy to say that educating people in computer security would make them realise that's a bad idea, but I use many of these services despite not trusting them one bit; I do it because network effects force me to. I could run my own StatusNet server on my own hardware, but instead I use Twitter in order to make it easy for people to communicate with me. I use Facebook because it's the easiest way to keep up with my many peers that do, and sometimes because I am forced to; an organisation I am a member of uses a Facebook group for important announcements. Many people do not publish an email address, but instead require me to contact them through various third-party services.
In effect, we are being forced to hand our information to third parties, and to trust them with it. Variations on these services that store your information on hardware you control exist; variations on those services where you actually pay a service provider to store it on their hardware (in exchange for them looking after maintenance, amortizing up-front costs, and so on for you, and where they are more incentivised to keep your stuff secure so you trust them than to try and find ways to make money out of it) also exist.
But they are not popular, as the big "free" providers have the vast majority of the users, and the value of these services is in all your peers already being on them. Now that worries me.
I'd really like to see more push-back against this. If enough people used decentralised software like Diaspora or ran their own mail systems, then the network effects would benefit those, rather than centralised commercial outfits. Clearly, some large incentive needs to be found to push people over, and an unpleasant transition period where everyone needs to be on both. Eventually, organisations like Facebook, Twitter and Google would find themselves forced to interoperate with the decentralised protocol or lose their place in the market, and then would find themselves having to compete on points such as "privacy" when the same ease-of-use and functionality can be had elsewhere for little cost.
But, we need technical measures as well. Build sensible public-key infrastructure into the core of applications (including Web browsers). Ditch cookies, and replace them with explicit authentication: provide a system of public-key-signing HTTP requests as I suggest, but turn it off by default, and force web servers to request it with a status code, as is already done for HTTP authentication (not that that is used for web applications, alas). Let browsers seamlessly support multiple identities, and when a web site requests identification, let the user choose which identity to use; and then colour the border of the Web page according to the identity in use so they don't forget. And while providing identity management through that (controlled) mechanism, try as hard as possible to remove all other means of identification - don't send headers leaking lots of information about the user-agent and its capabilities and settings, and disallow Javascript from querying that sort of thing. Bundle Tor with browsers, so it can be turned on and off with the click of a button, as part of the "private browsing mode" found in many browsers.
I still don't think there's much point in trying to fix this with making information gathering and retention illegal (the recent PRISM scandals suggest that legitimate authorities will find ways to work around limitations on their information gathering, and organised criminals simply won't give a damn anyway); we need better technology that makes us anonymous by default and pseudonymous when we want to be. But there may be some value in legislation helping to break the stranglehold on the social software market held by big centralised organisations!
I'm updating the ARGON security model to work like this (not that that makes a difference to the Real World, mind...)