PGP co-founder Jon Callas on privacy, encryption policy, and ponytails

Technologist and inventor Jon Callas helped found landmark security firms and privacy technologies including PGP, Silent Circle, and Blackphone. During a long and distinguished career, he’s also worked at Digital Equipment Corporation, Apple, and Entrust, and alongside other infosec luminaries including Phil Zimmermann and Bruce Schneier.

Callas – who has a background in cryptography and created several Internet Engineering Task Force standards including OpenPGP – joined the American Civil Liberties Union (ACLU) as a senior technology advisor last year.

The Daily Swig recently caught up with Callas to talk encryption policy and privacy in 2019.


Tell us about your new role with the ACLU. Are you getting involved in any campaigns? What’s your perspective so far?

Jon Callas: I started at the ACLU on December 1. I’ve been there long enough to figure out a number of things. I’ve been working on two phases. First, those of us who are the technologists (I have a two-year fellowship) provide support for people who are doing their own work with lawsuits, investigations, and other things.

A bunch of times there’s research that I will do for them. It’s questions like: “How does this thing work correctly? What’s really going on when someone uses Google Home or [similar smart speakers]? What’s going on with facial recognition in different places?”

We can read documents as experts in the same way that our lawyer friends can deal with the cases themselves.

The other part of it is that we are looking at different projects around the world. I’m using surveillance as my umbrella, and the reason for that is that I need to have some sort of theme to say whether I’m going to do something or not.

It’s broad enough that it can cover things like encryption backdoors, location privacy, and use of machine learning in dodgy places. As you might expect, I’ve been looking a lot recently at encryption backdoors and spyware.


What are you finding with respect to encryption backdoors?

JC: We are working with people to come to agreements on why this is not a good idea. It’s designing for surveillance as opposed to designing for security.

[With] everything you do in your decisions when you are creating things that are going to create more surveillance, you are implicitly lowering security. That’s pretty much the choice that we have in systems design. The choices are, do we push towards more security and privacy, or do we push for more surveillance?


On this theme, GCHQ recently came up with a proposal for adding an extra party into an end-to-end encrypted chat via a ‘ghost’ feature. What are your thoughts on that?

JC: It’s a very interesting paper. It’s very well written. It comes up with a bunch of very good points, but it also includes ‘we’re going to let you guys encrypt – we’re just going to be on conversation’, which has been described as ‘we don’t want to break your encryption, we just want to break your authentication’.

There have been a number of papers that have been written [in response to the proposal]. What I have been looking at is how you would go and work it. It’s something that doesn’t work with the principles that they’ve proposed.

One of the assumptions that’s in there, for example, is that you’re going to be able to limit this to only the ‘good’ countries. Fundamentally, that’s not going to happen.

You can’t say: “Well only the US, the UK, our friendly allies – we’re the only ones that are going to use this.” Other countries are going to insist that they be allowed to play, too.

You have the obvious – China has been setting up its own standards for everything. They have their own encryption algorithms. They have certificates issued with those encryption algorithms. They are data homing to China – cloud data that Chinese users make use of has to be homed there. This is also a thing the EU had been doing with GDPR.

You’re going to have to give it to them. You’re going to have to give it to the Russians. You’re going to have to give it to Saudi Arabia. There’s no good way to shut any of these countries out.

That gets directly back to the question: “Are we building in surveillance into the system or are we building in security?” The countries that have a reasonably good human rights record and a reasonably good rule of law are relatively small [in number], and they’re not going to be able to keep this sort of tool [from being used by others]. It goes to everybody or none.

There are some other issues as well. One that I’ve been looking at is related to what a couple of EFF (Electronic Frontier Foundation) guys did. They were looking at a threat model to it. They pointed out there’s a number of ways you could detect that this [GCHQ proposal] would be working.

I agree with the assessment and some of the things that they did. It’s a good paper. If you looked at it from a dispassionate point of view, some of the things that they pointed out could be worked around. Nonetheless, you could very likely have a client that you run on your system. You fire up an app. It’s a WhatsApp checker and it tells you who’s in the conversation.

If anybody is concerned about government spying which, getting back to my previous thing, could be any government, then it [a checker app] would tell them who’s in the conversation. Here’s all these people who are there and there’s somebody unknown.

Does that mean it’s good for them anymore? It’s in the [GCHQ] paper that they need it to be silent, that they need it to run dynamically. If you run something either passively on your network or actively on your end point device that’s effectively a bug sweeper, then is it useful for them?

Ages ago with the Clipper Chip, the break that Matt Blaze found on the chip was that you could fake being a legitimate user.

There was this field called the Leaf for law enforcement access field. Matt [Blaze] came up with a way that you could forge a Leaf and say that you were playing the law enforcement game, but you actually weren’t. That was pretty much the end of the proposal because they wanted a way to set up encrypted phones, and they didn’t want to have one that pretended to play their game but wasn’t.

It’s a similar question that I have [with GCHQ’s proposals]. If there were ways to show that a system built on a grand architecture sense would work, and you could tell who the cheaters are, you could tell who was listening in, does that mean that this [the proposal] isn’t necessary or that it’s not going to work?

I think this is another facet of what we technologists have been saying all along: which is: “This isn’t going to work and there are a bunch of ways that this is not going to work.”

We have seen the discussions about how the low-level cryptography works, but I am looking at ways the higher-level system wouldn’t work.

Those are the two major things that I am finding. If you are going to allow anyone to play the game, what are the consequences of that? Does that mean that we need to have an international standard and this spying thing regulated by the ITU [International Telecommunications Union] or somebody like that because you’re going to need different policies for different parts of the world? Somebody is going to have to put these together and regulate them.

Their proposal says that transparency is important, and things should be audited. So you’re turning it into one big international standards and code review systems.

And then if you can detect it running all the time anyway did you actually do anything, or did you take NSO Group’s Pegasus [spyware] and build it into the system?

My conclusion is that, yes, it’s going to be detectable. It is a huge engineering effort to govern even who the good countries are and what they could do, and this runs counter to the principles of rules of law, transparency, and proportionality that they say are extraordinary important. You can’t square this circle. You can’t live up to what we believe good principles and build this thing, too.


Moving on from the UK, what do you think of the Australian policy of pretty much proscribing encryption?

JC: It’s a bad move for them. They have said they are going to have their own commissions that look at what you would do with this. Just like we all say, this is ridiculous – you can’t ban encryption, because now you’re banning all of security everywhere.

They are going to come up with some way that they would implement this. There’s the “I want to wait to see what they will say”, but there’s also “I don’t see how this works either”.



Are we seeing that history is repeating itself in encryption policy?

JC: I am somewhat frustrated that it’s the same discussion that we’ve been having for 20 years. Nothing has changed, and there are huge parallels with how we got here. The encrypted phones of the late 80s and early 90s were created because of Russian, and then Iron Curtain, spying.

It was those nations spying on the large companies of the time, like IBM and so on, and they wanted a way to have their cake and eat it, too.

As we found out then, there was no way to build a spying system that was also a secrecy system or – to put it another way – to build a spying system that only let the good guys in.


If we’ve been having this discussion for the last 20 years what can happen to break out of it?

JC: I think that counter regulations is one of the things that is necessary. We are seeing a lot of very good regulations showing up in parts of the world that relate to privacy and security anyway.

I’m a huge fan of GDPR [the EU’s General Data Protection Regulation]. It’s not perfect – we can quibble about it – there are going to be discussions about how it is going to be implemented, and there’s going to be the equivalent of court cases and test cases all up and down the line.

But, if you look at it, the basic principles are completely reasonable. Yes, there are absurd things that you can do on a number of fronts, but I think we’ll get past those. It is mandating that people have a right to privacy.

While it doesn’t mandate secrecy, it mandates control – and it turns out that one of the good ways to make sure that a company has control and is able to say, “Yes I am being a good steward of this data”. It includes encryption and secure enclaves and all the other sorts of security technology that we are seeing coming up.

Here in the US, California has a privacy law now. Illinois has a biometric privacy law. There’s a lawsuit on that – some parents are suing an amusement park because they took fingerprint data from their kids without permission. The Illinois Supreme Court just unanimously said that yes, the lawsuit can go on.

So now it looks like we have a law that governs a particularly interesting new form of privacy. We have something going through the court system which will govern how these sorts of things are going to work.


We’ve spoken about the security policies of governments. What about the privacy threats posed by the likes of Facebook and Uber? How can you advocate for privacy in a world where it is routinely given away?

JC: Part of the reason I am at the ACLU is that I think there are a number of things that are better solved through policy than they are through technology. Each of them has a power that the other doesn’t have.

We saw the Californian data breach law of years ago become the inspiration of entirely new sets of security technologies, with an emphasis on reporting and so on. It pushed us in the right direction.

I think that the privacy laws that are coming up in the US and in Europe will also push us in the right direction. I think that we’ll have technologies that will back these things up. Having something in place where we as a society say, “These are how the rules ought to be, you have to do certain sorts of things” is going to have a huge effect on the technology and on companies.

We are also seeing real competition happen on the basis of privacy.


Can you give an example of that?

JC: There is a lot that coming from Apple, for example. They are turning privacy into a huge selling point for their devices.

For example, they are going to have a new credit card that isn’t going to collect data on you. We still don’t know all of the details. It’s probably not going to be perfect, but now all of a sudden, we have two huge companies – Apple and Goldman Sachs – that are saying: “We’re not going to make extra money by monetizing your data.”

Even if it’s not perfect – which I don’t expect it to be – it’s a huge step forward. It’s competition. Once they start and someone else says “I need to do compete with that”, then we’ve changed the nature of what people are doing.

At CES [convention], the Vizio chief exec flat out said: “I can’t sell these TVs for as cheaply as I am selling them unless I can get your data.” The obvious follow-on question is: “So how much do I have to pay you? I’ve got my wallet right here.” You know, give me the number.

We are discussing this now – and we are being open about it, which we haven’t been – and that’s really, really good news.

We’re seeing both public policy and private sector giving themselves up on one side or another of it. This is the most hopeful that I’ve been in years.

One of the things I wanted to do with Blackphone [a privacy focused smartphone] was to prove that people would buy something that is privacy protecting and was leading with that as it as its main feature. And we were successful enough with that, it showed that there was a market.



Chinese telco equipment maker Huawei has been much in the tech news recently. There are fears that its technology could present a spying threat. However, the telcos themselves see Huawei’s technology as the cheapest and fastest way to roll out 5G. As a technologist, do you think concerns about China’s technology are credible, or exaggerated for political reasons?

JC: I think that there is a huge need to be concerned that is Huawei, and also some other companies. We know that there have been things that have gone on in the past.

There are undoubtedly some things that are legitimate that we don’t know about. There’s undoubtedly some political maneuvering going on.

It is bringing to the forefront the question of, how you know these things?

We want a manufacturer of any equipment to give us the full information about what’s going on. Going back to the previous discussion, having the chief exec of Vizio say that my televisions collect data is the first step.

There have been other discussions going on about supply chain things, as we saw with the recent story about Asus. The question then becomes: “How do we secure the supply chain?” Securing the supply chain is not only technological. It’s also going to be policy driven.

If you look at what’s going on with Huawei, this is a policy-driven discussion going on about the supply chain.


I don’t think the whole thing is unprecedented.

JC: No. If you look at the Bloomberg/Supermicro article which, even if things didn’t happen exactly the way that the reporters said it did, we know that something happened. We know, for example, that there were server motherboards that went to one or more companies that had malware in their drivers.

That kind of sounds familiar today with the Asus thing.

So even if it wasn’t hardware on the board, even if it was something simple like a [malicious] BIOS on it, we know that something happened.

Even if it was that somewhere in the supply chain, buggy software was put on a server motherboard and nobody found it, we’re back to how do you get to the supply chain?



Do you think any information that came out as a result of the Snowden leaks has had an affect on encryption policy and, if so, what?

JC: I think it has had a long-lasting effect. What we found out were things that many people had suspected for some time. It has directly affected the discussion that we are having both technologically and from a policy standpoint.

I firmly believe that the major difference between the NSA and other [companies] is that Snowden said what the NSA is doing.

We are seeing echoes of this. It’s very likely that stuff that’s going on with the Huawei things is stuff we don’t know about, but that’s similar.

We are seeing this in discussions of NSO Group, other spyware, etc. It is no longer crazy to discuss these things.


Do you still have a ponytail? I ask because I was wondering if it’s anything beyond a sheer coincidence that so many infosec experts have ponytails: Bruce Schneier, Mikko Hyppönen, yourself. Is it the cybersecurity equivalent of the capes of superheroes?

JC: I don’t know if it is more than coincidence. There are a lot of us that way. I started growing it because I’d spent many years doing many interesting things to it. My adventurous hairdresser moved away, and I wondered how long it would actually get.

Then I kept it because it was just easier. A ponytail is easy. Wash it, let it air dry, tie it back. The colors came via finding another good, adventurous hairdresser.

And then I decided to go to a conference of fans of the Miss Fisher’s Murder Mysteries, which are set in 1928-29. We all went overboard: “Oh, we should dress in period clothes. Oh, we should submit talks. Oh, we should enter in the clothing competition.”

So I cut it to a period style, and have kept it for the last year. I rather like it this way, and people say it makes me look 15 years younger, which – given that I’m now described as “elder statesman” – is fine with me.