dcsimg
RealTime IT News

Philip Zimmermann, PGP Creator

Philip Zimmermann Phillip Zimmermann is the creator of Pretty Good Privacy, an e-mail encryption software package. Originally designed as a human rights tool, PGP was published for free on the Internet in 1991. This made Zimmermann the target of a three-year criminal investigation, because the government said it violated U.S. export restrictions for cryptographic software.

Network Associates Inc. (NAI) acquired his company, PGP Inc., in December 1997, where he stayed on for three years as senior fellow. In August 2002 a new company called PGP Corp. acquired PGP from NAI. Zimmermann now advises the PGP board and is enjoying his status as a consultant, but said he would consider starting another company focused on - what else? - encryption.

Recently, internetnews.com sat down with Zimmermann at the e-Mail Technology Conference in San Francisco to talk about e-mail, the controversial "Do-Not-Spam" list and how technologists can protect privacy and still make money.

Q: Let's start with future of e-mail, what would you like to see happen?

I think that e-mail should be more encrypted than it is today. We've had a history of trying to get encryption adopted by the masses of e-mail users and it's been difficult because of ease of use issues. Not so much ease of use with the user interface, but ease of use in terms of the difficulties people have in grasping the concepts of "pCertification" and trust models and things relating to public key infrastructure.

I think it its important that e-mail be mostly encrypted because we enjoyed that kind of privacy with postal mail with envelopes. It would be a shame to abandon that privacy as we move to the digital world.

Q: What is holding us back from that now?

The cognoscenti understand public key cryptography. They understand why it's important to know whether a public key belongs to the name that is attached to that. But your mom doesn't understand that. And if you want your mom to use encrypted e-mail, you are not going to ask her to learn about these abstractions of public key infrastructure and key certification. So if you can develop a way for thousands of employees in an enterprise to have their e-mail encrypted or decrypted automatically without knowing about it, the best way to do that is to put it in an e-mail proxy and have a box sitting next to the e-mail server that serves as a proxy to the e-mail server. PGP has a product that does that called PGP Universal.

Q: Is cryptography and encryption software taken for granted these days? Do people expect too much that technology will relieve them of responsibility?

In the case of e-mail encryption, I don't think that it is taken for granted. In fact quite the opposite. People don't think about it at all. That could sound like people are taking it for granted but there is a threshold of not thinking about it in which you are not taking it for granted. So we need to raise it above that threshold where people expect their e-mail to be encrypted and, in fact, it really does get encrypted.

In the case of spam, we are going to have to use technology to beat spam. There has to be some awareness of users not to be taken by fraud. But as far as the delivery of spam is concerned, you can use technology to do that. One of the most interesting that I've seen is technology that tries to impose some kind of cost on sending e-mail. Not cost in dollars, but costs in compute cycles. Notably, the Penny Black proposal, which requires that when an e-mail server wants to send e-mail to another e-mail server it asks for a cryptographic puzzle from the recipient's e-mail server, which it must solve before the recipient's e-mail server will accept the e-mail. And it takes a couple of seconds in compute cycles to solve that puzzle. So a routine e-mail is not a problem because you don't mind if your server takes a couple of seconds to send an e-mail. But if you are sending a 100 million pieces of e-mail, that is not feasible to spend a couple of seconds of compute cycles on a piece of e-mail.

As bad as the junk mail that I get is, at least I don't get everybody in the world sending me junk mail. Because you have to spend a tiny amount of money to send some junk mail in the postal system. It should cost a tiny amount of something to restore it to something that resembles the postal system.

Q: What are your feelings about the controversial "Do-Not-Spam" list and the Federal Trade Commission's ruling?

The leading argument they made was that a spammer could use it as a gold mine of e-mail addresses to send spam to. I thought that was an interesting observation. I think it is true if you make just a "Do-Not-Spam" list like we did with the "Do-Not-Call" list for phone numbers. What they should do is create a "Do-Not-Spam" list that is a list made out of cryptographic hashes of e-mail addresses. Not the e-mail addresses themselves, which could be used by the spammers, but a one-way hash function like using the secure hash algorithm that NIST [National Institute of Standards and Technology] has, that we use in cryptographic systems, and store the hashes on the list. So spammers could easily tell if your e-mail is on the list or not by hashing it and comparing it to what is on the list. I think that is a simple mathematical solution to the FTC leading reason why they don't want to make a "Do-Not-Spam" list.

Q: What does the future hold for PGP and its role as a "defender of human rights"?

Visualize two pie charts. The first pie chart is all the e-mail in the world. You find that a very narrow slice of that chart is encrypted. Most e-mail is not encrypted. Now, if you expand that slice of encrypted e-mail to a second pie chart, where all that e-mail is encrypted, you would find that nearly the entire pie chart is PGP. I'm proud of that second pie chart but not the first one. Why is it after 13 years of PGP deployment, we have such a thin slice of e-mail being encrypted. The answer is that the learning curve is too much for technically unsophisticated people. The solution to that is to find a way so that it is done automatically.

I feel that this is particularly important in now in this legal climate because the Patriot Act is increasing government surveillance capabilities. Reducing legal obstacles to government surveillance . . . reducing judicial oversight, which I just can't imagine any justification for reducing any judicial oversight, which parts of the Patriot Act changes. We should try to repeal parts of the Patriot Act or at least (exempt) our e-mail and other communication infrastructure to make it more resistant to that kind of surveillance. This would also have the benefit of protecting our communication infrastructure nationwide to interception by terrorists.

Q: How advanced are the terrorists in your estimation? Are they as intelligent as the people over at M.I.T.?

No. No they're not.

Q: Could they be?

They have people who have knowledge of computers working for them and they use encryption. But I don't think they are as capable as intercepting communications as major governments are.

Q: Talk about the struggles you've had in the beginning getting your technology to light. What can future technologists learn from that?

The struggles that we had in the 1990s to lift the U.S. export controls and to prevent the imposition of domestic controls is something that we fought and won. And I think that we have a good chance of holding the line on the erosion of privacy. It would be difficult for the government impose controls on cryptography now because of the tremendous momentum that has built up in the industry to depending on them. It's in every browser with strong cryptography with SSL there are products in Europe that compete with American products. Our global e-commerce depends on it.

In general, I would like to suggest to people that are going to deploy technology to examine what are the social effects of what their technology. Is it going to help privacy and civil liberties or is it going to hurt it. And to try to use that as guidance in choosing which kind of technology to deploy. Try to be socially responsible.

There are ways to make money with technology that are not as corrosive to your privacy. For example biometrics could be deployed in a way that erodes privacy or it could be deployed in a way that doesn't erode privacy. If you store biometric specimens in a central database, I think that would erode privacy. But if you use biometrics locally on a smart card where the only copy of the biometric specimen lives only on the smart card and nowhere else, I don't think that erodes privacy. So when you deploy technology as an engineer you can choose to design it in certain ways that have a smaller impact on privacy and civil liberties and still make money.