RealTime IT News

Paul Kocher, President, Cryptography Research

Paul KocherPaul Kocher has been thinking about computer security since his undergraduate days at Stanford University more than 15 years ago.

He gained international popularity for his research in cryptography and for being the man who designed the security protocols that make up SSL (Secure Sockets Layer) v3.0.

More recently he has led research in developing differential power analysis and designs for securing smart cards and other devices against attacks.

These days, Kocher is president and chief scientist for Cryptography Research, a 10-year-old San Francisco-based company that provides services and applied research to solve some of the world's most complex data security problems.

Internetnews.com sat down with Kocher to talk about his thoughts on the trends in security attacks and how cryptography can help solve them.

Q: Security has improved exponentially over the last 10 years. Why does it seem like a gargantuan task to completely secure a system?

There are trends that show up every year. One of the basic ones is that, with algorithms, you have in many ways a solid foundation for building security systems. But once you get above the algorithm or protocol layer, really, nobody knows what they are doing anymore.

We have no way to make software bug free. We have no way to know if the networks really are secure other than to observe our failures. We have little understanding of how systems can be built that are low risk. There are a lot of parts, but they are very reactive to failures. What most people want are systems that are reliable and don't fail in the first place.

A lot of our research is looking at how to make systems that fail gracefully and recover comfortably.

Q: One situation researchers have found recently is an increase of attacks against hash functions. What seems to be the problem?

Think of hash functions as the glue that connects all these algorithms together. So if you want to make a digital signature, you create a hash function to convert a large message into something short or if you want to divide a T [a common cipher text] as part of a protocol, you would use a hash function in the process.

Over the last year, there has been a lot of new research, which is using a technique called the neutral bit technique that research firm LEBM developed. That technique is a new way of attacking these things.

The major hash functions that were being used have had these big chips in their armor discovered. The MD-5 hash function, which has been around for quite a long time and is still widely used, was shown to be vulnerable to attack by something called "collisions." Basically it means that it is possible to manufacture two messages that have the same hash, and that is not supposed to be possible.

And then there is a new result announced that the same kind of attack is possible -- but it would take a bit of computer power to do it -- against the SHA-1 [U.S. Secure Hash Algorithm 1] hash function, which is by far the most widely used hash function today.

SHA-1 is in easily 400 to 500 different programs installed in a typical computer. It is in almost everything where you have any security protocol.

Now it is not something that users need to go out and unplug their computers from the Internet. But the attacks mean that there is a process that needs to happen now where we go to find new algorithms, and we need to understand that a lot better.

It is the part of the natural research attack and development process that will be difficult for a lot of product vendors, because there are a lot of legacy systems and huge amounts of code and a huge number of protocols that all have this algorithm hardwired into them.

Q: Will these vendors tackle the problem themselves or is there a third-party solution? Who might be ahead of the game at this point?

Most of the larger companies will have their engineers go through their products to look for what algorithms are used in the code. This is not an easy process, because if you shipped a product 15 years ago, you may not have any engineers that are still paying attention to it. Certainly for new development, this is an issue people are going to be thinking about.

One challenge with hash functions is there is not really any single algorithm you can point to and say that this algorithm is well studied and it is secure against these attacks. There are some algorithms that are relatively new or not widely used that it is not evident that an attack applies.

Q: One term being applied to your research is this notion of "renewable security." Can you help define what it is and where it is used?

Over the last 20 years, most of the work in computer security has focused on making systems that are robust against attack. The question of what you do after a failure has occurred is an area where there have been some systems that have been built successfully. But for the most part there is very little research and very little foundation for them.

If you look at anti-virus software, for example, that is a case where it is a very reactive method. When it comes to protecting against movie, music or game piracy, there is this notion that you have to build this really strong box and put this stuff into it and hope your box never gets broken. This model turns out to be unrealistic. You can't build cheap complex systems that are developed and deployed by engineers that don't know anything about security.

One area where we are really working with that model is a system to control piracy. There you've got a lot of really complicated unpredictable threats. Defenses are very hard to make effective. But you have a business model where you can tolerate reasonable amounts of fraud and have some failures. As long as the new movie you are shipping doesn't get pirated as quickly, that is a big win.

Q: Storage is a hot topic these days. How is cryptology helping keep things safe in the long term?

Security for storage data is a really interesting problem because you have this requirement that you have security over time. We are having a hard time making systems that can be secure even temporarily. With a normal protocol like with a phone or e-mail, the security is negotiated instantaneously. For stored data you have a lot of complicated requirements, like when you deploy your system is long before you use your data. So you will typically save data today and then want to use it in a lot of rich and complicated ways later.

You have these challenges like how do you put your security into your enterprise in a way that the technology that you deploy today will have the flexibility that you need later but not be so flexible that they are easy to attack.

It turns out that there are quite a few different types of architectures people are using for storage today, and there is not a consensus as to which is best. Some people put security at the application layer, which means that you can get good and rich integration. But if you want to handle a suite of programs, you are going to end up with this patchwork of techniques.

Some people put it in the network fabric so you have an IPSEC encryptor next to your storage devices or near your servers. But once it is on the PC or on the storage device, it is not encrypted. That approach has the big drawback in that you are only protecting the communications channel and not protecting the data.

The difficulty of managing data securely is huge, especially if it is data that needs to be protected over a long period of time.

Q: What do you see as the major trends in cryptography and IT security in the next 10 years?

What we are seeing are these targeted attacks, such as someone who is trying to read your CFO's e-mail in order to do some insider trading.

The current counter measures like spyware detectors are reactive in that they observe that thousands of computers have been affected by some attack. But if it is a directed attack, we don't have the tools to handle that today.

Most of the products are designed to protect most of the people most of the time. They are not designed to protect a situation where one particular person is being targeted or where there is a narrow focus determined attack.

If you can keep 99.9 percent of your computers virus free, you've done a pretty good job at handling the virus problem. Keeping 99.9 percent of your computers safe is no good if that 0.1 percent is your CFO.