RSA CTO on machine learning, cryptography & quantum computing

Zulfikar Ramzan, CTO of RSA talks about complexity theory and cryptography, and speculates on the implications of quantum computing developments on cryptography.

Photo by Logan Kirschner.

After finishing his cryptography PhD from the Massachusetts Institute of Technology (MIT) in 2001, Zulfikar Ramzan found many of his colleagues continuing down the academic path. He wanted to sink his teeth into real-world problems, however, and found himself looking for work in a company that allowed him to apply academic knowledge at the same time. 

He joined Symantec to work on large data sets and machine learning, left Symantec for a startup called Immunet, which was bought by Sourcefire, which in turn was bought by Cisco. Feeling “the itch” again, (as he calls it), he left to join a startup that was purchased by Blue Coat, which in turn was acquired by Symantec. 

After having inscribed a full circle through the security vendor space, he received a LinkedIn message asking: “Hey, RSA is looking for a new CTO, would you be interested?” (his first thought was that he was being pranked). Ramzan joined RSA as the CTO in 2015. So he has yet again walked a full circle, working for the company co-founded by Ron Rivest, his PhD advisor at MIT. 

Ramzan recalls many of the exciting developments along the way, and gives us a tour through computer science theory and speculates on the implications of quantum computing developments on cryptography.

We’ve come a long way since the days of RSA-129.

Actually, RSA-129 is what got me interested in RSA, the algorithm. Because it was ‘92- or ’93 when they started really making progress on [solving] factory RSA-129. I was just enamored by it because it was an interesting mathematical problem — if somebody were to tell you that there’s a solution to it, you wouldn’t believe it. So that’s why I vividly remember where I was when I saw the email about RSA-129 being cracked. It was a life-changing moment. 

It was machine learning that got me more deeply interested in cryptography. Because they’re exactly the opposite fields if you think about it. Machine learning is when you get to see data that goes into and out of a black box — the goal is to figure out how the black box works. Now cryptography says, “I want you to create a black box so that, even if you get to see what goes in and what goes out, you can’t understand how the black box functions.” And they’re actually, mathematically, the exact almost duals of each other. I was very interested in the complexity of machine learning; what are some of the problems you cannot possibly solve using machine learning? And it turns out that there is a class of problems that you cannot ever tackle using machine learning. 

Does that bring in the P versus NP problem?

P versus NP is about worst-case complexity, whereas cryptography is more concerned with average-case complexity. Now factoring is considered to be hard to solve — but in the worst case. In the average case, that should be pretty easy; let’s say half of the numbers are even and I pick a random number. I could just guess that two is a factor and I’d be right half the time.

In cryptography, you need to come up with situations where even the average instance is hard to break. Whereas in the P versus NP scenario, what they’re concerned with is worst-case complexity of the problem. 

I was interested in complexity theory and then in machine learning — cryptography is at the intersection of those two fields in many ways. 

Do you think quantum computing is going to break traditional cryptography?

It has been a big open question for a long time. Starting in the early ’90s with the work of Peter Shor, where he first showed that you could do things on a quantum computer much more efficiently, specifically around factoring of RSA-type moduli. But right now, I think the jury’s still out on where and how long it’ll take for us to get to a point of what we call quantum supremacy. 

And I think there’s a lot of challenges on the engineering side for building a quantum computer. I mean, they are hypersensitive — if you modify the temperature by the slightest amount, they no longer work effectively. The error rates are incredibly high, so you have to build in quantum error correction. We’re seeing a lot of advances in this area, but I still think it’s going to be quite a while before we get to the point where traditional cryptography — like RSA and Diffie-Hellman and all of the curve-based approaches — are going to become obsolete through quantum computers. 

At the same time, now is a good time to think about what the future looks like beyond traditional public-key. There’s a lot of work now by National Institute of Standards and Technology (NIST), for example, to solicit contributions for a post-quantum cryptographic algorithm. A lot of work based on lattices seems to be the most promising approach.

Even the RSA algorithm, from after it was conceived, took a lot of time before being used extensively in practice. First you had to think about implementations, which back in the ’70s were non-trivial to solve. It’s not just about understanding the textbook implementation, but actually understanding how to pick the prime numbers, how to pick the right exponents, how to avoid things like timing attacks. Sometimes people may not leak the actual data, but if I can figure out how much time it took to do the computation, that can leak something about the key that’s being used. With any algorithm, it can take a decade or two before we have enough confidence in how it’ll perform in the real world and we’ve worked out all the subtle kinks. We saw that with the RSA algorithm and I think with whatever comes up next is going to have that same issue.

Continued in Part II.