SAN FRANCISCO--Technology advances and evolves at a frighteningly fast rate, which is great for users, but the pace of change makes it ever more difficult for security technology to keep up.
Security is difficult to get right, and that challenge is is made more daunting when the systems and devices change constantly. The task of figuring out how to defend a given system grows more complex by the day, something that even some of the pioneers of the security community struggle with.
“The most trustworthy computer I’ve ever owned had two floppy drives. When you were done with it, you powered it down and you could be reasonably sure that nothing foreign happened to it,” Paul Kocher, a cryptographer who helped develop the idea of differential power analysis attacks on cryptosystems, said during the cryptographers’ panel at the RSA Conference here Tuesday.
The same thing certainly can’t be said about today’s computing devices. Modern devices are rarely shut down completely and are subject to an ever-widening array of attacks, many of which were not even contemplated by the designers of software and hardware from just a couple decades ago. Attacks always get better, and while computing devices and security have improved as well, it hasn’t been an even race. Many of the attacks that are prevalent today take advantage of the complexity of target systems, and complexity is usually the enemy of security.
“Thirty years ago, we had computers that we knew how they worked. That’s not true now. Who knows what any of these computers are doing?” said Whitfield Diffie, one of the pioneers of public-key cryptography.
“I don’t think Australia can do better than the NSA, so I don’t think this is going to end very well for any of us.”
Part of the problem, the panelists said, is that modern computing relies so much on interconnected systems distributed across the globe. Those systems are often owned and operated by people or organizations with which a given individual has no actual connection or relationship. That requires the individual to trust both the system and the operator of it, a requirement that isn’t really ideal for security.
“Trust isn’t necessarily the right word to use. It implies that I believe something that I haven’t actually verified for myself,” Kocher said. “We can never actually have complete trust in somebody across the internet whose objectives might be unknowable.”
During the panel, which also included Ron Rivest, one of the designers of the RSA algorithm, and Shafi Goldwasser from the Sminos Institute for the Theory of Computing, the cryptographers also talked quite a bit about the push in various countries for backdoor access to encrypted communications and devices. There is legislation in both the UK and Australia that includes a version of law enforcement access to encrypted communications, either through technical or judicial means, and officials from the FBI and other agencies in the United States have been pushing for a similar thing.
But security experts in general and cryptographers specifically say any back door in an encrypted system, regardless of whether it’s for law enforcement use, not only weakens the system but also provides another target for attackers. There have been a handful of cases over the years of back doors being found in cryptosystems, and intelligence agencies are known to have exploited some of them, at least. Kocher said the idea of using legal means to force companies to weaken their own products is counterproductive.
“I think if anyone should be going to prison, it’s the developers who put back doors in their products without telling their managers or anyone else,” he said. “I don’t think Australia can do better than the NSA, so I don’t think this is going to end very well for any of us.”