Sergey Nivens - Fotolia
Interview: Steve Grobman, McAfee CTO shares his views on some burning security questions
Cyber security technology innovator and veteran Steve Grobman shares his views on adversarial artificial intelligence, post-quantum cryptography and security for next-gen tech
One of the few Intel Fellows, Steve Grobman moved to McAfee as its chief technology officer when it was spun out as a standalone pure-play security firm in 2017, bringing with him the knowledge and experience that enables an informed and incisive perspective on most matters relating to cyber security.
A year ago, he warned that bad actors are likely to use adversarial machine learning, which is the science of “confusing learning and artificial intelligence [AI]”.
“We need to understand how to have multiple machine learning capabilities work together in order to not allow our technologies to succumb to this type of adversarial innovation,” he said in October 2017.
But a year down the line, Grobman says it’s difficult to know to what extent this prediction has come true, despite attention from the security research community.
“We are still working on ways to detect this [type of activity] in the wild, so it is difficult to know exactly how much is actually going on, but we are trying to do a better job of being able to detect if bad actors are using it, and hopefully we will be able to do that shortly,” he told Computer Weekly.
While some security researchers believe that due to the scarcity of skills in applying artificial intelligence, the use of offensive AI is unlikely in the near future because any experts will pursue legitimate, well-paying careers, Grobman points out that the barriers to using the technology have come down.
“In the past, to use AI you had to be an expert in data science and have programming skills. But now we have open source packages like TensorFlow, where if you wanted to build a machine learning model, you could search online for a tutorial that will enable you to do that with literally 15 lines of Python code.”
Building AI capabilities
Grobman says this means it is relatively easy to build AI capabilities and start to experiment in using them to do things like evasion of security controls. “I would disagree that all those with these capabilities will go to the good side. I think there will be some who are willing to accept that crime pays quite well.”
This is most likely to be the case in countries where the authorities are less likely to go after cyber criminals, says Grobman, and also depends on the type of crime.
“Ransomware, for example, is typically seen from the perpetrators’ perspective as a ‘victimless crime’, in that if the ransom is paid, the data is restored and no harm is done. They see the ransom payment simply as a transfer of wealth.”
Post quantum cryptography is fast becoming a popular topic in the cyber security community because of concerns that much of the public key cryptography in use today to protect some of the world’s most sensitive data could be cracked by quantum computers when they become a reality.
“This is a topic that the general public is going to have a hard time understanding, but it is actually quite important that we move our protocols to quantum safe algorithms because when quantum computing for cryptanalysis becomes practical, data that is being encrypted today will be vulnerable to decryption,” he says.
While Grobman recognises that quantum computing is unlikely to be perfected for a couple of years, he is among those members of the cyber security community who believe that the industry needs to be working faster on finding ways to re-tool existing algorithms.
“Part of the challenge is going to be the fact that this process needs to involve governments, industry standards and potentially adversarial countries,” he says. “It is going to take some time, but it is time that we really do not have.”
Post-quantum encryption algorithms
The US National Institute of Standards and Technology (Nist) is currently evaluating post-quantum encryption algorithms submitted by cryptography experts and plans to publish a selection of the best in the next couple of years that will be used as industry standards for the post-quantum era.
“But part of the problem, in my mind, is that the current timeline is working in a granularity of years, and it will literally be a few years before potential post-quantum algorithms are identified, and we are acting serially that once the algorithms are identified, we will start looking at moving those algorithms into protocols, and eventually we will look at moving those protocols into products.”
This slow pace of progress is a cause for concern, according to Grobman, particularly in the light of the timeline for other big initiatives such as internet protocol version 6 (IPV6).
“The IPV6 specification was finalised 20 years ago, and is still not widely implemented and used,” he says. “I think it is important to have more pragmatists involved in the standards process because running this process just out of research and academia can result in having elegant solutions, but missing some of the practical challenges of deploying in real-world environments.”
Grobman believes it is important that the security industry starts thinking about the implementation of post-quantum cryptographic algorithms before we know what the algorithms are going to be.
“If we were building a new airplane and another company was designing the engine, we wouldn’t wait for them to finish the engine design before we start working on the rest of the plane. Instead, we would start building the rest of the plane and bring the two pieces of technology together as soon as they are ready, so we would end up with the finished airplane much faster.”
Read more about quantum computing
- Cryptographic agility is key to post-quantum security.
- Post-quantum security on Airbus’s radar.
- Enterprise data encryption: Preparing for a post-quantum future.
- The time to think about post-quantum cryptography is now.
- Why this quantum computing breakthrough is a security risk.
- Prepare now for quantum computers, QKD and post-quantum encryption.
In the case of post quantum cryptography, Grobman says the industry should be looking at some of the top contenders in the submitted algorithms and identify what paradigm changes would be needed in things like transport layer security (TLS), digital certificate management and other forms of encryption key management.
Although Grobman agrees it “wouldn’t hurt” for organisations to increase the length of their encryption keys in the short term, even fairly long keys will be vulnerable to attack once practical quantum cryptanalysis is achieved, so increasing the key length may provide a “false sense of security”.
Given the doubtful benefit of increasing key lengths, he recommends instead that organisations plan for an “aggressive” re-tooling of the protocols ahead of post quantum algorithms being available.
Reiterating that this process is likely to take several years, Grobman also emphasises the importance of finding algorithms that are not only quantum resistant, but also resistant to classic key cracking methods.
“The nuance of finding secure algorithms, as we have seen time and time again, is difficult,” he says. “MD5, RC4, triple DES, one after the other vulnerabilities were found, but luckily things like AES [advanced encryption standard] has so far stood the test of time and is quantum resistant [because it is a symmetric key algorithm].
“It is the public key [asymmetric] cryptography algorithms that are the ones that are of most concern, like the RSA algorithm and elliptic curve cryptography, which are the ones that are going to be the biggest challenge in the post quantum era.”
Learning lessons
In developing new algorithms, Grobman says the lessons learned from vulnerabilities in algorithms in the past need to be part of the checklist.
“We are in a better position now to think about long-term computing implications because of past experience and, unlike those who came up with earlier encryption algorithms in a pre-internet era, we have an awareness and understanding of things like remote exploits and the need to protect data that is going all over the world, flowing through many points of potential interception.”
However, Grobman notes that almost every cyber security problem the world has seen is because an assumption has been made that has turned out to be incorrect.
“If you look at the Spectre and Meltdown issues at the beginning of 2018, the assumption that as long as the last thing a processor does is not shown to the operating system if access is denied, there would be no way to leak the data, led to all sorts of optimisations around leaving micro architectural remnants that could ultimately be used for leaked data side channel attacks.
“Buffer overflows back in the ‘90s is another classic example,” he says. “Nobody assumed that overwriting a data buffer would be an avenue that would enable the code that was overwritten to be executed.”
“That’s why I love defining next-generation technology with a diverse set of perspectives, where engineers and architects can challenge assumptions and ask hard questions. If the only way you are implementing security is through the people who are actually coming up with the technology, it is likely to have elements that are overlooked, which is why it is important to have a rigorous review by multiple parties.”
Traditional computing
Returning to the topic of post-quantum encryption, Grobman says it is a flawed assumption that quantum computers will be necessary to protect against quantum cryptanalysis.
“We can understand the way that quantum could be used from a cryptanalysis perspective and then figure out how to build strong keys and strong algorithms that will be quantum resistant,” he says, adding that quantum computing will not necessarily replace traditional computing.
“Quantum computers will be very good at certain types of workloads, but very bad at many other types of workloads. It will be effective at solving some very specialised types of problems, with the closest analogy being the difference between CPUs [central processing units] and GPUs [graphics processing units].
“They are both important elements of computing, and GPGPUs [general purpose GPUs] are very good at certain workloads, but there are some workloads that run better on a high-performance CPU. Likewise, I expect the application of quantum computing to be very specialised.”