Abandon formal methods and treat PCs like lab rats, Daniel Gruss tells Black Hat Asia attendees
Recent years have witnessed an increasing number of software-based attacks on increasingly complex hardware.
Treat this as the new normal, Daniel Gruss, a member of one of three teams that uncovered the Meltdown vulnerability, said during a keynote presentation on Friday at the Black Hat Asia security conference.
Gruss, an assistant professor at Graz University of Technology, Austria, explained that the complexity of a computer system built from smaller sub-systems goes beyond the sum of the complexity of the smaller systems.
That’s because interactions between sub-systems creates additional complexity, according to the micro-architecture security specialist.
“We are putting a layer upon layer that depend on each other and make assumptions about each other, but it’s not clear whether these assumptions hold,” according to Gruss.
The resulting complexity exceeds what can be analysed using formal methods or a mindset where the operation of computers is treated like a form of applied mathematics.
Computing systems are reaching biological levels of complexity, Daniel Gruss tells Black Hat Asia 2020
Evolve or die
Computer science must instead evolve to adopt methodologies derived from natural sciences, such as biology. This makes sense, in part, because the complexity of computer systems is already approaching that of a biological organism such as a mouse.
With humans now capable of replicating the intricacy of nature itself, computer scientists need to adopt a different approach, according to Gruss.
“Computers are artificial, but we have to study them like nature because they have exceeded the complexity of something that we can study another way,” he said.
Natural science methodology involves posing a question, building a hypothesis, making a prediction based on this hypothesis, then testing and analysis.
The internet and associated technologies have changed what is required of a secure system. Ransomware, for instance, is a much bigger problem than might otherwise be the case because of the “network of networks”.
And, on a smaller scale, physical addresses in computer memory need to be treated as a secret because of the ’Row hammer’ attack.
“The required security properties change over time,” according to Gruss.
Gruss argued that the number of vulnerabilities in closed and open source systems is roughly comparable. If open source systems had any advantage it was in shorter patching times.
Rather than switching to different software development methodologies, Gruss held out more promise in going back to less complex systems. Generations of Intel cores after the Pentium 4, for example, were based on the less complex architecture of the Pentium 3.
Such changes really only offer temporary respite from the push to more complex systems with developments such as smart cities and IoT – to say nothing of increased data centre sophistication –driving increased complexity and power utilization.
As systems become ever-more complex, more and more effort will have to be expended just to maintain security levels at current levels, Gruss concluded.
Thirty years ago, the job of penetration tester didn’t really exist. In the future we’ll need more security analysts doing a wider variety of jobs, he added.
Cache from chaos
Gruss was rather busy at Black Hat Asia, which this year took place entirely online as a virtual event.
On Thursday, the academic also took part in a team presentation entitled “Page Cache Attacks: Microarchitectural Attacks on Flawless Hardware”.
The talk covered how page caches can be exploited by attackers to spy on keystrokes, break security defences, and more.
Both the technical talk and (in particular) the keynotes were peppered by memes and accompanied by a background laughter track which, if nothing else, made the presentations stand out from the crowd.