Last week, GCN’s CyberEye published “Attacks on open source call for better software design,” which hyperbolically declared 2014 an Annus Horribilis for open source in government. Their final recommendation is for everyone to focus on good software development practices and automatic static analysis. That’s fine, as far as it goes. The line of reasoning, unfortunately, leaves a number of tortured tropes and misconceptions about open source and security in its wake. That’s like catnip for me, so here we go.
We open with the recent unpleasantness at the Drupal project. The SQL injection vulnerability, while serious, isn’t unusual. It’s actually the most common vulnerability in the world. What made the exploit newsworthy was the very short amount of time between disclosure and widespread exploitation: “if timely patches weren’t applied, then the Drupal security team outlined a lengthy process required to restore a website to health.” Basically, you had seven hours to fix it before evil robots descended on your servers.
This isn’t an open source problem, it’s a software management problem.
Vulnerabilities are expected. WhiteHat security says that 86% of sites, both open source and proprietary, have flaws like Drupal’s. Yes, we can hope for better developers and better engineering processes, and static analysis can help identify problems early, but it’s far more important to have plans and policies in place to quickly identify these problems, verify the remedy, and apply it.
In that context, GCN’s next claim even more silly: “…open source is, by nature, open to the widest range of bad guys who could compromise it.” There’s good reason this was unattributed. World-class security experts like Bruce Schneier have declared, many times, that it’s false on its face. GCN themselves ran a piece on the DHS work with Coverity on measuring defect densities in open source projects which proves it. As an argument, it is an exhausted busker, roused from its stupor to blearily tap-dance whenever open source is mentioned in the same breath as security. Anway, how you feel about open source doesn’t affect the Drupal vulnerability one bit. What mattered was the speed of the response, not the origin of the flaw.
It’s worth spending a moment on the importance of commercial support in situations like this. Open source communities can do wonderful things, and the dynamics of their collaboration go a long way toward solving the collective action problem. The trick with security vulnerabilities, though, is that unlike a filesystem bug or a kernel panic, they cause no pain until they strike. This makes it very difficult for a project to identify and remediate problems unless they have skilled, vigilant developers looking for problems. The best and cheapest way to get that kind of attention is to pay someone. That’s what you’re buying when you use a commercially supported open source project.
Commercial support also means prompt delivery of the fix through trusted channels. One reason the Drupal problem was so gnarly is that the exploit covered its tracks by pretending to have fixed itself. So people would log into Drupal, check for the flaw and find nothing, even if they hadn’t applied the fix. Very nasty business. You can avoid this problem by using trusted package management tools and only applying fixes digitally signed by your provider.
The concept of secure software management practices isn’t new. In 2003, GAO told the House Subcommittee on Technology Information Policy that over 80% of known vulnerabilities are attributed to misconfiguration and missing patches. Today, we have tools like SCAP and DHS’ Continuous Diagnostics and Mitigations (CDM) program which let agencies identify security risks on an ongoing basis. These risks can be prioritized, so staff can mitigate the biggest problems first. So we can give the government credit for doing the right thing.
Back to GCN’s argument, we’re now laden with the Drupal story and this silly claim about open source security. This allows us to tar the entire problem with infamous vulnerabilities like Heartbleed and Shellshock, creating the impression of a big mess they’ll call “Annus Horribilis”.
The history of the phrase “Annus Horribilis” is interesting. It’s actually a reference to John Dryden’s 1667 poem “Annus Mirabilis,” which is about the Great Fire of London in 1666. Dryden says that for all the suffering, 1666 was a “Year of Miracles” because the fire could have been much, much worse.
So it is with open source in 2014. We can see how effective the open source community has been in responding to these vulnerabilities, without diminishing the seriousness of the flaws themselves. There are, of course, more vulnerabilities to come: no matter how well-trained, no matter how carefully scrutinized, we will have software with bugs. What matters is that gainfully employed open source developers, as well as volunteers, quickly developed patches and that millions of people were able to fix the problem in a matter of hours. That’s a miracle.
Imagine a flaw like Heartbleed hiding in proprietary software. Will it be discovered? Will it be quickly patched? We don’t know, because the only people who can identify and fix the problem are employed by the company that wrote it. They can be smart, they can be well-trained, and highly-skilled, and use only the best of the best practices of software development, and they still couldn’t muster the number of eyeballs commanded by a high-functioning open source community. I’ll submit VMWare’s response to Shellshock as evidence.
GCN ends its fever dream with the perils of software reuse, attributing lurking danger to the fact that 90% of software is assembled, rather than written. Let’s assume GCN isn’t suggesting that everyone should write stuff from scratch, since that’s even more dangerous. If the point is that it’s dangerous to rely on others, that’s actually true. You want software that you know has been tested, well-reviewed, and can be trusted to respond quickly to problems. Sometimes that’s a commercial vendor, sometimes that’s a thriving community. Either way, you want to perform some diligence before you start downloading software off the Internet.
All’s well that ends well, since GCN takes the tangled mess of arguments and comes to the right conclusion. NIST’s SP 800-160 guidance on software development spends plenty of time talking about how software can be safely developed and managed. That includes 121 pages about trust, process, and continuous improvement – and not a single mention of open source.
Many, many thanks to David Egts and Shawn Wells, who both contributed to this.