The major patching efforts triggered by the Heartbleed, Shellshock and POODLE flaws last year highlight the effect of critical vulnerabilities in third-party code. The flaws affected software that runs on servers, desktop computers, mobile devices and hardware appliances, affecting millions of consumers and businesses.
However, these highly publicised vulnerabilities were not isolated incidents. Similar flaws have been found in libraries such as OpenSSL, LibTIFF, libpng, OpenJPEG, FFmpeg, Libav and countless others, and these have made their way into thousands of products over the years.
Among the reasons why these bugs end up in finished products is a belief by developers that the third-party code they choose to integrate is secure because it has already been used by many others.
The shallow bugs myth
"There is a myth that open source is secure because everyone can review it; more eyes reviewing it making all bugs shallow," said Jake Kouns, CISO of Risk Based Security, a firm that specialises in tracking vulnerabilities. "The reality is that while everyone could look at the code, they don't and accountability for quality is deferred. Developers and companies that consume third party libraries are not allocating their own resources to security test 'someone else's code.' Right or wrong, everyone seems to think that someone else will find the vulnerabilities and what is published is secure."
The reality is that many open source projects, even the ones producing code that's critical to the Internet infrastructure, are often poorly funded, understaffed and have nowhere close to enough resources to pay for professional code audits or the manpower to engage in massive rewrites of old code.
OpenSSL is a prominent example of such a case, but far from the only one. After the critical Heartbleed bug was announced in April, it was revealed that the OpenSSL project had only one full-time developer and that the project was being primarily funded through contract-based work that other team members did in their spare time for companies in need of SSL/TLS expertise.
The developers of OpenBSD criticised OpenSSL for maintaining old code for platforms that few people care about and decided to fork the project to create a cleaner version of the library dubbed LibreSSL.
The flaws in open source libraries are often the result of one or more of these reasons: old code or low code maturity, insufficient auditing or fuzzing -- a process of finding vulnerabilities by automatically feeding unexpected input to applications -- and too few maintainers, said Carsten Eiram, the chief research officer of Risk Based Security. "We see that many vulnerabilities being found in these libraries are by researchers simply running some of the latest fuzzers against them, so it's often something the maintainers or companies using said libraries could do themselves. Software vendors are quick to implement libraries into their products, but rarely audit or even fuzz these first or help maintaining them."
It's all marketing
The Heartbleed, Shellshock and POODLE vulnerabilities raised a lot of interest among software developers and system administrators, partly because of the attention the flaws received in the media. Some vendors are still identifying products affected by these flaws and are releasing fixes for them, months after they were first announced.
Eiram believes that the primary reason why these vulnerabilities stood out was not their impact, but the way in which they were advertised by their finders -- with fancy names and logos. The sad truth is that flaws of similar importance are regularly found in widespread libraries, but manage to fly under the radar and are rarely patched by the software vendors who use them.
"A lot of vulnerabilities -- 18 -- have been addressed in OpenSSL since Heartbleed, and we haven't remotely seen the same attention to issuing fixes quickly -- or at all -- from the vendors," Eiram said. "We see fixes of varying severity to libraries on almost a daily basis, but rarely see vendors bundling these libraries in their products issue fixes, even though we know these libraries are heavily used."
One example of that is a vulnerability discovered in 2006 by Tavis Ormandy, a security researcher who now works at Google. The flaw was among several that affected LibTIFF and were fixed in a new release at the time. It was tracked as CVE-2006-3459 in the Common Vulnerabilities and Exposures database.
"In 2010, a vulnerability was fixed in Adobe Reader, which turned out to be one of the vulnerabilities covered by CVE-2006-3459," Eiram said. "For four years, a vulnerable and outdated version of LibTIFF had been bundled with Adobe Reader, and it was even proven to be exploitable."
Adobe Systems has since become one of the software vendors taking the threat of flaws in third-party components seriously, Eiram said. "They've made major improvements to their process of tracking and addressing vulnerabilities in the third-party libraries and components used in their products."
Another one of those vendors is Google. Aside from just keeping track of vulnerabilities in the third-party code it uses, the company's researchers are actively searching for flaws in that code.
"We've seen two of Google's prolific researchers, Gynvael Coldwind and Mateusz Jurczyk, finding more than 1,000 issues in FFmpeg and Libav, which is used in Chrome, and they're currently looking at other libraries like FreeType too," Eiram said. "OpenJPEG also seems to receive some scrutiny by Google at the moment, which is used in PDFium that in turn is used in Chrome. Obviously, Google also put a lot of effort into securing WebKit, when they started using that as the engine for Chrome."
Making such contributions helps to improve the code maturity of those libraries for everyone, and is something that all software vendors should do.
If vendors would at least use fuzzing to test the libraries that they use and help fix the issues they find in the process, it would make a significant difference, Eiram said. Much more so than the bug bounty programs for critical Internet software, like those run by Hacker One or Google, which so far have had little effect at drawing researchers toward finding vulnerabilities in libraries, he said.
An accurate bill of materials
Unfortunately we're a long way from that happening, as many software developers fail to even keep track of which third-party components they use and where, not to mention vulnerabilities that are later found and patched in those components.
Veracode, a security firm that runs a cloud-based vulnerability scanning service, found that third-party and open source components introduce an average of 24 known vulnerabilities into each application and that in the case of some enterprises, 40 percent of the applications they use have one or more critical vulnerabilities introduced by components.
"Most companies learned lessons from trying to patch Heartbleed and Shellshock," said Chris Eng, vice president of research at Veracode. "One of the challenges was that it involved not only patching servers, but patching vulnerable hardware and software products. Answering the question 'Which of my products rely on a vulnerable version of OpenSSL?' was difficult for many organisations due to lack of visibility into the composition of their software products."
"Having an accurate 'bill of materials,' so to speak, for all software projects is critical to the patching effort," Eng said. "This has always been true, but Heartbleed and Shellshock both amplified the issue thanks to the ubiquity of both OpenSSL and Bash."
For system administrators the situation is even more complicated, because they rely on software vendors for fixes and their response to flaws in third-party components varies greatly from fairly quickly to none.
"We do sense that the software industry has recognised the threat and is trying to deal with it -- at least many major companies -- but properly mapping libraries used in the code and tracking and triaging vulnerabilities in these requires significant resources," Eiram said.
Be prepared for more
If there's one thing that Heartbleed and Shellshock changed it is that researchers now have a model for advertising the flaws they find so they reach a wider audience. Even though many in the security industry don't agree with this approach, because it tends to hype the risks, it does seem to put pressure on vendors to act. It also seems to attract the attention of even more researchers, leading to increased scrutiny of some libraries, even if for short periods of time.
"Researchers looking to find the most impactful vulnerabilities will naturally be drawn to software that is widespread and baked into a wide range of products," Eng said. "I think this will continue, because many researchers are motivated -- at least in part -- by the media attention that comes with discovering high-profile bugs."
From a business perspective, being forced to unexpectedly reallocate resources that are planned for something else in order to deal with flaws like Heartbleed, which involves identifying affected products and issuing patches, is likely a burden for software vendors. If faced with highly publicised flaws on a regular basis, vendors might naturally be pushed to adopt more proactive strategies that involve tracking, patching and even finding vulnerabilities in third-party components themselves as a matter of course.
"It does seem like more and more software companies are discussing the challenge of dealing with third party libraries and components, and have recognised how these are an Achilles heel," Eiram said. "The vast majority of them still need to implement a proper policy and approach for dealing with this challenge."
Companies should map which of their products contain third-party libraries, should define policies for who may add such components to products and how, should consider the security state of a library before using it by consulting the various vulnerability databases and should create an in-house vulnerability tracking solution or purchase a subscription to a commercial one with strong library coverage.
"Longer term we may see a shift, but developers may still view Heartbleed and Shellshock as isolated events rather than a trend," Eng said. "On the other hand, automated solutions are now making it easier for enterprises to identify components with known vulnerabilities in their application portfolios, so I think we'll see the proactive approach becoming a best practice over time."
On the enterprise side, "Organisations need to recognise that bugs of the magnitude we've seen in 2014 will continue into 2015 and put the processes in place to quickly identify where they are vulnerable, have a sound procedure to prioritise the issues and efficient processes to patch systems when bugs are discovered to reduce risk of exploitation," said Gavin Millard, technical director for EMEA region at Tenable Network Security. "When the next 'Bug du Jour' hits, the rapid response to deal with it needs to be a well oiled, tested and efficient machine."