Quantcast
Channel: CodeSection,代码区,网络安全 - CodeSec
Viewing all articles
Browse latest Browse all 12749

What’s in your code? Why you need a software bill of materials

0
0

What’s in your code? Why you need a software bill of materials

Writing secure applications doesn't mean simply checking the code you've written to make sure there are no logic errors or coding mistakes. Attackers are increasingly targeting vulnerabilities in third-party libraries as part of their attacks, so you have to check the safety of all the dependencies and components, too.

In manufacturing, companies create a bill of materials, listing in detail all the items included when building a product so that buyers know exactly what they're buying. Processed food packaging, for example, typically tells you what's inside so that you can make an informed buying decision.

When it comes to software, untangling the code to know what libraries are in use and which dependencies exist is hard. It's a challenge most IT teams don't have the time or resources to unravel.

"You don't want to purchase spoiled food, buy a car with defective air bags, or have a relative receive a defective pacemaker," says Derek Weeks, vice president and devops advocate at Sonatype, a software supply chain automation provider. Yet we surprisingly don't demand the same rules for software.

Tell me what's inside

At the very least, a software bill of materials should describe the components included in the application, the version and build of the components in use, and the license types for each component.

To take one example, IT administrators would have had a far easier time back in April 2014 when the Heartbleed vulnerability was initially disclosed if they'd had a bill of materials on hand for every application running in their environment. Instead of testing every application to determine whether OpenSSL was included, IT could have checked the list and known right away which ones depended on the vulnerable version and needed action.

Other nice-to-have information would be details like the location within the source code where that component is being called, the list of all tools used to build the application, and relevant build scripts.

Today's developers rely heavily on open source and other third-party components, and an estimated 80 to 90 percent of an application may consist of code written by someone else. According to statistics collected by Sonatype, the average application has 106 components. It doesn't matter if the problem is in one of those components. The organization is responsible for the entire software chain and is on the hook if a vulnerability in the library results in a security incident.

Black boxes

When organizations buy software -- either commercial or open source -- they have only a limited visibility in what components are in use. Especially diligent teams may look at the code to see which libraries are included, but libraries can call other components and easily go more than two levels deep.

"People aren't even sure what they're using, especially when libraries call other libraries that they don't even know about," says Mark Curphey, CEO of software security company Sourceclear.

As many as one in 16 components used by development teams has a known security defect, according to Sonatype's 2016 State of the Software Supply Chain report. It's the equivalent of being told 6 percent of the parts used in building a car were defective, but nobody knew which part or who supplied it, Weeks says. A car owner would not accept that answer, nor should software owners.

Some software buyers are taking a stand. Both Exxon and the Mayo Clinic, for example, require software suppliers to provide a software bill of materials in order to discover potential security and licensing problems or whether the application is using an outdated version of the library.

When such problems are found, an administrator can ask the supplier to rebuild the application with the newer version. While waiting for the updated software, IT has the opportunity to put in temporary mitigations to protect the application from attackers looking to exploit the vulnerability. A software bill of materials also helps administrators perform spot checks of applications and code whenever a vulnerability is disclosed or a core library, such as OpenSSL, releases a new version.

Just because a component doesn't have any known bugs at the moment is not an argument for its safety. Some components may be at the latest available version but are several years old. If administrators and developers have the right information, they can decide whether or not they want to risk using an application containing an old, possibly unsupported, component.

Similar, but different programs

Understanding what components are being used is not only an open source software problem. Several efforts are underway to establish certification and testing laboratories focused on the security of software. Unlike the bill of materials, which helps software owners stay on top of maintenance and updates, these efforts focus on assisting buyers with the purchase decisions.

The Underwriters Laboratories rolled out a voluntary Cybersecurity Assurance Program (UL CAP) earlier this year for the internet of things and critical infrastructure vendors to assess the security vulnerability and weaknesses in their products against a set of security standards. UL CAP can be used as a procurement tool for buyers of critical infrastructure and IoT equipment. ICSA Labs has a similar IoT Certification Testing program that tests IoT devices on how they handle alert/logging, cryptography, authentication, communications, physical security, and platform security. An ICSA Labs certification means that the product underwent a testing program and that vulnerabilities and weaknesses were fixed.

The Online Trust Alliance has an IoT Trust Framework, which is a set of specifications IoT manufacturers should follow in order to build security and privacy -- such as unique passwords, encrypted traffic, and patching mechanisms -- into their connected devices. The framework will eventually become a global certification program, but for the moment, it's more of a guidance on what to do correctly.

At this year's Black Hat conference, Peiter Zatko, the famous hacker known as Mudge, and Sarah Zatko unveiled a Consumer Reports-style ratings system, Cyber Independent Testing Lab, to measure the relative security and difficulty of exploitation for various applications. CITL's methodology includes looking for known bad functions and how often the application uses them, as well as comparing how frequently good functions are called as opposed to the bad ones.

"We as security practitioners tend to focus on exploitability, but as a consumer of a product, they're almost always going to say disruptability is what bothers them," Zatko said during his presentation. The plan is to release large-scale fuzzing results by the end of 2017.

Track ingredients for better security

Attackers have shifted their focus upstream to look at the components because targeting a library vulnerability gives them more victims than focusing on only a single application. The serialization flaw in Apache Common Core is a good example of how such flaws can be missed. An administrator may think there's nothing to worry about because the organization doesn't use JBoss, not realizing another application they rely on may be using the vulnerable collection code and is susceptible.

A software bill of materials helps administrators gain visibility into the components used in applications and discover potential security and licensing problems. More important, administrators can use the list to spot-check applications and code from suppliers to obtain an accurate view of potential vulnerabilities and weaknesses, as well as roll out patches in a timely manner.


Viewing all articles
Browse latest Browse all 12749

Latest Images

Trending Articles





Latest Images