November 14, 2022 By Steven Weaver 4 min read

IBM Cloud Continuous Delivery is now introducing automated support for SLSA Level 1.

In the past few years, security hacks against enterprise software applications have made news worldwide, with exploits like those that impacted SolarWinds becoming widespread and notorious.

In fact, in 2021, more than three in five companies were targeted by software supply chain attacks, according to a recent survey. Additionally, a new global study of 1,000 CIOs found that 82% say their organizations are vulnerable to cyberattacks targeting software supply chains.

Any software can introduce vulnerabilities into a supply chain at any step of the process, including third-party and open-source packages:

Figure 1: Supply chain attack vectors.

As a system gets more complex, it’s critical to already have checks and best practices in place to guarantee artifact integrity and that the source code you’re relying on is the code you’re actually using. Without solid foundations and a plan for the system as it grows, it’s difficult to focus your efforts against tomorrow’s next hack, breach or compromise.

Emerging standards

As software supply chain attacks become more prevalent, organizations across the world have started to identify best practices to counter security breaches and have established guidelines and controls that can be put in place to address attacks.

IBM is a leading contributor, for example, to the Open Source Security Foundation, but government agencies like the National Institute of Standards and Technology (NIST) and open source foundations like SLSA (Supply-Chain Levels for Software Artifacts) are also publishing guidance on practices and procedures to address security vulnerabilities.

What is SLSA?

Supply-Chain Levels for Software Artifacts (SLSA)—also known as “salsa”—is a security framework. SLSA is a checklist of standards and controls to prevent tampering, improve integrity and secure packages and infrastructure in your projects, businesses or enterprises. SLSA gets you from safe enough to being as resilient as possible, at any link in the chain. It is organized into a series of levels that provide increasing integrity guarantees. This gives you confidence that software hasn’t been tampered with and can be securely traced back to its source.

Figure 2: SLSA levels.

IBM Cloud Continuous Delivery now enables SLSA Level 1 support

Since early last year, IBM Cloud Continuous Delivery has provided a reference implementation of NIST Configuration Management controls as a service that you can configure in a few clicks by using toolchain templates. The workflows for Continuous Integration, Continuous Deployment and Continuous Compliance build, scan, test and deploy your cloud-native applications while ensuring security and compliance goals are met and evidence is retained for any future audits. The workflows can be customized to leverage other enterprise tools or implement custom policies.

IBM Cloud Continuous Delivery is now introducing automated support for SLSA Level 1. To achieve SLSA Level 1, the build process must be fully scripted and automated, and it must generate provenance. Provenance is metadata about how an artifact was built, including the build process, top-level source and dependencies. Knowing the provenance allows software consumers to make risk-based security decisions. Provenance at SLSA Level 1 does not protect against tampering, but it offers a basic level of code source identification and can aid in vulnerability management.

Implementing SLSA Level 1

Figure 3: SLSA provenance.

A software system is made up of inputs that execute steps to deliver a resulting artifact (e.g., a container image or a module). Inputs typically can come from two sources:

  1. System inputs like the environment for the build, including the worker system and the system configuration that are typically generated automatically by the build system.
  2. Inputs that come from external sources, such as the build definition source (repositories or configuration providers), triggering inputs (event payloads), user-defined parameters or other materials like container images or modules used in the build.

These inputs are evaluated by the buildconfig, which (using the resulting instructions for the build) is then executed to create the subject. Each of these inputs and outputs are the provenance of the build and should be captured in a secure evidence repository.

In IBM Cloud Continuous Delivery, the build configuration comes from Tekton pipeline run information and is provided by the Private Worker Agent. Workers are the entities that run the tasks in delivery pipelines. For more information on private workers, see Working with Delivery Pipeline Private Workers.

Materials and subjects are injected by the user. 

To generate a provenance document in a Continuous Delivery Tekton pipeline run, slsa-provenance-statement attestations are emitted into the pipeline step logs. The resulting Provenance Record is made available from the Download link of a pipeline run. To obtain provenance documents, download the pipeline run from the Actions menu on the Pipeline run details page. The provenance documents are part of the returned .zip file. For additional examples, check the documentation.

Next steps

As securing the software supply chain becomes more critical for organizations delivering enterprise applications, IBM continues to provide additional capabilities in the DevSecOps toolchain templates available on IBM Cloud.

Get started for free.

If you’d like to share your feedback with us, you can reach out to the IBM Cloud Continuous Delivery development team by joining us on Slack.

More from Announcements

Success and recognition of IBM offerings in G2 Summer Reports  

2 min read - IBM offerings were featured in over 1,365 unique G2 reports, earning over 230 Leader badges across various categories.   This recognition is important to showcase our leading products and also to provide the unbiased validation our buyers seek. According to the 2024 G2 Software Buyer Behavior Report, “When researching software, buyers are most likely to trust information from people with similar roles and challenges, and they value transparency above other factors.”  With over 90 million visitors each year and hosting more than 2.6…

Manage the routing of your observability log and event data 

4 min read - Comprehensive environments include many sources of observable data to be aggregated and then analyzed for infrastructure and app performance management. Connecting and aggregating the data sources to observability tools need to be flexible. Some use cases might require all data to be aggregated into one common location while others have narrowed scope. Optimizing where observability data is processed enables businesses to maximize insights while managing to cost, compliance and data residency objectives.  As announced on 29 March 2024, IBM Cloud® released its next-gen observability…

Unify and share data across Netezza and watsonx.data for new generative AI applications

3 min read - In today's data and AI-driven world, organizations are generating vast amounts of data from various sources. The ability to extract value from AI initiatives relies heavily on the availability and quality of an enterprise's underlying data. In order to unlock the full potential of data for AI, organizations must be able to effectively navigate their complex IT landscapes across the hybrid cloud.   At this year’s IBM Think conference in Boston, we announced the new capabilities of IBM watsonx.data, an open…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters