October 10, 2019 By Jerh O'Connor 3 min read

We are delighted to announce the addition of the industry-leading Tekton capability to IBM Cloud Continuous Delivery pipelines.

With this new type of Continuous Delivery Pipeline, you define your pipeline as code using Tekton resource YAML stored in a Git repository. This lets you version and share your pipeline definitions while allowing you to configure, run, and view pipeline output in the familiar IBM Cloud Continuous Delivery DevOps experience.

Tekton Pipelines is an open source project, born out of Knative Build, that you can use to configure and run continuous integration/continuous delivery (CI/CD) pipelines within a Kubernetes cluster.

Tekton Pipelines are cloud native and run on Kubernetes using custom resource definitions specialized for executing CI/CD workflows. The Tekton Pipelines project is new and evolving and has support and active commitment from leading technology companies, including IBM, Red Hat, Google, and CloudBees. 

For a closer look at Tekton, see our video “What is Tekton?”:

In addition to the benefits of Tekton, this new capability in IBM Continuous Delivery provides the following unique features:

  • A dashboard to easily view in-flight and completed Pipeline Run information
  • Manual and Git triggering support
  • Integration into the rich Open Toolchain ecosystem of integrations

How to get started

Getting started with CD Tekton-enabled pipelines requires a little setup. You will need the following:

Once you have these pieces in place, you need to configure the Tekton Pipeline to enable the running of your pipeline.

Click on the Tekton Pipeline tile on your toolchain dashboard to be taken to the configuration screen, where you will see notifications on what remains to be configured to use this new pipeline technology:

Select the required values in the Definitions tab, select your private worker in the Worker tab, and click Save.

You now need to set some trigger mappings via the Triggers tab so you can run the pipeline.

The simplest trigger is a manual trigger where you kick off a run yourself from the dashboard. You can also create Git triggers that fire based on git push and pull events as needed.

CD Tekton-enabled pipelines also provide a means of externalising reusable properties and provide a simple, secure method for storing sensitive information like apikeys via the Environment Properties tab.

Once the setup has been completed, the new Tekton Dashboard automatically updates to show information on in-flight and completed Pipeline Runs, showing the status of these runs and allowing the cancellation and deletion of runs.

In all cases, you can dig deeper into the logs and see the output of each Tekton Task and Step in the selected Pipeline run.

Caution: The Tekton Pipelines project is still in alpha and being actively developed. It is very likely that you will need to make changes to your pipeline definitions as the project continues to evolve

Watch the demo

Watch this short companion video for a demo of using our sample Tekton pipeline in IBM Cloud.

Contact

Now you can try it out yourself. For more information, see our documentation. If you have questions, engage our team via Slack by registering here and join the discussion in the #ask-your-question channel on our public Cloud DevOps @ IBM Slack.

We’d love to hear your feedback.

Learn more about Tekton by reading “Tekton: A Modern Approach to Continuous Delivery.”

 

More from Announcements

Success and recognition of IBM offerings in G2 Summer Reports  

2 min read - IBM offerings were featured in over 1,365 unique G2 reports, earning over 230 Leader badges across various categories.   This recognition is important to showcase our leading products and also to provide the unbiased validation our buyers seek. According to the 2024 G2 Software Buyer Behavior Report, “When researching software, buyers are most likely to trust information from people with similar roles and challenges, and they value transparency above other factors.”  With over 90 million visitors each year and hosting more than 2.6…

Manage the routing of your observability log and event data 

4 min read - Comprehensive environments include many sources of observable data to be aggregated and then analyzed for infrastructure and app performance management. Connecting and aggregating the data sources to observability tools need to be flexible. Some use cases might require all data to be aggregated into one common location while others have narrowed scope. Optimizing where observability data is processed enables businesses to maximize insights while managing to cost, compliance and data residency objectives.  As announced on 29 March 2024, IBM Cloud® released its next-gen observability…

Unify and share data across Netezza and watsonx.data for new generative AI applications

3 min read - In today's data and AI-driven world, organizations are generating vast amounts of data from various sources. The ability to extract value from AI initiatives relies heavily on the availability and quality of an enterprise's underlying data. In order to unlock the full potential of data for AI, organizations must be able to effectively navigate their complex IT landscapes across the hybrid cloud.   At this year’s IBM Think conference in Boston, we announced the new capabilities of IBM watsonx.data, an open…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters