March 25, 2020 By Powell Quiring 3 min read

As a developer or administrator, learn how to use Tekton pipelines in your CI/CD pipelines in the IBM Cloud. 

IBM has been providing tools for DevOps and CI/CD for decades. One component of IBM Cloud DevOps is the Delivery Pipeline, which has recently been expanded to include Tekton 10.1. The open source Tekton Pipelines project is new and evolving and has support and active commitment from leading technology companies, including IBM, Red Hat, Google, and CloudBees.

See the following video for an overview of Tekton:

This multi-part tutorial will guide you through using this interesting technology. For a great overview, check out “Build and Deliver Using Tekton-Enabled Pipelines.”

Before you begin

Navigate to the tekton-toolchain GitHub repository and make a fork. I will use mine (https://github.com/powellquiring/tekton-toolchain) during the tutorial. When you see this, substitute in your fork.

Note: If you see warnings or errors that the Continuous Delivery service is required, you can either live with the warnings or create a Continuous Delivery service (Lite plan) for the region.

Pipelines of pipelines of pipelines…

Warning—the word pipeline is over loaded. A Delivery Pipeline is an IBM tool in the IBM Toolchain. The Tekton Pipeline is the open source project, and one of the Tekton Pipeline resources is named Pipeline. (Wow.)

Configure a simple Tekton Pipeline

This first blog post will explain how to configure a simple Tekton Pipeline. It will added to a Delivery Pipeline in a DevOps Toolchain.

Note: Public workers are only available in the Dallas region.

Visit the IBM Cloud in your browser. In the hamburger menu in the upper left, choose DevOps.

  • Verify that Toolchains is the selected panel on the left:
    • Set the Resource Group as desired (default is fine).
    • Leave Cloud Foundry Org unselected.
    • Set the Location to Dallas (Dallas is the only region with public workers).
  • Click Create a toolchain and then Build your own toolchain. Verify:
    • Select Region: Dallas
    • Select a resource group: default (or your desired resource group)
  • Click Add tool and choose GitHub:
  • Add tool and choose Delivery Pipeline:
    • name: whatever
    • Pipeline type: Tekton

As you can see, a Toolchain is a landing page that holds integrated tools. You have integrated two tools: GitHub and a Delivery Pipeline.

Click the Delivery Pipeline to open.

  • The Definitions panel on the left is selected. Definitions are going to identify the set of Tekton files that will contribute to this Delivery Pipeline. Contribute the files in the lab1-simple path (your fork)
  • Click the Worker panel. The default works great, and for me it was: (Beta)IBM Managed workers in DALLAS
  • Click the Triggers panel:
    • Add trigger
    • Manual Trigger
    • EventListener: the-listener
  • Save and then Close, and you are back to the Delivery Pipeline. Click Run Pipeline > Manual Trigger.

This demonstrates some cool stuff. All of the Tekton files in the github lab1-simple were added to the Delivery Pipeline. In this case, it was just the tekton.yaml file but all files with a .yaml or .yml extension are read by the Delivery Pipeline.

Open the lab1-simple/tekton.yaml file. The name of the trigger in the drop-down menu is the same as the EventListener.

apiVersion: tekton.dev/v1alpha1
kind: EventListener
metadata:
  name: the-listener

Click on the log entry created and you will see xx1, which matched the task name in the pipeline:

apiVersion: tekton.dev/v1alpha1
kind: Pipeline
metadata:
  name: pipeline
spec:
  tasks:
    - name: xx1

You can see the task steps that were executed. No surprises. The steps are commands executed in the associated container and give you a feel for the environment. The ubuntu container image was supplied by hub.docker.com.

apiVersion: tekton.dev/v1alpha1
kind: Task
metadata:
  name: the-task
spec:
  steps:
    - name: echo
      image: ubuntu
      command:
        - echo
      args:
        - "01 version"
    - name: lslslash
      image: ubuntu

Note that the last few steps show that the file system is preserved between steps. Experiment some more to verify that the working directory is not maintained across tasks.

Learn more

There is a vibrant, open source community working on the Tekton Pipeline project, and this is a great chance to join the fun. Tekton has been integrated into the IBM DevOps environment, and you can leverage the IBM Cloud so that you can work on your business instead of your infrastructure. 

This is Part 1 of a multi-part tutorial series—more Tekton posts are coming to explain topics like variables, workspaces, and sharing.

Resources

Report a problem or ask for help

Get help fast directly from the IBM Cloud development teams by joining us on Slack.

Was this article helpful?
YesNo

More from Cloud

Fortressing the digital frontier: A comprehensive look at IBM Cloud network security services

6 min read - The cloud revolution has fundamentally transformed how businesses operate. Its superior scalability, agility and cost-effectiveness have made it the go-to platform for organizations of all sizes. However, this shift to the cloud has introduced a new landscape of ever-evolving security threats. Data breaches and cyberattacks continue to hit organizations, making robust cloud network security an absolute necessity. IBM®, a titan in the tech industry, recognizes this critical need, provides a comprehensive suite of tools and offers unmatched expertise to fortify…

How well do you know your hypervisor and firmware?

6 min read - IBM Cloud® Virtual Private Cloud (VPC) is designed for secured cloud computing, and several features of our platform planning, development and operations help ensure that design. However, because security in the cloud is typically a shared responsibility between the cloud service provider and the customer, it’s essential for you to fully understand the layers of security that your workloads run on here with us. That’s why here, we detail a few key security components of IBM Cloud VPC that aim…

New IBM study: How business leaders can harness the power of gen AI to drive sustainable IT transformation

3 min read - As organizations strive to balance productivity, innovation and environmental responsibility, the need for sustainable IT practices is even more pressing. A new global study from the IBM Institute for Business Value reveals that emerging technologies, particularly generative AI, can play a pivotal role in advancing sustainable IT initiatives. However, successful transformation of IT systems demands a strategic and enterprise-wide approach to sustainability. The power of generative AI in sustainable IT Generative AI is creating new opportunities to transform IT operations…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters