April 21, 2020 By Powell Quiring 5 min read

As a developer or administrator, learn how to use Tekton pipelines in your CI/CD pipelines in the IBM Cloud.

IBM has been providing tools for DevOps and CI/CD for decades. One component of IBM Cloud DevOps is the Delivery Pipeline, which has recently been expanded to include Tekton 10.1. The Tekton Pipelines project is new, open source, and still evolving. The project has support and active commitment from leading technology companies, including IBM, Red Hat, Google, and CloudBees.

See the following video for an overview of Tekton:

This is part two of multi-part tutorial that will guide you through the Tekton Pipelines technology. See the following for some background: 

Before you begin

This second part of the multi-part series is focused on parameters and Secrets. You can step through the initialization portion of the previous post to create the Tekton Delivery Pipeline. In addition, I’m using the GitHub fork powellquiring, but you should use your own.

Configure the Tekton Pipeline

  • Open the Devops Toolchain resource created in Part 1.
  • Open the Delivery Pipeline.
  • Open Configure Pipeline.
  • Select the Definitions panel and edit it to resemble the following. If you are continuing from the previous post, simply replace the Path with lab2-parameters:
  • Select the Triggers panel and add manual triggers for all EventListeners. Name each trigger the same as the EventListener name. This will result in the following:
    • task-default-variable
    • pipeline-supplied-variable
    • user-defined-variable
    • user-defined-secret-variable

Hit close to return to the Delivery Pipeline Dashboard.

Define a parameter for a Task step

The Task below has a parameter specification containing var with a default VALUE:

kind: Task
metadata:
  name: the-var-task
spec:
  inputs:
    params:
      - name: var
        description: var example in task
        default: VALUE
  steps:
    - name: echoenvvar
      image: ubuntu
      env:
        - name: VAR
          value: $(inputs.params.var)
      command:
        - "/bin/bash"
      args:
        - "-c"
        - "echo 01 lab2 env VAR: $VAR"
    - name: echovar
      image: ubuntu
      command:
        - /bin/echo
      args:
        - $(inputs.params.var)
    - name: shellscript
      image: ubuntu
      env:
        - name: VAR
          value: $(inputs.params.var)
      command: ["/bin/bash", "-c"]
      args:
        - |
          echo this looks just like a shell script and the '$ ( inputs.params.var ) ' is subbed in here: '$(inputs.params.var)'
          env | grep VAR
          echo done with shellscript

Click Run Pipeline and choose task-default-variable. When it completes, click on the pipeline run results to see the following output:

[pdv : echoenvvar]
01 lab2 env VAR: VALUE

[pdv : echovar]
VALUE

[pdv : shellscript]
this looks just like a shell script and the $ ( inputs.params.var )  is subbed in here: VALUE
VAR=VALUE
done with shellscript

WARNING: The Tekton parameter specification $(inputs.params.var) looks like a bash shell variable, but it is not. The Tekton parameter substitution will be done before invoking the container.

Define parameters from a Pipeline to a Task

But how do I get parameters to the Task? In the Pipeline:

kind: Pipeline
metadata:
  name: pipeline-supplied-variable
spec:
  tasks:
    - name: psv
      params:
        - name: var
          value: PIPELINE_SUPPLIED
      taskRef:
        name: the-var-task

The Pipeline Task is supplying parameters to the same the-var-task. Click Run Pipeline, chose pipeline-supplied-variable, and check out the results.

Define parameters from a user to a task

How do I get values from a user clicking in the Delivery Pipeline console UI into the Pipeline? A parameter specification is declared in the TriggerTemplate. The PipelineRun parameter $(param.var) is expanded when the PipelineRun is created, just like it was expanded above within the Task. In our example, this is done when the Run Pipeline button is clicked, invoking EventSource and the associated TriggerTemplate:

kind: TriggerTemplate
metadata:
  name: trigger-user-supplied-variable
spec:
  params:
    - name: var
      description: var example
  resourcetemplates:
    - apiVersion: tekton.dev/v1alpha1
      kind: PipelineRun
      metadata:
        name: pipelinerun-$(uid)
      spec:
        pipelineRef:
          name: pipeline-input-parameter-variable
        params:
          - name: var
            value: $(params.var)

Similarly, the Pipeline has a parameter specification and the Pipeline Task is enhanced with a parameter expansion:

apiVersion: tekton.dev/v1alpha1
kind: Pipeline
metadata:
  name: pipeline-input-parameter-variable
spec:
  params:
    - name: var
      description: var example in pipeline
  tasks:
    - name: pipv
      params:
        - name: var
          value: $(params.var)
      taskRef:
        name: the-var-task

To see this in action in the GUI, you will create an environment property:

  • Click on Configure Pipeline.
  • Click Environment Properties and add text property var with a value like defined in environment properties.  
  • Click Save and click Close.

Now click Run Pipeline with the manual trigger user-defined-variable. The environment properties that have a name matching the TriggerTemplate parameter specification will be expanded. In our case, the var environment property will be expanded in the PipelineRun.

Check the output and verify the parameters were passed through correctly.

Secure Property

If the Environment Property is created as a Secure Property, you will notice that it will not be displayed in the logs—instead, it will be hidden. All occurrences of this string that are in the logs and stored on IBM’s servers will be replaced by a string of asterisks.   

In the Environment Properties, delete the var property and add a new varSecure Property—and notice the output change.

Secrets

A Kubernetes Secret object can be created by the TriggerTemplate. This can be handy if one member of your team owns the Secret and a different member of the team is responsible for running the Pipeline.  Below, we’ll create a Secret object named secret-object. The apikey  parameter below identifies a property in the Environment Properties Panel of the Pipeline configuration. The Secret object holds key: value}pairs. In this case, {secret_key: $(params.apikey)}.

kind: TriggerTemplate
metadata:
  name: trigger-user-supplied-secret-variable
spec:
  params:
    - name: apikey
      description: the ibmcloud api key
  resourcetemplates:
    - apiVersion: v1
      kind: Secret
      metadata:
        name: secret-object
      type: Opaque
      stringData:
        secret_key: $(params.apikey)

Note that the parameter is not passed through the PipelineRun into the Pipeline. Instead, the Task can pull the value from the Secret object:

  • secret-object: name of the Secrets resource
  • secret_key: key of the {key: value} pair
kind: Task
spec:
  steps:
    - name: echoenvvar
      env:
        - name: API_KEY
          valueFrom:
            secretKeyRef:
              name: secret-object
              key: secret_key

To see this in action in the GUI, follow these steps:

  • Click on Configure Pipeline.
  • Click on Environment Properties.
  • Add a Secure property apikey with a value like “veryprivate.”
  • Click Save and then Close.

Now Run Pipeline with the manual trigger user-defined-secret-variable, and when it completes, click on the PipelineRun results to verify the output.

Learn more

There is a vibrant, open source community working on the Tekton Pipeline project, and this is a great chance to join the fun. Tekton has been integrated into the IBM DevOps environment, and you can leverage the IBM Cloud so that you can work on your business instead of your infrastructure. 

This is Part 2 of a multi-part tutorial series—more Tekton posts are coming to explain topics like workspaces and sharing.

Report a problem or ask for help

Get help fast directly from the IBM Cloud development teams by joining us on Slack.

Was this article helpful?
YesNo

More from Cloud

How a US bank modernized its mainframe applications with IBM Consulting and Microsoft Azure

9 min read - As organizations strive to stay ahead of the curve in today's fast-paced digital landscape, mainframe application modernization has emerged as a critical component of any digital transformation strategy. In this blog, we'll discuss the example of a US bank which embarked on a journey to modernize its mainframe applications. This strategic project has helped it to transform into a more modern, flexible and agile business. In looking at the ways in which it approached the problem, you’ll gain insights into…

The power of the mainframe and cloud-native applications 

4 min read - Mainframe modernization refers to the process of transforming legacy mainframe systems, applications and infrastructure to align with modern technology and business standards. This process unlocks the power of mainframe systems, enabling organizations to use their existing investments in mainframe technology and capitalize on the benefits of modernization. By modernizing mainframe systems, organizations can improve agility, increase efficiency, reduce costs, and enhance customer experience.  Mainframe modernization empowers organizations to harness the latest technologies and tools, such as cloud computing, artificial intelligence,…

Modernize your mainframe applications with Azure

4 min read - Mainframes continue to play a vital role in many businesses' core operations. According to new research from IBM's Institute for Business Value, a significant 7 out of 10 IT executives believe that mainframe-based applications are crucial to their business and technology strategies. However, the rapid pace of digital transformation is forcing companies to modernize across their IT landscape, and as the pace of innovation continuously accelerates, organizations must react and adapt to these changes or risk being left behind. Mainframe…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters