What is Knative?
Explore IBM's Knative solution Subscribe for cloud updates
Illustration with collage of pictograms of computer monitor, server, clouds, dots
What is Knative?

Knative enables serverless workloads to run on Kubernetes clusters. It makes building and orchestrating containers with Kubernetes faster and easier.

Knative (pronounced Kay-NAY-tive) is an extension of the Kubernetes container orchestration platform. It provides tools and utilities that make building, deploying and managing containerized applications within Kubernetes a simpler and more "native-to-Kubernetes" experience (hence the name, "K" for "Kubernetes" + "native").

Like Kubernetes, Knative is open source software. Google initially developed it in collaboration with IBM®, Pivotal, Red Hat®, SAP and nearly 50 other companies. Today, the Cloud Native Computing Foundation (CNCF) hosts the Knative open source project.

Strategic app modernization drives digital transformation

Strategic application modernization is one key to transformational success that can boost annual revenue and lower maintenance and running costs.

Related content

Register for the guide on hybrid cloud

Why Kubernetes needs Knative

Kubernetes automates and schedules the deployment, management and scaling of containers—lightweight, executable application components that combine source code with all the operating system (OS) libraries and dependencies required to run the code in any environment. 

Containers allow application components to share the resources of a single instance of an OS, in much the same way that virtual machines (VMs) allow applications to share the resources of a single physical computer. Smaller and more resource-efficient than VMs and better suited to the incremental release cycles of Agile and DevOps development methodologies, containers have become the de facto compute units of modern cloud-native applications. Companies that use containers report other benefits, including improved app quality, greater levels of innovation and more:

Download the full report: Containers in the enterprise

As cloud-native development becomes more popular and containers proliferate an organization, Kubernetes’ container orchestration capabilities, which include scheduling, load balancing, health monitoring and more, make that proliferation a lot easier to manage. However, Kubernetes is a complex tool that requires developers to perform or template many repetitive tasks, such as pulling application source code from repositories, building and provisioning a container image around the code and configuring network connections outside of Kubernetes using different tools. Incorporating Kubernetes-managed containers into an automated continuous integration/continuous delivery (CI/DC) pipeline requires special tools and custom coding.

Knative eliminates this complexity with tools that automate these tasks from within Kubernetes. A developer can define the container's contents and configuration in a single YAML manifest file, and Knative does the rest, creating the container and performing the network programming to set up a route, ingress, load balancing and more. Knative also offers a command line interface, Knative CLI that allows developers to access Knative features without editing YAML files.

Making containers serverless

Serverless computing is a cloud-native execution model that makes applications easier to develop and more cost-effective. The serverless computing model

  • provisions computing resources on demand, scaling transparently based on requests and scaling to zero when requests are no longer made;

  • offloads all infrastructure management tasks—scaling, scheduling, patching, provisioning, and more—to the cloud provider, allowing developers to focus their time and effort on development and innovation; and

  • enables cloud customers to pay only for used resources; they never pay for idle capacity.

On its own, Kubernetes can't run serverless applications without specialized software that integrates Kubernetes with a specific cloud provider's serverless platform. Knative enables any container to run as a serverless workload on any Kubernetes cluster, whether the container is built around a serverless function or other application code (for example, microservices), by abstracting away the code and handling the network routing, event triggers and autoscaling.

How Knative works: Knative components

Knative sits on Kubernetes and adds three main components or primatives: Build, Serving and Eventing. 

Build


The Knative Build component automates turning source code into a container. This process typically involves multiple steps, including:

  • Pulling source code from a code repository, such as GitHub
  • Installing the underlying dependencies, such as environment variables and software libraries, that the code needs to run
  • Building container images
  • Putting container images into a registry where Kubernetes and other developers can find it.

Knative uses Kubernetes APIs and other tools for its Build process. A developer can create a single manifest (typically a YAML file) that specifies all the variables—location of the source code, required dependencies and more—and Knative uses the manifest to automate the container build. 

Serving


The Serving component deploys and runs containers as scalable Knative services. Serving provides the following essential capabilities:

  • Configuration defines and maintains the state of a service. It also provides version management: Each modification to the configuration creates a new version of the service, and previous versions are saved.

  • Intelligent service routing lets developers route traffic to different versions of the service. Suppose you've created a new service version but want to deploy it to a subset of users before migrating all users. Intelligent service routing lets you route a percentage of user requests to the new service and the rest to a previous version; as you become more confident in the new service, you can route more traffic to it.

  • Autoscaling. Knative can scale services up into thousands of instances; it can also scale them down to zero—that is, no instances of the container at all—which is critical for supporting serverless applications.

Knative Serving borrows intelligent service routing from Istio, another application in the Kubernetes ecosystem. An open source service mesh for Kubernetes, Istio also provides authentication for service requests, automatic traffic encryption for secure communication between services, detailed metrics about microservices and serverless function operations and other tools for developers and administrators to optimize infrastructure. For more details on how Knative uses Istio, read “Istio and Knative: Extending Kubernetes for a New Developer Experience."

Eventing


The Eventing component of Knative enables different events to trigger their container-based services and functions. Knative queues and delivers those events to the appropriate containers, so there's no need to write scripts or implement middleware for the functionality. Knative also handles channels, which are queues of events that developers can choose from, and the bus, a messaging platform that delivers events to containers. It also enables developers to set up feeds, which connect an event to an action for their containers to perform.

Knative Event sources make it easier for developers to create connections to third-party event producers. Knative eventing automatically creates the connection to the event producer and routes the generated events. There's no need to figure out how to do it programmatically—Knative does all the work.

Knative use cases

To recap, Knative supports several use cases for Kubernetes users who want to simplify containerized app development or take their use of containers to the next level.

Streamlining Kubernetes. By eliminating repetitive build and configuration tasks, Knative makes developers working with Kubernetes more productive. Any development team struggling to manage a growing number of Kubernetes clusters is an ideal candidate for Knative.

Accelerating the journey to serverless. Serverless environments can be daunting to set up and manage manually. Knative enables organizations to set up serverless workloads quickly. As far as the developers are concerned, they’re just building a container—it’s Knative that runs it as a serverless function behind the scenes.

Supporting Agile and DevOps lifecycles. By enabling developers to create new containers and container versions more quickly, Knative makes it easier to deploy containerized applications in small, fast, iterative steps as part of an Agile or DevOps workflow. Knative services integrate easily into automated CI/CD pipelines without requiring special software or custom programming.

Smoothing new feature rollouts. Rolling out new releases to customers can expose software issues that might affect business processes. Knative's configuration and routing let developers expose new container revisions to a subset of the user base. Then, they gradually increase that audience over time as they troubleshoot issues or quickly roll back to older versions if necessary.

Keeping developers focused on coding and innovation. DevOps might empower developers to administer their own environments. But at the end of the day, developers want to focus on building bug-free software and innovative new features, not on configuring message bus queues for event triggering or managing container scalability. Knative lets developers spend more time doing what they do best.

Related solutions
Red Hat® OpenShift on IBM Cloud®

With Red Hat OpenShift on IBM Cloud, Red Hat OpenShift developers have a fast and secure way of containerizing and deploying enterprise workloads in Kubernetes clusters.

Explore Red Hat OpenShift
IBM Cloud Satellite

Deploy and run apps consistently across on-premises, edge computing and public cloud environments from any cloud vendor using a standard set of cloud services, including toolchains, databases and AI.

Explore IBM Cloud Satellite
IBM Cloud Code Engine

A fully managed serverless platform, IBM Cloud Code Engine lets you run your container, application code or batch job on a fully managed container runtime.

Explore IBM Cloud Code Engine
Optimize Kubernetes with IBM® Turbonomic®

Automatically determine the proper resource allocation actions and when to make them. This helps ensure that your Kubernetes environments and mission-critical apps get precisely what they need to meet your SLOs.

Explore IBM Turbonomic
Resources Containers in the enterprise

New IBM research documents the surging momentum of container and Kubernetes adoption.

What is serverless?

Serverless is a cloud application development and execution model that lets developers build and run code without managing servers or paying for idle cloud infrastructure.

Flexible, resilient, secure IT for your Hybrid Cloud

Containers are part of a hybrid cloud strategy that lets you build and manage workloads from anywhere.

Take the next step

Red Hat OpenShift on IBM Cloud offers developers a fast and secure way to containerize and deploy enterprise workloads in Kubernetes clusters. Offload tedious and repetitive tasks involving security management, compliance management, deployment management and ongoing lifecycle management. 

Explore Red Hat OpenShift on IBM Cloud Start for free