Home Topics Serverless computing What is serverless computing?
Explore IBM's serverless solution Subscribe for cloud updates
Illustration with collage of pictograms of computer monitor, server, clouds, dots

Published: 10 June 2024
Contributors: Stephanie Susnjara, Ian Smalley

What is serverless computing?

Serverless computing is an application development and execution model that enables developers to build and run application code without provisioning or managing servers or back-end infrastructure.

Serverless does not mean "no servers." The name notwithstanding, servers in serverless computing are managed by a cloud service provider (CSP). Serverless describes the developer's experience with those servers—they are invisible to the developer, who doesn't see them, manage them or interact with them in any way.

Developers can focus on writing the best front-end application code and business logic with serverless computing. All they need to do is write their application code and deploy it to containers managed by a CSP.

The cloud provider handles the rest—provisioning the cloud infrastructure required to run the code and scaling the infrastructure up and down on demand as needed—and is also responsible for all routine infrastructure management and maintenance, such as operating system updates and patches, security management, capacity planning, system monitoring and more.

Moreover, developers never pay for idle capacity with serverless. The cloud provider spins up and provisions the required computing resources on demand when the code executes and spins them back down again—called ''scaling to zero''—when execution stops. The billing starts when execution starts and ends when execution stops; typically, pricing is based on execution time and resources required.

Along with infrastructure as a service (IaaS), platform as a service (PaaS), function as a service (FaaS) and software as a service (SaaS), serverless has become a leading cloud service offering. According to a report from SkyQuest Technology, the global serverless architecture market size was valued at USD 8.01 billion in 2022 and is expected to grow from USD 9.84 billion in 2023 to USD 50.86 billion by 20311. Today, every leading cloud service provider offers a serverless platform, including Amazon Web Services (AWS Lambda), Microsoft Azure (Azure Functions), Google Cloud (Google Cloud Functions) and IBM Cloud® (IBM Cloud Code Engine).

Together, serverless computing, microservices and containers form a triumvirate of technologies at the core of cloud-native application development.

Check out this video for a detailed explanation of serverless and the serverless stack (6:37).

Realize the full value of your hybrid cloud

Connect and integrate your systems to prepare your infrastructure for AI.

Related content

Register for the guide on app modernization

The origins of serverless

Serverless originated in 2008 when Google released Google App Engine (GAE), a platform for developing and hosting web applications in Google-managed data centers. With GAE, a software developer might create and launch software on Google's Cloud without worrying about server management tasks like patching or load balancing, which Google handled.

The term ''serverless'' first appeared in a tech article by cloud computing specialist Ken Fromm in 20122. In 2014, Amazon introduced AWS Lambda, the first serverless platform. Named after functions from lambda calculus and programming, AWS Lambda, a FaaS model, helped the serverless computing framework gain mass-market appeal and rapid adoption among software developers by enabling them to execute code in response to events without the need for server management. In 2016, Microsoft Azure Functions and Google Cloud Functions launched their serverless platforms.

Other major players in today's serverless platform market include IBM Cloud® Code Engine, Oracle Cloud Infrastructure (OCI) Functions, Cloudflare Workers and Alibaba Function Compute.

The serverless ecosystem
Serverless and FaaS

Serverless is more than function as a service (FaaS)—the cloud computing service that enables developers to run code or containers in response to specific events or requests without specifying or managing the infrastructure required to run the code.

FaaS is the compute model central to serverless, and the two terms are often used interchangeably. Compared to FaaS, serverless is an entire stack of services that can respond to specific events or requests and scale to zero when no longer in use—and for which provisioning, management and billing are handled by the cloud provider and invisible to developers.

In addition to FaaS, these services include databases and storage, Application programming interface (API) gateways and event-driven architecture.

Serverless databases and storage

Databases (SQL and NoSQL) and storage (particularly object storage) are the foundation of the data layer. A serverless approach to these technologies involves transitioning away from provisioning "instances" with defined capacity, connection and query limits and moving toward models that scale linearly with demand in both infrastructure and pricing.

API gateways

API gateways act as proxies to web application actions and provide HTTP method routing, client ID and secrets, rate limits, CORS, viewing API usage, viewing response logs and API sharing policies.

Serverless and event-driven architecture

Serverless architectures work well for event-driven and stream-processing workloads, most notably the open-source Apache Kafka event streaming platform.

Automated serverless functions are stateless and designed to handle individual events. These functions have become an essential part of event-driven architecture (EDA)—a software design model built around the publication, capture, processing and storage of events. In an EDA framework, event producers (for example, microservices, APIs, IoT devices) send real-time event notifications to event consumers, activating specific processing routines. For example, when Netflix releases a new original series, multiple EDA services wait on standby for the release notification, which triggers a cascade of updates to inform users. Many other companies based on user-facing web and mobile applications (for example, Uber, DoorDash, Instacart) rely on event-driven architecture.

Serverless versus PaaS, containers and VMs

Because serverless, platform as a service (PaaS), containers and virtual machines (VMs) all play a critical role in the cloud application development and compute ecosystem, it's helpful to measure how serverless compares to the others across some key attributes.

  • Provisioning time: Measured in milliseconds for serverless versus minutes to hours for the other models.
  • Administrative burden: None for serverless, compared to a continuum from light to medium to heavy for PaaS, containers and VMs, respectively.
  • Maintenance: Serverless architectures are managed 100% by CSPs. The same is true for PaaS, but containers and VMs require significant maintenance, including updating and managing operating systems, container images, connections and so on.
  • Scaling: Autoscaling, including autoscaling to zero, is instant and inherent for serverless. The other models offer automatic but slow scaling that requires careful tuning of autoscaling rules and no scaling to zero.
  • Capacity planning: None is needed for serverless. The other models require a mix of automatic scalability and capacity planning.
  • Statelessness: Inherent for serverless, which means scalability is never a problem; state is maintained in an external service or resource. PaaS, containers and VMs can use HTTP, keep an open socket or connection for long periods and store state in memory between calls.
  • High availability (HA) and disaster recovery (DR): Serverless offers both high availability and disaster recovery with no extra effort or extra cost. The other models require extra cost and management effort. Infrastructure can be restarted automatically with VMs and containers.
  • Resource utilization: Serverless is 100% efficient because there is no idle capacity—it is invoked only upon request. All other models feature at least some degree of idle capacity.
  • Billing and savings: Serverless is metered in units of 100 milliseconds. PaaS, containers and VMs are typically metered by the hour or the minute.
Serverless, Kubernetes and Knative

Kubernetes is an open-source container orchestration platform that automates container deployment, management and scaling. This automation dramatically simplifies the development of containerized applications.

Serverless applications are often deployed in containers. However, Kubernetes can only run serverless apps independently with specialized software that integrates Kubernetes with a specific cloud provider's serverless platform.

Knative is an open-source extension to Kubernetes that provides a serverless framework. It enables any container to run as a serverless workload on any cloud platform that runs Kubernetes, whether the container is built around a serverless function or some other application code (for example, microservices). Knative works by abstracting away the code and handling the network routing, event triggers and autoscaling for serverless execution.

Knative is transparent to developers. They build a container by using Kubernetes, and Knative does the rest, running the container as a serverless workload.

Pros and cons of serverless
Pros

Serverless computing offers individual developers and enterprise development teams many technical and business benefits:

  • Improved developer productivity: As noted, serverless enables development teams to focus on writing code, not managing infrastructure. It gives developers more time to innovate and optimize their front-end application functions and business logic.
  • Pay for execution only: The meter starts when the request is made and ends when execution finishes. Compare this to the IaaS compute model, where customers pay for the physical servers, VMs and other resources required to run applications, from when they provision those resources until they explicitly decommission them.
  • Develop in any language: Serverless is a polyglot environment that enables developers to code in any language or framework—Java, Python, JavaScript, node.js—that they're comfortable with.
  • Streamlined development or DevOps cycles: Serverless simplifies deployment and, in a larger sense, simplifies DevOps because developers don't spend time defining the infrastructure required to integrate, test, deliver and deploy code builds into production.
  • Cost-effective performance: For specific workloads (for example, embarrassingly parallel processing, stream processing, specific data processing tasks), serverless computing can be both faster and more cost-effective than other forms of compute.
  • Reduce latency: In a serverless environment, code can run closer to the end user, decreasing latency.
  • Usage visibility: Serverless platforms provide near-total visibility into system and user times and can aggregate usage information systematically.
Cons

While serverless has many advantages, it's essential to consider some disadvantages:

  • Less control: In a serverless setting, an organization hands server control over to a third-party CSP, thus relinquishing the management of hardware and execution environments.
  • Vendor lock-in: Each service provider offers unique serverless capabilities and features that are incompatible with other vendors.
  • Slow startup: Also known as "cold start," slow startup can affect the performance and responsiveness of serverless applications, particularly in real-time demand environments. 
  • Complex testing and debugging: Debugging can be more complicated with a serverless computing model as developers lack visibility into back-end processes.
  • Higher cost for running long applications: Serverless execution models are not designed to execute code for extended periods. Therefore, long-running processes can cost more than traditional dedicated server or VM environments.
Serverless and security

While CSPs provide security measures to manage serverless applications, the client is also responsible for securing the application code and data per a shared responsibility model. Cloud-based security measures for serverless include automated security policies and solutions like security information and event management (SIEM), identity and access management (IAM) and threat detection and response.

Following DevSecOps practices helps development teams secure serverless technologies. DevSecOps, which stands for "development, security and operations,'' is an application development practice that automates the integration of security and security practices at every phase of the software development lifecycle, from initial design through integration, testing, delivery and deployment.

Serverless and sustainability

Unlike traditional on-prem data center environments, a serverless computing model can help organizations reduce energy consumption and lower their carbon footprint for IT operations.

Moreover, a serverless model allows companies to optimize their emissions through resource efficiency by only paying for and by using the needed resources. This feature results in less energy wasted on idle or excess processes.

Serverless use cases

Given its unique attributes and benefits, serverless architecture works best for use cases involving microservices, mobile back-ends and data and event stream processing.

Serverless and microservices

The most common use case of serverless today is supporting microservices architectures. The microservices model is focused on creating small services that do a single job and communicate with one another by using APIs. While microservices can also be built and operated through either PaaS or containers, serverless gained significant momentum given its attributes around small bits of code, inherent and automatic scaling, rapid provisioning and a pricing model that never charges for idle capacity.

API backends

Any action (or function) in a serverless platform can be turned into an HTTP endpoint ready to be consumed by web clients. When enabled for the web, these actions are called web actions. Once you have web actions, you can assemble them into a full-featured API with an API gateway that brings more security, OAuth3 support, rate limiting and custom domain support.

Open Liberty InstantOn (CRIU)

Open Liberty InstantOn4 takes a novel approach to support rapid startup for serverless applications. With InstantOn, you can take a checkpoint of your running Java application process during application build and then restore that checkpoint in production. The restore is fast (in the low 100 s of milliseconds), which makes it ideal for serverless. Since InstantOn is a checkpoint of your existing application, its behavior after restore is identical, including the same excellent throughput performance. This process enables organizations to adopt serverless for new cloud-native applications and provides the opportunity to bring serverless to existing enterprise.

Data processing

Serverless is well suited to working with structured text, audio, image and video data around tasks such as data enrichment, transformation, validation and cleansing. Developers can also use it for PDF processing, audio normalization, image processing (rotation, sharpening, noise reduction, thumbnail generation), optical character recognition (OCR) and video transcoding.

Massively parallel compute and "map" operations

Any embarrassingly parallel task is a good use case for a serverless runtime, with each parallelizable task resulting in one action invocation. Sample tasks include data search and processing (specifically cloud object storage), MapReduce operations and web scraping, business process automation, hyperparameter tuning, Monte Carlo simulations and genome processing.

Stream processing workload

Combining managed Apache Kafka with FaaS and database or storage offers a robust foundation for real-time buildouts of data pipelines and streaming apps. These architectures are ideally suited for working with all sorts of data stream ingestions (for validation, cleansing, enrichment and transformation), including IoT sensor data, application log data, financial market data and business data streams (from other data sources).

AI and serverless

Serverless provides the automated scalability needed to run artificial intelligence (AI) and machine learning (ML) workloads, ensuring optimal performance and accelerating innovation.

Hybrid cloud and serverless

Serverless computing supports a hybrid cloud strategy by providing the agility, flexibility and scalability needed to accommodate fluctuating workloads across on-prem, public cloud, private cloud and edge environments.

Common applications for serverless

Serverless supports many of today's most common applications, including customer relationship management (CRM), high-performance computing (HPC), big data analytics, business process automation, video streaming, gaming, telemedicine, digital commerce, chatbot creation and more.

Tutorials: Get started with serverless

You can expand your serverless computing skills with these tutorials:

  • Experience Liberty InstantOn (CRIU): Unlock new value for your organization with IBM WebSphere® Liberty by optimizing your cloud deployment and improving operational efficiency. Liberty InstantOn showcases a better way to achieve serverless performance without compromise.
  • Run batch jobs: Learn how to run a batch job that uses the Code Engine console. A job runs one or more instances of your executable code. Unlike applications that handle HTTP requests, jobs are designed to run once and exit.
Solutions
IBM Cloud Code Engine

Run your container, application code or batch job on a fully managed container runtime.

Explore IBM Cloud Code Engine
IBM Cloud Pak® for Applications

Whether it's deployment, building new cloud-native applications, refactoring or replatforming existing applications, Cloud Pak for Applications (CP4Apps) has it covered.

Explore IBM Cloud Pak for Applications
IBM Cloud Satellite®

Deploy and run apps consistently across on-premises, edge computing and public cloud environments from any cloud vendor.

Explore IBM Cloud Satellite
IBM Cloudant®

The data layer for hyperscale, resilient, globally available applications, based on open-source Apache CouchDB.

Explore IBM Cloudant
Instana® serverless monitoring

Monitor your serverless applications and services to optimize performance, availability and security.

Explore Instana serverless monitoring
Resources
Serverless in the enterprise, 2021

Learn more about insights on the real-world opportunities and challenges of serverless computing.

Intro to IBM Cloud Code Engine

Go beyond functions to run all your containerized workloads—including web apps, microservices and batch jobs—on this fully managed serverless platform.

Enjoy your cloud again

Take a closer look at IBM Cloud Code Engine and the benefits that it offers.

Take the next step

Serverless computing offers a simpler, more cost-effective way of building and operating applications in the cloud. Learn about IBM Cloud Code Engine, a pay-as-you-use serverless platform that lets developers deploy serverless applications and workflows with no Kubernetes skills needed.

Explore Cloud Code Engine
Footnotes

(All links resides outside of ibm.com)

Global Serverless Architecture Market Report, SkyQuest, March 2024

Why The Future Of Software And Apps Is Serverless, ReadWrite, October 2012

3 About OAuth 2.0, OAuth

4 Faster startup for containerized applications with Open Liberty InstantOn, Open Liberty