Serverless computing is an application development and execution model that enables developers to build and run application code without provisioning or managing servers or back-end infrastructure.
Serverless does not mean "no servers." The name notwithstanding, servers in serverless computing are managed by a cloud service provider (CSP). Serverless describes the developer's experience with those servers—they are invisible to the developer, who doesn't see them, manage them or interact with them in any way.
Developers can focus on writing the best front-end application code and business logic with serverless computing. All they need to do is write their application code and deploy it to containers managed by a CSP.
The cloud provider handles the rest—provisioning the cloud infrastructure required to run the code and scaling the infrastructure up and down on demand as needed—and is also responsible for all routine infrastructure management and maintenance, such as operating system updates and patches, security management, capacity planning, system monitoring and more.
Moreover, developers never pay for idle capacity with serverless. The cloud provider spins up and provisions the required computing resources on demand when the code executes and spins them back down again—called ''scaling to zero''—when execution stops. The billing starts when execution starts and ends when execution stops; typically, pricing is based on execution time and resources required.
Along with infrastructure as a service (IaaS), platform as a service (PaaS), function as a service (FaaS) and software as a service (SaaS), serverless has become a leading cloud service offering. According to a report from SkyQuest Technology, the global serverless architecture market size was valued at USD 8.01 billion in 2022 and is expected to grow from USD 9.84 billion in 2023 to USD 50.86 billion by 20311. Today, every leading cloud service provider offers a serverless platform, including Amazon Web Services (AWS Lambda), Microsoft Azure (Azure Functions), Google Cloud (Google Cloud Functions) and IBM Cloud® (IBM Cloud Code Engine).
Together, serverless computing, microservices and containers form a triumvirate of technologies at the core of cloud-native application development.
Check out this video for a detailed explanation of serverless and the serverless stack (6:37).
Serverless originated in 2008 when Google released Google App Engine (GAE), a platform for developing and hosting web applications in Google-managed data centers. With GAE, a software developer might create and launch software on Google's Cloud without worrying about server management tasks like patching or load balancing, which Google handled.
The term ''serverless'' first appeared in a tech article by cloud computing specialist Ken Fromm in 20122. In 2014, Amazon introduced AWS Lambda, the first serverless platform. Named after functions from lambda calculus and programming, AWS Lambda, a FaaS model, helped the serverless computing framework gain mass-market appeal and rapid adoption among software developers by enabling them to execute code in response to events without the need for server management. In 2016, Microsoft Azure Functions and Google Cloud Functions launched their serverless platforms.
Other major players in today's serverless platform market include IBM Cloud® Code Engine, Oracle Cloud Infrastructure (OCI) Functions, Cloudflare Workers and Alibaba Function Compute.
Serverless is more than function as a service (FaaS)—the cloud computing service that enables developers to run code or containers in response to specific events or requests without specifying or managing the infrastructure required to run the code.
FaaS is the compute model central to serverless, and the two terms are often used interchangeably. Compared to FaaS, serverless is an entire stack of services that can respond to specific events or requests and scale to zero when no longer in use—and for which provisioning, management and billing are handled by the cloud provider and invisible to developers.
In addition to FaaS, these services include databases and storage, Application programming interface (API) gateways and event-driven architecture.
Databases (SQL and NoSQL) and storage (particularly object storage) are the foundation of the data layer. A serverless approach to these technologies involves transitioning away from provisioning "instances" with defined capacity, connection and query limits and moving toward models that scale linearly with demand in both infrastructure and pricing.
API gateways act as proxies to web application actions and provide HTTP method routing, client ID and secrets, rate limits, CORS, viewing API usage, viewing response logs and API sharing policies.
Serverless architectures work well for event-driven and stream-processing workloads, most notably the open-source Apache Kafka event streaming platform.
Automated serverless functions are stateless and designed to handle individual events. These functions have become an essential part of event-driven architecture (EDA)—a software design model built around the publication, capture, processing and storage of events. In an EDA framework, event producers (for example, microservices, APIs, IoT devices) send real-time event notifications to event consumers, activating specific processing routines. For example, when Netflix releases a new original series, multiple EDA services wait on standby for the release notification, which triggers a cascade of updates to inform users. Many other companies based on user-facing web and mobile applications (for example, Uber, DoorDash, Instacart) rely on event-driven architecture.
Because serverless, platform as a service (PaaS), containers and virtual machines (VMs) all play a critical role in the cloud application development and compute ecosystem, it's helpful to measure how serverless compares to the others across some key attributes.
Kubernetes is an open-source container orchestration platform that automates container deployment, management and scaling. This automation dramatically simplifies the development of containerized applications.
Serverless applications are often deployed in containers. However, Kubernetes can only run serverless apps independently with specialized software that integrates Kubernetes with a specific cloud provider's serverless platform.
Knative is an open-source extension to Kubernetes that provides a serverless framework. It enables any container to run as a serverless workload on any cloud platform that runs Kubernetes, whether the container is built around a serverless function or some other application code (for example, microservices). Knative works by abstracting away the code and handling the network routing, event triggers and autoscaling for serverless execution.
Knative is transparent to developers. They build a container by using Kubernetes, and Knative does the rest, running the container as a serverless workload.
Serverless computing offers individual developers and enterprise development teams many technical and business benefits:
While serverless has many advantages, it's essential to consider some disadvantages:
While CSPs provide security measures to manage serverless applications, the client is also responsible for securing the application code and data per a shared responsibility model. Cloud-based security measures for serverless include automated security policies and solutions like security information and event management (SIEM), identity and access management (IAM) and threat detection and response.
Following DevSecOps practices helps development teams secure serverless technologies. DevSecOps, which stands for "development, security and operations,'' is an application development practice that automates the integration of security and security practices at every phase of the software development lifecycle, from initial design through integration, testing, delivery and deployment.
Unlike traditional on-prem data center environments, a serverless computing model can help organizations reduce energy consumption and lower their carbon footprint for IT operations.
Moreover, a serverless model allows companies to optimize their emissions through resource efficiency by only paying for and by using the needed resources. This feature results in less energy wasted on idle or excess processes.
Given its unique attributes and benefits, serverless architecture works best for use cases involving microservices, mobile back-ends and data and event stream processing.
The most common use case of serverless today is supporting microservices architectures. The microservices model is focused on creating small services that do a single job and communicate with one another by using APIs. While microservices can also be built and operated through either PaaS or containers, serverless gained significant momentum given its attributes around small bits of code, inherent and automatic scaling, rapid provisioning and a pricing model that never charges for idle capacity.
Any action (or function) in a serverless platform can be turned into an HTTP endpoint ready to be consumed by web clients. When enabled for the web, these actions are called web actions. Once you have web actions, you can assemble them into a full-featured API with an API gateway that brings more security, OAuth3 support, rate limiting and custom domain support.
Open Liberty InstantOn4 takes a novel approach to support rapid startup for serverless applications. With InstantOn, you can take a checkpoint of your running Java application process during application build and then restore that checkpoint in production. The restore is fast (in the low 100 s of milliseconds), which makes it ideal for serverless. Since InstantOn is a checkpoint of your existing application, its behavior after restore is identical, including the same excellent throughput performance. This process enables organizations to adopt serverless for new cloud-native applications and provides the opportunity to bring serverless to existing enterprise.
Serverless is well suited to working with structured text, audio, image and video data around tasks such as data enrichment, transformation, validation and cleansing. Developers can also use it for PDF processing, audio normalization, image processing (rotation, sharpening, noise reduction, thumbnail generation), optical character recognition (OCR) and video transcoding.
Any embarrassingly parallel task is a good use case for a serverless runtime, with each parallelizable task resulting in one action invocation. Sample tasks include data search and processing (specifically cloud object storage), MapReduce operations and web scraping, business process automation, hyperparameter tuning, Monte Carlo simulations and genome processing.
Combining managed Apache Kafka with FaaS and database or storage offers a robust foundation for real-time buildouts of data pipelines and streaming apps. These architectures are ideally suited for working with all sorts of data stream ingestions (for validation, cleansing, enrichment and transformation), including IoT sensor data, application log data, financial market data and business data streams (from other data sources).
Serverless provides the automated scalability needed to run artificial intelligence (AI) and machine learning (ML) workloads, ensuring optimal performance and accelerating innovation.
Serverless computing supports a hybrid cloud strategy by providing the agility, flexibility and scalability needed to accommodate fluctuating workloads across on-prem, public cloud, private cloud and edge environments.
Serverless supports many of today's most common applications, including customer relationship management (CRM), high-performance computing (HPC), big data analytics, business process automation, video streaming, gaming, telemedicine, digital commerce, chatbot creation and more.
You can expand your serverless computing skills with these tutorials:
(All links resides outside of ibm.com)
1 Global Serverless Architecture Market Report, SkyQuest, March 2024
2 Why The Future Of Software And Apps Is Serverless, ReadWrite, October 2012
3 About OAuth 2.0, OAuth
4 Faster startup for containerized applications with Open Liberty InstantOn, Open Liberty
Accelerate your business transformation with cloud solutions designed for innovation and growth. Explore cutting-edge tools and insights to stay ahead of the competition.
Ready to modernize your business? Discover the most efficient cloud migration strategies that can optimize performance, reduce costs and enhance scalability.
Explore how Infrastructure as a Service (IaaS) can empower your business with scalable, flexible and cost-efficient cloud infrastructure solutions tailored to your needs.
Learn how IaaS, PaaS and SaaS can transform your operations, offering flexibility, scalability and cost-efficiency. Understand the differences and choose the perfect solution for your growth.
Discover how NoSQL databases can enhance your data management strategy with flexible, scalable solutions. Learn about the types and benefits of NoSQL to stay ahead in today’s data-driven world.
Discover how Techwave used IBM Cloud bare metal servers to drive digital transformation, ensuring flexibility, scalability and full control for its clients.
Use our cloud services, powered by our IBM Consulting Advantage platform, to speed your journey to hybrid cloud, driving cost efficiency, increased productivity, sustainability and faster time to market.
Dedicated server hosting from IBM provides total isolation, control, and security for mission-critical workloads with customizable options.
IBM Cloud Virtual Server for VPC is family of Intel x86, IBM Z, and IBM LinuxONE virtual servers built on IBM Cloud Virtual Private Cloud.