Home
Topics
Event-driven architecture
Event-driven architecture (EDA) is a software design model built around the publication, capture, processing and storage of events.
It enables teams to identify system events (basically any change or action that occurs within the system) and respond and react to them in real time (or near-real time).
The profusion of EDAs across cloud-native environments represents a significant shift away from traditional computational architectures—which focus on stockpiling static data in repositories like data lakes (as in service-oriented architectures)—toward a dynamic approach that tracks data as it traverses an architecture. Data is still valuable in an event-driven system, but EDAs emphasize the timely reaction to events, recognizing that the value of an event might diminish as time progresses.
In an event-driven architecture, event producers (like microservices, APIs and IoT devices) send real-time event notifications to event consumers that then activate specific processing routines. For instance, when Netflix releases a new original series, multiple EDA services wait on standby for the release notification, which triggers a cascade of updates to inform users.
One of the key advantages of an event-driven architecture is the decoupled relationship between front-end and back-end components, which allows systems to share information without knowing about each other. Producers can send events without knowing which consumer will receive them, and consumers can receive events without sending requests to producers. In other words, EDAs enable systems to work independently and process events asynchronously.
Modern, forward-thinking enterprises have vast digital footprints, and the real-time functionality of an event-driven system allows businesses to maintain operational readiness without idling and respond quickly to event broadcasts. As such, EDAs help enterprises automate a range of organizational processes—from optimizing supply chains to proactively identifying quality issues—and ultimately improve both their top and bottom lines.
Learn how IBM Event Automation can help you put events to work by enabling business and IT users to detect situations, act in real time, automate decisions and maximize revenue potential.
Event streaming revolves around the unbounded, sequential and real-time flow of data records called "events,” foundational data structures that record any occurrence or change in the system or environment. Examples of such changes include a user adding an item to their shopping cart on an e-commerce site or requesting to reset a password or an application state change. It’s a term that essentially refers to every data point in the system. And a “stream” (also called a data stream or streaming data) is the continuous delivery of those events.
Each event typically comprises a key that identifies the event or the entity it pertains to, a value that holds the actual data of the event, a timestamp that indicates when the event occurred or was recorded, and sometimes metadata about the event source, schema version or another attribute. They can either carry state data (the item purchased, its price and a delivery address, for instance) or serve as identifiers (a shipping notification).
With the help of specialized stream processing engines, events can undergo a few different processes within a stream. “Aggregations” perform data calculations, like means, sums and standard deviation. “Ingestion” adds streaming data to databases. Analytics processing uses patterns in streaming data to predict future events, and enrichment processing combines data points with other data sources to provide context and create meaning.
Events are often tied to business operations or user navigation processes and typically trigger another action, process or series of events. Take online banking, as one example. When a user clicks “transfer” to send money from one bank account to another, the funds are withdrawn from the sender’s account and added to the recipient’s bank account, email or SMS notifications are sent to either (or both) parties, and if necessary, security and fraud prevention protocols are deployed.
In addition to events, EDAs rely on three primary components to move event data through the architecture.
In an EDA, event-driven applications act as producers or consumers (and sometimes both).
When an app or service performs an action that another app or service might want to know about, it publishes a new event—a record of that action or change—that another service can consume and process to perform other actions.
The event producer then transmits the event—in the form of a message—to a broker or another type of event router, which maintains the event’s chronological order relative to other events. An event consumer ingests the message—in real time (as it occurs) or in a later relevant instance—and processes the message to trigger another action, workflow or event of its own.
In a simple example, a banking service might transmit a “deposit” event, which another bank service would consume and respond to by writing a deposit to the customer’s statement.
But event-driven integrations can also trigger real-time responses based on complex analyses of huge volumes of data, like when a customer clicks on a product on an e-commerce site and the system generates instant product recommendations based on other customers’ purchases.
Event-driven architecture maximizes the potential of cloud-native applications and enables powerful app technologies, like real-time analytics and decision support. Overall, they replace the traditional “request/response” architecture, where one app must request specific information from another app and wait for a reply before moving on to the next task.
However, “EDAs” is a term that includes several architectural patterns, all of which can be useful for different purposes:
In the pub/sub model—defined by a one-to-many dependency between objects and a decoupled, asynchronous relationship between publisher (event producer) and consumer—the publisher does not need to know about the subscribers. It just publishes the event to a shared event channel where subscribers listen and react to the event independently, in real time.
Typically, a message broker (router) handles the transmission of event messages between publishers and subscribers. The broker receives each event message, transforms it (if necessary), maintains its order relative to other messages, makes it available to subscribers for consumption and then deletes it once consumed (so it can’t be consumed again).
Pub/sub-messaging patterns are ideal for businesses with large codebases and for broadcasting information to multiple consumers (for notification systems and real-time data feeds, for instance).
Like pub/sub, event streaming decouples publishers and consumers to enable asynchronous communication. However, in the event streaming model, event consumers don’t need subscriptions to the streams; rather, producers publish streams of events to a broker log, and consumers can step into each stream at any point and consume only the events they want to consume (instead of receiving and consuming every published event).
Unlike pub/sub, however, event streaming brokers retain the events even after the consumers have received them.
Because consumers can process events at any time after they are published, event streaming records are persistent. This means that they are maintained for a configurable amount of time (anywhere from fractions of a second to forever). Consumers can access the stream at any time to read recent messages, batch process a series of messages from the last time they accessed the stream or reference relevant messages from the recent past.
Event streaming technologies (like Apache Kafka, Amazon Web Services (AWS) Kinesis and IBM Event Automation also include two models: the pull model (where brokers only send consumer data when consumers indicate they’re open to events) and the push model (where the broker’s business logic dictates which consumers receive events).
Event streaming patterns are most useful for apps that need both real-time event updates and access to past events (fraud detection systems for financial institutions, for instance).
In addition to the two primary EDA architectural patterns, three design patterns govern how events are processed once they reach the subscriber.
All three processing patterns (among others) can be used within both publish/subscribe and event streaming architectural patterns, but ESP is (naturally) most common in the event streaming architectural pattern.
EDAs can be useful for businesses operating in myriad sectors, but they’re especially valuable for businesses with large, complex IT environments.
Enterprises attempting to integrate systems running in different tech stacks, for example, can use the decoupling features of event-driven architectures to keep event data system-agnostic and improve interoperability. Multinational corporations can use an EDA to coordinate systems across accounts and regions, facilitating independent scaling of different parts of the architecture. And for businesses running systems that each process different parts of an event, the fan-out features of EDAs can push events to each consumer—without needing new code—for parallel processing across the system.
In e-commerce, an event-driven architecture can filter and route events in real time to make sure they go only to subscribers interested in their data. New purchases go directly to order processing consumers, order issues are routed directly to customer service channels, and so forth.
Event-driven architectures are becoming essential for keeping today’s businesses apace with the market and moving them into the future. In fact, 26% of organizations are planning to adopt EDAs to meet business needs, on top of the nearly 37% of companies who already have. 1 And the EDA software industry is expected to double in size—to more than USD 16.5 billion in revenue—by 2027.2
Thousands of business events flow through every part of an organization each day, and these events provide a wealth of information about what’s happening across a business at any moment in time. However, without the proper technology, many businesses aren’t able to process and use this data to make informed decisions about their customers, products or business.
That’s where EDA platforms become invaluable, enabling high-throughput, low-latency process automation and providing advanced tools for a range of use cases, including:
Use an EDA to get real-time views of transactional data flowing across your business. Combine historical analyses with live spending patterns to develop more detailed profiles and quickly spot opportunities to engage with prospective customers.
Use event-driven architectures to monitor changes in stock levels across business channels in real time, so you can automate and optimize shipping volume based on which high-profit items or top-sellers are running low.
Evaluate real-time usage and activity patterns, along with historic trends, to detect new anomalies and issue suspicious activity alerts as soon as aberrations arise.
Better understand customer behavior by combining in-shop and online activity, and generate informed, real-time offers designed to increase customer spending.
Draw insights from real-time equipment and product data to detect risk factors and quality issues promptly, helping your facility anticipate and address potential malfunctions or breakdowns.
Detect real-time price fluctuations for the materials that impact your business’s bottom line. Keep costs low by quickly renegotiating to the best available price to maximize potential revenue.
Event-driven architectures put business events to work by enabling users to detect emerging situations, act in real time, automate decision-making and maximize revenue potential. EDA can also help enterprises sustain and accelerate growth, by delivering:
EDAs enable systems to scale up by adding more instances of services to handle increased workloads.
EDAs enable components to communicate asynchronously; producers publish event messages on their own schedule, without waiting for consumers to receive them (or even knowing if consumers received them), simplifying both integration and the user experience.
Services can be added, removed or modified independently, facilitating agile development and deployment practices.
EDAs are decoupled within time and synchronization, so event producers and consumers interact through events (rather than direct API calls), which decreases dependencies and increases overall system resilience.
EDA is inherently designed for real-time processing and response, enabling teams to respond more proactively and facilitating smarter actions and automations.
IBM® Event Automation is a fully composable solution that enables businesses to accelerate their event-driven efforts, wherever they are on their journey. The event streams, event endpoint management and event processing capabilities help lay the foundation of an event-driven architecture for unlocking the value of events.
Build smart applications that can react to events as they happen. Handle mission-critical workloads through enhanced system connectivity, rich deployment and operations, and an event-driven architecture expertise.
By leveraging AI for real-time event processing, businesses can connect the dots between disparate events to detect and respond to new trends, threats and opportunities.
What’s the difference, how can they benefit you and which is the right choice for your business?
Organizations that become more event-driven are able to better differentiate themselves from competitors and ultimately impact their top and bottom lines.
Stream processing is at the core of real-time data. It allows your business to ingest continuous data streams as they happen and bring them to the forefront for analysis, enabling you to keep up with constant changes.
Learn how the AsyncAPI specification can describe and document Kafka topics.
Data management is the practice of ingesting, processing, securing and storing an organization’s data, where it is then utilized for strategic decision-making to improve business outcomes.
1 IDC TechBrief: Future of Industry Ecosystems — Event-Driven Architecture (link resides outside of ibm.com), IDC, 30 June 2023.
2 IDC PlanScape: Developing an EDA Strategy for the Future of Industry Ecosystems (link resides outside of ibm.com), IDC, 29 September 2023.