Today, we are excited to announce that metric streaming is now available to push metrics from IBM Cloud Monitoring to a Kafka service, such as IBM Cloud Event Streams. 

IBM Cloud Monitoring is a cloud-native, container-intelligence management system that you can include as part of your IBM Cloud architecture to gain operational visibility into the performance and health of your applications, services and platforms. With this feature, you can export a set of defined metrics to an external tool of choice for deeper analysis via easy integration with the IBM Event Streams service. Additionally, if storing metrics longer than the default 15-month retention period is necessary, you can leverage metric streaming to export data to external storage for historical purposes. 

How do I start using metric streaming?

All IBM Cloud Monitoring instances currently have the metric streaming functionality enabled. To configure the metric data streaming for your instance, access the Settings as an Administrator and navigate to the metric data streaming section:

To start using metric streaming, you will first need to provision a Kafka service instance, including an available topic and corresponding authentication information that will be used to configure the new integration. 

In the Metric Data Streaming tab, click on Add Integration to create a new configuration:

Once you have defined this information, use the Test Connection button to verify the connection between IBM Cloud Monitoring and your Kafka endpoint. 

Additionally, you will need to decide which metrics to export using a specified scope via the PromQL format, including all metrics and labels. Data granularity is available at either 10 seconds or one minute.

Integration with IBM Event Streams

First, create a new instance of the IBM Event Streams service using the same region as your Sysdig instance from where you are exporting the data (recommended):

Next, create a new topic for the data export with your desired partition and data retention configuration:

Once your topic has been created, access Service credentials and create a one with Write role (minimum requirement):

From the newly created service credential, extract the following fields for your configuration: 

  • The Kafka Brokers to include as “Brokers” in the configuration
  • The username and password that your IBM Cloud Monitoring instance will use for authentication

Once you have your configuration created in the IBM Cloud Monitoring instance, you will see metrics flowing to your Kafka topic. 

Learn more

IBM Cloud Monitoring is powered by Sysdig Monitor, which leverages system calls enriched with cloud and Kubernetes context along with Prometheus metrics to help you resolve issues faster. It’s this robust data that gives you maximum visibility to ensure application availability, performance and fast problem resolution. With extensive out-of-the-box dashboards, easy-to-use alerts and access to notification channels, you can start quickly and scale simply to get more done. However, if you already have a tool of choice that you love, you can still take advantage of the data that Sysdig Monitor provides via our newly released streaming support, in order to then slice, dice and analyze wherever you want. 

Visit our docs for additional information on how you can get started with our service and feel free to contact me directly with any specific questions or concerns you may have.

More from Analytics

IBM acquires StreamSets, a leading real-time data integration company

3 min read - We are thrilled to announce that IBM has acquired StreamSets, a real-time data integration company specializing in streaming structured, unstructured and semistructured data across hybrid multicloud environments. Acquired from Software AG along with webMethods, this strategic acquisition expands IBM's already robust data integration capabilities, helping to solidify our position as a leader in the data integration market and enhancing IBM Data Fabric’s delivery of secure, high-quality data for artificial intelligence (AI).  According to a Forrester study conducted on behalf of…

Fine-tune your data lineage tracking with descriptive lineage

4 min read - Data lineage is the discipline of understanding how data flows through your organization: where it comes from, where it goes, and what happens to it along the way. Often used in support of regulatory compliance, data governance and technical impact analysis, data lineage answers these questions and more.  Whenever anyone talks about data lineage and how to achieve it, the spotlight tends to shine on automation. This is expected, as automating the process of calculating and establishing lineage is crucial to…

Reimagine data sharing with IBM Data Product Hub

3 min read - We are excited to announce the launch of IBM® Data Product Hub, a modern data sharing solution designed to accelerate data-driven outcomes across your organization. Today, we're making this product generally available to our clients across the world, following its announcement at the IBM Think conference in May 2024. Data sharing has become the lifeblood of modern organizations, fueling growth and driving innovation. But traditional approaches to data sharing can often be a bottleneck constricting the seamless sharing of data.…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters