Home Z software Z analytics Machine Learning for z/OS Machine Learning for IBM z/OS

Accelerate your business insights at scale with transactional AI on IBM z/OS

Try it free

IBM Machine Learning for z/OS® (MLz), formerly IBM Watson® Machine Learning for z/OS, is a transactional artificial AI solution that runs natively on IBM Z®. It provides a web user interface (UI), various APIs and a web administration dashboard with a powerful suite of easy-to-use tools for model development and deployment, user management and system administration.

Easily import, deploy and monitor models to achieve value from every transaction and drive new outcomes for your enterprise while maintaining operational SLAs.

For greater flexibility, Machine Learning for z/OS includes two editions: 

  • IBM Machine Learning for IBM z/OS Enterprise Edition – a full lifecycle end-to-end AI platform with enterprise AI features like native CICS® and IMS scoring interfaces, Python and Spark scoring services, ONNX and Deep Learning Compiler support and trustworthy AI features like explainability.
  • IBM Machine Learning for IBM z/OS Core Edition – a lightweight version of MLz providing the essential services that are REST-API-based for machine learning operations including online scoring capabilities on IBM Z.

All IBM Machine Learning for IBM z/OS editions can run as a stand-alone solution or infuse it into your enterprise AI capability as a scalable platform.

Release 3.2: Machine Learning for IBM z/OS
Unlock the power of Trustworthy AI with MLz v3.2

Join us to discover cutting-edge features and real-world applications in our exclusive webinar session.

What's new

Visualized explanations of AI inferences can be natively accessed in MLz

MLz Core

MLz Enterprise

Benefits AI at speed

Use the unprecedented power of IBM z16™ and the Telum™ AIU with the Machine Learning for z/OS software solution to deliver transactional AI capability. Process up to 228 K z/OS CICS credit card transactions per second with 6 ms response time, each with an in-transaction fraud detection inference operation that uses a Deep Learning Model.1

AI at scale

Colocate applications with inferencing requests to help minimize delays caused by network latency. This delivers up to 20x lower response time and up to 19x higher throughput versus sending the same inferencing requests to a x86 cloud server with 60 ms average network latency.2

Trustworthy AI 

Use trustworthy AI capabilities like explainability and monitor your models in real time for drift to develop and deploy your transactional AI models on z/OS for mission-critical transactions and workloads with confidence.

Compare editions
Editions Enterprise Edition

An enhanced edition that delivers improved scoring performance, a new version of Spark and Python machine learning runtimes and includes a GUI-guided configuration tool and more.

Core Edition

A lightweight version of WMLz providing the essential services that are REST-API-based for machine learning operations including online scoring capabilities on IBM Z.

Configuration experience

Guided UI

Scripts, /OSMF Workflow

Repository database

Db2® for IBM z/OS, built-in (Derby for z/OS)

Db2 for IBM z/OS, built-in (Derby for z/OS)

Scoring engine

Spark, Python, PMML, IBM Snap ML, Watson Core Time Series

Spark, PMML, IBM Snap ML, Watson Core Time Series

Inference interface

In-transaction scoring with native interfaces for CICS and IMS, RESTful interface

RESTful interface

Model lifecycle management

Guided UI, RESTful services

RESTful services

Supported AI model formats

Spark, Python, PMML, ONNX

Spark, PMML

z16 on-chip AI acceleration

ONNX and IBM Snap ML models

IBM Snap ML Models

AI model training tool

Integrated JupyterHUB

Trustworthy AI

Explainability and drift detection

Try it at no cost
IBM Machine Learning for z/OS Online Scoring Community Edition

Try this lightweight, no-cost option to experience IBM Machine Learning for z/OS, enabling in-transaction scoring for deep learning models. This capability can deliver significant AI value in critical business areas such as fraud detection, customer churn, loan approval and operational performance. Embed deep learning models in your transactional applications on IBM Z, particularly when milliseconds matter.

Try it at no cost
Technical details

Machine Learning for z/OS uses both IBM proprietary and open source technologies and requires prerequisite hardware and software. 

  • z16™, z15®, z14, z13® or zEnterprise® EC12 system
  • z/OS 3.1, 2.5 or 2.4
  • IBM 64-bit SDK for z/OS Java™ Technology Edition version 8 SR7, 11.0.17 or later
  • IBM WebSphere Application Server for z/OS Liberty version 22.0.0.9 or later
  • Db2 12 for z/OS or later only if you choose Db2 for z/OS as the repository metadata database
Enterprise Edition prerequisites Core Edition prerequisites
Related products IBM Z Anomaly Analytics

Identify operational issues and avoid costly incidents by detecting anomalies in both log and metric data.

Python AI Toolkit for IBM z/OS

Access a library of relevant open source software to support today's AI and ML workloads.

IBM z/OS

Leverage a security-rich and scalable operating system for running mission-critical applications.

IBM Db2 13 for z/OS

Enhance availability, security and resiliency while improving performance and business results.

IBM Db2 Analytics Accelerator for z/OS

Get high-speed data analysis for real-time insight under the control and security of IBM Z.

IBM Db2 AI for z/OS

Learn how AI enhances usability, improves operational performance and maintains the health of IBM Db2 systems.

Next steps

Explore Machine Learning for IBM z/OS. Schedule a no-cost 30-minute meeting with an IBM Z representative.

Try it at no cost
More ways to explore Documentation Support IBM Redbooks® Support and services Global financing Flexible pricing Education and training Community Developer community Business Partners Resources
Footnotes

DISCLAIMER: The performance result is extrapolated from IBM internal tests running a CICS credit card transaction workload with inference operations on an IBM z16. A z/OS V2R4 LPAR configured with 6 CPs and 256 GB of memory was used. Inferencing was done with Machine Learning for z/OS 2.4 running on WebSphere Application Server Liberty 21.0.0.12, using a synthetic credit card fraud detection model (https://github.com/IBM/ai-on-z-fraud-detection) and the Integrated Accelerator for AI. Server-side batching was enabled on Machine Learning for z/OS with a size of 8 inference operations. The benchmark was run with 48 threads performing inference operations. Results represent a fully configured IBM z16 with 200 CPs and 40 TB storage. Results might vary.

DISCLAIMER: Performance results are based on an IBM internal CICS OLTP credit card workload with in-transaction fraud detection running on IBM z16. Measurements were done with and without the Integrated Accelerator for AI. A z/OS V2R4 LPAR configured with 12 CPs, 24 zIIPs and 256 GB of memory was used. Inferencing was done with Machine Learning for z/OS 2.4 running on WebSphere Application Server Liberty 21.0.0.12, using a synthetic credit card fraud detection model (https://github.com/IBM/ai-on-z-fraud-detection). Server-side batching was enabled on Machine Learning for z/OS with a size of 8 inference operations. Results might vary.