November 9, 2020 By Kazuaki Ishizaki 4 min read

Does AI Model Lifecycle Management matter?

Artificial intelligence (AI) is becoming ubiquitous in many areas, from the edge to the enterprise. So, how do you use AI? Do you just feed data to a predictor? The answer is “no.”

In fact, during the infusion of AI, we need to collect data, train the data, build a model, deploy the model, and run the predictor. The pipeline to using AI is longer than one might expect, given that there are several elements (seen in Figure 1 as described in the Google article “MLOps: Continuous delivery and automation pipelines in machine learning”):

Figure 1: Elements for machine learning systems.

In enterprise, the critical role of AI requires a well-defined and robust methodology and platform, and a business may even fail if its methodology and platform are not up to par. For example, if fraud detection makes bad decisions, a business will be negatively affected. In the long pipeline for AI, response time, quality, fairness, explainability, and other elements must be managed as part of the whole lifecycle. It is impossible to manage them individually.

Therefore, what we call “AI Model Lifecycle Management” manages the complicated AI pipeline and helps ensure the necessary results in enterprise. We will detail AI Model Lifecycle Management in a series of blog entries. In addition, we will show how the IBM Cloud Pak® for Data can help AI Model Lifecycle Management.

We expect these blog entries to be of interest to the following people:

  • Data science and AI leaders: To better understand how to increase returns on data science and AI investments.
  • Data scientists: To better appreciate how data science activities can leverage/integrate with DevOps tools/processes, and to more deeply understand IBM’s strategy for end-to-end AI Model Lifecycle Management.
  • DevOps engineers: To better understand the AI development process, its associated complexities, and how it can integrate with DevOps.

What is AI Model Lifecycle Management?

Let us think about what is necessary for AI Model Lifecycle Management. The first requirement is a set of components for the whole pipeline. The document “The AI Ladder – Demystifying AI Challenges” explains how to introduce AI into enterprise and clearly outlines four steps in the pipeline:

  • Collect: Make data simple and accessible.
  • Organize: Create a business-ready analytics foundation.
  • Analyze: Build and scale AI with trust and transparency.
  • Infuse: Operationalize AI throughout a business.

Figure 2 shows these four steps with their relationships and operational examples:

Figure 2: AI Model Lifecycle.

Another requirement is data governance of the whole pipeline. Quality is essential in enterprise, and explainability and fairness are growing increasingly essential. During the whole pipelining, data governance for AI Model Lifecycle Management should monitor and give feedback regarding quality, fairness, and explainability.

How tools help AI Model Lifecycle Management

As we have seen, AI Model Lifecycle Management is not easy. It is impossible to do it manually. Therefore, the necessary tools should have the following features to effectively support AI Model Lifecycle Management in a cloud:

  • Ease of model training and deployment
  • Model deployment and training at scale
  • Monitoring data governance, quality, and compliance
  • Visualization of the whole pipeline
  • Rich connectors to data sources

One example of these tools is the IBM Cloud Pak for Data. IBM Cloud Pak for Data is a multicloud data and AI platform with end-to-end tools for enterprise-grade AI Model Lifecycle Management, ModelOps. It helps organizations improve their overall throughput of data science activities and achieve faster time to value from their AI initiatives. The Cloud Pak for Data includes the following key capabilities:

  • Model development and training tools, including AutoAI and no-code, drag and drop capabilities, and support for a rich set of commonly used open source libraries and frameworks.
  • Model deployment tools to scale deployed models in production for modern apps and meet performance requirements.
  • Model monitoring and management tools to deliver trusted AI.
  • Data virtualization capabilities to significantly increase the AI throughput of data science teams by helping data scientists efficiently access the broad set of data sources of an enterprise across a hybrid multicloud environment, without having to copy data.
  • DataOps to meet data governance, quality, and compliance requirements.
  • Complete data services, with a rich set of data connectors and scalable multicloud data integration capabilities to enable efficient extract, transform, and load (ETL) operations from a variety of data sources.

Stay tuned for the next blog entries

Future blog entries will detail the following phases in AI Model Lifecycle Management:

This post will be updated with links as they become available. For a deeper dive into the subject, see our white paper.

Was this article helpful?
YesNo

More from Cloud

A major upgrade to Db2® Warehouse on IBM Cloud®

2 min read - We’re thrilled to announce a major upgrade to Db2® Warehouse on IBM Cloud®, which introduces several new capabilities that make Db2 Warehouse even more performant, capable, and cost-effective. Here's what's new Up to 34 times cheaper storage costs The next generation of Db2 Warehouse introduces support for Db2 column-organized tables in Cloud Object Storage. Db2 Warehouse on IBM Cloud customers can now store massive datasets on a resilient, highly scalable storage tier, costing up to 34x less. Up to 4 times…

Manage the routing of your observability log and event data 

4 min read - Comprehensive environments include many sources of observable data to be aggregated and then analyzed for infrastructure and app performance management. Connecting and aggregating the data sources to observability tools need to be flexible. Some use cases might require all data to be aggregated into one common location while others have narrowed scope. Optimizing where observability data is processed enables businesses to maximize insights while managing to cost, compliance and data residency objectives.  As announced on 29 March 2024, IBM Cloud® released its next-gen observability…

The recipe for RAG: How cloud services enable generative AI outcomes across industries

4 min read - According to research from IBM®, about 42% of enterprises surveyed have AI in use in their businesses. Of all the use cases, many of us are now extremely familiar with natural language processing AI chatbots that can answer our questions and assist with tasks such as composing emails or essays. Yet even with widespread adoption of these chatbots, enterprises are still occasionally experiencing some challenges. For example, these chatbots can produce inconsistent results as they’re pulling from large data stores…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters