Governing assets in AI use cases

Create an AI use case to track and govern AI assets from request through production. Factsheets capture details about the asset for each stage of the AI lifecycle to help you meet governance and compliance goals.

Service The required service is not available by default. An administrator must install the watsonx.governance service or the AI Factsheets service on the IBM Cloud Pak for Data platform. To determine whether the service is installed, open the Services catalog, and check whether the service is enabled.

Governance details for tracked assets are collected in an AI use case and stored in an inventory. A typical flow to request and track an asset is as follows:

  1. A business user (the model owner) identifies a need for an AI solution and creates an AI use case to request a new model or prompt template. When Governance console is enabled and configured, the use case is created in the Governance console and facts are synched with AI Factsheets.
  2. When the request is saved, the AI use case is created in the inventory, and the tracking begins. Initially, the use case is in the Draft state because there are no assets to accompany the request. Optionally, you can implement a manual approval process as simple as sending a link to an approver and getting a sign-off in email before sending the use case to a data scientist or prompt engineer. Or, you can implement a more structured sign-off process by adding an attachment group to the use case that adds a placeholder for uploading a formal approval document as part of your governance plan.
  3. When the AI use case is fully set and approved, the owner moves it to the Awaiting development status and sends the link to the assigned data scientist or prompt engineer.
  4. When the assigned developer creates a model or prompt template for the use case, they associate the asset with an _approach in the use case. An approach is one path for solving the problem in the use case. A use case can include multiple approaches.
  5. Once tracking is enabled, details for tracked assets are captured in factsheets and stored in the use case. For example, the factsheet for a machine learning model includes training data and creation details. The details for a prompt template include prompt parameters and prompt variables as well as information about the foundation model used, such as name and publisher.
  6. As the asset advances in the lifecycle, the AI use case and the factsheets capture all updates, including deployments and evaluation results. For example, when a prompt template is tracked in the use case, the factsheet records all of the basic creation data. When a validator evaluates the prompt with test data, the associated facts from the validation are added to the factsheet. As the prompt template is pushed to production and configured for monitoring, all associated lifecycle facts are added to the factsheet, creating a complete record.
  7. The use case owner continues to update the status to reflect the current state in the lifecycle.
  8. Validators and other stakeholders can review AI use cases to ensure compliance with corporate protocols and to view and certify model progress from development to production. The user can generate reports from the use case to archive details or satisfy compliance requirements.

Get started with AI use cases

Set up or work with AI use cases:

Parent topic: Governing AI assets