Deploying Maximo AI Service
You can use Maximo® AI Service SaaS or on-premises. To deploy Maximo AI Service on-premises, you initiate the deployment by using the CLI or Ansible® collection, connect Maximo AI Service to Maximo Manage by using system properties, and then conditionally import the Maximo AI Service certificate.
Before you begin
Before you begin, review the entire process for enabling AI features. For more information, see Maximo AI Service and AI features in Maximo Manage.
-
Install or set up an account for watsonx™.ai. You can use watsonx.ai on-premises or SaaS. You can use an existing instance or install or sign up for a new instance.
- For more information about installing watsonx.ai on-premises, see Installing watsonx.ai on-premises for Maximo AI Service. The cluster that watsonx.ai is installed in requires a GPU.
- For watsonx.ai SaaS, no installation is required, but you must set up an account for watsonx.ai. For more information, see Step 1: Sign up for watsonx. You must complete only step 1.
-
At a minimum, for a single user and tenant, you must have three primary nodes that have eight CPUs with 32 GB memory each and six secondary nodes that have four CPUs with 32 GB memory each. Additional users, tenants, and workloads require more resources.
-
Red Hat® OpenShift® 4.16 or later
-
Suite License Service. You can use an existing instance or install a new instance as part of the deployment for Maximo AI Service.
-
IBM® Data Reporter Operator . You can use an existing instance or install a new instance as part of the deployment for Maximo AI Service.
-
One of the following object storage providers:
-
MinIO. You can use an existing instance or install a new instance as part of the deployment for Maximo AI Service. Install MinIO in the same cluster as Maximo AI Service.
-
Amazon Web Services S3. The buckets that are used for Maximo AI Service must have unique names, and all typical dependencies of Amazon Web Services S3 must be deployed.
-
- IBM Db2®. You can use an existing instance or install a new instance as part of the deployment for Maximo AI Service.
About this task
As part of the deploying, you use system properties to establish a connection from Maximo Manage to the Maximo AI Service. This connection enables Maximo Manage to communicate with the services or runtimes that are hosted within the Maximo AI Service cluster.
The following steps describe how to deploy Maximo AI Service on-premises. For more information about Maximo AI Service SaaS, see Enabling AI features by using Maximo AI Service SaaS.
Procedure
What to do next
If another status or an error is displayed, you can access more details in the Red Hat OpenShift logs. For more information about troubleshooting, see Troubleshooting Maximo AI Service and AI features.
If Maximo AI Service is available and running, you can start setting up your AI configurations. Each configuration represents a feature that you want to enable, for example, problem code recommendations or the AI assistant.
| Feature | Model template | Model | Steps |
|---|---|---|---|
|
Problem code recommendations for work orders |
pcc |
Granite 3.0 8B Instruct | Enabling recommended problem codes for Work orders |
|
Field value recommendations |
mcc |
Granite 3.2 8B Instruct | Enabling field value recommendations |
|
AI assistant |
nl2oslc |
Granite 3.2 8B Instruct | Enabling the assistant |
|
Locating similar work orders |
similarity |
Granite 3.0 8B Instruct | Enabling locating of similar work orders |
|
AI recommendations for asset boundary and failure list in Reliability Strategies |
fmea |
Granite 3.2 8B Instruct | Enabling AI recommendations in Reliability Strategies |