Known limitations

Before you use IBM Cloud Pak® for Business Automation, make sure that you are aware of the known limitations.

For the most up-to-date information, see the support page Cloud Pak for Business Automation Known Limitations, which is regularly updated.

The following sections provide the known limitations by Cloud Pak capability.

Image tags cannot start with a zero

If you use image tags in the custom resources to identify different versions of Cloud Pak for Business Automation container images, do not start the tag with a "0" (zero). The "0" is removed from the tag by the operators and the image cannot be pulled as a result. Image tags can include lowercase and uppercase letters, digits, underscores ( _ ), periods ( . ), and dashes ( - ). For more information, see Digests versus image tags.

Okta and Azure AD integration

Table 1. Limitations of SSO third-party IdPs
Limitation Description
Automation Document Processing Not supported.
Business Automation Workflow Case Management solutions and applications are not supported.
Business Automation Workflow Content Processes cannot be triggered if documents are added from the Administrative Console for Content Engine (ACCE). Note, processes can be launched when documents are added from Navigator BAW desktop.
Business Automation Workflow External Services
  • External Services for REST Services and Web Services cannot be created.
  • External Workflow cannot be created.
  • Process applications that are published by Open API cannot be consumed by an Automation Service.

Multi-zone region (MZR) storage support on Red Hat OpenShift Kubernetes Service (ROKS)

Table 2. Limitations of Portworx on ROKS
Limitation Description
When one zone is unavailable, it might take up to a minute to recover. If all worker nodes in a single zone are shut down or unavailable, it can take up to a minute to access the Cloud Pak applications and services. For example, to access ACCE from CPE takes a minute to respond.

Connection issues of Identity Management (IM) to LDAP(s)

Table 3. Limitations of Identity Management (IM) foundational service
Limitation Description
IM does not update LDAP certificates automatically When the CP4BA operator configures an LDAP connection to IM, the certificates are added to the platform-auth-ldaps-ca-cert secret.
Problem

If the LDAP certificates are changed or expire, IM cannot refresh the certificates automatically, and this results in SSL errors in the platform-auth-service-** pod.

CWPKI0823E: SSL HANDSHAKE FAILURE: The signer might need to be added to local trust store 
[/opt/ibm/wlp/output/defaultServer/resources/security/key.jks], located in SSL configuration alias [defaultSSLConfig]. 
The extended error message from the SSL handshake exception is: [PKIX path building failed: 
sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target].
Workaround

Manually import the new or changed LDAP certificates to IM. For more information, see Configuring LDAP over SSL.

LDAP failover

LDAP failover is not supported. It is not possible to configure multiple/failover LDAPs in the custom resource (CR) file template.

IBM Automation Document Processing

Table 4. Limitations of Document Processing
Limitation Description
Issues with versioning when importing a project in Document Processing Designer with the merge or overwrite options.

Importing a project with the overwrite option in Document Processing Designer only supports importing projects from the previous version. For example, in 23.0.1, you must only import projects that were exported from version 22.0.2.

If you have exported projects from older releases, you must periodically update the archive files with the latest release:
  1. Import your archive from an older release into the latest release, with the overwrite options.
  2. Reexport the project as a new archive file. For example, if you import an archive file into 23.0.1, that archive file must have been exported from 22.0.2. If you initially exported from 22.0.1, then you should first import it to 22.0.2 with the overwrite option, then export it, then import that 22.0.1 archive file into 23.0.1.
It is recommended not to merge projects from different releases. When you import a project with the merge option across releases, you must first migrate the older release archive file into the latest release:
  1. From the latest release, import your older version archive file with the overwrite option. All document classes are imported and your project contains all new system types from the latest release.
  2. After you successfully import, export the project again.
  3. Import the archive file that you just created into your latest release project.

When you import a project with the merge option, the current project is merged with the project from the exported archive file. Only nonconflicting document types are imported into the project, and if there are conflicting document types, they are skipped during import. If you need to import a document type that is in conflict, you must first delete that document type from the project before importing the archive file.

After importing a project, it is recommended to retrain both classification and extraction models to avoid issues with the change of document classes in the merged project, and make sure the models are trained and built with the latest features.
Need an egress to use webhooks with external custom applications. If you have an external custom application that uses the webhook feature, you must set up a custom egress for Document Processing engine so that notifications can be sent outside of the Red Hat OpenShift Container Platform cluster where Document Processing engine is deployed. For more information, see Creating an external egress for Document Processing engine when an external application uses webhooks.
In a starter deployment, simultaneous uploading of multiple large batches might fail with an Uploading Error status.
Workaround

You must view the batch contents, remove the failed documents, and upload these documents again.

In some situations, the documents continue to be processed by the server and might transition out of the Uploading Error status. When a batch has the Uploading Error status, but none of the documents are in error, you can recover the batch by either adding or removing a document in the batch. The batch status is updated and processing can resume.

Accessibility of the Verify client graphical user interface.

A user who uses the Firefox browser to reach the Verify client user interface and then tabs into it to access the various zones cannot tab out again.

You reach the Verify client user interface in different ways, depending on your application. For example, for the single document application, you tab into the content list on the start page, select a document from the list by pressing the Tab or Arrow keys, and then you tab to the context icon (the 3 ...), select the icon by pressing the space bar or the Enter key, and finally press the Arrow keys to select Review document.

Data standardization: uniqueness You cannot reuse an existing data definition for composite fields, nor create a data definition with a name that is already used for another data definition.

When standardizing your data, you can associate a data definition with a field or a document type. These data definitions are used when deploying the project to a runtime environment. For simple fields, you can either create a data definition, or reuse an existing one. For composite fields, you are not able to reuse an existing data definition, you can only create one.

If you attempt to create a data definition with the same name as an existing one, you get a uniqueness error.

Deleting and re-creating a project
If you want to delete and re-create a Document Processing project to start over, you might encounter errors after re-creating the project. This occurs because the re-created project is out of sync with the Git repository.
Workaround

To avoid errors, follow these steps to delete and re-create a project:

  1. In Business Automation Studio, delete your project.
  2. Go to the remote Git server that is connected to your Document Processing Designer and delete the project repository for the project that you deleted in step 1.
  3. In Business Automation Studio, create your project again with the same name as the previous project.

For more information, see Saving an ADP Project fails with status code 500 service error.

Fit to height option does not fit to height properly. In a single or batch document processing application, the Fit to height option does not fit to height properly. When you view a document in a single or batch document processing application, if you rotate this document clockwise or counterclockwise and select the Fit to height option, the size is not changed. The limitation applies when fixing classification or data extraction issues, and in portrait view in the modal viewer dialog.
The ViewONE service is not load-balanced. Due to a limitation when editing the document to fix classification issues in batch document processing applications, the current icp4adeploy-viewone-svc service session affinity is configured as ClientIP and the session has to be sticky.
Postal mail address and Address information field types
  • Extraction of multiple addresses within a single block is not supported. You cannot define your own composite address field types because you cannot define multiple postal mail address subfields for a single address block.
  • You cannot upgrade previous address field types (such as address block or US mail address) to the new Postal mail address and Address information field types. Address field types that you defined in earlier versions remain the same.
    Workaround
    If you want to use the new address functionality, you must deploy a new key class based on the new address field types (Postal mail address and Address information).
Microsoft Word documents
  • In a FIPS-enabled deployment, you cannot upload or process Microsoft Word documents that have the .doc or .docx format.
  • If you observe that the viewer hangs when a multipage Microsoft Word document is loaded to the classification or data extraction issue page, look up the log of your browser development tool console. An error message that is similar to the following one means that the connection between the viewone service and the Application Engine has expired.

    Strict-Transport-Security: The connection to the site is untrustworthy, so the specified header was ignored. v1files x1~1> JS Thread 6 Jun 2022, 08:41:52, UTC-7 (000005615/000002156): finishLoading viewone_ajax-0.js:4856:26 java.lang.ArrayIndexOutOfBoundsException: Invalid page number -1 viewone_ajax-0.js:4856:26

    Workaround
    Refresh the page and reload the document.
  • Data highlighting and Click N Key coordinates for DOC and DOCX documents might not capture highlighted text correctly when fixing data extraction issues in the runtime application.
    Workaround
    Use the keyboard to correct any values that encounter this issue.
Support of NVIDIA CUDA drivers 11.2 To use a FIPS-compliant TensorFlow version, the NVIDIA CUDA drivers 11.2 are required. However, IBM Cloud® Public (ROKS) GPU does not support CUDA 11.2 because it uses Red Hat® Enterprise Linux® (RHEL) 7. The current version of NVIDIA Operator on RHEL 7 is 1.5.2. It cannot be upgraded to the latest version (1.6.x) to use the latest NVIDIA CUDA drivers 11.2 because that version does not support RHEL 7. The current NVIDIA Operator that is installed from the Operator Hub does not run on GPU RHEL 7 Bare Metal Servers, as you only have the option to deploy to 1.6.x.
Data extraction from tables is not fully supported.
  • While data extraction from simple tables is fully supported, some limitations exist for complex tables, for example when you extract data from the summary section of tables, or if some watermarks, text, or other interfering elements exist in your documents. Complex tables are not fully supported. For the full list of supported tables, and examples of the types of tables that contain limitations, see Best practices on table extraction.
Some checkboxes are not detected. Some types of checkboxes are not detected, for example if they are too small or improperly shaped. For more information and examples of non-detected checkboxes, see Limitations for checkbox detection.
Problems accessing the application configurator after the initial configuration
Problem

When you create an application in Business Automation Studio, an application configurator displays where you enter configuration parameters for the application.

However, after the application is created, you cannot access the same configurator. As a result, you cannot change the configuration settings the same way.

Workaround
To reconfigure your settings after the application is created:
  1. In Business Automation Studio, open your application.
  2. Select Application project settings from the top left drop-down list.
  3. Click the Environment variables tab.
  4. Edit the following environment variables where applicable:
    • evObjectStoreName (the CPE current object store)
    • evDbaProjectId (the CPE's currently deployed project ID)
    • evRootFolder (the CPE folder, only for the Document Processing template)
  5. Click Finish editing to save changes. Click Preview.
  6. Export and import your application to the Application Engine.
SystemT Extractor accuracy The SystemT extractors that are included with IBM Automation Document Processing are intended as samples to demonstrate the capabilities of the feature. They are not tuned to any specific document format and might not provide high recognition rates for some document types. If you plan to use SystemT extractors in production environment, you need to build your own SystemT extractors, which can be better trained for the documents that you are processing.

IBM Automation Decision Services

For more information, see Known limitations.

IBM Automation Workstream Services

For more information, see Workstream limitations.

IBM Business Automation Insights

Table 5. Limitations to IBM Business Automation Insights
Limitation Description
Alerts
Business Performance Center
You cannot create an alert for a period KPI if it contains a group. If you want to create an alert for a period KPI, go to the Monitoring tab and remove the Group by keyword. Then, go to the Thresholds tab to create one or more alerts.
Kibana
In the Kibana graphical user interface, the Alerting menu item for monitoring your data and automatically sending alert notifications is present but the underlying feature is not enabled.
Business Performance Center
Aggregations
Because Business Performance Center uses Elasticsearch as its database, approximation problems can be found with aggregations of numbers greater than 2^53 (that is, about 9 * 10^15). See the Limits for long values section of the Aggregations page of the Elasticsearch documentation.
Fixed date time range
If, in a fixed date time range, you modify only the date or time, your changes are not saved.
Data tables
When any chart is displayed as a data table, only the first 1000 rows are shown.
Data permissions
You can set up data permissions by monitoring source or by team. If you encounter an error when you set a permission by source, try setting the same permission by team.
No Business Automation Insights support for IBM Automation Document Processing The integration between IBM Automation Document Processing (ADP) and Business Automation Insights is not supported. When you deploy or configure the IBM Cloud Pak for Business Automation platform, select the Business Automation Insights component together with patterns that are supported by Business Automation Insights, such as workflow (Business Automation Workflow) or decisions (Operational Decision Manager), not just with document-processing (IBM Automation Document Processing).
Flink jobs might fail to resume after a crash. After a Flink job failure or a machine restart, the Flink cluster might not be able to restart the Flink job automatically. For a successful recovery, restart Business Automation Insights. For instructions, see Troubleshooting Flink jobs.
Case event emitter (ICM) You can configure a connection to only one target object store. The Case event Emitter does not support multiple target object stores.
Elasticsearch indices

Defining a high number of fields in an Elasticsearch index might lead to a so-called mappings explosion which might cause out-of-memory errors and difficult situations to recover from. The maximum number of fields in Elasticsearch indices created by IBM Business Automation Insights is set to 1000. Field and object mappings, and field aliases, count towards this limit. Ensure that the various documents that are stored in Elasticsearch indices do not lead to reaching this limit.

Event formats are documented in Reference for event emission.

For Operational Decision Manager, you can configure event processing to avoid the risk of mappings explosion. See Operational Decision Manager event processing walkthrough.

In the BPEL Tasks dashboard, the User tasks currently not completed widget does not display any results. The search that is used by the widget does not return any results because it uses an incorrect filter for the task state.

To avoid this issue, edit the filter in the User tasks currently waiting to be processed search. Set the state filter to accept one of the following values: TASK_CREATED, TASK_STARTED, TASK_CLAIM_CANCELLED, TASK_CLAIMED.

Historical Data Playback REST API The API plays back data only from closed processes (completed or terminated). Active processes are not handled.
In Case dashboards, elapsed time calculations do not include late events: Average elapsed Time of completed activities and Average elapsed time of completed cases widgets. Events that are emitted after a case or activity completes are ignored. But by setting the bai_configuration.icm.process_events_after_completion parameter to true, you can set the Case Flink job to process the events that are generated on a case after the case is closed. The start and end times remain unchanged. Therefore, the duration is the same but the properties are updated based on the events that were generated after completion.

IBM Business Automation Navigator

Table 6. Limitations of IBM Business Automation Navigator
Limitation Description
Resiliency issues can cause lapses in the availability of Workplace after a few weeks. This issue might be attributed to issues with the Content Platform Engine (cpe) pod. Use the following mitigation steps:
  • Ensure that cpe is deployed in a highly available setup, with at least two replicas.
  • Monitor the cpe pod and restart if issues occur.
Task Manager is not supported when configuring with System for Cross-domain Identity Management (SCIM). Task Manager requires an LDAP registry for user authorization. It is not supported in a deployment that is configured with SCIM.
After you update the schema name in the CR YAML, the schema name is not updated in the system.properties file and the Business Automation Navigator pod uses the old schema name. You need to manually delete the system.properties file and restart the Business Automation Navigator pod so that it uses the new schema name.

IBM Business Automation Application and IBM Business Automation Studio

Table 7. Limitations of Business Automation Application and Business Automation Studio
Limitation Description
Process applications from Business Automation Workflow do not appear in Application Designer. Sometimes the app resources of the Workflow server do not appear in Studio when you deploy Workflow server instances, Studio, and Resource Registry in the same custom resource YAML file.

If you deployed Business Automation Studio with the Business Automation Workflow server in the same custom resource YAML file, and you do not see process applications from Business Automation Workflow server in Business Automation Studio, restart the Business Automation Workflow server pod.

The Business Automation Workflow toolkit and configurators might not get imported properly. When you install both Business Automation Workflow on containers and Business Automation Studio together, the Business Automation Workflow toolkit and configurators might not get imported properly. If you don't see the Workflow Services toolkit, the Start Process Configurator, or the Call Service Configurator, manually import the .twx files by downloading them from the Contributions table inside the Resource Registry section of the Administration page of Business Automation Studio.
Kubernetes kubectl known issue modified subpath configmap mount fails when container restarts #68211. Business Automation Studio related pods go into a CrashLoopBackOff state during the restart of the docker service on a worker node.

If you use the kubectl get pods command to check the pods when a pod is in the CrashLoopBackOff state, you get the following error message:

Warning Failed 3m kubelet, <IP_ADDRESS> Error: failed to start container: Error response from daemon: OCI runtime create failed: container_linux.go:348: starting container process caused "process_linux.go:402: container init caused \" rootfs_linux.go:58: mounting

To recover a pod, delete it in the OpenShift® console and create a new pod.

To use Application Engine with Db2® for High Availability and Disaster Recovery (HADR), you must have an alternative server available when Application Engine starts.

Application Engine depends on the automatic client reroute (ACR) of the Db2 HADR server to fail over to a standby database server. You must have a successful initial connection to that server when Application Engine starts.

IBM Resource Registry can get out of sync.

If you have more than one etcd server and the data gets out of sync between the servers, you must scale to one node and then scale back to multiple nodes to synchronize Resource Registry.

After you create the Resource Registry, you must keep the replica size.

Because of the design of etcd, changing the replica size can cause data loss. If you must set the replica size, set it to an odd number. If you reduce the pod size, the pods are deleted one by one to prevent data loss and the possibility that the cluster gets out of sync.
  • If you update the Resource Registry admin secret to change the username or password, first delete the instance_name-dba-rr-random_value pods to cause Resource Registry to enable the updates. Alternatively, you can enable the update manually with etcd commands.
  • If you update the Resource Registry configurations in the icp4acluster custom resource instance, the update might not affect the Resource Registry pod directly. It affects the newly created pods when you increase the number of replicas.

After you deploy Business Automation Studio or Application Engine, you cannot change the Business Automation Studio or Application Engine admin user.

Make sure that you set the admin user to a sustainable username at installation time.

Because of a Node.js server limitation, Application Engine trusts only root CA.

If an external service is used and signed with another root CA, you must add the root CA as trusted instead of the service certificate.
  • The certificate can be self-signed, or signed by a well-known root CA.
  • If you are using a depth zero self-signed certificate, it must be listed as a trusted certificate.
  • If you are using a certificate that is signed by a self-signed root CA, the self-signed CA must be in the trusted list. Using a leaf certificate in the trusted list is not supported.
  • If you are adding the root CA of two or more external services to the Application Engine trust list, you can't use the same common name for those root CAs.

IBM FileNet® Content Manager

Table 8. Limitations of IBM FileNet Content Manager
Limitation Description
A smaller number of indexing batches than configured leads to a noticeable degradation in the overall indexing throughput rate. Because obsolete Virtual Servers in the Global Configuration Database (GCD) are not automatically cleaned up, that Virtual Server count can be higher than the actual number of CE instances with dispatching enabled. That inflated number results in a smaller number of concurrent batches per CSS server, negatively affecting indexing performance.

For more information and to resolve the issue, see Content Platform Engine uneven CBR indexing workload and indexing degradation.

Download just Jace.jar from the Client API Download area of ACCE fails. Result is the "3.5.0.0 (20211028_0742) x86_64" text returned. When an application, like ACCE, is accessed through Zen, the application, and certain operator-controlled elements in Kubernetes need additional logic to support embedded or "self-referential" URLs. The single file download in the Client API Download area of ACCE uses self-referential URLs and the additional logic is missing. To avoid the self-referential URLs, download the whole Client API package that contains the desired file instead of an individual file. The individual file would then be extracted from the package.
Queries to retrieve group hierarchies using the SCIM Directory Provider might fail if one of the groups in the hierarchy contains a space or some other character not valid in an HTTP URL. The problem can occur when performing searches of users or groups and the search tries to retrieve the groups a group belongs to. If one of these groups in this chain contains a space or other illegal HTTP URL character, then the search may fail.
LDAP to SCIM attribute mapping may not be correct. The default LDAP to SCIM attribute mapping used by IM may not be correct. In particular, TDS/SDS LDAP may have incorrect mappings for the group attributes for objectClass and members. To learn more about how to review this mapping and change it, see Updating SCIM LDAP attributes mapping.
When using the SCIM Directory Provider to perform queries for a user or group with no search attribute, all users/groups are returned rather than no users or groups. Queries without a search pattern are being treated as a wildcard rather than a restriction to return nothing.

IBM Business Automation Workflow and IBM Workflow Process Service

For IBM Business Automation Workflow, see IBM Business Automation Workflow known limitations.
For Workplace, see Workplace limitations.