Configuring the runtime to use Machine Learning with Operational Decision Manager

You must configure an external Liberty server with both Machine Learning for z/OS and Operational Decision Manager before you can use the two products together.

Before you begin

Before you begin this task, install Operational Decision Manager on an external Liberty server. See Configuring topology 6: Operational Decision Manager on the Liberty server. During these steps, you will submit two JCL:
  • HBRWLPC configures the XML files for a Liberty
  • HBRWLPD places the XML files into your Liberty's user directory

About this task

In this task, you will follow the documentation of Operational Decision Manager and Machine Learning for z/OS to install them into an external Liberty server.

Procedure

  1. Verify that you have a working Liberty environment with WOLA in it by doing the following steps:
    1. Submit HBRDPLOY in the Liberty working dataset *.WLP.SHBRJCL(HBRDPLOY) to deploy the miniloan sample rule app.
    2. Submit HBRMINW in the Liberty working dataset *.WLP.SHBRJCL(HBRMINW) to run the sample rule app.
    3. Check that the output of the sample is correct.
  2. Edit the db2Type2.xml and db2Type4.xml files of your Liberty profile server. In both files, change the <feature>jdbc-4.0</feature> to <feature>jdbc-4.1</feature>.
  3. Follow the steps in the Machine Learning for z/OS documentation to install MLz into the Liberty that already contains Operational Decision Manager.
    Tip: Use Option 3: Integrate a machine learning scoring service with an external WLP server. This will install into your Liberty whilst preserving the Operational Decision Manager application.
  4. If your Liberty XML configuration files already contain definitions for your server HTTP (and optionally HTTPS) ports, then remove these lines.
    The Machine Learning for z/OS configuration process adds port settings to the Liberty server.xml file:
    <httpEndpoint httpsPort="-1" httpPort="-1" id="defaultHttpEndpoint"/>
    <httpEndpoint httpsPort="-1" httpPort="-1" sslOptionsRef="MLzScoringSSLOptions" host="*" id="MLZAdminHttpEndpoint"/>
    <httpEndpoint httpsPort="-1" httpPort="24018" sslOptionsRef="MLzScoringSSLOptions" host="*" id="MLZHttpEndpoint"/>
  5. If you require the Rule Execution Server console application to be hosted in the Liberty server, for example to use it with Decision Runner, then a configuration change is required. This change is to use a custom set of features instead of the default features included by the Machine Learning for z/OS configuration:
    1. Remove the following line from your server.xml file: <feature>webProfile-7.0</feature>
    2. In the server.xml file, after the line </featureManager>, insert the following line at the top of the lists of include: <include location="MlzContent.xml" />
  6. Restart your Liberty server and wait for Machine Learning for z/OS to start.
    This can take up to 30 minutes.
  7. Connect your MLz install to the admin dashboard and install the churn sample:
    1. Define your Liberty server to the Machine Learning for z/OS administration dashboard as a scoring service.
    2. Create a deployment of the Churn sample. This sends this model to your Liberty server's scoring service.
  8. Test your model within the Machine Learning for z/OS administration dashboard by selecting Test API from the ACTIONS menu for the deployment.
  9. Ensure that Operational Decision Manager is still working by resubmitting the HBRMINW.

Results

You now have Operational Decision Manager and MLz installed into a single Liberty and ready to work together.