Integrating 3rd-party ML engines
External machine learning service engines, such as Microsoft Azure ML Studio, Microsoft Azure ML Service, and Amazon SageMaker can be integrated for model evaluations.
You can use the following methods to integrate integrate 3rd-party engines:
-
Engine binding level
Ability to list deployments and get deployment details.
-
Deployment subscription level
You must score subscribed deployment in the proper format, such as the IBM Watson Machine Learning format and receive the output in the same compatible format. Watson OpenScale must configure your model evaluations to process both input and output formats.
-
Payload logging
Each input and output to the deployed model triggered by a user’s application must be stored in a payload store. The format of the payload records follows the same specification as mentioned in the preceding section on deployment subscription levels.
Watson OpenScale uses those records to calculate bias, explanations, and others. It is not possible for Watson OpenScale to automatically store transactions that run on the user site, which is outside the Watson OpenScale system. This method is one of the ways that Watson OpenScale safeguards proprietary information. Use the Watson OpenScale Rest API or Python SDK to work with secure data.
Steps to implement this solution
- Learn about custom machine learning engines.
- Set up payload logging.
- Set up a custom machine learning engine by using one of these Custom machine learning engine examples.
Parent topic: Supported machine learning engines, frameworks, and models