Important:

IBM Cloud Pak® for Data Version 4.8 will reach end of support (EOS) on 31 July, 2025. For more information, see the Discontinuance of service announcement for IBM Cloud Pak for Data Version 4.X.
Upgrade to IBM Software Hub Version 5.1 before IBM Cloud Pak for Data Version 4.8 reaches end of support. For more information, see Upgrading from IBM Cloud Pak for Data Version 4.8 to IBM Software Hub Version 5.1.

Managing hardware specifications for deployments

When you deploy certain assets in Watson Machine Learning, you can choose the type, size, and power of the hardware configuration that matches your computing needs.

Creating hardware specifications for deployments

You can create hardware specifications for your deployments in one of the following ways:

Deployment types that require hardware specifications

Selecting a hardware specification is available for all batch deployment types. For online deployments, you can select a specific hardware specification if you're deploying:

  • Python Functions
  • Tensorflow models
  • Models with custom software specifications

Hardware configurations available for deploying assets

  • XXS: 1x2 = 1 CPU and 2 GB RAM
  • XS: 1x4 = 1 CPU and 4 GB RAM
  • S: 2x8 = 2 CPU and 8 GB RAM
  • M: 4x16 = 4 CPU and 16 GB RAM
  • L: 8x32 = 8 CPU and 32 GB RAM
  • XL: 16x64 = 16 CPU and 64 GB RAM

You can use the XXS and XS configurations to deploy:

  • Python functions
  • Python scripts
  • R scripts
  • Shiny apps
  • Models based on custom libraries and custom images

For Decision Optimization deployments, you can use these hardware specifications:

  • S
  • M
  • L
  • XL

Hardware specifications for GPU inferencing

Beginning Cloud Pak for Data version 4.8.5, you can select GPU hardware specifications for CUDA software specifications from the user interface on x86 platform when you create a deployment. For more information, see Configuring MIG support in Red Hat OpenShift.

Use the following predefined hardware specifications for GPU inferencing:

Hardware specifications for GPU inferencing
Size Hardware definition
GPUx1 1GPU, 1 CPU and 4 GB RAM
GPUx2 2GPU, 2 CPU and 8 GB RAM
GPUx3 3GPU, 2 CPU and 12 GB RAM
GPUx4 4GPU, 2 CPU and 16 GB RAM

Learn more

Parent topic: Managing predictive deployments