Part 2 of a four-part series. In this post, you’ll learn how to configure watsonx Assistant to process simple Q&A.
In this post, we are going to configure watsonx Assistant to process simple Q&A for your chatbot using the sample on this Git project. This sample is a bank’s Virtual Agent chatbot, and its dialog flows are pre-defined.
- Part 1: Set up required apps and services.
- Part 2: Configure watsonx Assistant to process simple Q&A.
- Part 3: Create a simple flow on Node-RED and integrate with watsonx Assistant.
- Part 4: Configure Slack and Node-RED integration.
Step 1. Understanding watsonx Assistant
watsonx Assistant is IBM’s AI product that lets you build, train, and deploy conversational interactions into any application, device, or channel. You need to train your Watson assistant to recognize “intents” and the “entities” of a user input and define a “dialog” flow that incorporates your Intents.
As you add training data, a natural language classifier is automatically added to the skill. The classifier model is trained to understand the types of requests that you teach your assistant to listen for and respond to.
In this series, we are not going to define complicated dialog flows, but if you are interested, here is the official document to which you can refer.
Step 2. Set up watsonx Assistant
In this part, we are going to use this sample from a Git project. We are not going to deploy the sample app here, but we are importing intents, entities, and dialog flows.
- Go to the Git project and download the source code from Clone or download
- Open watsonx Assistant from the IBM Cloud dashboard.
- Open Skills and click Create skill.
- Specify the location of the workspace JSON file in your local copy of the app project and click Import: <project_root>/training/bank_simple_workspace.json
- By default, you have an assistant called My first assistant and it is associated with a default skill called My first skill. Open it in the Assistants menu on the left, then Swap skill to the imported Banking_Simple_DO_NOT_DELETE.
Step 3. Understanding the dialog (optional)
It is good to see how dialog flows, intents, and entities are defined for future reference. For instance, an intent named #Business_Information-Contact_Us will be triggered by inputs like the following:
- Can I email you?
- Can I talk to someone?
- I need an SMS number customer service
- What are your contact details?
Entities are like attributes of Intents. In this case, the entity @contact_type is used to refine user intents and act like a subcategory of #Business_Information-Contact_Us with the following values:
- address
- call
- SMS
Now, the dialog defines how Watson behaves depending on inputs. In this example, when #Business_Information-Contact_Us is triggered, Watson checks which @contact_type the user prefers and responds accordingly.
Step 4. Try it
Let’s try it and see how your assistant behaves. Click Try it on the right top corner and test that it works as expected. In the following image, Watson detected #Business_Information-Contact_Us and @contact_type correctly and responds as expected.
What’s next?
You’ve now configured your Watson to handle sample Q&A. In the next parts of this blog series, we are going to integrate with Node-RED. You can, of course, train Watson however you want and connect with your own app in the future.
Alternatively, you can add direct integration by clicking Add integration in the assistant menu.
Disclaimer
IBM is not liable for any damages arising in contract, tort or otherwise from the use of or inability to use this post or any material contained within. All sample code is provided as-is and IBM does not support customization. Do not use the code in production.