I have configured an Assistant with some dialog skills. The utterances are passed through Node.JS backend of my application. As a requirement for the project is to use Discovery but in term of a search skill.
My question is: When the request is proceed to a search skill from Watson, the way of returning the result from Discovery is the same like from a dialog skill e.g. Discovery -> Watson Assistant -> Node.JS backend?
(We didn't configure the billing plan yet, that's why I'm asking this basic question.)
When you use Watson Discovery as part of a Watson Assistant search skill, access to Discovery is through Assistant. In terms of the underlying programming, take a look at the Watson Assistant API overview in the docs and at the actual V2 API for Watson Assistant. The search integration is configured is in Watson Assistant and determines how the results are displayed to the user as chatbot answer.
So, from the user view, everything is done through Watson Assistant.
Related
I'm creating a bot using IBM Watson Assistant. I am trying to use a webhook, but don't know the format of the POST request JSON/HTML which is sent to the webhook.
My case study is a shop where user can pre-order. I want to send the order details to my back-end server and give the user a reference number for the pre-order. I found nothing in the documentation about what POST request format is sent from IBM Watson Assistant and in what format response should be returned.
I know IBM Watson Assistant does not require a particular response format. It allows the developer to manipulate the response as the developer wants.
IBM Watson Assistant has a documented API. There are the recommended V2 Assistant API which can be used to create session and then send messages. The older V1 Assistant API has more functions and is deeper into the system. Both APIs can be used to write chatbots.
If you mean a Webhook as the Watson Assistant feature to reach out from a dialog node to an external service, the process is the following:
in the global configuration, you define the URL and the header
for a dialog node, you enable webhooks, then define the key / value pairs that are sent as payload. They can differ by dialog node.
Typically, the expected result is JSON data because it is the easiest to process.
This IBM Cloud Solution tutorial on building a Slack bot with Watson Assistant uses webhooks to call out to a Db2 database. The code is available in a GitHub repo.
We have integrated IBM Watson Assistant skill/workspace with a Facebook page using the Watson features. We did this using an integrated approach from Virtual Assistants tab.
We are able to get the response in Facebook Messenger from Watson skill/workspace FAQS. Now we want to add a few more questions to skill/workspace and get the response from a database.
We know that we can use IBM Cloud Functions to get DB data and respond back with the data, but Cloud Functions action types (web_action and cloud_function or server) incur a cost, hence we are looking for another approach.
We have our own APIs developed for the DB and want use those in Watson Assistant dialogue node actions. Please let us know how we can add it in actions and get a response from the API without using client application/cloud functions.
Note: we haven't developed any application for this chatbot, we directly integrated Watson skill/workspace with the Facebook page and trying to call API calls wherever we require them from the dialogue nodes.
As you can see, IBM Watson Assistant allows to invoke three different types of actions from a dialog node.
client,
server (cloud_function),
web_action.
Because for cloud_function and web_action the action is hosted as Cloud Function on IBM Cloud, the computing resources are charged. For type client, your app would handle the API call and the charges depend on where your app is hosted. Thus, there are always costs.
What you could do is to write a wrapper function that is deployed as web_action or cloud_function. Thus, there isn't much of computing resource needed and the charges would be minimal. But again, independent of the action type, there are always costs (maybe not charges) - one way or another...
I am using IBM Watson Assisstant to create a chatbot, and created a Watson Discovery collection too in the project.
I need help in that how the dialogs works to take a response from the Discovery collection when an intent along with the entities are detected in the try section.
in the response section we have to define something or there is something else similar like text response
is the discovery response is only available in the app on which we work
There are a couple of options to link up a IBM Watson Assistant chatbot to IBM Watson Discovery.
The first and oldest is to have the application interact with Watson Assistant and, depending on the flow, context and response send a request to Watson Discovery. Basically, the integration is done in the application layer.
The second option is to use server or client dialog actions in Assistant to directly call into Discovery. See my blog on a barebone news chatbot and the related code on GitHub on how to implement such an action. My example uses client actions (basically let the app handle it again), but server actions are similar. This IBM Cloud solution tutorial covers server actions for a database-driven bot.
The newest option and currently in beta is to use the direct integration of Assistant and Discovery. See "Building a search skill" for an introduction into this direct linkage between IBM Watson Assistant and IBM Watson Discovery.
The option 2 (server action) and 3 should work from the "Try it" in the tooling, the others not because of the app-based coordination.
As usual in IT, there are different way to achieve the goal, the choice is yours... ;-)
I have created a chatbot with IBM Watson Assistant. But currently I have hardcoded all values in the dialog
e.g : When some user will ask "Who created Computer ?" then in the dialog flow I have written "XYZ created computer".
But suppose the user will ask about some other person and that value is not hardcoded in the dialogs on IBM Watson Assistant then is there any way by which I can provide Google search results?
You can make programmatic calls from within IBM Watson Assistant dialog nodes. Both server-side actions (IBM Cloud Functions) or client side calls (within the app) are supported. That way you can react to such queries as described and could call a search engine, database or somethin else.
This IBM Cloud solution tutorial on how to build a database-driven Slackbot uses server side actions to interact with a Db2 database. Instead of calling out to a database to fetch data, in your example, you would open a request to Google search.
I saw that you tagged it "facebook-apps". If you use Botkit middleware to build an integration with Facebook Messenger, then check out this blog on how to enable actions in the Botkit Middleware for Watson Assistant.
I'm only familiar with Watson Assistant because I've done some work with it but I've never used Watson Assistant Solutions.
Is it a new API or an app on top of Watson Assistant? When should I use one or the other?
They are both IBM Cloud/SaaS services. You can learn more about them here:
Watson Assistant (formerly Conversation Service)
Build multi-turn natural language dialog.
Catalog of already configured customer service and skill workspaces.
Insights into conversations and improve your training
Built-in system entities to ease conversation design
Doc, Boilerplate apps, REST API, SDK and skill Workspaces
Publish AI bot channels for Watson Assistant AI bot to Slack, Facebook, Messenger, or Twilio.
Watson Assistant Solutions
An AI assistant framework, services and starters that you use to build your own branded, personal and proactive multi modal personal assistant. Optimized for use with any IOT device.
Watson Assistant for Industry
Out of the box skills: Weather, Greeting, Conversational Essentials, General Knowledge and IFTTT. Includes Watson Assistant (Formerly Watson Conversation Service) workspaces.
Device enabled for allowing users to communicate via Audio Gateway for Voice
Contextual skill routing service using via intent, location, language and previous skill context
Doc, SDK and Skill Boilerplate to build new custom skills
Watson Assistant for Hospitality
Includes Watson Assistant for Industry
Out of the box hospitality skills for Service Requests, Venue Information, Reservations, Command and control of devices & more
Watson Assistant for Automotive
Includes Watson Assistant for Industry
Out of the box automotive skills: IOT for Automotive to understand car data, Points of Interest, Navigation and Car Manual
Business Console for administrating skills
Well, IBM has its IBM Cloud offering that delivers infrastructure, platform, functions and more as a service (IaaS, PaaS, FaaS). Let's take a look at both of the Watson Assistant offerings:
Watson Assistant (formerly Conversation) is one of the platform services and features cognitive chatbot capabilities.
IBM also has a new product in its overall portfolio which is named Watson Assistant, the AI assistant for business. It is a software as a service (SaaS). There are industry-specific solutions (health, automotive, hospitality, ...) that are based on the capabilities of Watson Assistant (see 1), but feature industry insights, branding and integration with customer environments.
To make it programming-related: If you want to develop your own chatbot or chatbot-based solution from scratch, go with 1) and its API and SDKs (Nodejs, Python, ...). If you want to start on a higher level with pre-built assets, investigate 2).