Ready to use intents & dialogs for Chatbot - ibm-cloud

I'm using IBM Watson Conversation to build a bot. I'm looking for ready to use dialogs & intents regarding the most commonly used conversation statements.
Like: Welcome, Good Morning, Aha, Good Evening, How are you, Who are you, etc...
Actually when I used api.ai from Google, there's a default WELCOME intent, it's AMAZING. I'm looking for something similar.

In Conversation service, you get the default "welcome" message node and another node for "anything_else" which will be executed when no other intent matches the user query. These two nodes get created for you the moment you go to the Dialogue tab of the service for the first time.
This gives you a skeleton of how you can add new nodes as per your need. Currently there aren't any other intents that Conversation service provides by default, which also makes sense in someway as everyone's need might be different.
But the service provides some default entities which are called "system entities". These are common entities like Person's name, location, currency etc that might be used in almost all sorts of chatbot scenarios. By default, these will be disabled, but you can turn them on from the "Entity" tab and click on system entities. For better designing I recommend you check this documentation

Related

IBM Watson chat bot

I am currently working on Watson chat bot with the aim of creating a virtual assistant for customers which hopefully will be capable of handling the requests. Within a node I ask a question like "Can you provide me the serial number?" to the customer. What I want Watson to do is that save that serial number as a variable so that I can respond the customer like "Okay so the number you provided is "that number", do you confirm?" I would be very happy if someone can help me out with this. How am I going to integrate a variable capable of storing the customer´s input?
Thank you in advance!
I recommend starting with this tutorial on building a database-driven Slack chatbot. The source code is available on GitHub.
The tutorial shows how to gather event data. The typical way of getting data from an user is to save the relevant parts in so-called context variables. You can access those variables from within dialog nodes, print values in responses or pass them to other code for backend processing (like in a database system).

Using "speechBiasingHints" with Dialogflow Webhook

First time posting, so feel free to give me feedback if I could improve something about this post... Now on to my question.
I am currently developing a Google Action, the Action will allow the user to define important events, such as Bob's Birthday or Fred's Graduation, and save data about said events. Later, the user will be able to ask for info about the event and get it returned back to them.
I am using the Dialogflow API with "Inline Editor" fulfillment to keep it as simple as possible for right now. The problem I am running into is this, the event has an entity type of #sys.any, so anything the user says is excepted as valid input. I would like some way to bias towards events I already have stored for the user however, so that they are more likely to find the event they are looking for.
I found another answer on here discussing speech biasing (What is meant by speech bias and how to use speechBiasHints in google-actions appResponse) which defined speech biasing as the ability to"influence the speech to text recognition," which is exactly what I believe I want. While that answer provided sample code, it was for the Actions SDK, not the Dialogflow SDK, which I am using.
Can anyone provide an example of how to fill the "speechBiasingHints" section of the ExpectedInput response of the Conversation Webhook using the DialogFlow Webkook?
Note: This is for a student project, and I'm new to developing Google Actions and still very much learning about everything that is capable with Google Actions. Any feedback or suggestions are very welcome.
The question you link to does quite a few things differently than the approach you're taking. The Action SDK provides more low-level control, but doesn't have much Natural Language Processing (NLP) capabilities, which Dialogflow provides.
Dialogflow handles biasing a little differently through the use of Entities, so you don't need to control the speech biasing directly, Dialogflow can handle that for you, to some extent.
Since each user may have a different event name, you'll probably want to use a User Entity, which is an entity you define and then populate on a user-by-user basis through Dialogflow's API. In your sample phrases, you can then use this entity name instead of #sys:any, or create another set of phrases that use this entity in addition.

Watson: Dialogs. Are they required?

We are working on a micro-service that interacts with Watson.
I was presented with the following argument: "There is no need to use dialogs in a Conversation project on Watson. Declaring the intents and entities is just enough to get the work done"
Based on the documentation, I have the impression that using dialogs is a requirement in order to train Watson correctly on how to interpret the combination of intents and entities. Plus, in the Dialog section, you have the chat that allows you to make corrections.
Is there a way that I can confirm that Dialogs are or are not a requirement?
If you plan to just use intents and entities programmatically then you don’t need dialog.
You will need to create one blank dialog, with a condition of true. This is to avoid SPEL errors relating to no node found.
From a programming point of view (and ignoring conversation for a minute), if you need to take action on intents, entities, or change context variables. The recommendation is to do that in dialog. This way your code is not split across two systems, making it easier to maintain.
Probably in this phrase above, the author wants to say that you only need to create #intents and #entities for your Conversation and defining the purpose for your bot, this is true, depends on what you want to do in your bot, cause after it you can just create your dialog flow!
The Dialog section is for you create your Dialog flow, is absolutely needed when you want to create one conversation flow, e.g: one chatbot.
A workspace contains the following types of artifacts:
Intents: An intent represents the purpose of a user's input, such as a
question about business locations or a bill payment. You define an
intent for each type of user request you want your application to
support. In the tool, the name of an intent is always prefixed with
the # character. To train the workspace to recognize your intents, you
supply lots of examples of user input and indicate which intents they
map to.
Entities; An entity represents a term or object that is relevant to
your intents and that provides a specific context for an intent. For
example, an entity might represent a city where the user wants to find
a business location, or the amount of a bill payment. In the tool, the
name of an entity is always prefixed with the # character. To train
the workspace to recognize your entities, you list the possible values
for each entity and synonyms that users might enter.
Dialog: A dialog is a branching conversation flow that defines how
your application responds when it recognizes the defined intents and
entities. You use the dialog builder in the tool to create
conversations with users, providing responses based on the intents and
entities that you recognize in their input.
EDIT:
Likes #Simon O'Doherty said, if your purpose is to just use the Intents and Entities programmatically, then you don't need the Dialog. His answer is complete.
See the documentation for Building a Dialog.
Yes Intents and Entities may seem to be just enough for you, and you can generate the answer - programmatically - based on them. But you should keep in mind that Dialog does not mean Response. I know it's hard to find this clearly in Watson documentation.
If you need to get access to context variables in the next node or dialog, you should define them in slots. Without defining dialogs for each intent, context variables will not be accessible or conveyed in the next dialog.
You need Dialog portion of Watson Conversation service if you want to respond to the user queries.
The Intents and entities are the understanding piece, and the Dialog portion is the response side of the conversation.

Create a chatbot for stock market using IBM Watson

I would like to create stock bot which can have basic conversation and give me stock price in conversation.
To get stock price i am using yahoo finance api.
For basic conversation i am using
IBM watson conversation api
I have also used
IBM NLU (natural language understanding) Api
to verify different company names asked in different manner but i am not getting expected result.
For example if i search
"What is price of INFY?"
then it should give me correct answer and should filtered out as my action should be to pass INFY in yahoo finance api. This should also work if i change format of question asked.
Below is the flow chart setup which i made on node-red panel of bluemix (IBM).
Could you help me to find out exact api's and flow which could help me achieve my goal.
This is a pretty big one, but at least some first impression comments...
Watson Conversation Service is already integrated with NLU component - the intents and entities TAB. The company names could be extracted from the input text with the use of entities and entities synonyms. Drawback here is that the user needs to list all the possible variants of how the company name can look like, but on the other hand, the entities specification can be imported in the Conversation through a csv file.
In general the integration of Watson Conversation service and some 3rd party services needs to be done outside the Conversation service - as it as of now - does not explicitly support calling of 3rd party APIs, so the node.js solution here seems a sound one. What you need to specify is how the integration of WCS and 3rd party services will look like. The general pipeline could look like:
user inputs text to the system
text goes to Watson Conversation Service
the intent and company name is extracted in WCS
WCS sends text output + sets a special variable in the node output field such as "stocks" : "Google" that will tell the node.js component that sits after the conversation service to find out and include stocks market value of Google inside the output text
Now - back to your solution - it might make sense to have also a dedicated NLC service that will be used only to extract the companies name in the system. However I would use this only if it would turn out that e.g. entities in WCS service are not robust enough to capture the companies properly (my feeling here is that for this particular use case the entities with synonyms might work ok).

async autocomplete service

Call me crazy, but I'm looking for a service that will deliver autocomplete functionality similar to Google, Twitter, etc. After searching around for 20 min I thought to ask the geniuses here. Ideas?
I don't mind paying, but it would great if free.. Also is there a top notch NLP service that I can submit strings to and get back states, cities, currencies, company names, establishments, etc. Basically I need to take unstructured data (generic search string) and pull out key information with relevant meta-data.
Big challenge, I know.
Sharing solutions I found after further research.
https://github.com/haochi/jquery.googleSuggest
http://shreyaschand.com/blog/2013/01/03/google-autocomplete-api/
If you dont want to implement it yourself, you can use this service called 'Autocomplete as a Service' which is specifically written for these purposes. You can access it here - www.aaas.io.
you can add metadata with each record and it returns metadata along with the matching results. Do check out demo put up on the home page. It has got a very simple API specifically written for autocomplete search
It does support large datasets and you can apply filters as well while searching.
Its usage is simple - Add your data and use the API URL as autocomplete data source.
Disclaimer: I am founder of it. I will be happy to provide this service to you.