In IBM Watson service, I have created my intents and entities, and even defined my dialog with slots. When testing the bot, it performs very poorly in recognizing the entities, especially when there are multiple system entities in an utterance. An example would be:
Patient was born on 1st of July 1945 and was injured on 2nd June 2017.
How can I manually label entities in test utterances?
Related
IBM Watson Assistant is depreciating the #sys-person entity for some reason. I use it a lot in my slots to capture the name.
How can I create an entity that would do the same thing to replace #sys-person?
If you are using a supported language, use annotation-based entities.
The tutorial on how to create a database-driven Slackbot uses that method for locations (#sys-location is deprecated, too). Load the provided skill and see how it is done.
Basically, you create an entity and then go to intent examples and tag the parts which identify a person. Then, Watson Assistant is learning that you expect a person entity in specific sentencens and sentence positions. You can fine-tune it by running some dialogs and correct the falsely identified or missing person entity values.
I use the same technique as replacement for #sys-location and it works for me in slots. Even "San Francisco" is recognized as one location. I added it as sample. You can tag entities across intents.
If you don't want to go that route, the only solution I am aware of is to define an entity, e.g., my-person with many examples and use that.
My Google Action delivers information to college students. For example: Who is the Title IX Coordinator?
To answer this question, we need to know the college the student attends. There are 2700+ colleges in the U.S. Many have the same name or similar sounding names.
So, #college-name is an entity in DialogFlow. Is there a way to import all 2700+ college names into DialogFlow as potential values for #college-name?
Also, is there a way to use a listbox with DialogFlow / Actions on Google with Google Assistant to ensure the correct college is identified?
Dialogflow has the ability to import entities from a file either in a CSV or JSON format.
There isn't a listbox visual widget, although you can use a List with similar names or Suggestion Chips to narrow down their search.
In addition to importing entity values from a file you can also push them to Dialogflow programmatically via the Dialogflow REST API. This API manages the agent itself and is thus different from the Dialogflow Webhook, which calls your fulfillment service.
The specific endpoint you would use to update entity values is projects.agent.entityTypes.entities. Dialogflow also offers SDKs for Python, Node.js and other languages. This is probably the best option if you have a large number of values, as it allows you to setup some kind of pipeline from your data source to Dialogflow and schedule it to update the entity on a regular basis (i.e. with an AWS Lambda function or a cron job that runs once a day).
Can we configure our Bluemix chatbot Application with Multiple Conversation Workspace? If yes then how we can call to particular conversation service on the basis of user questions asked on chatbot?
This can be done by your application. A scenario could be that
the user input is analyzed by your app or by sending it to the national language classifier or NL understanding service.
based on the results of the analysis your app would then send the input to the specific workspace
calls into a conversation workspace are stateless, but have an ID for the individual conversation (chat) and metadata about where in the dialog you are
that info could be used to later jump back to where in the conversation for a workspace the user was
IMHO, that technique could be used to support multiple spoken languages or to separate different more complex subjects into individual workspaces. Take a look at the architecture diagram in the documentation for the general idea.
I would like to create stock bot which can have basic conversation and give me stock price in conversation.
To get stock price i am using yahoo finance api.
For basic conversation i am using
IBM watson conversation api
I have also used
IBM NLU (natural language understanding) Api
to verify different company names asked in different manner but i am not getting expected result.
For example if i search
"What is price of INFY?"
then it should give me correct answer and should filtered out as my action should be to pass INFY in yahoo finance api. This should also work if i change format of question asked.
Below is the flow chart setup which i made on node-red panel of bluemix (IBM).
Could you help me to find out exact api's and flow which could help me achieve my goal.
This is a pretty big one, but at least some first impression comments...
Watson Conversation Service is already integrated with NLU component - the intents and entities TAB. The company names could be extracted from the input text with the use of entities and entities synonyms. Drawback here is that the user needs to list all the possible variants of how the company name can look like, but on the other hand, the entities specification can be imported in the Conversation through a csv file.
In general the integration of Watson Conversation service and some 3rd party services needs to be done outside the Conversation service - as it as of now - does not explicitly support calling of 3rd party APIs, so the node.js solution here seems a sound one. What you need to specify is how the integration of WCS and 3rd party services will look like. The general pipeline could look like:
user inputs text to the system
text goes to Watson Conversation Service
the intent and company name is extracted in WCS
WCS sends text output + sets a special variable in the node output field such as "stocks" : "Google" that will tell the node.js component that sits after the conversation service to find out and include stocks market value of Google inside the output text
Now - back to your solution - it might make sense to have also a dedicated NLC service that will be used only to extract the companies name in the system. However I would use this only if it would turn out that e.g. entities in WCS service are not robust enough to capture the companies properly (my feeling here is that for this particular use case the entities with synonyms might work ok).
I want to train the Watson Conversation service without using the toolkit? I want the chatbot to be trained by code.
I want to develop a system from which the administrator of a web page can edit or create intents and entities, so that in this way I do not have to be the one to edit if something is wanted to change. IBM Watson Virtual Agent is something similar to what I want to create
You can create your own tooling or integration into conversation using the Workspace API.
https://www.ibm.com/watson/developercloud/conversation/api/v1/#workspaces