Training data for Conversation Enhanced Watson Application - ibm-cloud

Looking at Retrieve and Rank Web UI bound to the conversation-enhanced application:
https://github.com/watson-developer-cloud/conversation-enhanced
no questions have been uploaded for training, though there is a trainingdata.csv.
I would like to understand how trainingdata.csv was constructed.
Thank you !

That training data was created manually, not using the UI, using the approach described in https://www.ibm.com/watson/developercloud/doc/retrieve-rank/training_data.shtml (because it was prepared before the tooling was available)

Related

ChatBot with conditional response flow - Rasa Open Source

I'm working on a Rasa (Open Source) project, I need to represent the diagram flow in a chatbot.
The main problem is following the conditional flow as the user can say yes or no and modify the flow of the conversation.
I would like to know how I could build a chatbot that contemplates all the possibilities represented in the diagram and the others that are outside it, using Rasa.
In other words, a chatbot that responds to the user according to his previous response.
Please.
flowchart
A "solution" I found was to create a story for each possible path, but it is "unfeasible" due to the number of stories. (there are 9 other diagrams like this one).
Regarding the flow and how you can map your designed diagrams into Rasa, you try to use one universal story/rule to make your structure more modular. You can find parts of the flow in your diagram that are being repeated on the other ones also and create your story/rule(s) out of them to be used in different flows. Rasa also supports checkpoint, which let you manage your stories in a more restrictive way.
For getting users' responses and acting accordingly, you need to deploy Rasa forms and action in your story/rule(s) for extracting those 'entities` you want to get from users and manipulate them.

How to train IBM Watson Assistant to answer from a specific dataset (say a eBook)?

I am a new bee to IBM Watson. I went through videos to create virtual assistant/chatbot where we could define intents/entities and answer accordingly. This seems fine when I have limited number of intents/entities. But say, I have a eBook and I want to train Watson to answer from this eBook. How do I achieve this. Anyone high level approach or direction will be really helpful.
There are different approaches.
You could use the integrated search skill which provides a link to Watson Discovery. You would upload your eBook to Watson Discovery and kind of index it.
Another approach is to use a database or something else as backend. Based on the input which identifies the search term and scopes which eBook to search, the answer would be retrieved from the backend database. This tutorial features a Db2 database and Watson Assistant retrieves the answer from the database. A similar approach is taken in this sample which shows how to retrieve excerpts from Wikipedia.

IBM Watson chat bot

I am currently working on Watson chat bot with the aim of creating a virtual assistant for customers which hopefully will be capable of handling the requests. Within a node I ask a question like "Can you provide me the serial number?" to the customer. What I want Watson to do is that save that serial number as a variable so that I can respond the customer like "Okay so the number you provided is "that number", do you confirm?" I would be very happy if someone can help me out with this. How am I going to integrate a variable capable of storing the customerĀ“s input?
Thank you in advance!
I recommend starting with this tutorial on building a database-driven Slack chatbot. The source code is available on GitHub.
The tutorial shows how to gather event data. The typical way of getting data from an user is to save the relevant parts in so-called context variables. You can access those variables from within dialog nodes, print values in responses or pass them to other code for backend processing (like in a database system).

Does the Recommendation service allow enriching an existing model with new data?

We are able to provide an initial training model and ask for recommendations. When asking for recommendations we can provide new usage events. Are these persisted at all into the model? Do they manipulate the model at all?
Is there another way the data is supposed to be updated or do we need to retrain a new model every time we want to enrich the model?
https://azure.microsoft.com/en-us/services/cognitive-services/recommendations/
EDIT:
We are trying to use the "Recommendations Solution Template" which deploys a solution to Azure and provides a swagger endpoint for working with the model (https://gallery.cortanaintelligence.com/Tutorial/Recommendations-Solution)
It appears the Cognitive Services API is much richer than this. Can the swagger version's models be updated?
After more experience with this I discovered a few things as of August 21st, 2017:
While not intuitive for the uninitiated, new data requires training a new model for the data to be persisted into the model.
This allows a form of versioning the model, and means when you make new models you can switch recommendations to work how they did before if they don't work as well.
The recommended method appears to be to batch usage data and create new builds of the model on an interval.
The APIs do allow passing in recent usage data to allow recent data to be accounted for at scoring time, it's just not persisted.
The "upload usage events" call in the cognitive services API does not seem to work. Uploading the new usage data via a file does appear to work.
The Recommended Solutions Template vs. The Cognitive Services API
It appears the Recommended Solutions Template is a packaged version of the SAR (Smart Adaptive Recommendations) model inside the Cognitive Services API that is optimized for ease of use.
I'm presuming for other popular recommendation models like FBT the Cognitive Services API should be used as the deployable template only allows one model type.
Additional note on the Preview Status of the API
It seems microsoft is deprecating the datamart as of February and sending people to this preview API instead. Therefore it seem reasonable to presume this Preview is highly likely to move on past preview and not be killed.

How can I Use LogiXML Analysis Grid in ASP.Net?

Hope you all are fine. I am a beginner in BI Reporting. I need to know that how can I use LogiXML Analysis Grid in ASP.Net application with my own data.
Hope to get replies from you.
Thanks and Regards
Can you be more specific as to how you would like to use the Analysis Grid within your ASP.NET application?
The Analysis Grid is a super-element within the LogiXML Logi Info product, that allows you to connect to a data source and display an interactive grid within an dynamic HTML output. For some details, you may want to visit some of the online docs:
Working with Analysis Grids
http://www.logixml.com/devnet/rdPage.aspx?rdReport=Article&dnDocID=1037&dnProd=&IdeDisplayStatus=Collapsed
If you need some details about connecting with your data sources you can create any number of different types of data connections, here is some additional documentation on building connections:
Introducing Data Connections
http://www.logixml.com/devnet/rdPage.aspx?rdReport=Article&dnDocID=1146&dnProd=&IdeDisplayStatus=Collapsed
To allow for integration with your current ASP.NET application, you may be asking about a security integration with single-sign on, so that you can pass credentials from your existing application into the Analysis Grid report. Here are some docs about that:
Introducing Logi Security
http://www.logixml.com/devnet/rdPage.aspx?rdReport=Article&dnDocID=1205&dnProd=&IdeDisplayStatus=Collapsed
Working with Logi Secure Key
http://www.logixml.com/devnet/rdPage.aspx?rdReport=Article&dnDocID=1126&dnProd=&IdeDisplayStatus=Collapsed
hope this information helps.