IBM Watson Assistant - store slot data in custom defined entitiy? - ibm-cloud

I am using slots in one of my nodes in the dialog of IBM Watson Assistant. The issue is none of system provided entities for slots(sys-person, sys-percentage, sys-number...) does not fit my need.
I need a company name to be saved. So I created my own entity named
#companyName added pattern, and when the user enters value it is recognized by the entity pattern but the data is not saved to the entity.
How can I save the answer the user gives to that question in the my entity #companyName
screenshots:
---

You don't want to save the entity, but its value...:
You would need to check for #companyName.value and save it to the variable $companyName. See this slot using tips in the documentation for IBM Watson Assistant.
This tutorial showing a database-driven chatbot uses patterns to capture data and the code is available. Examine it for some coding examples.

I figured it out!
Everything that I had to do is append .literal in the second screenshot after the Check for: #companyName.
So now the slots part looks like this, and we have the entered value inside the variable $companyName
I found it in the official documentation, more precisely here:
https://cloud.ibm.com/docs/services/assistant/dialog-slots.html#dialog-slots
(under number 4. Add a slot for each unit of required information. For each slot, specify these details)

Related

Here Flutter SDK (Explorer): How to retrieve details (e.g. contacts, fuelstation, evChargingPool)?

I was trying out the Flutter exmaples for the Here Flutter SDK (explorer). One use case would involve retrieving details about the found places. I checked the Place class and based on the nested details class I expected that I could retrieve detail Information about e.g., evChargingPools and the included charging station. Nevertheless detail information (contact, opening time ..) would also be of interest.
Unfortunately and independend of the type of query (TextQuery, PlaceIdQuery, CategoryQuery) i used, i could only retrieve the categories as part of the details node.
I am currently using the freemium version. I checked the documentation and at least up to my knowledge it does no state under which constraints (well excluded the case, the data does not exist) these detail information are included, or how to retrieve them.
your CategoryQuery has to include the FuelStations Ids.
https://www.developer.here.com/documentation/flutter-sdk-explore/4.13.2.0/dev_guide/topics/search.html#search-for-places-categories
So init a SearchEngine and a CategoryCategory with the area and the PlaceCategory.id(id: 7600)
Source for PlaceCategories: https://developer.here.com/documentation/geocoding-search-api/dev_guide/topics-places/places-category-system-full.html#400---transport
Sorry if i can't provide a SourceCode example (my app is written in Swift so no Flutter), but i can provide you a Postman Request for it :)

Google Actions CLI 3.1.0 version and actions.intent.TEXT

I want to be able to talk with Google Assistant, but connect the Actions project directly to an NLP service I already have running on my server. In other words, NOT use dialogflow.
All the following examples show how to do this.
With Rasa
https://blog.rasa.com/going-beyond-hey-google-building-a-rasa-powered-google-assistant/
With LUIS
https://www.grokkingandroid.com/using-the-actions-sdk/
https://dzone.com/articles/using-the-actions-sdk-for-google-assistant-develop
With Watson
https://www.youtube.com/watch?v=no0R0bSkHXc
They use the actions.intent.MAIN as the invocation and actions.intent.TEXT for all other utterances from the talker.
This is what I need. I don’t want to create a load of intents, with utterance phrases, inside the Action because I just want all the phrases spoken by the talker to be passed to my server, and for my NLP service to deal with them.
So I set up a new Action project, install the Actions CLI and then spend 3 days trying all possible combinations without success, because all these examples are using gactions cli 2.1.3 and Google have now moved on to gactions cli 3.1.0.
Not only have the commands changed, but so too has the file formats and structure.
It appears there is also a new Google Actions Console, and actions.intent.TEXT is no longer available.
My Action is webhook connected to my server, but I cannot figure out how to get the action.intent.TEXT included and working.
Everything I find, even here
Publishing Actions on google without Dialogflow
is pre version update and follows the same pattern.
Can anyone point to an up-to-date, v3.1.0, discussion, tutorial or example about how to send all talker phrases through to an NLP that isn’t dialogflow, or has Google closed that avenue?
Is it possible to somehow go back and use the 2.1 CLI either with the new Console or revert the console back. (I have both CLI versions, I can see how different their commands are)
Is it possible to go back and use 2.1?
There is no way to go back to AoG 2. You probably also don't want to do so - newer features aren't available with v2 and are only available with v3.
Can I use my own NLP with v3?
Yes, although it isn't as obvious, and there are some changes in semantics.
As an overview, what you'll need to do is:
Create a Type that can accept "Free form text". I usually call this type "Any".
In the console, it looks something like this:
Create a Custom Intent that has a single parameter of this Any Type and at least one phrase that captures everything for this parameter. (So you should add one training phrase, highlight the entire phrase, and set it for the parameter. Sometimes I also add additional phrases that includes words that I don't want to capture.) I usually call the Intent "matchAny" and the parameter "any".
In the console, it could be something like this:
Finally, you'll have a Scene that you transition to from the Main invocation. When it matches the "matchAny" Intent, it should call your webhook with a handler name. Your webhook will be called with the "any" parameter set with the user utterance. (Note that the JSON has also changed.
Again, the console might have it looking something like this:
That seems like a lot of work. Isn't there just some way to do all that from the command line?
Yes. You can do all of that in the configuration files that the CLI accesses and then upload it. (You can then also use the console to review the configuration, if necessary, to make sure they're configured as you expect. You can shift back and forth between them as appropriate.)
Google also has a github repository that contains most of the files pre-configured for this sort of setup.
You will need to update the configuration from the repository to handle the webhook correctly (it includes code to illustrate what is happening using the inline code editor) and to add your project ID.

Adding block feature using swift and parse server

So I'm dealing with app rejection "Guideline 1.2 - Safety - User Generated Content" and one of the features app want me to implement is "- A mechanism for users to block abusive users" Im not sure exactly how to put the code into action but know what i have to do which is
Step 1. Create class in Parse for blocked users, like: "Blocked"
Step 2. Create columns of type [String]: blockedBy & username (user is blocked)
Step 3. Query only users if current user is not in blocked list
Step 4. Add button to send PFObject to block a User
If someone can help i can provide info from my project. It would be much appreciated because I've been struggling with this for weeks.
Check out https://www.raywenderlich.com/98831/parse-tutorial-getting-started-web-backends tutorial. Great help, shows how to setup the local environment, the MongoDB, how to create a class within the local parse environment ran from the terminal, shows how to query, etc. Let me know if you have more specific questions I can help resolve.

IBM Watson Conversation : Intent is not triggering the dialog, all user inputs lead to anything_else

I'm a complete newb in this field, so started with the tutorial in the documentation. This is the page where I'm having problem.
I'm able to create the welcome dialog, and changes in the anything_else dialog are reflecting fine. But when I create a new dialog and set the trigger to the intent, it does not work. I've tried typing in the name, copy pasting it, clicking on (create new condition) and not. Tried different browsers and restarted project too.
There is a dropdown on the text that I type in the bot panel that says choose intent name and gives the list of two intents I've made. When I try clicking on the intent it's supposed to be in, I get this error :
Unable to change the intent. Error: Unique Violation: The value "good morning" already exists
I don't know what I'm doing wrong and googling it gives no result! Comment sections on different websites too seem to be having no problem with it as well.
Any help is appreciated!
Edit : JSON as downloaded from the project menu:
{"name":"Car tutorial","created":"2017-04-30T16:42:55.215Z","intents":[{"intent":"greeting","created":"2017-04-30T17:50:08.575Z","updated":"2017-04-30T17:50:08.575Z","examples":[{"text":"Good afternoon","created":"2017-04-30T17:50:08.575Z","updated":"2017-04-30T17:50:08.575Z"},{"text":"Good evening","created":"2017-04-30T17:50:08.575Z","updated":"2017-04-30T17:50:08.575Z"},{"text":"Good morning","created":"2017-04-30T17:50:08.575Z","updated":"2017-04-30T17:50:08.575Z"},{"text":"Hello","created":"2017-04-30T17:50:08.575Z","updated":"2017-04-30T17:50:08.575Z"},{"text":"Hi","created":"2017-04-30T17:50:08.575Z","updated":"2017-04-30T17:50:08.575Z"}],"description":null},{"intent":"turn_on","created":"2017-04-30T17:49:26.312Z","updated":"2017-04-30T17:49:26.312Z","examples":[{"text":"Air on please","created":"2017-04-30T17:49:26.312Z","updated":"2017-04-30T17:49:26.312Z"},{"text":"I need lights","created":"2017-04-30T17:49:26.312Z","updated":"2017-04-30T17:49:26.312Z"},{"text":"Listen to some music","created":"2017-04-30T17:49:26.312Z","updated":"2017-04-30T17:49:26.312Z"},{"text":"Play some tunes","created":"2017-04-30T17:49:26.312Z","updated":"2017-04-30T17:49:26.312Z"},{"text":"Turn on the headlights","created":"2017-04-30T17:49:26.312Z","updated":"2017-04-30T17:49:26.312Z"}],"description":null}],"updated":"2017-04-30T17:56:05.345Z","entities":[{"entity":"appliance","values":[{"value":"air conditioning","created":"2017-04-30T17:51:41.232Z","updated":"2017-04-30T17:51:41.232Z","metadata":null,"synonyms":["air"]},{"value":"headlights","created":"2017-04-30T17:51:41.232Z","updated":"2017-04-30T17:51:41.232Z","metadata":null,"synonyms":["lights"]},{"value":"music","created":"2017-04-30T17:51:41.232Z","updated":"2017-04-30T17:51:41.232Z","metadata":null,"synonyms":["radio"]}],"created":"2017-04-30T17:51:41.232Z","updated":"2017-04-30T17:51:41.232Z","metadata":null,"description":null},{"entity":"genre","values":[{"value":"classical","created":"2017-04-30T17:52:56.711Z","updated":"2017-04-30T17:52:56.711Z","metadata":null,"synonyms":["symphonic"]},{"value":"rhythm and blues","created":"2017-04-30T17:52:56.711Z","updated":"2017-04-30T17:52:56.711Z","metadata":null,"synonyms":["r&b"]},{"value":"rock","created":"2017-04-30T17:52:56.711Z","updated":"2017-04-30T17:52:56.711Z","metadata":null,"synonyms":["pop"]}],"created":"2017-04-30T17:52:56.711Z","updated":"2017-04-30T17:52:56.711Z","metadata":null,"description":null}],"language":"en","metadata":null,"description":"","dialog_nodes":[{"go_to":null,"output":{"text":{"values":["Hi! What can I do for you?"],"selection_policy":"sequential"}},"parent":null,"context":null,"created":"2017-04-30T17:55:24.851Z","updated":"2017-04-30T17:55:48.868Z","metadata":null,"conditions":"#greeting","description":null,"dialog_node":"node_3_1493574922154","previous_sibling":"node_1_1493574794528"},{"go_to":null,"output":{"text":{"values":["I'm sorry, I don't understand. Please try again."],"selection_policy":"sequential"}},"parent":null,"context":null,"created":"2017-04-30T17:53:39.404Z","updated":"2017-04-30T17:56:05.345Z","metadata":null,"conditions":"anything_else","description":null,"dialog_node":"node_5_1493574962852","previous_sibling":"node_3_1493574922154"},{"go_to":null,"output":{"text":{"values":["Welcome to the car demo!"],"selection_policy":"sequential"}},"parent":null,"context":null,"created":"2017-04-30T17:53:17.084Z","updated":"2017-04-30T17:53:57.004Z","metadata":null,"conditions":"welcome","description":null,"dialog_node":"node_1_1493574794528","previous_sibling":null}],"workspace_id":"2b6d0fdd-c04f-40e0-9310-0bba1ad38cef","counterexamples":[]}
Based on attached JSON, your problem is that Watson conversation is stuck in the training process (if the pink message is showing a long time, that means he stuck).
I cannot explain why it happened. But I know the solution, just use "conversation_start" instead of "welcome" trigger
Most probably, you placed "anything_else" node above your intent node.
Anything_else should be the latest node, because watson is executing it sequentional, once he will find match, it will go into.
Be sure, that you have the same order:

CRM 2016-Plugin Registring New Step

I am very new to MSCRM, so requesting for help. I am using Office365, i.e. MSCRM online organisation.
Here, I have written a plugin which should be fired when, in an Account entity, user uploads his image, the plugin stores the image as an attachment, in notes.
The plugin works fine, when I tested it by writing a console application.
I have registered the plugin and believe it will work fine here too. The only problem is I am unable to register the plugin new step.
The problem is in Filtering Attributes , I am unable to get entityimage attribute, even if i select/check All attributes.
Please suggest how should I proceed.
In this scenario you can write plugin on "Create" message of "Annotation" entity. And create message does not have any filtered attributes.
As you wrote and tested using Console Application, while converting it to plugin make sure that you are checking created note contains data into "FileName" and "DocumentBody" attribute. Along with you can also check whether this note is created against "Account" entity. This two conditions will narrow your scope, limited to notes created against account having some attachment. In plugin execution context you'll get above mentioned attributes.