Is there a rasa story log - chatbot

in rasa
my chatbot is not following the story well. Is there a way to collect data about a conversation and see why it chose the story step it chose?

You can add the --debug flag to any CLI command (e.g., rasa shell, rasa run) to get the logs which are going to show you how the intent of the user message is classified as well as the actions predicted by the policies. I would also recommend writing test stories and checking if you get any failures.

Related

Watch GH Actions workflow output in CLI?

I am trying to execute a github workflow via the CLI, on a particular branch. The documentation for this is available here https://cli.github.com/manual/gh_workflow_run
Is it possible to somehow get the same kind of logs you get with the browser UI in the terminal? I would like to programmatically interact with it.
If its possible with their rest api, then that is even better.
https://docs.github.com/en/rest/reference/actions#create-a-workflow-dispatch-event As per the docs, it just returns a Status: 204 No Content.
It's possible to do trigger a workflow remotely using the Github API with the dispatch_event API you mentioned.
The Github CLI have various command for workflow (to run, list, view, enable ou disable). You can find more information on the official documentation
To get the logs from the Github CLI, as explained here, you can use commands such as:
gh run view run-id --log
Note that if you don't specify run-id, GitHub CLI returns an interactive menu for you to choose a recent run, and then returns another interactive menu for you to choose a job from the run.
You can also use the --job flag to specify a job ID. Replace job-id with the ID of the job that you want to view logs for.
gh run view --job job-id --log
You can use grep to search the log. For example, this command will return all log entries that contain the word error.
gh run view --job job-id --log | grep error
To filter the logs for any failed steps, use --log-failed instead of --log.
gh run view --job job-id --log-failed
Note that you can get a workflow run id from the Github API as well.
Therefore, as you should already have the job_id from the workflow file, it could be possible to start a workflow with a dispatch_event through the API, then get the workflow run_id from the workflow runs list API as well, and use the Github CLI command(s) in loop to get the logs.
It's not pretty, but it should work gathering all those steps in a script as a workaround.

IBM watson assistant export logs

I used the script below to export logs from watson successfully a few weeks ago. However, I couldn't use the same script to export logs in the past few days.
I received error message "no logs found" after sending the request.
May I know if anyone also encounter this problem?
https://github.com/watson-developer-cloud/community/commit/b91891c5379ecc62b1ddcded34f6e4a1d58d6e1c
If you receive a result like "no logs found" and no error message, then it seems ok. Depending on your service plan of IBM Watson Assistant there are different limits how long logs are retained. Thus, you might have received logs when checking first, but when checking again some days later with no new activities there might be no log data anymore to return.
The API docs for Watson Assistant have details on list log events and have code samples for Python, Node.js, Java, Go and more languages. Check that your script follows that Python example.

How does a command output can be accessed through UDEPLOY REST API

I am using https://host/cli/componentProcessRequest/info/ to get the information about a component process request execution details. But this gives basic information not detailed like logs.
In this process we execute a shell script. I want to get the shell output log through REST API. Is there any way that i can achieve this ?

Can I re-run a Power Automate flow instance from history?

Is there any way to find and re-run an earlier instance of a Power Automate workflow programmatically?
I can do this manually: download the .csv file containing the instances, search in the Trigger output column the one I want, get the id, copy-paste the run URL, and click resubmit.
I tried with Power Automate itself:
The built-in Flow Management connector supports only to find a specific flow by name, and does not even go to the history.
PowerShell:
Installed the PowerApps module, I can list the instances with
Get-FlowRun -FlowName {flow name}
But I don't see the same properties as in the exported .csv file, and there's also no Run-Flow command that would let me run it.
So, I am a little stuck here; could someone please help me out?
We cannot programmatically resubmit the Flow run from the history with PowerShell or by any other api method yet.
But can avoid some manual work by using workflow function in a Flow compose step, we can automate the composition of Flow history run url. Read more
https://xxx.flow.microsoft.com/manage/environments/07aa1562-fea6-4583-8d76-9a8e67cbf298/flows/141e89fb-af2d-47ac-be25-f9176e64e9a0/runs/08586722084717816659969428791CU12?backUrl=%2Fflows%2F141e89fb-af2d-47ac-be25-f9176e64e9a0%2Fdetails&runStatus=Failed
There are 3 guids that I need to find aso that I can build up the flow history url.
The first guid is my environmentName (07aa1562-fea6-4583-8d76-9a8e67cbf298), then I’ve got the flow name ( 141e89fb-af2d-47ac-be25-f9176e64e9a0) and finally the run (08586722084717816659969428791CU12).
There is a cmdlet from Microsoft 365 CLI to resubmit a flow run
m365 flow run resubmit --environment flowEnvironmentID --flow flowGUID --name flowRunID –confirm
You can also resubmit a flow run using Power Automate REST API
https://api.flow.microsoft.com/providers/Microsoft.ProcessSimple/environments/{FlowEnvironment}/flows/{FlowGUID}/triggers/manual/histories/{FlowRunID}/resubmit?api-version=2016-11-01
For the Power Automate REST API, you will have to pass an authorization token.
For more information, go through the following post
https://ashiqf.com/2021/05/09/resubmit-your-failed-power-automate-flow-runs-automatically-using-m365-cli-and-rest-api/

How do I configure and access logs in Java and Docker based OpenWhisk actions?

Looking at OpenWhisk samples, it seems like JavaScript-based actions can use console.log() to report log information that would be collected and accessible through the activation API.
However, it is not clear how to report logs from Java-based or Docker-based actions.
Logs for OpenWhisk actions are taken from stdout and stderr of the action. That mechanism applies to all runtimes.
The Docker-based approach assumes that stdout of the program you run is the result of the action. Thus it takes stdout and tries to JSON-parse it. The result will be the result of the action itself rather than logs. Currently there is no way to write logs in a docker-based action.