google actions: conversation content order with rich response - actions-on-google

I am able to use the SimpleResponse, BasicCard, List and other such rich responses. Can the following be supported?
a. only speech + basicCard + simpleResponse
if I build a response such as:
conv.ask('<speak> ...</speak>');
conv.ask(new BasicCard(
);
conv.ask(new SimpleResponse({
speech: ...
text: ...
});
I notice that on display devices (phone), the content of the speak appears as text too. Is there a way to avoid it?
Next, the text of the Simple Response appears before the Card. Is there a way to ensure it appears after the card.
Currently, for the first problem, I am forced to use a SimpleResponse with a short text (like Hi) and for the second problem, I have put the text as the card text and remove the SimpleResponse.
But would like to know if there is a way out? Thanks

First of all; As stated in the reference docs for the node.js library, the first item in your response should always be a SimpleResponse. And a SimpleResponse always shows a text, whether it's a short text that you define or the transcription of its speech property. But I like that you're putting a short text instead to avoid showing the user what your Action says verbatim.
Second; from my experience, the order of the responses aren't shown accurately on the simulator. I've tested your case in a dummy Action and while the simulator shows the Final Response (which is last in my code) before the card, my phone shows them in the correct order.
Simulator:
Smartphone:
Test in on a device and see if the error persists. I currently don't have my Google Home near me but test on it as well if you can.

For your first problem: If you want to use ssml tags you are forced to use a SimpleResponse, that's how it's meant to be. In other words your first problem is not a problem :)

Related

google actions: reprompts not showing

I am trying to get the system to prompt if the user is silent or not entered any response. This is using actions sdk.
As per the documentation (https://developers.google.com/actions/assistant/reprompts), I set the conversation object in the json as:
"inDialogIntents": [
{
"name": "actions.intent.NO_INPUT"
}
]
Then, in the functions code I have the following:
app.intent('no_input', conv => {
conv.ask('Hello');
});
Yet there has been no response even after waiting for a few minutes. I even tried
app.intent(actions.intent.NO_INPUT, conv => {
conv.ask('Hello');
});
but the code has not been called. Can someone share what needs to be done to get this working? Thanks.
Here's a more detailed version of my comment:
First of all, smartphones DO NOT have no-input support, as they close the mic automatically when the user doesn't say anything and they also make it visually clear. So if you're testing on a smartphone, that's the reason you're not seeing your reprompts.
As for testing of the no-input prompts, it can be rather hard to test on a Google Home. Maybe you don't have access to one or you don't want to wait awkwardly staring at your device. For these reasons we have "No Input" button in the Simulator:
You can use this button to simulate a No Input prompt. If that still doesn't solve your problem, then you can assume there's something wrong with your code.

google actions sdk: how to include an audio when displaying a basic card response

when responding to a user query using actions sdk I am able to create a basic card using:
conv.ask(new BasicCard({
text: 'Text with card display',
title: 'Title:',
display: 'CROPPED',
}));
However, if I wish to provide the user with some audio (different from the display text) how do I do it?
I tried to add a conv.ask('<speak>' + 'Hello' + '</speak>'); but it throws a error
MalformedResponse
expected_inputs[0].input_prompt.rich_initial_prompt.items[0].simple_response: 'display_text' must be set or 'ssml' must have a valid display rendering.
What is the best way to include an audio in a google actions project? Thanks
If you want to play the audio in the background, I'd suggest using SSML, but if your actual goal is to just deliver the audio to the user (like if it's a podcast or something) you can use a Media Response.
If, however, you want the text displayed on a device with a screen to be different from the text that's spoken, you could add a Simple Response (which has the option to add different text and speech).
The Basic Card does not have audio attached to it. As the name suggests, it is a visual cue rather than an audible one. It is meant to supplement the text that is spoken and displayed - not replace it.
While you can create a SimpleResponse that has different text that is spoken vs displayed, you should make sure that both responses are substantially the same. You can use a SimpleResponse with something like this:
conv.ask(new SimpleResponse({
speech: '<speak>Here are the results, showing our sunny recreational facilities. <audio src="https://actions.google.com/sounds/v1/animals/cicada_chirp.ogg">And the sounds of nature.</audio></speak>',
text: 'Here is the result',
}));

point-of-sale-api iOS callback from FileMaker Go

I'm close to getting my homegrown POS app to work with Square, but I'm missing something simple and can't seem to turn up an answer. I'm using FileMaker Go as the app, but I don't think that that is relevant to my current proof-of-concept issue. It may be relevant to other issues later (callbacks).
In my point-of-sale-api settings, I have:
com.filemaker.go.17
for the Bundle ID, and
create-workflow
for the iOS App URL Schemes, which seems to be the first piece of code that Square allows me to save. Any prefixed item such as shortcuts://create-workflow gives an error without description (I'm hoping that Square will trigger a workflow as a test in this POC).
I'm hoping to just trigger safari or workflow/shortcuts with the callback as filemaker go doesn't directly accept the callback response without a helper application - which I'll eventually try.
Any thoughts on what I'm missing?
Thanks tons!

XCUITest not recognizing alert

I'm new to XCUITest and have come across a problem where it's not recognizing an alert.
I used the recorder to get the commands, but when I try to play it back, it fails with an error saying:
No matches found for Find: Descendants matching type Alert from input...
let app = XCUIApplication()
app.navigationBars["Spree.HomeWebView"].children(matching: .button).element(boundBy: 1).tap()
app.alerts["Select a Saved Password to Use With “Spree-DEBUG”"].buttons["Not Now"].tap()
I thought it might be a problem with the double quotes in the string, but when I tried the following:
app.alerts["Select a Saved Password to Use With \“Spree-DEBUG\”"].buttons["Not Now"].tap()
It said Invalid escape sequence in the literal.
Since you want to query a button that is inside an alert, you can omit whole alerts subscript part of your query. Just like:
app.alerts.buttons["Not Now"].tap()
This is because you probably don't have more than one alert shown at the same time, so querying all alerts should give the one you need since you only have one, right :)
Finally, I found a real solution, basically what happens iOS 11-12 works fine with a handler, but iOS13 doesn't if you uninstall app during automation. So as guys provided you need to use springboard activity to handle alerts which are not part of the app, and conclusion:
Handle addUIInterruptionMonitor
Handle app.alerts
springboard alerts

appAPI.notifier.show sometimes not work

i have a problem with Crossrider code.
I want to display notifier with this code:
appAPI.notifier.show({
'name':'my-notification-name',
'title':'Title',
'body':body_popUp,
'theme':'facebook',
'position':'bottom-left',
'close':false,
'sticky':true,
'fadeAfter':1,
'width':'700px',
'closeWhenClicked':false
});
but sometimes work, sometimes not work.
Do you have an idea? I have to write any instructions before call .show()?
Thanks in advance, Mattia
You don't show what body_popUp is set to but, assuming it's valid HTML and it's placed in the extension,js file, the code look fine.
In general, note that the notification is smart and only appears when it detects user movement in the browser. This algorithm is used as a way to ensure that the notification is seen, as it assumes the user is looking when activity is detected.
[Disclosure: I am a Crossrider employee]