How to invoke Google Assistant via Button - actions-on-google

all:
We want to enable Google Assistant with custom actions via the button, not the voice input (keyword).
For example, usually, we enable Google Assistant with word "Hello, Google, show me the weather.". But within our production, we want to press one specific button, and then it could send the sentence above out to Google Assistant directly.
But we can't find any APIs to support this requirement. And we heard that Google plan to support hard-key method since Samsung make good experiences on S8
Do anyone help us to fix this gap?
Thank you!

You could use an Action link without any additional parameters specified to trigger the MAIN intent, or specify the custom intent you'd like to trigger.

Related

How to display info on the default screen of a Google Nest hub

I am trying to display infos from a Google Sheet to the "lockscreen" (don't know how to call it) of a Google Nest Hub.
I want the info to be displayed all the time and take advantage of this screen that is always on.
Basically that would be a Todo list. I don't understand why I need to invoke an app or talk to my device while the screen is always on with weather, time displayed and the background picture.
WHAT I TRIED SO FAR:
I have serched the generic documentation for Google Assistant (https://developers.google.com/assistant)
I don't see any doc about that or any app available (yet?) that has this feature.
Thanks for any help/suggestion.
The platform does not provide a way for third-party apps to add content to the homescreen.

Have Google Home trigger smart device

I'm developing a smart device that needs to respond to a trigger and take an action. I'm having some trouble however determining what will host the code that fires the trigger. Google Home appears to have events based on time but I can't seem to find anything that can trigger an event based on something like the weather. IFTTT seems like a natural fit but to have customers install IFTTT and then find my applet is a bit cumbersome. I could have my server monitor the condition and fire the trigger but ideally the trigger could be generated on-prem.
So my question... Does anyone have a good suggestion for where to host code that fires a trigger that is sent to a smart device?
*first-time poster so forgive me for any lack of formalities
Automations on Google Home are available for triggering actions but might cover all the use cases you specified. You can create your own system that changes the states of the devices based on your conditions, then report to Google via Report State.

detect/prevent keyboard input on Google Action

I am trying to develop a game that needs to have only voice input. If the user enters text then it will ruin the game expirience. Is that any way to prevent the user from using a keyboard input on a Google Action?
This depends on what you are using for your actions on google development. If you are using Dialogflow or the legacy actions sdk you can check for the input type of the request. If any request matched the INPUT_TYPE_KEYBOARD you could ignore the request using a webhook.
For the new version of the actions sdk with action builder I cannot find any mention of this type existing in the documentation or sdk overview. So I'm not sure if that would also work in the new version.
Another way to do this would be by preventing your action to be deployed to platforms that have a keyboard. You can do this in the settings of your action, but this means that the action will not be available on phones and other devices even for users that would want to use voice.

Is it possible to implement Hands free call using Dialogflow for Google Home

I have some requirement to develop a Hands free call using Dialog-flow with/without using third party apps.Is it possible to implement business services using custom skill for Hands free calls from Google Home.Whether this kind of permission is provided from Google Home or not?
If their means can i have a sample/example related to above requirment
Thanks in Advance.
As far as I know Google (and I don't speak for them) does not publish an API for making calls the way the Home devices do. The best that you can do is to put a link on a button on a basic card or have a suggestion link that references the "tel:" scheme.

How to make Siri perform a specific action in my App?

I want to integrate Siri in my app and make it perform specific actions i.e:
1- open specific view
2- send feedback
I searched about this functionality, but no useful answers were found.
anyone have tried use Siri like this way?
Thanks.
Siri right now only works for certain 'domains' (see https://developer.apple.com/sirikit/).
If you app is in one of these domains you can trigger certain app functions through Siri. Depending on what kind of app you are working on you could theoretically use the Messaging domain to let users send you feedback.
You won't be able to use Siri to navigate through your app by voice but you might be able to trigger certain features.