Google Assistent Explicit Intents without App name - actions-on-google

I would like to make my Google Assistant (Google Home & Android Smartphone) a little bit smarter by adding simple small-talk intents and (last but not least) usefull "Ok Google, do whatever" or "Ok Google, tell me when ..." intents.
For now I only own an Echo Dot with Alexa and I really hate their conception of skills due to their strict invocations. I have read somewhere that Google is going to come around this nightmare by using implicit invocation. However what I have done so far is not even close to good.
With implicit invocation, Google Assistant can find the correct action by searching for intents. This is good and I can add a simple phrase that Google detects correctly. However, instead of invoking that intent, Google asks me if it should ask appname to do so.
Of course this is not really an option if we want to make digital assistants smarter, since this not only destroys any kind of smartness, but also prevents us (at least me) from writing usefull actions at all (because it would be annoying to develop and to use it). They should be able to react to specific phrases and intents instead of requiring to specify the App. This makes it impossible to create simple intents like "Say goodnight" or "Ask my girlfiend when she will be here".
My question is not only if this is currently possible, but also what we can expect regarding this problem in the future? Is there any good news? Or do we have to wait, until we can help the existing assistents to evolve their real power?

You can add custom trigger phrases that will open or deeplink into your skill.
With query pattern in action.json.
Action.Json Query Pattern (Google Doc)
But the amount is limited. And I am not sure if you can completely avoid that google ask some stupid stuff like should i really open it... or i am opening now...
And maybe you have also to say ok, google to make it start listening at all.

Nick Felker's answer is better than mine. To expand on it a bit:
In the Google Home app on your phone tap the hamburger menu icon (three horizontal parallel lines) in the upper left, then go to "More settings", then "Shortcuts" (near the bottom), then press the little blue "+" button in the lower right to set up your custom shortcut.
Another option for extremely simple intents "Say goodnight" for example, is to use IFTTT, which has lots of integrations out of the box as well as the ability to pass along the message to a webhook which you could write yourself. Important caveat: IFTTT isn't "smart" itself, so that first layer of integration only does simple string matching (and I mean simple; it seems to be case-sensitive).

Related

Shall I mention the deep links in the sample invocation in the directory page?

I am making an action for google assistant.I have created some deep links so should I mention them in the sample invocations in the directory information? I also want the users to see the first welcome screen that is the default welcome intent and not directly jump to the other intents for the first time.I cannot decide what to do.Can someone please help me with this?
Showing them in your examples lets people know that the feature is possible.
If you really don't want to allow it when they first start - you can intercept the "deep link" at invocation time and send back a reply welcoming them first and explaining things, then either letting them do it or letting them do it in the future.

How to keep Google Assistant Behavior but also trigger IFTTT

I know you can make custom Google Assistant triggers that will invoke IFTTT. But I want to make a custom trigger that will do something but /also/ keep the default Google Assistant behavior. Is there a way to do this?
Description of my actual goal: I speak German as much as possible at home with my daughter. But there are times where I don't know a word, so I can say "OK Google, what is $word in German?" and it will speak it to me. This is very useful.
Then I manually add that word to my vocabulary list to study it.
I would like to write my own Python/Node microservice that will receive the word and generate flashcards (do a lookup on Linguee for sample sentences, for example) in my study program automatically.
But I would also like to keep the Google Assistant behavior that reads the translation back to me on my phone.
So is there a way to accomplish this? Basically instead of having a trigger invoke Google Assistant, I'd like it to do that and also do a second behavior (issue a POST request to a custom URL).
Thank you.

Make Google Home Action work with "Hey Google, INTENT" instead of "Hey Google, ask ACTION to INTENT" possible?

Right now my Action for Google via Dialogflow only works if I say:
Hey Google, ask ACTION to INTENT
I want to remove the ask ACTION to part, so I can just say:
Hey Google, INTENT
My Action is basically a "Turn on device". I can say things like:
Hey Google, ask home to turn on TV
Hey Google, ask home to turn on fan
and so on. Is this possible? I know for Alexa they're called Home Automation Skills, but they're really tricky to setup, apparently.
There are two (sorta three) answers that address your question in different ways.
First - there is no way, programmatically, to remove the ask ACTION to part. This would be like asking if there was a way to remove the hostname from a URL.
However, you (as a user) can setup a shortcut so that when you say "Hey Google, turn on the TV" this actually gets interpreted as "Hey Google, ask some action name to turn on the TV". To do this
Go into your Google Home app.
Open the Menu -> More Settings -> Shortcuts
Second - as #shortQuestion suggested, you could rely on implicit invocations to do what you want. To pull this off, you need to setup the various phrases that will trigger an explicit invocation - and hope that Google notices these and suggests them as something the user can do. There is no way, however, to force Google to pick your Action for a particular phrase, Google's pick may change over time, and they may just suggest your action instead of immediately invoking it. This is sorta like trying to play the SEO game with Google's search engine.
But... what you're asking to do is something that is more along the lines of a Smart Home action. I wouldn't call it "tricky" to create a Smart Home action, but you cannot do it with Dialogflow, and it requires you to create and setup a server that manages (and ultimately controls) the devices in question.
I just found in the Invocation And Discovery Docs
You able to do that!
Invocation name, ex. Talk to Dr.A
Deep link invocation, ex. find recipes
Discovery (MOST IMPORTANT)
In some cases, some of an intent's query patterns can trigger your action, even if users don't use your invocation name.
This is not programmatically possible on the Google Assistant.
The only way to do this is by setting a shortcut in your Assistant. You could set "INTENT" as a shortcut for "ask ACTION to INTENT".
Go to the "Action discovery and updates" section of the actions on google console, and configure some implicit invocations for the public to discover the functionality within your assistant without explicitly invocating your bot by name.

Why would an invocation name for an AoG app be ignored?

I have an Actions On Google app in testing. Most of the time when I say, "OK Google talk to 'my app name here'" my app runs. Sometimes it does not and Google passes the question to Google Search. Then, on my phone I will get search results in the Google app; on the simulator I will see a message like "blah blah blah not supported in simulation".
I have had the question up since last week on the official Google plus "support" page with only a single reply asking if the screen shots were real or not from a person whom I think is just another developer.
successful invocation
Unsuccessful invocation handled by search
[The screenshots were captured and NOT drawn by the way]
Does anyone here have an idea why search is run and what I can do about it if anything?
This is a hobby project of mine to be sure, but if I were trying to speech enable a device it seems to me that this might be a showstopper and a reason to go with another vendor. No?
Just from those screen shots, my first thought is "how is 'visor' pronounced"? And how could it sound like you're mispronouncing it? If it doesn't recognize the "visor" part to match the pronunciation that you think it should be getting, even if the word displayed is the same, it might be passing it along to search to handle.
Remember - this is English. What is written out isn't necessarily what it sounds like. And the system is trying to match what you say and not what it is written as.
One thing you can do is to listen to the recordings Google has of your invocation attempts. Try and figure out if the successful ones sound different from the ones that failed.

Ideas for an overlay for really long webpages for mobile phones

I like reading faqs from GameFaqs.com and I'd like to read them on my iPhone. The problem is most game faqs are really, really long, which makes scrolling through them a big pain. Flicking through screen after screen of text takes a very long time.
I am not a web developer, but would there be a way to somehow create an overlay through which we can access a faq and that will give us commands like "search", "go to page", "next page" and "prev page"?
I'm unclear on what kind of solution you're looking for. Presumably programming of some sort, since this is Stack Overflow. :-)
You could certainly write a native iPhone app that does this. You could also write a proxy-type web service that does it. Presumably those are obvious enough that they are not what you want.
You could do this with bookmarklets. These examples include searching in the page already.
There has actually already been something made that fixes this problem. It's called QuickScroll, and it's for jailbroken iPhones. Link here: http://moreinfo.thebigboss.org/moreinfo/quickscroll.php
Just search for "quickscroll" in Cydia. It's free, so go check it out!