Voice Biometric in google assistance - actions-on-google

For one of project, to protect user PHI information we want to implement voice biometric solution in Google assistance app. Need your guidance in below:
Can we add voice biometric for the invocation phrase. I know, that google does voice match, but we want to implement voice biometric authentication, so that only authorized user can access specific skills. How we can do it, or access audio stream during invocation and control.
How we can get audio stream once the assistant app has been invoked. We want to add passive voice biometric in our Google Assistant app.
Please let me know if any details available. Thank you.
Note: Above information required for channel Google Home or Google Assistant device.

For privacy and security reasons, at no point can you get access to the the voice stream or a recording of the users voice when using Actions on Google or through Google Home device.
If you were building your own device and using the Assistant SDK, you have access to the stream (since you have access to the hardware directly).
I'm not sure what you mean by "voice biometric" vs "voice match", but voice matching provides access to account linking so you can enforce access from only specific users.

Related

Register user biometrinc finger print in flutter

Can we register a biometric fingerprint on the phone in flutter. I search about that on google and found loca_auth flutter plugin but it can only get the list of biometric fingerprints and authenticate fingerprint but what I need is to register biometric fingerprints in the device.
Third party apps do not have the capacity to register/add biometric materials to devices, no matter what platform you are using -- flutter, etc. Here is how the flow works in general.
User gets a new phone (purchased, gifted, found, etc.), a phone that supports biometric authentication.
User goes to Settings and enrolls a biometric template (e.g. enrolls their fingerprint as a way of unlocking the device). In general, this is the only way to register/enroll a fingerprint/face/iris/etc.
Your app wants users to authenticate using biometrics and so implements something similar to what's described here or here.
Now inside your app, when the user clicks to authenticate(), your app never actually sees any biometric materials. Biometric materials are kept in a secure location so that third party apps cannot access them. What your app gets is acknowledgement from the Framework that the fingerprint/face/iris trying to authenticate into your app is indeed enrolled on the device. Checkout the blog posts I mentioned for more details.
You can do this. Just take a look at this:
Fingerprint Authentication in Flutter
Accordingly you also need to set permissions in android manifest file too.

Set a password of voice before opening any app by google assitant

i'm currently working on an app like a vault that prevents you to open that app but only by your voice or fingerprint on the google assistant. But my question is that may be with fingerprint i can but by voice how?? cause i need to store the voice somewhere and then later compare it.. Please help me
For starters - you can't check a user's voice on the Assistant. Google doesn't send you the audio of what the user has said. All you get is a text transcript.
However... what you want is already what is implemented by Google. Users who have setup voice recognition on their Google Home devices will be recognized by the Assistant when they say "ok google" or "hey google" and you will receive a unique userID for those users. You'll get the same ID the next time they try talking to you. If the Google Home doesn't recognize the voice, it won't provide the ID.
There are caveats with the voice recognition, and voices can be recorded and played back so this isn't a foolproof security method, but from what you have described, it sounds like Google already does what you want.

Is it possible to test Google Smart Home in Google Console during developments?

I am experimenting with Google Smart Home. My end goal is to receive home control events, such as Turn the lights in the living room on, in my Dialogflow fulfillment service.
I am wondering if it is possible to develop and test Google Smart Home without actual devices. That is, it would be great if I could configure a Google HomeGraph configuration via browser and verify Google Smart Home actions via the console (console.actions.google.com).
Is this possible and practical?
No, currently it isn't possible to test smart home devices with the web-based Simulator. You will need to add the device using the Google Home app on your phone and can then test it using the phone's Assistant.
As an aside, you also can't develop it using Dialogflow directly - you'll need to use the Action SDK. (You can relay it to Dialogflow if you want - but since you just get events and not conversation, it really doesn't help you much.)

Restrict google assistant action from being used on iphone

I've deployed a Google Assistant action using Node.js and DialogFlow, which tells the user his current location using Google maps API but it doesn't seem to work on iPhone as iPhone doesn't provide GPS location to the action. Can I make it work on iPhone or restrict it from being used on iPhone.Here is the link to the action directory page-Find my Location
There's no specific way to restrict access to a specific type of phone. Although you can check the surface capabilities, you cannot disambiguate between Android and iPhones.
Does the Google Assistant app on the iPhone have any location access? If you ask 'Where am I?' will it give the answer?
What are you getting after you make the location permission request? Do you get a permission denied response, or an undefined location?

Can google assistant ask an specific queestion hourly?

I want to create an app that every hour asks me "What are you doing?" through voice notification.
There are a few ways I can respond, "Working", "Playing", etc.
Then it can log it on my calendar. If this is possible, what are the correct terms that are used to describe this? So I can search for documentation.
Would like to build this with Google Actions https://developers.google.com/actions/ and with Google Home
You can try the Push notifications feature to trigger the wanted intent where you will prompt the user for the type of activity he/she is doing.
The user can respond and then your webhook can take care of logging to your activity to your calendar using the Calendar API (Assuming you are referring to Google Calendar)
Note that this feature is currently in Developer preview and will not work in a production environment. Also it is only intended for Google Assistant on mobile devices. Google Home devices do not support any push notification features at the moment.