reference link: https://developers.google.com/assistant/sdk/guides/service/python/embed/setup
1.According to the reference link, I realized the light control on raspberryPi.
2.when I run pushtotalk.py, I can control testlight on/off via resberry mic.
3.When I via Google assistant apk to control testlight on/off, it prompts me "testlight does not support remote control"
4.When I turned off the raspberryPi's board and ran the Google assistant APK to control testLight, it still prompted me "testlight does not support remote control"
so i think,testLight should always be offline and displayed in Google assistant apk.
I need some help, I don't know what else to setting or to do to make it online.
Related
I have installed a PWA on my Android device and I want to remotely debug it in Chrome on my desktop. When I connect to my device in Chrome dev tools, all I see are the tabs open in Chrome on my Android device. I do not see the instance of the PWA that has been installed - "added to homescreen" Is there any way to debug the installed instance?
You have to open your developer tools using Ctrl+Shift+I
Then go to More Tools -> Remote Devices
In Remote devices, you can see your available online device and click on that(either online device can be emulator or it can be attached mobile device), also you can Add Rule if you want to run in localhost from mobile device.
Click on Inspect button, available in right side
Now you can see virtual device is created and you can see all logs that comes from device will display in Console tab
I was having the same problem. Turns out that PWAs that were open before you connected remote debugger will not show up. Simply close the app and start it after connecting the debugger.
I built an action and it works well through the Home Simulator but I am not ready to make it available publicly yet. Is it possible for me to put it on my Google Home device and also share it with a few coworkers? If so, how? Thanks in advance!
If you are developing your agent in api.ai you can go to the "Interactions" section and enable "Google Home". There you enable Google home and go to the settings. There you can authorize api.ai and enable the preview. Now you should be able to start your agent by saying "start <your agent name from the dialog before>". However this is restricted to your own Google Account you cannot share that in other ways yet.
Check my screenshot I called my project "Playground" after clicking on "Preview" I was able to start my agent in the web simulator by writing "start playground".
I'm trying to turn on Kiosk mode for an non-managed chrome box following the official instructions but when I'm on the chrome://extensions page in developer mode there is no Add kiosk application option.
Did this get removed at some point leaving kiosk mode only available to managed devices?
Edit
The kiosk mode app I was using was a simple one that I wrote and had loaded via the "unpacked extension" box on the extensions page.
Then I published it restricted to test accounts and included the account on the chromebox as a tester and loaded it from the chrome web store that way. The effect is the same and I still don't see the kiosk application options.
Double Edit
I published it unlisted and installed it. The app installs and works. I have "kiosk_enabled" : true in my manifest and I still don't see any kiosk mode option.
Got the manage kiosk button to show up by:
Removing other users. The first user on the machine is designated the owner. I was trying for kiosk mode from the second user.
Restarting the device
After the reboot the Manage Kiosk Applications button was enabled. The solution is hinted at in this bug from 2014: https://code.google.com/p/chromium/issues/detail?id=385943
So I'm reading in this link and it says we can use chrome for remote debugging an app which seeems great but they don't explain how to do it. When I click in the link they provide which has some android documentation, I just see java code. As a non java-developer I wonder how can I use remote debugger in chrome for ionic framework?
To access remote debugging of a webview on your phone, plug your phone into your pc and go to chrome. then type: chrome://inspect into the url bar. open your app on your phone and it should show your device on the inspect dashboard.
In the Samsung Smart TV menu there is an option to "Start receiving Smart TV logs". It's "OFF" by default.
When I clicked it I received a prompt to "Check the Console View". I opened the console view and ran the app on emulator, but i couldn't see any logs there.
I know that when the emulator is launched, a separate window showing all the alert(".."); logs is also launched.
I want to know how to use of this option of viewing logs via Console View. I'm new to Eclipse and Smart TV SDK. Is there anything that I'm missing?
How is this different from the logs that are already being shown with the emulator
The console log is used for debugging in real devices.
Emulator already have it's own debugger console window so the emulator not sending anything to eclipse.
If you want to work with real devices this feature is very useful. Do app sync from TV to your workstation and enable the log receiver. When your synced apps run in the real devices (TV/BDP) the alert from application will sent to eclipse's console window
The app will send its logs back to an active eclipse console on the system from which it downloaded the app.
I'm working with eclipse on windows, so I gave my pc a static ip address and installed the Apache 2.2 web server. After uploading my app, I enable the console and open the console view as you did. Then, I start my app, and I see all the log information in the console.
I find this log information essential, because some services return an error in the emulator, but actually execute successfully on the tv. Many of these services are interacting directly with the tv hardware, and there is no other way to debug them.