Why can't I get a MediaStream from back camera when front camera is in use on Android Chrome? - android-webview

Consider this:
What I'm trying to do here is to get a MediaStream for environment facing camera (back) when another MediaStream for user facing camera (front) is active.
This is happening on Android Chrome 94 and Android MS Edge 93, therefore I think it's probably a webview related issue, not sure. I tested this on an Xiaomi phone and on 2 different Samsung tablets.
If I stop the track from the front stream before creating the new stream, I don't have any issue at all.
It's really strange, because same piece of code works perfectly on PC (Windows) Chrome (not tried macOS or iOS at the moment).
Thing is that stopping the first track before creating the second stream will give the user a pretty bad experience (with blank screen for a while), which I'd like to avoid.
Also there could be some use cases (e.g. PiP) in which using both cameras at once could be necessary... I can't see why this isn't/shouldn't be working.
Do you guys have any suggestion?
Thanks

Related

how to control camera retake in ziggeo for Ionic 3 framework

How to control the camera retake in Ziggeo for Ionic3. Ziggeo is taking the user to the camera and according to the device option, their user can take a lot of retakes. is it possible to stop the camera retakes or user may reflect back to ionic app as soon as user take video (Stop recording button).
I tried to found this on Ziggeo documentation but didn't got succeed.
Let me first mention that I work at Ziggeo. Now with that being said, lets get cracking :)
When camera is requested on the desktop systems the browsers talk to OS and OS talks to the drivers. The drivers talk with the camera and provide the video data. On the mobile devices this is slightly different.
The mobile browser will ask the system, which will reply by activating the camera app. The camera app is different for different versions of system and system itself, however in general they refuse to listen for any parameters that are sent to them from browser.
This is why you might see the option to retake the recording on the mobile devices.
The purpose of Ziggeo is to however provide a way to use camera and mic in many ways. As such there is a way to actually skip the native app and go to a new way of recording videos.
This is accomplished by adding the webrtc_on_mobile parameter when you are creating your app.
var ziggeoApp = new ZiggeoApi.V2.Application({
token:"APPLICATION_TOKEN",
webrtc_streaming_if_necessary: true,
webrtc_on_mobile: true
});
Now the above is just the HTML version of it. The Ionic is a bit different. Currently it is not possible, however it will be possible in the next update.
Edit 2020:
To support iOS webrtc_streaming_if_necessary: true was created. This is because the WebRTC implementation of WebRTC on those systems is for streaming, not the standard WebRTC. By using it, you make sure that you are not using the WebRTC Streaming unless it is actually necessary to do so.
Added the way you would use it in the above code.
You can always check and find the latest on the header building page on Ziggeo here: https://ziggeo.com/docs/sdks/javascript/browser-integration/header

Actions on Google app not working on phone surface simulator

I created a basic AoG app using DialogFlow and it seems to work as I expected in the simulator when using the Speaker surface. Yet when I switch the surface to Phone it seems to fail an return Sorry, this action is not available in simulation every time. Using it on my phone, just returns some search results.
First try is with the Speaker surface and the second with Phone
I'll try on my Google Home when I get back from work.
I had a similar issue when testing on my phone. I solved by disabling then enabling again the test on device. On AoG's console, click on the icon of a smartphone in front of a computer besides the speaker icon, on the top right corner. It is weird but when I did this it worked on my phone, maybe you will get the same results.

AirConsole phone controller screen not loading

I've set up a project in Unity using AirConsole. I was able to put together a simple game pretty quickly, but I've since run into issues. I was initially developing on a Mac and then switched to my work computer which is a Dell PC. On my Mac I was able to test my game using the simulator in Unity. Since I switched to my PC I have been unable to connect from my phone to the webserver in the debug simulator. The phone gets through the "Enter the Code" screen, the developer console makes a connection noise, but then my phone just sits at the "Loading..." screen.
I've reverted back to the Pong sample project to confirm that my code is not the problem. I've tested the phone using both Chrome and the AirConsole app. I've tried temporarily disabling my firewall in case my firewall was blocking the ports. If I run the simulator in Browser Start Mode = "Virtual Controllers" the controllers on my PC load fine, but I'm still not able to connect with my phone. I'm using AirConsole v1.3.0.b which was still the most recent version when I posted this.
Steps to resolve this issue:
Start the game using the simulator.
Make sure the virtual controllers of the simulator load your controller.
If that is the case try to load the controller.html directly in the phones browser. Assuming your run your game on in the simulator using http://www.airconsole.com/simulator/#http://192.168.0.2:7834/ then take the part behind the # and append controller.html. In this example try to load: http://192.168.0.2:7834/controller.html on your smartphone. Note that this controller wont be working - it's just to try if the page that AirConsole loads internally works. If this does not work, you are either not in the same network or you network has client isolation activated (this needs to be deactivated). To resolve this, check: Using Airconsole on a standard University/Corporate network for dev
If this also works, you probably have javascript errors before the AirConsole constructor is called. Use chrome to remotely debug the phone browser: https://developers.google.com/web/tools/chrome-devtools/debug/remote-debugging/remote-debugging?hl=en
Update: here is a detailed guide for Unity
https://developers.airconsole.com/#!/guides/unity-ngrok

Unity camera not moving

I'm trying to get started with Google Tango for Unity by following this tutorial: https://developers.google.com/project-tango/apis/unity/unity-prefab-motion-tracking
But when I build and run my project, the gyroscope doesn't seem to work and the camera doesn't respond.
I'm using Unity 4.6.7.
Any suggestions?
All of the samples have worked for me without issue in unity 5.1.3f1 personal.
you could try connecting adb logcat and observe for errors during execution.
find ADB.exe in the android SDK directory
from a command prompt run it with the parameters 'adb.exe logcat -c' to clear the logfile, then again without the -c and watch for errors.
In Unity, the gyroscope on some Android devices is disabled by default, assuming the device has one. So you might need to enable it when the app starts up to make sure it is being used, by using Input.gyro.enabled = true;
It also might be that the app is for whatever reason not asking for permissions, which is then resulting in the app being automatically denied access to the gyroscope and camera by the device.
There might also be some other logic that is interfering or altering the gyro data in some way. It's very unlikely, but possible, so it might be worth looking into if all other suggestions fail.
Hope you find a solution soon.

Record iPhone screen and user's actions

In order to do usability testing I'd like to record an iPhone's display along with every user action. I can't modify the application itself however jailbreaking the phone wouldn't be a problem.
Ideally I'd like to get a full resolution video of the screen display with an overlay showing touch events on top of it.
For now the best solution I've found is using a video-out cable and record its output, but with this solution I'd need an external camera to capture what the user was doing and it wouldn't be very precise.
Other ideas?
The application display recorder, found in the big boss repo (cydia) works very well for this.
I have tried MirrorOp (requires JailBreak) and AirSquirrels' Reflector (no JB required) for usability testing. Both work very well, but none grab touch feedback. You can use a second camera or a Hug the notebook approach.