audio volume from external device - yocto

I am working on a device with a imx6 with yocto linux (krogoth). I use headsets with usb and they works fine. But now I have a headset where it is possible to change the volume with two buttons on the device. On my windows notebook the volume control in the software changes when i push one of the buttons. I also tested it with a ubuntu 20 and I see the control in the ALSA mixer when a button is pushed.
On my device I see no reaction. The sound volume changes in the headset when a button is pushed but in the ALSA mixer of the device there is no reaction. That is a big problem for my use case. Does anybody knows this problem and can help me. Do I need a special module or anything else?

Related

Vuforia Ground Plane Detection doesn't work when using webcam (IP webcam using Iriun Webcam in android phone)

Thought of using Vuforia as it allows testing using a webcam. So, i downloaded Iriun Webcam (IP Webcam for android), and succesfully got the video stream inside the Unity Editor when i press "Play".
I created a minimal AR example, where my app would detect a plane and a tap would place an object.
My issue is that when I Build the apk and test it on my phone it works perfectly as it should, but when i do it using the play button inside the editor, the video stream is captured, but it won't do any AR stuff (like plane detection as it was supposed to do)
Please help me with any possible reason there could be, as I couldn't find any such issues faced by anyone else
I created a minimal AR example, where my app would detect a plane and a tap would place an object.
My issue is that when I Build the apk and test it on my phone it works perfectly as it should, but when i do it using the play button inside the editor, the video stream is captured, but it won't do any AR stuff (like plane detection as it was supposed to do)
As mentioned here https://library.vuforia.com/unity-extension/vuforia-play-mode-unity Ground Plane is not supported when using a webcam in Unity Play Mode. Ground plane relies on a robust device tracker and this is not available when using a webcam. However, to help development, it is possible to record a sequence on a device and then using this in play mode https://library.vuforia.com/platform-support/recording-and-playback. The other option is to emulate the Ground Plane behavior using an Image Target as discussed here https://library.vuforia.com/ground-plane/introduction-ground-plane-unity.

How to use Joystick Emulator in Unity

I'm developing the WebGL game need to be controlled by a joystick.
But I don't have a joystick device.
I don't have any idea which joystick emulator is the best for unity and how to use it.
You could maybe connect a playstation/xbox controller via a usb cable to your computer. Then you could it to see if your controller works on your computer by using this website: https://gamepad-tester.com
then look at this to see if it works, the different parts of the controller in the image should react to what you do with the controller in real life:

How do I test the Air Tap gesture in Hololens Emulator?

I am building an Augmented Reality app for Hololens. I need to test the Air Tap Gesture, without deploying to the Hololens.
Is there any method to test the Air Tap functionality through the Hololens Emulator or the Holographic Emulation provided by Unity3D?
This will depend on your emulation environment:
Hololens Emulator (From Microsoft, allows multiple input devices)
Air tap gesture - Right-click the mouse, press the Enter key on your keyboard, or use the A button on an Xbox controller.
Windows Holographic Emulation (From Unity, requires game controller)
Perform a tap gesture with a
virtual hand - Left and right trigger buttons; the A button
Adding to the other answer. If you don't have a HoloLens this will not help but, if you do this will speed up development significantly.
On your HoloLens, download the Holographic Remoting App. Once it is downloaded open and run it and you will see your IP address for your HoloLens.
Now, go to the Unity editor -> Window tab -> Holographic Emulation. This will open a new window. For Emulation mode select remote device. Under Remote Machine enter the IP address. You can tinker with the other settings as you see fit.
Now with the Application running on your HoloLens click connect. Once you are connected the Unity editor will tell you and the HoloLens should go blank. Now when you press play on the editor the app will run on the HoloLens.
This does not download the app to your HoloLens and stops running when you stop the editor
If you are working in the Unity editor,
Shift + LMB (left-mouse-button) to simulate air-tap with the left
hand;
Space + LMB to simulate air-tap with the right hand.
Other useful points:
gaze can be controlled with the mouse by keeping RMB down. Also
q/w/e/a/s/d. (I know question was only about airtap, but they kind of
come together).
If you need to test typing some data with the standard keyboard, no
need to simulate airtap - just click with the mouse.

Detecting when device in cardboard headset in Unity

I'm building a Unity Cardboard app, and would like to detect when the device is in the headset. The NFC in theory has this data, but it does not seem to be exposed in the API. I would like to have the app automatically enter VR mode when in the headset, without the user needing to toggle in and out of a VR mode.
Basically, I want Cardboard.vrModeEnabled to be automatically updated when you enter or exit a headset.
Is this possible? Thanks!
It used to be in the (non-Unity) SDK but was deprecated, for several reasons. For one, the NFC sensors on phones are placed in different places, so the detection was not uniformly reliable. For another, using the sensor this way drains battery quickly.
There are a lot of cardboard models on the market. a lot of them don't come with an NFC tag. so i wouldn't count on it.
Best approach for me is to start in VR mode, when the user touches the screen, disable VR mode for 10 seconds since the last touch and then go back to VR mode.

windows-ce usb touch driver does not get enumerated

I added usb touch screen driver to my custom win ce 6.0 OS image. The touch screen works only when there is another USB device(USB flash drive, usb mouse etc.) present in the system and does not work when there is only single usb touch screen is connected. After troubleshooting, I discovered that the USB touch device does not get enumerated. Any help resolving this issue is highly appreciated.