I need to connect an Android device to a Hololens for 'spectator' viewing as in the sample 'Build 2019 Demo' code. However when I launch the Android and Hololens builds and enter the Hololens IP address on the Android side, all I see is an 'ArUco' code. No video and no 3D content are visible on the android device.
I should add that I have painstakingly reviewed the sample project as compared to my project and cannot determine any relevant differences.
Naturally, I want to get past this screen-code, but I am also confused about where this ArUco code exists in the application flow as it isn't part of any scene (or flow)that I am aware of.
The ArUco code is appearing in order to localize the two devices. If everything is compiled correctly, the HoloLens will start using its camera to detect the ArUco code. Once its detected, the ArUco code will be dismissed and content will be positioned correctly across the two devices. It may be that you are missing the OpenCV native plugins required for ArUco marker detection. Instructions on how to build those plugins can be found here. You specifically need an x86 version of SpectatorView.OpenCV.dll for ArUco detection to work on a HoloLens 1 device.
The Build2019 sample uses Azure Spatial Anchors compared to ArUco markers for localization. If you want to use Azure Spatial Anchors, you need to go to Spectator View -> Edit Settings and Add a SpatialAnchorsCoordinateLocalizationInitializer to the prefab. You also need to declare in the SpatialLocalizationInitializationSettings a Prioritized Initializer that references this SpatialAnchorsCoordinateLocalizationInitializer. This will cause the SpatialAnchorsCoordinateLocalizationInitializer to be used compared to the default ArUco localization initializer.
If you have set up these settings, it may be that the SpatialAnchorsLocalizer isn't registering as available on your Android or HoloLens device. You need to add the SPATIALALIGNMENT_ASA preprocessor directive to your Android and WSA Player Settings to get the SpatialAnchorsLocalizer to declare itself as supported.
Related
I followed this tutorial https://learn.microsoft.com/en-us/training/modules/learn-mrtk-tutorials/1-1-introduction and did every step. But when I run it on the HoloLens, I don't see anything.
Build configurations are as in the tutorial, HoloLens2 is in Developer mode and has been paired before. Tools are as in https://learn.microsoft.com/de-de/windows/mixed-reality/develop/install-the-tools?tabs=unity
I am building the app over WIFI.
When the app starts on HoloLens, mesh appears for a moment and I get a question asking if it is allowed to use on my microphone.
"Build with Unity" or similar does not appear, and no objects either. I see "nothing".
What could be the problem?
I use:
Unity 2021.1.20f
MRTK v1.0.2206. Preview (from Microsoft Download Center)
It was actually the wrong Unity version.
I am now using Unity 2020.3.4 and I can see and move the objects.
I am currently working on developing an application for the Hololens 1 using Unity and MRTK.
I have been unable to get the air tap or any other input working in my application. The ring pointer for airtap does not appear in the application even though it works in the Unity Play Mode Input Simulation and other applications on the device. I tried it with the MRTK examples and even those did not work on the device.
Also I had to add the Tracked Pose Driver from the Player Settings to get the camera working properly but have not figured out how to get the application to accept gesture input.
Also tried the solution listed here: Why is 'air tap' gesture not working on HoloLens1 in my Unity/MRTK app?
but that did not work.
Will appreciate any guidance to solve this problem.
I finally got my project working after a lot of trying.
It turned out that initially, I had unknowingly configured the project to use XR SDK in Unity 2019 and thus had to configure MRTK accordingly. Thank you to Kevleigh for helping me with the issue here: https://github.com/microsoft/MixedRealityToolkit-Unity/issues/7850
While this did work for my example project, I couldn't get the main project to work with the same settings even after installing all the plugins.
Finally, I had to switch to Unity 2018 and I got the project working with MRTK 2.3 and the default configuration. So, while Unity 2019 did not work for me you can get it working with XR SDK as suggested in the above link.
I have a Xiaomi Redmi Note 8 Pro and it doesn't seem to support ARCore. I found some ways to dodge this but it appears to be quite complicated nor very safe.
My question is:
What other tools would you recommend if I want to create an app in Unity that also needs to use GPS modules, maybe altimeter and of course camera (AR stuff)?
I heard about Vuforia that might do the trick, also read something about AR Foundation from Unity. But to me, it looked like depending on chosen deployment it use AR Core or AR Kit(even Vuforia).
Any clearance about this maybe?
I suggest you don't try messing with your device. It doesn't support ARCore for good reason. And maybe you just try to use Android Studio Emulator, but Alas for some unknown reason, i try it but APK which is generated from unity can't be installed on the Emulator. Some stuff with the architecture
If you want to use Unity anyway. I suggest you use Vuforia. It works on most modern devices and doesn't even need device to test, just hit unity play mode and you can test around from your PC (need webcam).
Vuforia Engine provides a simulator mode in the Game view that you can activate by pressing the Play button. You can use this feature to evaluate and rapidly prototype your scene(s) without having to deploy to a device. (Source: https://library.vuforia.com/articles/Training/getting-started-with-vuforia-in-unity.html)
For Unity with ARFoundation, you can't use your PC like Unity with Vuforia, you need ARCore/ARKit supported devices.
Last if you want AR with GPS Modules (although this is not with unity) checkout AR.js https://github.com/AR-js-org/AR.js
I'm developing an app for the Hololens 1 in Unity, and it runs perfectly fine on the device when using Holographic Remote. However whenever I build and deploy the application through Visual Studio, it then only launches in 2D mode on the Hololens (as a flat "window" you can position in space). What settings control this behaviour?
Unity version is 2019.1.4f1,
Visual Studio is 2017 Community Edition,
I'm on Windows 10.
Developer mode is turned on on both the HL and my desktop. Virtual Reality Support is ticked in Unity, the Mixed Reality SDK is added to the list and the Build settings are on x86 / D3D Project.
I tried replacing my scene with one of the examples from the MRTK, but to no avail. Strangely enough, if I make a clean new project with nothing except the MRTK example in it it does deploy properly, so there must be something in my project interfering. I just can't figure out what.
Expected behaviour is that the application launches in "room scale" mode, i.e. all other applications disappear and the objects in my scene can be viewed in 3D.
EDIT: This has been marked as a possible duplicate. The answers given there do not solve my problem, however. I already made sure that "Virtual Reality Supported" is ticked in the XR settings and the SDK is added to the list. I don't think I have a Windows Insider preview, but since I was able to deploy perfectly fine with a fresh project I don't think that's really the problem...
It appears Vuforia was causing the issues. I got it to deploy in 3D with 'Vuforia Augmented Reality Supported' ticked and the following settings in VuforiaConfiguration:
Camera Device Mode: MODE_OPTIMIZED_SPEED
Device Type: Digital Eyewear
Device Config: Hololens
Video Background DISABLED
Device Tracker DISABLED
Furthermore, 'Vuforia' must not be added to the list of Virtual Reality SDKs in XR Settings.
Note that I have not tried all subsets of these settings individually, some of them might not have an impact whatsoever (except for the last one, I am quite certain adding that SDK will force the app into 2D mode).
Also note that I haven't verified that Vuforia actually works correctly on the Hololens, just that I can deploy the app in 3D mode with it enabled, given the above settings. If someone could confirm that Vuforia is even supported by MRTK v2?
EDIT: apparently the problem is also caused by ticking "WSA Holographic Remoting Supported" in the XR Settings, so be sure to disable that.
I'm experimenting with Vuforia Fusion to be able to track an object and then use its new extended tracking system (which should use ARCore/ARKit if available) to track the detected object in space.
The problem is, no matter what settings I set, no matter even if I place the .aar ARCore plugin inside the Plugins > Android folder in the Unity editor (as mentioned in the Vuforia guidelines to integrate ARCore), the tracking will always be very inaccurate compared with the ARCore implementation without Vuforia.
I'm wondering if Vuforia Fusion really uses ARCore, because, as I said before, even if I don't put the .aar ARCore plugin inside the Plugins > Android folder, Vuforia Fusion still returns me that I'm using the PLATFORM_SENSOR_FUSION (aka ARCore).
Does anyone experienced the same behaviour? Or did you have different results?
Thanks in advance and have a nice day.