MRTK Not Recognizing Handtracking Input - unity3d

I am currently using Unity with Microsoft's Mixed Reality Toolkit to create a simple game that places a square that can be moved using hand gesture's (air-tapped and dragged). So far I have attached the ManipulationHandler and NearInterationGrabbable to the square game object such as in the tutorial: Here.
When running the Unity project in the Unity player, the square is able to be manipulated with the GGVPointer. However, when I deploy the Hololens, no gesture input is recognized.
Here is my object hierarchy
and the contents of the cube under inspector
Current Development Evironment:
MRTK 2.2 & 2.3
2019.3.0f6 & .9f1
Deployment on:
Hololens 1
Hololens 1 and 2 emulator
In regards to the solution posted here, after attaching the script in the answer and debugging it in the Hololens emulator I receive the error :
Exception thrown at 0x00007FF80850A839 (KernelBase.dll) in Hololens Test Project.exe: 0x40080202: WinRT transform error (parameters: 0x000000008000000B, 0x0000000080070490, 0x0000000000000014, 0x000000506E7FDA50).

Had the same problem - after you want to change something on the MixedRealityToolkit by copying/cloning the profiles some options like "Input System" gets disabled - so you just have to reactivate it again:

Related

How to provide Intrinsics Parameters to Vuforia with a Custom Driver to remove distortion?

I'm working on a project where I track an object using a Model Target and display its 3D model in a virtual environment for interaction. The user wears a VR headset and views the environment with the 3D model. The camera is placed in front of the user, so I aligned the camera and headset reference frames to align the real object with its 3D model.
However, the alignment is affected by the camera lens distortion. When the object is in the center of the image (principal point), the alignment is correct, but as it moves toward the edges, the distortion causes a misalignment.
To resolve this, I gathered the camera's intrinsic parameters and tried to build a Custom Driver (as suggested here: https://library.vuforia.com/platform-support/external-camera-calibration).
To build the Driver, I edited the RefDriverImplData.h and RefDriverImpl.cpp files in the Driver Template sample (10-13-3) adding my camera's intrinsic parameters. I also edited the Driver.h file in the UWP SDK (10-13-3) to add the camera parameters and built the Driver Template sample, as described for the File Driver sample (https://library.vuforia.com/platform-support/building-and-using-file-driver-sample). Then, since I'm using Unity (2021.3.15f1), I added the compiled binaries to Assets/Plugins/WindowsStoreApps/x64 and a script for delayed initialization with the custom driver.
When I run the project in Play Mode, I get two error messages (attached to this post) and one on screen. I also tried building the project, but it crashes without any error message.
Is it enough to provide the intrinsic parameters in the Driver Template sample files and Driver.h to make it work with the camera? And is this the correct way to provide intrinsic parameters to Vuforia to get undistorted images before tracking?
Error Log:
1.
Failed to create Vuforia engine: Vuforia driver library load error
UnityEngine.Debug:LogError (object)
Vuforia.Internal.Utility.UnityLogger:LogError (string)
Vuforia.Internal.Utility.Log:Error (string)
Vuforia.Internal.Core.Engine:InitOnCameraReady ()
Vuforia.WebCam:HandleFirstWebCamFrame ()
Vuforia.WebCam:<Init>b__33_0 (bool)
Vuforia.Internal.Utility.VuforiaCoroutineUtility/<RunCoroutineWithTimeout>d__1:MoveNext ()
UnityEngine.SetupCoroutine:InvokeMoveNext (System.Collections.IEnumerator,intptr)
2.
Vuforia Engine initialization failed: INIT_VUFORIA_DRIVER_FAILED
INIT VUFORIA DRIVER FAILED
Failed to initialize Vuforia Engine.
UnityEngine.Debug:LogError (object)
DefaultInitializationErrorHandler:SetErrorCode (Vuforia.VuforiaInitError) (at Library/PackageCache/com.ptc.vuforia.engine#ad7ad60b4246/Vuforia/Scripts/DefaultInitializationErrorHandler.cs:157)
DefaultInitializationErrorHandler:OnVuforiaInitializationError (Vuforia.VuforiaInitError) (at Library/PackageCache/com.ptc.vuforia.engine#ad7ad60b4246/Vuforia/Scripts/DefaultInitializationErrorHandler.cs:24)
System.Delegate:DynamicInvoke (object[])
Vuforia.Utility.ExtensionMethods.DelegateHelper:InvokeDelegate (System.Delegate,object[])
Vuforia.Utility.ExtensionMethods.DelegateHelper:InvokeWithExceptionHandling<Vuforia.VuforiaInitError> (System.Action`1<Vuforia.VuforiaInitError>,Vuforia.VuforiaInitError)
Vuforia.VuforiaApplication:VuforiaInitialized (Vuforia.VuforiaInitError)
System.Delegate:DynamicInvoke (object[])
Vuforia.Utility.ExtensionMethods.DelegateHelper:InvokeDelegate (System.Delegate,object[])
Vuforia.Utility.ExtensionMethods.DelegateHelper:InvokeWithExceptionHandling<Vuforia.VuforiaInitError> (System.Action`1<Vuforia.VuforiaInitError>,Vuforia.VuforiaInitError)
Vuforia.Internal.Core.Engine:InitOnCameraReady ()
Vuforia.WebCam:HandleFirstWebCamFrame ()
Vuforia.WebCam:<Init>b__33_0 (bool)
Vuforia.Internal.Utility.VuforiaCoroutineUtility/<RunCoroutineWithTimeout>d__1:MoveNext ()
UnityEngine.SetupCoroutine:InvokeMoveNext (System.Collections.IEnumerator,intptr)
Note that from the description here https://library.vuforia.com/platform-support/creating-custom-driver#cameracallback-class your Driver implementation has to provide both image buffer and camera intrinsics. When enabling Driver Vuforia does not access the system camera anymore so it is not possible to just provide intrinsics via Driver.

holographic remoting can not capture gestures at unity

I use Holographic Remoting Player to project the unity uwp program to HoloLens. I can get the unity picture on HoloLens, and I can move around to move the field of view, but the gesture cannot be detected in the unity release program.
I mean that gestures cannot be obtained in release , but gestures can be obtained when using the built-in connector in the unity editor.
gestureRecognizer = new GestureRecognizer();
gestureRecognizer.Tapped += RecognizerTapped;
gestureRecognizer.SetRecognizableGestures(GestureSettings.Tap | GestureSettings.DoubleTap | GestureSettings.Hold | GestureSettings.ManipulationTranslate);
gestureRecognizer.StartCapturingGestures();
InteractionManager.InteractionSourcePressed += InteractionSourcePressed;
void RecognizerTapped(TappedEventArgs args)
{
debugWindows .AddMessage ("tap "+args .tapCount);
}
void InteractionSourcePressed(InteractionSourcePressedEventArgs args)
{
debugWindows .AddMessage ("pressed");
}
that is my test code. and my custom connector is copy at unity docs.
when I used custom connector at anywhere (relase\editor), the "pressed" can printed , but "tap" and any gestureRecognizer's delegate canot printed.
when I used built-in connector at editor, both "pressed" and "tap" are printed.
I run it at many unity versions(2018.2, 2018.4, 2019.1-2019.3), get same result. Am I missing any key settings?
We believe it is a Unity Side issue, and there is a similar case in Unity's IssueTracker has not been fixed: HOLOLENS GESTURE INTERACTIONS NOT WORKING WHEN USING THE HOLOGRAPHIC REMOTING FROM UWP APP
You can try to resubmit your question to IssueTracker for help

How to use vuforia in unity with sound

I have a problem using vuforia with unity, I have 10 videos in database and when I start the App the sound from the videos start playing immediately even if there is no cards to read from.
Check you AudioSource components are on the target object, so it gets disabled.
If this happens and you still have the problem, add custom code so that it gets muted on OnTrackingLost and unmute in OnTrackingFound.
The class you are looking for id DefaulTrackableEventHandler, you can inherit from that one and add that code instead to our target using overrides or add code there like GetComponent.

Can I run ARCore Preview 1 App on Preview 2 release?

I've built an app which runs on ARCOre preview 1 package on Unity. I know Google has made major changes in preview 2.
My question is what changes will I have to make in order to run my ARCore preview 1 app run on preview 2?
Take a look at the code in the Preview 2 sample app(s) and update your code accordingly. For example, here is the new code for properly instantiating an object into the AR scene:
if (Session.Raycast(touch.position.x, touch.position.y, raycastFilter, out hit))
{
var andyObject = Instantiate(AndyAndroidPrefab, hit.Pose.position,
hit.Pose.rotation);
// Create an anchor to allow ARCore to track the hitpoint
// as understanding of the physical world evolves.
var anchor = hit.Trackable.CreateAnchor(hit.Pose);
// Andy should look at the camera but still be flush with the plane.
andyObject.transform.LookAt(FirstPersonCamera.transform);
andyObject.transform.rotation = Quaternion.Euler(0.0f,
andyObject.transform.rotation.eulerAngles.y,
andyObject.transform.rotation.z);
// Make Andy model a child of the anchor.
andyObject.transform.parent = anchor.transform;
}
Common
Preview 1 use Tango Core service that can changed Ar-Core service in Preview 2.
Automatic Screen Rotation is Handled.
Some Classes are altered like some reason of following.
For Users:
Introduce AR Stickers
For Developers:
A new C API for use with the Android NDK that complements our existing Java, Unity, and Unreal SDKs;
Functionality that lets AR apps pause and resume AR sessions, for example to let a user return to an AR app after taking a phone call;
Improved accuracy and runtime efficiency across our anchor, plane finding, and point cloud APIs.
I have updated my app from Preview 1 to Preview 2. And it's not a lot. It had minor API changes like the ones for hit flags, Pose.position etc. It would probably be stupid to post the change log here. I suggest that you can file the below steps:
Replace the old sdk with the new one in the Unity Project
Then, check for the error in your default editor, vs or vs code or mono
Just check for the relevant API's in the deveoper docs of AR.
It's not such a cumbersome job, it too me some 5-10 min to upgrade that's it.
Cheers!

Can't play music from FMODStudio in Unity3D

I created an audio project with FMOD studio, then I want to connect to a unity3d project. I've a first person charachter, and I added the FMOD Studio component to the FPSController, but no music comes out:
The intensity is not the volume, just a parameter to "mix" different ambientations.
What can be wrong?
Found: Another component has to be added to the FPSCharacter itself: FMOD Studio listener. With this it works.