I'm trying to set up Oculus Mixed Reality Capture to capture footage of someone playing my VR game. I have got it to work with other MRC enabled games like Beat Saber but I don't understand how to enable MRC in my game. I followed The Oculus Guide and also had a look at The Unreal Documentation but it seems to be outdated.
From what I understand all I need to do is
Have the Oculus VR plugin enabled
Get the Oculus MR settings (in a blueprint)
Set IsCasting to true
Oculus MR Blueprint nodes
But I just get these errors.
Blueprint Runtime Error: "Accessed None trying to read property CallFunc_GetOculusMRSettings_ReturnValue". Blueprint: BP_BilpakkGameMode Function: Execute Ubergraph BP Bilpakk Game Mode Graph: EventGraph Node: Load from Ini
Blueprint Runtime Error: "Accessed None trying to read property CallFunc_GetOculusMRSettings_ReturnValue". Blueprint: BP_BilpakkGameMode Function: Execute Ubergraph BP Bilpakk Game Mode Graph: EventGraph Node: Set Is Casting
Blueprint Runtime Error: "Accessed None trying to read property CallFunc_GetOculusMRSettings_ReturnValue". Blueprint: BP_BilpakkGameMode Function: Execute Ubergraph BP Bilpakk Game Mode Graph: EventGraph Node: Save to Ini
Also I put in a IsMrcEnabled node in and it returns false.
IsMrcEnabled node
Lastly I tried downloading the Mixed Reality Sample from Github, but I couldn't get this to work either, I got the same errors.
I'm using Unreal Engine 4.26.2 and Oculus VR plugin 1.51.0 and Oculus Quest 2
Related
This is my first time trying to find an API. Should I be successful I will then need to figure out how to use it.
I want to run a VR immersive experience on an oculus quest which a person wears while sitting in the motion simulator, I want the movement of a boat in the VR headset to correspond with the movement of the chair.
The software for the chair I have is called Actuate motion v1.0.8
On their website it says they have a "C API" (Which I can't find online documentation) but they aslo mention they have a unity plugin you can use for your game. I would use their plugin before attempting their api.
Good morning,
I am trying to install the Mixed Reality Toolkit for Hololens 1. I need to do a Spatial Mapping in Unity and I would like to use a "Spatial Mapping" prefab which should be displayed after Unity configuration with the MRTK tool. Unfortunately, I don’t see the prefab. I enabled the "SpatialPerception" in the Player configuration and simply put "Microsoft Reality Toolkit Foundation" in my project from the MRTK tool. How can I access the Spatial Mapping prefab please?
Thank you.
image unity
To use spatial mapping in the app, we should enable the Spatial Awareness system in the MixedRealityToolkit profile and register spatial observers to provide mesh data. There is not a Spatial Mapping "prefab" in MRTK. Here is a step by step guide showing how to do that:Spatial awareness getting started
We are using Vuforia for image tracking with hololens and unity engine. Vuforia works fine. We are also using Azure Spacial Anchors to fix the location of objects. However, Anchors do not seem to work with Vuforia. It appears that Vuforia captures camera events and does not pass them on to Azure Anchors, maybe?
Is there a way to get both technologies working at the same time?
the major issue would be Vuforia occupied the Camera pipeline
you may stop Vuforia and switch to ASA and switch back.
Or you may use pics and time stamps and ASA
Please read this page
https://library.vuforia.com/platform-support/working-camera-unity
may help you get the camera frame. then you may make the parameter transferd to a service you hosted in linux server, with Spatial https://github.com/microsoft/azure_spatial_anchors_ros
My multiplayer VR project is based on Unity3D, Oculus Integration v1.39 and PUN2 running on Oculus Quest. I'm using the standard teleport script provided in the Oculus library. PhotonAvatarView is what I use to keep avatar position/rotation in sync across clients.
Unfortunately, when a player teleports to a new location, the other one doesn't detect any change in the remote avatar. It seems like PhotonAvatarView doesn't see the change in location of the user, which is really strange. How can I fix it?
PhotonAvatarView code is available at this URL: https://doc.photonengine.com/en-us/pun/current/demos-and-tutorials/oculusavatarsdk
I'm a sound designer working on VR for mobile phone; prototyping on Galaxy S8.
We use Unity and Fmod, thus GVR plugins ( formerly resonance-audio ).
It is known that GVR bypass group busses in Fmod for a more precise control on spatialisation for each sources.
Now i have a problem as i'm definitely not a dev so my coding skills are not that great.
I simply want to automate the volume of certain Fmod events, make them fade out on like 10/15 seconds at the end of a scene. Thus i feel like i need automation either on GVR Source Gain from each track, OR the master volume of each event.
I added a parameter in Fmod, so by coding in Unity i want to tell fmod to smoothly go from a value to another thus fading out the volume.
Issue : the parameter appears in the inspector in Unity, but can't have access to it / can't control it.
I can tick the box of the said parameter, but it automatically untick as soon as i start the scene, and i don't know what to type to control the value.
I have some devs in the team with me that will help, but we're kind in a rush thus im trying to find solutions myself.
TLDR : how to simply automate parameter values of GVR plugins or Fmod event master bus ( not session master bus ) by coding in Unity ? Considering GVR / Resonance-audio bypass groupbusses.
If anyone had the same kind of issue or know how to sort it out... i'll be more than grateful.
regards,
Guillaume