I use to disable guardian in my phisical vr escape room to let the players moove without limits.
From 1 week i think, with guardian disabled in developer hub and in the quest, i see the players drifting, all the objects moove a little bit, loosing initial position.
i tryed disabling it with meta developer hub and sidequest too
How can i solve this? this problem is detrying the user experience
i don't know what to try anymore
until a week ago there was no such problem...
Related
I've posted a game on Google Actions named "Jeu du pendu".
When it is launched with the invocation phrase ("Parler avec jeu du pendu" / "Talk to the hangman game") it works normally.
But when it is launched via a tap in the list of games, the conversation ends immediately and the game does not start.
The same problem occurs on the Nest Hub Max or on a smartphone.
On the hub max you just hear 2 beeps. On the smartphone there is a message "Jeu du pendu a quitté la conversation" / "Jeu du pendu has left the conversation"
There is no crash reports of this event in Analytics page, neither in the Api logs.
I reached the Google Support team but they said they can't help with that.
Does anyone have an idea of what could be missing ?
Thanks in advance.
Finnaly I got the solution.
After searching a little more, I saw a built-in Invocation named 'PLAY_GAME'.
By default it redirects to the 'End Conversation' action.
I changed it to redirect to the 1st game Action like for the Main Invocation (invocation phrase). I re-published the game like that a few days ago.
I just tried now and it works !
It seems that there is no way to test this launching method in the Test console.
Thank for your help.
I am having trouble implementing shared experience using Azure Spatial Anchors. I followed the Microsoft tutorial (https://learn.microsoft.com/en-us/windows/mixed-reality/develop/unity/tutorials/mr-learning-sharing-05) and it is working locally on one HoloLens 2. But when I try to share anchor on one HoloLens 2 and get it from another HoloLens 2 in the same room the watcher can't seem to find it. No error is thrown and last debug message I get is "Watcher created" and "Looking for Azure anchor... please wait..." Debug Window Image . I tried going to different rooms multiple times but that didn't help.
I'm using:
Unity 2020.3.12 LTS,
MRTK 2.7.2,
ASA 2.10.0-preview1,
OpenXR 1.0
From telemetry, it seems that your HL2 devices are not sending any visual data to the service to locate the anchor, which would explain the behavior you're seeing.
Please create a new issue at the ASA samples repository, and we can investigate the problem further:
https://github.com/Azure/azure-spatial-anchors-samples/issues
I try to synchronize VR scene using GunDB.
In order to experiment it, I put a few data in GunDB.
But I got this warning.
storage warning
I use IndexedDB, and I can keep going it with hitting 'allow'
But I'm wondering why it uses too much storage!!!!!
setInterval(putLocation, Math.ceil(1000 / 50));
// putLocation
obj.get('attributes').get('position').put(object.attributes.position);
It updates data every 200ms in the same node. (object.attributes.position)
please let me know how could I fix it.
Thank you.
#huhsame 1.2GB for VR scene data? That seems suspicious.
By chance is this in Safari?
Safari has a known bug (#go1dfish found this) where it creates run-away storage accumulation (with or without gunDB) that gets triggered if its file descriptor is left open too long.
Could you see if the same thing happens in Chrome? If it does, it is a GUN bug then.
If it is just Safari, we tried to add code that would reset/reopen Safari's IndexedDB instance every 15 seconds, and have had success so far with that approach.
However, clearly, either Safari has changed something, or that workaround is no longer viable, so we'll need to figure something new out.
I understand Safari is very important because of iOS, it is just unfortunate that Safari lags behind on several very serious and important fronts (WebRTC, IndexedDB, & WebM). There is only so much our team can do to work around these bugs until Safari is more standards compliant. But where we can workaround, we will.
#marknadal
thanks for your answer.
and sorry I am late.
I tested it on Chrome and Safari after clearing storage.
after 30 minutes, Chrome used about
1MB,
and on the Safari, I did not find the usage panel but there is no pop-up for warning like the previous one.
I guess it was total data since September when I had started this experiment.
but it's just my opinion.
And I am still wondering that users should delete data in WebStorage regularly?
please answer me.
I have tried to get a response on Github but with no activity about this issue there I will ask here.
I have been following the documentation and I am stuck when I have imported the WwiseResonanceAudioRoom mixer effect on the bus in Wwise and I do not see anything in the properties. I am not sure if I am supposed to? Right after that part of the documentation is says "Note: If room properties are not configured, the room effects bus outputs silence." I was wondering if this was the case and yes it outputs silence. I even switched the effect to see if it will just pass audio and it does just not with the Room effect, so at least I know my routing is correct.
So now this leads up to my actual question. How do you configure the plugin?? I know there is some documentation but there is not one tutorial or a step by step for us non code savvy audio folk. I have spent the better half of my week trying to figure this out b/c frankly for the time being this is the only audio spatialization plugin that features both audio occlusion, obstruction and propagation within Wwise.
Any help is appreciated,
Thank you.
I had Room Effects with Resonance Audio working in another project last year, under its former name, GVR. There are no properties on the Room Effect itself. These effect settings and properties reside in the Unity Resonance prefabs.
I presume you've follow the latter tutorial on Room Effect here:
https://developers.google.com/resonance-audio/develop/wwise/getting-started
Then what you need to do is to add the Room Effect assets into your Unity project. The assets are found in the Resonance Audio zip package, next to the authoring and SDK files. Unzip the Unity stuff into your project, add a room Effect in your scene and you should be able to see the properties in the inspector of the room object?
Figured it out thanks to Egil Sandfeld Here ! https://github.com/resonance-audio/resonance-audio-wwise-sdk/issues/2#issuecomment-367225550
To elaborate I had the SDKs implemented but I went ahead and replaced them anyways and it worked!
I have followed the Multiplayer Shootout showcase (https://docs.unrealengine.com/latest/INT/Resources/Showcases/BlueprintMultiplayer/index.html) and tried to replicate the sessions part for my own project. I can create a (LAN) session, see other sessions and join one. My problem is that, for some reason, the correct map opens, but the actors do not replicate. If I simply open the map, for 2 players, they replicate without any issues. Is there something else I should do to enable replications when using sessions? Thank you!
After further testing the program, it gave me the following error:LogNet:Warning: Travel Failure: [LoadMapFailure]: Failed to load package '/Game/Maps/UEDPIE_2_Arena2'. Which I found out was because I was testing the game sessions inside the editor. After I tried testing it in Standalone Game mode, everything was working fine.