How to you get AVAudioEnvironmentNode.listenerPosition to update? - swift

I am trying to use an AVAudioEnvironmentNode() with AVAudioEngine and ARKit.
I am not using ARSCNView or any of the other ARViews, this is just a plain ARSession.
I have a sound source -> AVAudioEnvironmentNode -> AVAudioEngine.mainOut.
I understand how set the position of the sound source. I am trying to figure out how to move the audioListener. Because I want to walk around the sound source in space.
The Apple Documentation says that to update the node's listener position you use.
AVAudioEnvironmentNode.listenerPosition = AVAudio3DPoint(newX, newY, newZ)
When I pass the ARCamera's forward and up to the node, that seems to change fine. However when I trying to change the position, I do not hear anything, and when I print a debug of listenerPosition, the output stays at zero origin, even those the camera's position is moving.
Is there something I have to do to make the AVAudioEnvironmentNode movable to take a new position?
Thanks.

Related

Unity AR Rotate Scene to match reference point

How to Match a reference point in 2 different AR scenes by position and rotation?
Here are some details about my project:
I have 2 scenes: "new scan" and "load scan". In the "new scan" scene I instantiate a 3d cube and make all the other points relative to it. This is my reference point. Then I instantiate some more points and finally save all the data to the device (my phone).
Next, in "load scan" I load the scene again and instantiate the cube in the exact same world position. For now, I managed to set the right position for each point but the axis is rotated because I start the scene from a different real-world location and different phone rotation.
Based on the cubes which are instantiated in the same place, I need to match the rotation and the position of the scene so the points will appear in the same relative position as the first cube.
Note: one can assume that the cube will instantiate with the user standing in the same direction as the desired position. But Do NOT assume that the user starts the "load scan" scene in the same direction as the "new scan" scene (which effect the whole scene rotation).
Here is a visualization of the problem:
Image of New Scan:
Image of Load Scan:
Thanks
If you want to make sure that the cube will appear in the same position/rotation in every AR session you have several options:
Use an Image marker
Use ARWorldMap (iOS exclusive)
Use a Cloud Tracking solution (google cloud anchors / Azure spatial anchors)
Of course you can also try to make the user place the cube correctly themselves, or redesign your app to work without these restrictions.
So I've found a solution but this is not the ultimate solution:
First, make a class with public static parameter so we can pass it through other scripts and scenes. Something like that:
public static class SceneStage
{
public static int ResetScene = 0;
}
Now, every time the camera turns on, check the SceneStage.ResetScene state. If value == 0 don't do anything, otherwise ask the user to stand facing the desire direction and then press a button, which call the function ResetScene:
private void ResetScene(int _scene)
{
var xrManagerSettings = UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager;
xrManagerSettings.DeinitializeLoader();
SceneManager.LoadScene(_scene); // reload current scene
xrManagerSettings.InitializeLoaderSync();
}
Here I send the scene build index to the function with:
ResetScene(SceneManager.GetActiveScene().buildIndex);
So basically, the flow is like that: for the first time we open the scene (when SceneStage.ResetScene = 1) -> change the value to 0, and reset the scene. The second time don't do anything, but when we leave the scene set the value back to 1 so the next scene will reset too (because the ARPose driver still tracking the environment).

Why is this Framerate Independent When Adding An ImpulseForce Unreal Engine

I am facing an issue I want to push the object in the direction/Axis the player is moving towards I can do that using the getlastinputvector multiplying the force value but the thing is it is framerate dependent the output velocity is different when testing it in like 10FPS or 1000(uncapped)FPS how can i achieve the force not being affected by framerate and also the force being applied towards the player moving axis.
According to the documentation of add impulse: https://docs.unrealengine.com/4.27/en-US/BlueprintAPI/Pawn/Components/CharacterMovement/AddImpulse/
"If you want to continually apply forces each frame, use AddForce()"
The add impulse is good for something like kicking a ball where it only happens once and the framerate would not matter.
The add force is better for something like having your character push an object as it is designed to be used every frame.
In your case it seems you should use add force.

Get Orientation of SCNNode Swift [duplicate]

I am working on a basic racing game using Apple's SceneKit and am running into issues simulating a car. Using the SCNPhysicsVehicle behavior, I am able to properly set up the car and drive it using the documented methods.
However, I am unable to get the car's position. It seems logical that the SCNPhysicsVehicle would move the SCNNodes that contain the chassis and the wheels as the car moves but the SCNNodes remain at their original position and orientation. Strangely enough, the chassis' SCNPhysicsBody's velocity remains accurate throughout the simulation so I can assume that the car's position is based off of the SCNPhysicsBody and not the SCNNode. Unfortunately, there is no documented method that I have found to get an SCNPhysicsBody's position.
Getting a car's position should be trivial and is essential to create a racing game but I can't seem to find any way of getting it. Any thoughts or suggestions would be appreciated.
Scene Kit automatically updates the position of the node that owns an SCNPhysicsBody based on the physics simulation, so SCNNode.position is the right property to look for.
The catch is that there are actually two versions of that node in play. The one you typically access is called the "model" node. It reflects the target values for properties you set, even if you set those properties through an animation. The presentationNode reflects the state of the node currently being rendered — if an animation is in progress, the node's properties have intermediate values, not the target values of the animation.
Actions, physics, constraints, and any scene graph changes you make inside update/render loop methods directly target the "presentation" version of your scene graph. So, to read node properties that have been set by the physics simulation, get the presentationNode for the node you're interested in (the node that owns the vehicle's chassisBody physics body), then read the presentation node's position (or other properties).
I have the same problem with my player node.
I move it with applyForce (to manage collision detection).
But when i check node position after some movement, the node position has not move (presentation node is the actual position as rickster write in his answer)
I manage to update the scnNode.position with renderer loop
You have to set position of your node with the presentationNode position.
node.position = node.presentationNode.position
Set this into renderer(_: updateAtTime) and your node position will sync with any animation you made to the physicsBody

Unreal Engine 4 - Add offset to character movement

I just started (yesterday) using unreal engine and I need to simulate a drunk character using BPs.
I'm using two camera shakes (one for standing still and one for walking) but I want to add some "displacement" on charater when he's walking.
Basically I want to define a random float to be added to X axis location in order to make character wobble smoothly.
It will be acceptable even if there's a way to make the character move along with the camera when it's shaking.
What I tried until now is using AddActorLocalOffset and a timeline to lerp between actor's location and actor's location+offset, but both are very choppy to me.
Maybe it's a noob question but as I told I'm very new to this and need it for a quick work.
Any suggestion?
Thanks
If you are targetting physically correct model, you should use AddForce (UE Docs). But this approach would require implementation of a "drunk animation" where your character will modify it's movement animation to "compensate" this force by stepping aside etc.
Another (much more simple) approach is by using AddMovementInput. This example can be seen here: UE Aswers. In this case, you are basically simulate player's input by adding small amount of side force here and there.

Setting up a power meter in cocos2d

I am a straight noob. Everyone else says it, but I'm dead serious.
My question is, what is the best way to make a power meter to move a object? Meaning, how to set it up so that the longer the player holds the more power they get. Also how, would I incorporate physics?
What I'd like to accomplish is to have a player holding onto something so that when he taps on the screen and hold he powers up, and when he lets go he throws the object a certain distance.
just checking if the there is any thouch sequence or not is rather an easy thing, you just have to overload two functions for your scene class, one to inform you whenever a touch sequence begins and one to tell you touch is ended. the source code example is describe in this link. after than i think you need a gauge to show how much power is gathered so far, the easiest way is to use a texture with full power shown in it and the set it as texture and then show it little by little as the power goes up just as the code below:
// to create the gauge with zero power
CCSprite *s=[CCSprite spriteWithTexture:[CCTextureCache addImage:#"gauge.png"] rect:CGRectMake(0,0,0,10)];
// and then whenever the power changes you call this method
[s setTextureRect:CGRectmake(0,0,power,10)]
note that in my code i am using a 100x10 texture (power is somthing between 0..100 and texture height is 10 as the last parameter in both CGRectMake functions)