How to get audio listener position in unreal blueprint? - unreal-engine4

I am calculating relative speed between cars and my audio listener to achieve Doppler effect. So I need the vector from my audio listener to the object who is making sound. In C++ code, I can use "virtual void APlayerController::GetAudioListenerPosition(FVector& OutLocation, FVector& OutFrontDir, FVector& OutRightDir) const" to get the audio listener position of a player controller, which is its view point by default. But it's not an UFUNCTION. So how can I get this from blueprint without c++ code? In blueprint, I find a function named "Get Closest Listener Location". What is this used for? Is this function returning the same as APlayerController::GetAudioListenerPosition even if I called "Set Audio Listener Position Overrider"?
In addition, I am using unreal engine 5.

Related

Unity Input System event triggered without input

Using the "new" Unity Input System.
I have a start scene with a menu and from there I can load the game scene where I have a TouchManager, the touch manager allows me to control the player and to visualize the force applied to the player.
To visualize the force I have a game object that starts disabled and with the touch manager I enable the visualizer only when there is touch input.
The issue is that at start of the game scene the touch visualizer is enabled, although after the first touch it starts working perfectly.
With debugging I see that the event that signals "touch" is fired (without touch) but the event that signals the release isn't.
Regarding the code there is nothing particularly relevant as far as I am aware, so I briefly explain: (Scripts by order of execution)
GameManager:
Awake() the game is set to pause;
Start() the game is set to "unpause";
TouchManager:
Pause(bool isPause) input events are subscribed and unsubscribed;
Move is the input that is causing the issue.
I tried to disable the visualizer but has to be on update since the event that enables it is triggered after start method and I dont know how/when the touch event is triggered, also I would like to fix the issue at the source instead of covering up.
Any ideas?

Realtime Sharing of Pointer/Cursor of Each Player in Game Room Using PUN2/MRTK2 on Hololens 2

Is there a way to share each player's raycast pointer/cursor with the other players in realtime? I'm using PUN2 and MRTK2 and Hololens 2.
Approaches I've tried so far:
I tried a naïve approach of modifying the MRTK provided ShellHandRayPointer to contain a PhotonView and then using that on the MixedRealityToolkit object within the scene, but that seems to have no effect.
I've tried creating a cursor prefab based off of MRTK's CursorFocus in which I add a Surface Magnetism component (tracking the hands) and then instantiate this prefab for each player in PUN2's OnJoinedRoom callback. After the instantiation call, I add the object to a non-rendered layer for the player with the goal of hiding it for the local player but allowing it show up for other players. This seems to hide the object as expected when only one player is in the room, but when a second player joins, the first player then sees a cursor show up that tracks with their hand movement, which seems unexpected to me (of note is that I'm using one Hololens 2 headset with a computer acting as the second player). Though perhaps this "crossed" behavior is due to the Surface Magnetism component?
Thanks!
Step-by-step images of how I modified the ShellHandRayPointer with a PhotonView and then reattached to the MRTK system:
scene:
MRTK system:
MRTK system part 2: reference to cloned ShellHandRayPointer:
my cloned ShellHandRayPointer part 1:
PhotonView components expanded on the cloned ShellHandRayPointer:
Regarding how to share objects in Photon in real time, as far as I know, the real-time shared objects in Photon need to be instantiated by PhotonNetwork.Instantiate, but ShellHandRayPointer in MRTK is instantiated by Input system.
You can customize a copy of ShellHandRayPointer, map the position and rotation of ShellHandRayPointer to the copy at runtime and share this copy in Photon in real time.
The position and rotation of ShellHandRayPointer can be obtained in MixedRealityToolkit.InputSystem.DetectedInputSources. Or you can use Unity's methods to get this Game Object directly.
For the cursor, you can use the same method above to create a copy of the cursor and map its position and rotation.

How to mimic HoloLens 2 hand tracking wIth Windows Mixed Reality controllers [MRTK2]?

The HoloLens 2 will feature hand tracking and the ability to reach out and poke UI elements. With Unity and the Mixed Reality Toolkit V2, the input for the hand-tracked near interactions (ie poking) comes the PokePointer class and to generate events for GameObjects having BaseNearInteractionTouchable components.
My question is how can we get the same PokePointer events from virtual reality controllers such as the Windows Mixed Reality controllers? This would make it possible to prototype on the desktop using a VR headset and even directly use the same near interactions of the Mixed Reality Toolkit within VR applications.
Can the PokePointer component be attached to a hand GameObject that is a controller model? Or is there a better way to do this through the MRTK profiles system?
Actually, it's possible to add a poke pointer and a grab pointer to a VR device. In fact, adding basic functionality without visualization can be done without even writing any code!
Getting the existing grab and poke pointers to work with VR
Open your current pointer configuration profile by selecting the MixedRealityToolkit object in the scene view, going to the inspector window, then navigating to Input -> Pointers.
Under pointer options, set the controller type for the PokePointer and the Grab Pointer to include your VR Controller type (in my case, it was Windows Mixed Reality, though you may wish to use OpenVR)
The poke pointer is configured to follow the Index Finger Pose, which does not exist for VR. So you will need to open the PokePointer.prefab file and in the inspector, Under Poke Poker -> Pose Action, set the value to "Pointer Pose"
Hit play. The grab pointer will be slightly below and do the right of the motion controller gizmo, and the poke pointer will appear be right at the origin.
Bonus: Improving the grab, poke pointers by using custom pointer
You can greatly improve the pointers you have by using custom pointers instead of the default pointers. For example, you can:
have the poke pointer be offset from the gizmo origin by setting the PokePointer's raycastorigin field to a custom transform
Add visuals to actually show where the pointers are
I've created an example that demonstrates a custom grab and poke pointer which visualizes the grab and poke locations, and also offsets the poke position to be more convenient. You can download a unitypackage of the sample here, or just clone the mrtktips repository and look at the VRGrabPokePointers scene.
Note: to get the visuals to actually show up, use the following script (pointers currently disable all renderers on startup to avoid flickering).
using UnityEngine;
public class EnableRenderers : MonoBehaviour
{
void Start()
{
foreach (var renderer in GetComponentsInChildren<Renderer>())
{
renderer.enabled = true;
}
}
}
You can see an example of a custom MRTK and pointer profile in the example here, and also in the VRGrabPokePointersUnity scene

How can I use two Audio Listeners?

The problem is that I have a character controller the player with a camera and the camera have a Audio Listener.
But I also have another camera the Main Camera that also have a Audio Listener.
The Main Camera is using Cinemachine Brain and using virtual cameras.
If I disable the Audio Listener on the Main Camera the character in my cut scene will walk to a door/s but when the door/s will open there will be no sound of the door open.
And if I disable the player controller camera Audio Listener then when I will move my player around to door/s there will be no sound when the player enter a door.
And I need both to work. While the character in the cut scene is walking and enter a door and the door is open the player can walk around.
Screenshot of the player controller camera and the audio listener:
And this is the Main Camera Audio Listener screenshot:
So now when running the game the character medea_m_arrebola is walking by animation through a door and there is a sound of the door open and close.
This is part of a cut scene that work in the background I mean the cut scene camera is not enabled yet but you can hear the audio.
Later I will switch between the cameras to show parts of the cut scene.
But now also the FPSController ( Player ) is working and the player can move around but when he will walk through a door the door will open but there will be no audio sound of the door.
And if I will enable both Audio Listeners I will get this warning message in the console in the editor say that more then 2 audio listeners are enabled....etc.
This sounds like a design issue to me. Unity can only handle one AudioListener at a time. You basically have to construct your cutscene-system to work with what Unity offers, or find some kind of workaround to fit your specific case.
You could try to en-/disable your AudioListeners on the fly or maybe use AudioSources around you player dedicated to directional audio input while in a cutscene. (Like a surround sound setup with empty objects) That way you could simulate two AudioListeners. The best case would be if you reworked your system to use one AudioListener for both inputs.
Maybe try a workaround first but if it does not 100% work as intended do the rework. It's worth it in the long run.

Playing sound within a certain radius unity

I am looking for some way of making it so that the player can only hear the sounds as they get close to an object. In other words, a sound that only plays within a certain radius, or proximity sound, if this makes any sense.
If anyone knows of a way to do this in unity, any advice would be helpful!
Thanks,
Callum
You can use Unity's built-in 3D audio features to achieve that effect. Simply, if you attach an AudioSource to your desired object (the source of sound), and put the 3d audio sample you'd like to use in it as an AudioClip, you can edit its min/max distance and rolloff values accordingly to set the hearing radius. Note that, in that case you might want to attach your AudioListener to the player object instead, as the distance calculation will be done between the source and the listener.
Alternatively, you could write a small script to cast a sphere to find if the transform of the audio source is inside and change the volume etc. accordingly. Of course, you need to add colliders to your game objects in that case. See: http://docs.unity3d.com/ScriptReference/Physics.SphereCast.html