Unity Input System event triggered without input - unity3d

Using the "new" Unity Input System.
I have a start scene with a menu and from there I can load the game scene where I have a TouchManager, the touch manager allows me to control the player and to visualize the force applied to the player.
To visualize the force I have a game object that starts disabled and with the touch manager I enable the visualizer only when there is touch input.
The issue is that at start of the game scene the touch visualizer is enabled, although after the first touch it starts working perfectly.
With debugging I see that the event that signals "touch" is fired (without touch) but the event that signals the release isn't.
Regarding the code there is nothing particularly relevant as far as I am aware, so I briefly explain: (Scripts by order of execution)
GameManager:
Awake() the game is set to pause;
Start() the game is set to "unpause";
TouchManager:
Pause(bool isPause) input events are subscribed and unsubscribed;
Move is the input that is causing the issue.
I tried to disable the visualizer but has to be on update since the event that enables it is triggered after start method and I dont know how/when the touch event is triggered, also I would like to fix the issue at the source instead of covering up.
Any ideas?

Related

Unity MRTK with HoloLens 2: How to detect whether a hand touches a GameObject in code?

I have been working with Unity 2020.3 LTS, the Windows XR Plugin, and the amazing MRTK 2.7.0 to port an existing application to HoloLens 2.
In this application I have a scene with several GameObjects in it and I need to detect whether a hand touches a GameObject (either with the indexfingertip near interaction or the pinch gesture far interaction). The important part here is that this detection needs to happen in a central script in the scene (i.e. maybe have the hand as an object in the code) and not from the view of the touched Gameobject itself.
I have successfully implemented the latter using this example with the two code examples below on that page, but the touched GameObject itself firing events via a listener does not work well with my use case. I need to detect the touch from the hand's perspective, so to speak.
I have searched the web and the Microsoft MRTK documentation several times for this and unfortunately I could not find anything remotely helpful. For head-gaze the documentation has a super simple code example that works beautifully: Head-gaze in Unity. I need the exact same thing for detecting when a hand touches a GameObject.
Eventually I will also need the same thing for eye-tracking when looking at a GameObject, but I have not looked into this yet and right now the hand interaction is giving me headaches. I hope someone can help me with this. Thanks in advance :).
but the touched GameObject itself firing events via a listener does not work well with my use case.
Why does the event not work? Could you provide more detail about it?
In addition to NearInteractionTouchable, have you tried the Interactable component? It's usually used to attach to the touched Game Object and will fire the event receiver when catching input actions. In the event receiver (in the Component UI), you can add any function attached to any object as the listener, such as a central script in the scene. It should be an effortless way can meet your request. For more information please see: Event
After some additional fiddling around I was able to get it to work the way I want/need to with the Touch Code Example. The solution was to create an empty GameObject variable in the code of the central script that is continuously checked whether it is null or not. The touch on the GameObject itself then binds itself to that checked GameObject variable as long as it is touched and sets it back to null once it is not touched anymore. This allows the central script to work with the touched GameObject as long as it is touched.
void Start()
{
centralScript = GameObject.Find("Scripts").GetComponent<CentralScript>();
NearInteractionTouchableVolume touchable = gameObject.AddComponent<NearInteractionTouchableVolume>();
touchable.EventsToReceive = TouchableEventType.Pointer;
pointerHandler = gameObject.AddComponent<PointerHandler>();
pointerHandler.OnPointerDown.AddListener((e) =>
{
centralScript.handTouchGameObject = gameObject;
});
pointerHandler.OnPointerUp.AddListener((e) =>
{
centralScript.handTouchGameObject = null;
});
}

Kudan in Unity: how to stop or reset markerless tracking?

I am creating an application with Kudan where a photograph (a 2D sprite) appears via markerless tracking. Based on the sample project I've successfully made adjustments so that the 2D plane is always perpendicular to the camera and placed on the screen in the position I want. Really wonderful!
But I am unable to figure out how to restart/reset the tracking via a script. I can always force the tracking to restart by blocking the camera or shaking the phone, but I want to do it via a button-- it is exactly the same behavior I've found described in the "ArbiTrack Basics" guide for Android and iOS, but am unable to reproduce it in Unity. To what script should I send a stop tracking command in order to get the tracking instance to restart (exactly the same effect as blocking the camera when running one of the sample Unity projects in Markerless Mode).
The situation is described here for Android coding: https://wiki.kudan.eu/ArbiTrack_Basics#Stopping_ArbiTrack
where it says to call these three things:
// Stop ArbiTrack
arbiTrack.stop();
// Display target node
arbiTrack.getTargetNode().setVisible(true);
//Change enum and label to reflect ArbiTrack state
arbitrack_state = ARBITRACK_STATE.ARBI_PLACEMENT;
I have found one way to do this-- though I'm not sure it's ideal.
Looking in the TrackingMethodMarkerless.cs script, it seems that the StopTracking() does not work-- it disables the updating of the tracking but doesn't actually disable the instance of detection. But taking a note from it, I added an if statement to the ProcessFrame() function:
//
if (disableMarkerless == false)
trackable.isDetected = _kudanTracker.ArbiTrackIsTracking ();
else
trackable.isDetected = false;
//
Now, changing the disableMarkerless bool operator disables the tracking.

Swift Spritekit Modify Action

I am making an IOS game in Swift with Spritekit, I have a player Sprite which I want to rotate and move towards where a touch is on the screen. Currently I get the angle, Create the action to turn, Run the action and do the same for the movement. This works well for a single touch, However I now want to do the same when a touch moves. First I tried removing the action then running the new one, The sprite jitters or does not move at all, because the action is being cancelled very soon after being created. I have also tried running it every 100ms however I still do not get smooth movement.
So I was wondering is there any way to modify an action as it is running? Or what is the right way of doing this?
You can override the didEvaluateActions method in your scene class, this gets called after all actions are performed in an update frame. In that method, destroy the old action and start your new action. If you are still seeing jitters, then you need to reevaluate when you want to be removing actions

Unity3D build freezes unless mouse is moved

I have a simple Surface 2.0 (PixelSense) application that sends UDP messages to my Unity3D game when there are touch events since Unity doesn't support .NET 4. When I run the game in the Unity editor everything works fine, but when I run the actual Unity build the game freezes unless I move the mouse (keyboard input doesn't unfreeze it). Once it unfreezes, all the updates happen at once.
I also tried running a simple unity build with a sprite that translates in the X axis and starts over once it's not being rendered (does not receive UDP messages). This build runs at the same time as the surface app so it's also running in the background and it also freezes unless I move my mouse so I don't think it's a networking issue.
Run In Background is checked and I also have it in the script.
private void Awake()
{
Application.runInBackground = true;
}
Visible in background is also checked in the Player Settings.
I run the game windowed at full resolution.
Any ideas about this problem?

EventTrigger component only works with UI system?

I'm new to Unity and using 4.6 version.
I have a prefab which is just a sprite. And I instantiate three of them.
I want to receive touch and mouse events from those instances.
So, I added EventTrigger component to them from c# script, and added ClickEvent.
And I also added IPointerClickHandler and implement of it.
But it never gets any event. What am I missing?
Is TriggerSystem only works with UI system? Because I did not add any UI Pannel or UICanvas or etc. ( instead I added an empty object and added TriggerSystem component to it )
If this is not a good way to do, please give me a direction to start.
Thanks in advance.
Below is for Mobile devices.
If you want to do in game controls without UI you can use Input
This is Unity document.
It pointed TouchPhase and all of touch phases can find here.
Unity document has good example codes i think you should check it.
For GUI system to work there should also be an EventSystem object in your scene.
You can do it the easy way too, from the editor add a collider2d to the the sprite.
Write this to an script attach it the the gameObject.
void OnMouseDown() {
// do something
}
Unity also convert mouse event to touch events on touch systems so you may not need to worry about simple touch events.