EventTrigger component only works with UI system? - unity3d

I'm new to Unity and using 4.6 version.
I have a prefab which is just a sprite. And I instantiate three of them.
I want to receive touch and mouse events from those instances.
So, I added EventTrigger component to them from c# script, and added ClickEvent.
And I also added IPointerClickHandler and implement of it.
But it never gets any event. What am I missing?
Is TriggerSystem only works with UI system? Because I did not add any UI Pannel or UICanvas or etc. ( instead I added an empty object and added TriggerSystem component to it )
If this is not a good way to do, please give me a direction to start.
Thanks in advance.

Below is for Mobile devices.
If you want to do in game controls without UI you can use Input
This is Unity document.
It pointed TouchPhase and all of touch phases can find here.
Unity document has good example codes i think you should check it.

For GUI system to work there should also be an EventSystem object in your scene.
You can do it the easy way too, from the editor add a collider2d to the the sprite.
Write this to an script attach it the the gameObject.
void OnMouseDown() {
// do something
}
Unity also convert mouse event to touch events on touch systems so you may not need to worry about simple touch events.

Related

Unity MRTK with HoloLens 2: How to detect whether a hand touches a GameObject in code?

I have been working with Unity 2020.3 LTS, the Windows XR Plugin, and the amazing MRTK 2.7.0 to port an existing application to HoloLens 2.
In this application I have a scene with several GameObjects in it and I need to detect whether a hand touches a GameObject (either with the indexfingertip near interaction or the pinch gesture far interaction). The important part here is that this detection needs to happen in a central script in the scene (i.e. maybe have the hand as an object in the code) and not from the view of the touched Gameobject itself.
I have successfully implemented the latter using this example with the two code examples below on that page, but the touched GameObject itself firing events via a listener does not work well with my use case. I need to detect the touch from the hand's perspective, so to speak.
I have searched the web and the Microsoft MRTK documentation several times for this and unfortunately I could not find anything remotely helpful. For head-gaze the documentation has a super simple code example that works beautifully: Head-gaze in Unity. I need the exact same thing for detecting when a hand touches a GameObject.
Eventually I will also need the same thing for eye-tracking when looking at a GameObject, but I have not looked into this yet and right now the hand interaction is giving me headaches. I hope someone can help me with this. Thanks in advance :).
but the touched GameObject itself firing events via a listener does not work well with my use case.
Why does the event not work? Could you provide more detail about it?
In addition to NearInteractionTouchable, have you tried the Interactable component? It's usually used to attach to the touched Game Object and will fire the event receiver when catching input actions. In the event receiver (in the Component UI), you can add any function attached to any object as the listener, such as a central script in the scene. It should be an effortless way can meet your request. For more information please see: Event
After some additional fiddling around I was able to get it to work the way I want/need to with the Touch Code Example. The solution was to create an empty GameObject variable in the code of the central script that is continuously checked whether it is null or not. The touch on the GameObject itself then binds itself to that checked GameObject variable as long as it is touched and sets it back to null once it is not touched anymore. This allows the central script to work with the touched GameObject as long as it is touched.
void Start()
{
centralScript = GameObject.Find("Scripts").GetComponent<CentralScript>();
NearInteractionTouchableVolume touchable = gameObject.AddComponent<NearInteractionTouchableVolume>();
touchable.EventsToReceive = TouchableEventType.Pointer;
pointerHandler = gameObject.AddComponent<PointerHandler>();
pointerHandler.OnPointerDown.AddListener((e) =>
{
centralScript.handTouchGameObject = gameObject;
});
pointerHandler.OnPointerUp.AddListener((e) =>
{
centralScript.handTouchGameObject = null;
});
}

How do I implement a raycast / laserpointer to Oculus controller?

I want to implement a graphic raycaster/ laserpointer to the left Oculus controller, so I can interact with UI buttons in Unity.
I have seen a lot of tutorials etc. but nothing has helped.
I want a laserbeam or laserpointer/graphic raycast to shoot out from Oculus controller when a button os pressed on the controller. I need the laserbeam to interact with UI buttons in Unity.
You can use a normal raycast
I recomend you make this:
Create a script on your hand,and add a component called line renderer
In the script attach the lineRenderer component
make a simple raycast hit
get the position of the hit object
set the first position of the actual hand and the second to the hit object like this:
lineRenderer.SetPosition(0,transform.position);
lineRenderer.SetPosition(1,hitObject.transform.position);
And it draw a line from your hand to the hit object, remember to change the lineRender parameter to make a beautiful line
hope it helps
I created a project that uses unity's event system. It is a laser pointer which you can interact with unity's UI and 3d objects in the scene. If you want to check it out here's the link: https://github.com/balataca/oculus-laser-pointer

Unity OnMouseEnter() or OnMouseOver() not working

I want to make an Inventory System in Unity, so I tried to follow this tutorial, but the functions OnMouseEnter and OnMouseOver are not working.
I tried everything like 3d colliders with z-value of 100, with trigger and not trigger, and also I checked if the Physics.queriesHitTriggers is true, but nothing works. Do you have any easy tips?
Not with raycasting... I'm quite new and don't understand this.
Try adding a box collider or box collider 2d
Try adding an rigidbody besides the collider and you can do it kinematic
Try adding add a EventSystem if you are using canvas ui
And verify that it does not have an object like a canvas blocking the camera raycast.
good luck.
The GameObject you're trying to use mouse with, needs to have at least one component that is a rayccast target. an Image or SpriteRendrer should do it.
The thing I found out is that OnMouseOver only works in the Game tab and not working in the Scene tab :)
It's much easier and cleaner to use Unity UI System to make Inventory, Just Create Canvas, Image inside of it and add EventTrigger component to that Image object, and modify events inside of it in the inspector, add any functions you want on any event it supports it, it is much cleaner solution. Try this tutorial https://www.youtube.com/watch?v=HZpq46W4xo4
U can do whatever you want with this system, just a little bit of thinking and planing your own Inventory Req...
Best of luck!

Cannot Animate Interactable Gameobject using Mixed Reality Toolkit

I am a bit of a novice with the Unity Engine and Mixed Reality App development so please bear with me.
I have been working with the Microsoft Mixed Reality Toolkit for Unity to try and animate a game object and move it to the side. A simple action, very similar to an example scene provided by Microsoft with the toolkit called "InteractableObject" (Information links provided below)
Interactable Object - Mixed Reality (Microsoft Docs)
Mixed Reality Toolkit-Unity Interactable Objects and Receivers (Github)
This example scene in Unity has multiple objects to be used as "buttons". With the Mixed Reality Toolkit, even objects that you want the user to interact with to perform some sort of action when selected is even considered a button. At least according to the documentation I have actually been able to find on the subject. This is a series of screenshots depicting the inspector panels for my GameObject and the container for my object:
GameObject Inspector Panel
GameObject Container Inspector Panel (Part 1
GameObject Container Inspector Panel (Part 2
I am trying to make a single game object move to the side when I place the standard cursor on it. This same action is done with a balloon object in the example scene I mentioned. I have created the animator and the state machine the same as they did in there example as well as setup my game object in an almost identical format. Only real difference is that created a balloon object themselves and I am using a different set of custom models from my company.
When I attempt to play back the app in the Unity Editor, the state does not change when I place the cursor on the object. I can force the state to change using the editor and the required animation engages, but it will not change the state on its own. I configured my state machine the same as the Microsoft example and setup my state variable the same as well. It should move from an "Observation" state to a "Targeted" or "ObservationTargeted" state when the cursor moves onto the object. A screenshot of the GameObject state machine and the inspector panel of the specific transition in question are provided below:
GameObject Animator State Machine Setup
Observation to ObservationTargeted Transition Inspector Panel
I went through and verified that all components added by the Mixed Reality Toolkit are the same and they are. This includes the DefaultCursor, InputManager, MixedRealityCameraParent and Directional Light. I also checked that all the scripts were coded the same as well and they are. I am running out of places to look. I attached the Visual Studio debugger to the project in Unity and have verified that it just isn't changing the state on its own. But I cannot figure out why. I believe the problem has something to do with the setup of the transition, but I haven't been able to find the issue. All of the other mentioned components are provided by Microsoft and are not changed by myself nor are they changed in the sample scene.
If anyone else has had a similar problem or may know where I can look to find the problem please let me know. I haven't even built the project into an UWP application yet.
I know it's been a few months, but do you still looking for the solution?
With the newest version of Mixed Reality Toolkit you could make any GameObject to act as a button. Simply read this documentation. I have some cubes as buttons in my Unity project and the only extra Component I added to it to make it work was Interactable, which comes from Mixed Reality Toolkit.
If you want to trigger some animation when you place the cursor on the object (or look at it if you're going to use it with Hololens) then you can add them in Interactable object by adding a new Event (for example: OnFocus() event)
Hope this helps is any way

unity- select 2D game object in mobile touch

i want to know what is best way to select 2D object in unity mobile
for example i want to select one object and a menu show up for upgrade
i already searched but didnt find something good
if you write both code and good references would be nice
You can add a Button component to that object and add an OnClick() function call to it :
See a sample in Unity docs
or you can use Touches to have control over different phases :
See a sample in Unity Tutorials
You can add button instead of your GameObject and choose button sprite like as your GameObject's sprite then you will writing function to button to show your menu