I have a eventsystem with 2 inputmodules (gaze (for cardboard) and touch). The gazeinput is above de touchinput, so that is going to be used by unity as main inputmodule. Now i have 1 object that i want to trigger on touchinput, but that is not working because of the gazeinput. So my question is if it is possible to disable the gazeinput just for this 1 gameobject?
EDIT: the object is a menu button, located in the bottom-right corner. It moves with the camera.
thanks
use raycast. when your camera is seeing the gameobject on which you want to use the touch input. detect that the user is looking at the camera via raycast.
Attach a script (to the camera or an empty gameobject) that has the reference of both the input modules
When the player looks at the object(detected through raycast) in which you want to use the touch input. Simply disable the gazeinput.
And when the player looks away from that gameObject enable the gazeinput
Related
In the project I have added Player Movement with Joystick for both Right Controller and Left Controller.
There are objects for which XRGrabIntractable Script attached so that they are Grabbable when Ray is Pointed and Triggered.
when I use Trigger button and Pick up one of the object it is Grabbed.
But when I use the Joystick to move the player Grabbed Object in Hand also moves and Rotates.
Is it possible to disable the movement of the Grabbed Object when we use Joystick for player movement?
Unity Version: 2021.3.1f
Device: Pico Nio 3
XR Interaction Toolkit Version: 2.2.0
Checking to see if there is any default setup to disable movement of Grabbed Object with joystick.
Thanks in advance
Yes you have to uncheck the Anchor control checkbox. See picture below.
I want to implement a graphic raycaster/ laserpointer to the left Oculus controller, so I can interact with UI buttons in Unity.
I have seen a lot of tutorials etc. but nothing has helped.
I want a laserbeam or laserpointer/graphic raycast to shoot out from Oculus controller when a button os pressed on the controller. I need the laserbeam to interact with UI buttons in Unity.
You can use a normal raycast
I recomend you make this:
Create a script on your hand,and add a component called line renderer
In the script attach the lineRenderer component
make a simple raycast hit
get the position of the hit object
set the first position of the actual hand and the second to the hit object like this:
lineRenderer.SetPosition(0,transform.position);
lineRenderer.SetPosition(1,hitObject.transform.position);
And it draw a line from your hand to the hit object, remember to change the lineRender parameter to make a beautiful line
hope it helps
I created a project that uses unity's event system. It is a laser pointer which you can interact with unity's UI and 3d objects in the scene. If you want to check it out here's the link: https://github.com/balataca/oculus-laser-pointer
I'm new to unity and c#.
When i air tap on the left sphere, the plane and the right sphere should appear.
I figured that an OnPointerClicked event on the left sphere the animation of the plane and right sphere triggers (with an if/else statement).
Is this the way to go?
If so, how do i do that?
Or is there an easier method?
Thats the script of the left sphere:
Read this: IPointerClickHandler.OnPointerClick
You need to implement the interface, i.e. IPointerClickHandler in your MonoBehaviour class. When you click on the object in game, it will send a click event to EventSystem, which triggers the function body you have in your OnPointerClick method.
So as the doc mentioned, also make sure you have an EventSyetem (an empty game object with the script attached, or attach the component to your main camera).
I made a button that is clickable during the game playing. As I want the button have a fixed position in the main camera, I made both the main camera and the button children of the player gameObject so that the camera will follow the character while jumping or moving, everything works fine but there also a border colliders which will prevent the character moving out of the playing area. But then the collider of the button which was made intend to make the button clickable will also collide with the border which will prevent the character moving right forward. If we set the collider of the button a trigger, it seems that the button will be triggered wherever I click the mouse on the screen, that's not what I wanted.
I know maybe I could prevent this by checking if the collided object is the button or the character, but is there a better way to do that? Thanks.
For a 2D platform game, I would add a 2D user interface in a canvas over the "map" (the layer where you have the gameobjects like the character, platforms, enemies...). So the button will be always in the same place of the screen and will never collide with any gameobject of the game.
You make take some ideas from here: https://unity3d.com/learn/tutorials/topics/user-interface-ui/ui-events-and-event-triggers
Try use new UI in Unity may it fix your issue:
https://unity3d.com/learn/tutorials/topics/user-interface-ui/ui-button
I have been trying to add touch event to 3D gameObject in Unity.
Previously I was not using any Canvas or Panel so by using Event Trigger and Event System I was able to add touch events to the Gameobject but then I wanted to use a UI for Application and implemented this hierarchy.
Camera
GameOBject
Canvas
Panel(Transparent)
-Buttons
Panel2
EventSystem
So if i tap a part it does not respond.
Camera does have Physics Raycaster.
Gameobjects have colliders and mesh renderers.
I want a touch input from the mobile device.
Thanks
If two Button overlap each other then the order in which camera renders top will be detected.
So try changing the hierarchy order (remember the last in the hierarchy is always top of the camera ) you can achieve which one to click