Unity AR How can i trigger a button in the scene? - unity3d

I have this little AR-Unityapp for Android and i try to play a video with a button. But the button should not be on the screen ( canvas ) it should be in the scene ( understandable? ). So i made a 3d plane which simply starts the video with a pointer down event but it won´t work. I know that its possible to make this but i have no idea how some help would be helpful ( xP ).

use a world canvas:
Create a canvas and set its "render mode" to "world".
In that canvas you can put your button, give it a 3D location as you would with your plane, but you can use normal UI functionalities, such as buttons, instead of implementing click detections your self.

If you want to trigger a Pointer Down event just follow the steps below
STEPS
Add Physics Raycaster to your Camera as shown below
Add EventSystem object in your scene
Right click in Hirarchy Window -> UI -> Event System
Attack Event Trigger component to the required plane
Select Plane -> Add Component -> Event Trigger -> Add Event of Pointer Down
Now Call the Required method via PointerDown Event (In My case Plane.OnSelected())
ALTERNATIVE OF EVENT TRIGGER
Steps 1 to 2 as above
Attach the below Script to that Plane as below
using UnityEngine;
using UnityEngine.EventSystems;
public class Plane : MonoBehaviour,IPointerDownHandler {
public void OnPointerDown(PointerEventData eventData)
{
Debug.Log("CALLED");
}
}
This Method will be called on Pointer Down on the required plane.
As Seen in the above code we are using IPointerDownHandler interface of UnityEngine.EventSystems
I am sure this will clear your problem.

Related

IPointerClickHandler doesn't work on unity game object

I have unity 2D project and I wanted to detect left click event on the player game object.
I added Physics2DRaycaster to MainCamera (see the first screen below)
I added two colliders to the game object (the circle collider is for detecting collisions, the box collider is triggered and used for click detection through IPointer) (see the second screen below)
I implemented the interfaces: IPointerClickHandler,
IPointerEnterHandler, IPointerDownHandler in the game object script. (see the code below)
I have EventSystem object in scene (see the third and fourth screen below)
I checked with a simple project that this combination works
But still in my project any click event isn't detecting at all. I have not idea why. I think the raycast itself doesn't work, because when I look at EventSystem logs in inspector I see that the player object doesn't detected.
UPDATE: I saw that if I switch off the canvas the IPointer gets work. But canvas elements is placed only around the player gameObject (menu buttton), but the game object isn't hided by the canvas
UPDATE 2: Ok, I figured out that even if the canvas panel is transparent it still hides the game objects. So my question is, how can I detect click on game object if the canvas pannel is in front of the game object?
This is the script of the game object:
public class Player : MovingCharacter, IPointerClickHandler, IPointerEnterHandler, IPointerDownHandler
{
public void OnPointerDown(PointerEventData eventData)
{
Debug.Log("Down");
}
public void OnPointerClick(PointerEventData eventData)
{
Debug.Log("Clicked");
}
public void OnPointerEnter(PointerEventData eventData)
{
Debug.Log("Entered");
}
//other code......
}
This is the MainCamera:
This is my Game Object (Player):
This is the EventSystem:
And this is the project structure:
Use layers!
If you want to be able to hit the object then put your menu on a special layer like e.g. UI and make sure the PhysicsRaycaster2D ignores that layer.
UI will be Interactable anyway since the Canvas's own raycaster already takes care of UI elements.
In general I would also make sure to only mark these elements as Raycast Target that are actually Interactable (e.g. buttons). This way an invisible background panel wouldn't block the input either.

How to animate grabbing in Unity VR?

I've been googling this for 10 hours and I'm running out of ideas. I found several video-tutorials and written-tutorials but none of them really works or they're overkill.
I have VR Unity (2020.3.16f) project designed to be run on Quest 2. I'm not using OpenXR. I already created hand, created one simple grabbing animation, added animation to Animator, created transitions, and set "IsGrabbed" parameter. Now I'm looking a simple way to change "IsGrabbed" to true/false whenever I grab/release anything. I'm expecting something like this:
public class grabber : MonoBehaviour
{
// Start is called before the first frame update
Animator animator;
???
void Start()
{
???
}
// Update is called once per frame
void Update()
{
if (???)
{
animator.SetBool("IsGrabbing", true);
}
elseif (???)
{
animator.SetBool("IsGrabbing", false);
}
}
}
Please help. We're talking about VR here so I'm sure grabbing animation is the very basics of very basics of very basics. It can't be any difficult.
Best regards
First of all, I highly recommend watching this video by Justin P Barnett to get a much better overview of how this all works.
If you don't want to go that route for some reason, there are other options available to you. One such option is the "Player Input" component, which can act as a bridge between your input devices and your code. Most XR packages these days use the new Input System package, and it makes life easier, so I will assume you have that installed.
First, you will need to create an Input Actions asset, which can be done in the project pane: right-click -> Create -> Input Actions. There are many tutorials which explain this asset in detail, but here is a simple setup to get you started. Double click on the new asset to open the editing window, and create a new Action Map. In the "Actions" list, create a new action with action type Value, Control Type Axis, and in the dropdown arrow on your new action set the path to the input source. As an example source path, I will use XR Controller -> XR Controller -> XR Controller (Left Hand) -> Optional Controls -> grip. Make sure to click Save Asset before closing the window.
Create a script similar to this:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.InputSystem;
public class ControllerInputReceiver : MonoBehaviour {
public void FloatToScale(InputAction.CallbackContext context) {
float val = 0.1f + 0.1f * context.ReadValue<float>();
transform.localScale = new Vector3(val, val, val);
}
}
Create a cube somewhere visible in your scene, and add the Input Action Manager component to it, and drag your created Input Actions asset to its list of Action Assets. Then add the ControllerInputReceiver script. Also on this cube, create a Player Input component and drag your Input Actions asset to its Actions element. Choose your map as the default map and change behavior to Invoke Unity Events. Under the events drop down, you should see an element for the Action you created earlier. Drop your Controller Input Receiver component into this Action and select the FloatToScale function.
In theory it should work at this point. Build the game to your device and see if pulling the grip causes the cube to resize. If it does, then you can replace your Update function with:
void SetGrabbing(InputAction.CallbackContext context) {
animator.SetBool("IsGrabbing", context.ReadValue<float>() > cutoff);
}
If you are still having issues at this point, I really recommend checking out these youtube channels. I only started VR a couple of months ago and learned everything I know so far from these people. JustinPBarnett, VRwithAndrew, ValemVR, LevelUp2020. (Links removed because it kept screwing up my post)
Note, the new input system has button options instead of value/axis options for VR devices. These may be closer to what you want, but I had no luck getting them to work today.
Also note, depending on how you organize your code, you may or may not need the "Input Action Manager" component somewhere in your scene with your input actions asset in its list. It enables your actions for you, without you needing to do this programmatically.
Another solution would be:
Using the OVR plugin and Hands prefab (for left and right), to check whether the rotation of each of the fingers on a specific axis (ex. Z-Axis) falls under a specific range-meaning fingers are grasping/flexed.
By referencing for example the Transform of b_l_index1 , which is the corresponding part of the Index Finger within the model, you can check its local roation every frame and trigger the event of grasping when all fingers are rotated to a specific angle. Subsequently, triggering the animation you need.

How do I make a UI button follow a 2d game object?

new to unity and have stumbled into a problem. I am unsure about how to make a UI button on a canvas follow a rigidbody game object in the 2D screen space. I want to make the button's x and y position match the x and y of the RB, so that in theory if I clicked on the RB it would activate the button. However, I am unsure how to implement such a thing into my project.
I tried using transform.position and just equating the ui button pos and RB pos together but it remained static. However, I did script the RB to move to a designated x point and only then did the button move as well despite the RB moving beforehand.
This is literally the example in the documentation for Camera.WorldToScreenPoint(). Attach this script to your button:
public class ObjectFollow : MonoBehaviour
{
public Transform Follow;
private Camera MainCamera;
// Start is called before the first frame update
void Start()
{
MainCamera = Camera.main;
}
// Update is called once per frame
void Update()
{
var screenPos = MainCamera.WorldToScreenPoint(Follow.position);
transform.position = screenPos;
}
}
I have an Idea To Implement that You can use a world space canvas and as a child of that object "that you mentioned has rigid body " now when it moves the canvas and its content will move watch this to
Understand Canvas
Now press on your object at hierarchy window Right Click then UI then Button
this will create new canvas and attach button to will look like this
hierarchy
Now Select Canvas and reset its width and height , set scale to small value 0.01 ,0.01 0.01 for example
then change its Rendering Mode to Space World and assign camera
you should insure that you have event system it should created automatically but events will never work if you don't have a one in hierarchy Window
canvas will like this
Canvas Image
Now You can Select this Button and add a function to call using onClick or you can pass it to another script to add this function dynamically like this
Add Event to UI Button From Script

What's the best way to trigger an animation from the input of another object?

I'm new to unity and c#.
When i air tap on the left sphere, the plane and the right sphere should appear.
I figured that an OnPointerClicked event on the left sphere the animation of the plane and right sphere triggers (with an if/else statement).
Is this the way to go?
If so, how do i do that?
Or is there an easier method?
Thats the script of the left sphere:
Read this: IPointerClickHandler.OnPointerClick
You need to implement the interface, i.e. IPointerClickHandler in your MonoBehaviour class. When you click on the object in game, it will send a click event to EventSystem, which triggers the function body you have in your OnPointerClick method.
So as the doc mentioned, also make sure you have an EventSyetem (an empty game object with the script attached, or attach the component to your main camera).

Unity3d with vuforia showing 2d image when targed is detected

I have a question about the way how to show simple 2d image on top of detected marker. I have followed some tutorial to show 3d model and it works fine. there is no problem with 3d.
The problem starts when I want to add normal 2d object->sprite . When I add simple sprite I can't add texture and when I insert UI image it's added together with canvas and it is not showed when target is
detected. The original image on editor is placed then so far that it's difficult to find it.
I would be grateful if somebody can highlight me the right direction.
I need to make this image touch sensitive like a button. Clicking into it must show new scene ( I have it but under GUI.Button). The best way is to replace original marker but I can also make new sprite bigger to hide marker under it.
To help understand the answer, here's a quick rundown on how Vuforia handles marker detection. If you take a look at the DefaultTrackableEventHandler script that's attached to the ImageTarget prefab, you'll see that there are events that fire when the when the tracking system finds or loses an Image.
These are OnTrackingFound (line 67) & OnTrackingLost (line 88) in DefaultTrackableEventHandler.cs
If you want to show a Sprite when tracking, all you need to do is put the Image Target prefab (or any other) and make the Sprite a child of the prefab. The enabling and disabling should happen automatically.
However, in case you want to do something more, here's some edited code.
DefaultTrackableEventHandler.cs
//Assign this in the inspector. This is the GameObject that
//has a SpriteRenderer and Collider2D component attached to it
public GameObject spriteGameObject ;
Add the below lines to OnTrackingFound
//Enable both the Sprite Renderer, and the Collider for the sprite
//upon Tracking Found. Note that you can change the type of
//collider to be more specific (such as BoxCollider2D)
spriteGameObject.GetComponent<SpriteRenderer>().enabled = true;
spriteGameObject.GetComponent<Collider2D>().enabled = true;
//EDIT 1
//Have the sprite inherit the position and rotation of the Image
spriteGameObject.transform.position = transform.position;
spriteGameObject.transform.rotation = transform.rotation;
And the below to OnTrackingLost
//Disable both the Sprite Renderer, and the Collider for the sprite
//upon Tracking Lost.
spriteGameObject.GetComponent<SpriteRenderer>().enabled = false;
spriteGameObject.GetComponent<Collider2D>().enabled = false;
Next, your question about detecting clicks on this Sprite. Unity's Monobehaviour fires events for a lot of mouse events, such as OnMouseUp, OnMouseDown etc.
Link to Monobehaviour on Unity's API docs
What you will need is an event called OnMouseUpAsButton
Create a new script called HandleClicks.cs and add the below code to it. Attach this script as a component to the spriteGameObject that you assigned for the above.
public class HandleClicks : MonoBehaviour {
//Event fired when a collider receives a mouse down
//and mouse up event, like the interaction with a button
void OnMouseUpAsButton () {
//Do whatever you want to
Application.LoadLevel("myOtherLevel");
}
}