In the project I have added Player Movement with Joystick for both Right Controller and Left Controller.
There are objects for which XRGrabIntractable Script attached so that they are Grabbable when Ray is Pointed and Triggered.
when I use Trigger button and Pick up one of the object it is Grabbed.
But when I use the Joystick to move the player Grabbed Object in Hand also moves and Rotates.
Is it possible to disable the movement of the Grabbed Object when we use Joystick for player movement?
Unity Version: 2021.3.1f
Device: Pico Nio 3
XR Interaction Toolkit Version: 2.2.0
Checking to see if there is any default setup to disable movement of Grabbed Object with joystick.
Thanks in advance
Yes you have to uncheck the Anchor control checkbox. See picture below.
Related
Is there a way to share each player's raycast pointer/cursor with the other players in realtime? I'm using PUN2 and MRTK2 and Hololens 2.
Approaches I've tried so far:
I tried a naïve approach of modifying the MRTK provided ShellHandRayPointer to contain a PhotonView and then using that on the MixedRealityToolkit object within the scene, but that seems to have no effect.
I've tried creating a cursor prefab based off of MRTK's CursorFocus in which I add a Surface Magnetism component (tracking the hands) and then instantiate this prefab for each player in PUN2's OnJoinedRoom callback. After the instantiation call, I add the object to a non-rendered layer for the player with the goal of hiding it for the local player but allowing it show up for other players. This seems to hide the object as expected when only one player is in the room, but when a second player joins, the first player then sees a cursor show up that tracks with their hand movement, which seems unexpected to me (of note is that I'm using one Hololens 2 headset with a computer acting as the second player). Though perhaps this "crossed" behavior is due to the Surface Magnetism component?
Thanks!
Step-by-step images of how I modified the ShellHandRayPointer with a PhotonView and then reattached to the MRTK system:
scene:
MRTK system:
MRTK system part 2: reference to cloned ShellHandRayPointer:
my cloned ShellHandRayPointer part 1:
PhotonView components expanded on the cloned ShellHandRayPointer:
Regarding how to share objects in Photon in real time, as far as I know, the real-time shared objects in Photon need to be instantiated by PhotonNetwork.Instantiate, but ShellHandRayPointer in MRTK is instantiated by Input system.
You can customize a copy of ShellHandRayPointer, map the position and rotation of ShellHandRayPointer to the copy at runtime and share this copy in Photon in real time.
The position and rotation of ShellHandRayPointer can be obtained in MixedRealityToolkit.InputSystem.DetectedInputSources. Or you can use Unity's methods to get this Game Object directly.
For the cursor, you can use the same method above to create a copy of the cursor and map its position and rotation.
I want to implement a graphic raycaster/ laserpointer to the left Oculus controller, so I can interact with UI buttons in Unity.
I have seen a lot of tutorials etc. but nothing has helped.
I want a laserbeam or laserpointer/graphic raycast to shoot out from Oculus controller when a button os pressed on the controller. I need the laserbeam to interact with UI buttons in Unity.
You can use a normal raycast
I recomend you make this:
Create a script on your hand,and add a component called line renderer
In the script attach the lineRenderer component
make a simple raycast hit
get the position of the hit object
set the first position of the actual hand and the second to the hit object like this:
lineRenderer.SetPosition(0,transform.position);
lineRenderer.SetPosition(1,hitObject.transform.position);
And it draw a line from your hand to the hit object, remember to change the lineRender parameter to make a beautiful line
hope it helps
I created a project that uses unity's event system. It is a laser pointer which you can interact with unity's UI and 3d objects in the scene. If you want to check it out here's the link: https://github.com/balataca/oculus-laser-pointer
I am working on the Oculus project, The player character for my simulation in Unity. in which I have firstperson controller, I have created game object of player in which I put FPCamera as a child and character's body.
Issue: When I attach my oculus camera it detached from the body and with the Oculus headset movement, FPcamera act as a separate view from the body. the body does not rotate and remain static even though FPcamera is moving according to the headset. However it works fine if I disable oculus and move the character with the mouse, I can see my body and move left right everything with all animations.
I have the following link for the oculus controllers integrations in my project
https://assetstore.unity.com/packages/tools/integration/oculus-integration-82022 (Oculus integration)
here is a link which I have to achieve for my project, my FirstPerson should be like this in Oculus. you can see that the movement is with accuracy according to its headset movements
https://www.youtube.com/watch?v=7GpxsI-Tag
Note: I am using Unity 2017, there is no crash report in the project
First of all send the correct link for the video please.
When i create a new scene and I want to implementate the oculus player,cameras and hands I make this:
Find "OVRplayercontroller" prefab and drag to the scene
Find "CustomHandLeft" and "CustomHandRight" and drag to the scene
Go to the child object in OVRplayercontroller>OVRcameraRig>TrackingSpace
Then selec the 2 hands
And drag the TrackingSpace object to the "Parent transform" property in the OVRGrabber script in the 2 hands
Hope it helps you
You can use "OVRPlayercontroler"
I have a eventsystem with 2 inputmodules (gaze (for cardboard) and touch). The gazeinput is above de touchinput, so that is going to be used by unity as main inputmodule. Now i have 1 object that i want to trigger on touchinput, but that is not working because of the gazeinput. So my question is if it is possible to disable the gazeinput just for this 1 gameobject?
EDIT: the object is a menu button, located in the bottom-right corner. It moves with the camera.
thanks
use raycast. when your camera is seeing the gameobject on which you want to use the touch input. detect that the user is looking at the camera via raycast.
Attach a script (to the camera or an empty gameobject) that has the reference of both the input modules
When the player looks at the object(detected through raycast) in which you want to use the touch input. Simply disable the gazeinput.
And when the player looks away from that gameObject enable the gazeinput
When I add an OVRPlayerController into a Unity3d scene and build and run the scene for the GearVR the built-in touchpad spins the camera around the vertical axis, which is redundant with head tracking. What do I need to change so that the touchpad instead allows the camera to move forward and backward, as if walking? Is there a thorough tutorial?
The Oculus SDK 0.4.3 comes with the support for the GearVR Samsung GamePad.
All you need to do:
import the SDK.
overwrite the projectsettings folder of your project with the one that comes with the SDK.
add the overplayercontroller to your scene.
add a gameobject below the overplayercontroller e.g. plane, quad... this will act like the ground (keep the player from falling)
add a collider to the gameobject e.g. mesh collider
then once you run it you will see that you can move around using the gamepad as well as turning the came around the vertical axis...
basically use any first person shooter tutorial for Unity3D and because the Oculus SDK comes with support for the gamepad you can quickly do this...
this link might help
https://www.youtube.com/watch?v=mbm9lPB5GPw