Handling GearVR Touchpad input? - unity3d

When I add an OVRPlayerController into a Unity3d scene and build and run the scene for the GearVR the built-in touchpad spins the camera around the vertical axis, which is redundant with head tracking. What do I need to change so that the touchpad instead allows the camera to move forward and backward, as if walking? Is there a thorough tutorial?

The Oculus SDK 0.4.3 comes with the support for the GearVR Samsung GamePad.
All you need to do:
import the SDK.
overwrite the projectsettings folder of your project with the one that comes with the SDK.
add the overplayercontroller to your scene.
add a gameobject below the overplayercontroller e.g. plane, quad... this will act like the ground (keep the player from falling)
add a collider to the gameobject e.g. mesh collider
then once you run it you will see that you can move around using the gamepad as well as turning the came around the vertical axis...
basically use any first person shooter tutorial for Unity3D and because the Oculus SDK comes with support for the gamepad you can quickly do this...
this link might help
https://www.youtube.com/watch?v=mbm9lPB5GPw

Related

How to make Unity OVR hand collide with objects?

I want to make the OVRCameraRig hands from the Oculus Integration SDK sample to collide with objects. I mean I want it to use realistic physics and have the hand stopped by a cube or any gameobject without going throught it to make it much more realistic.
I tried to use the "Hand Physics Capsules" built in script for the right and left hand, that puts colliders and rigidbodys on the hand but it does not works as I want it. This method just pushes away the cube but does not stops the vr hands from going throught the cube and I don't want the cube to do any movement.
Thank you all for the help in advance.
Unity version: 2021.3.15f1
Oculus Integration SDK: 46.0

Why does the camera in unity keep falling and losing track of player as if it had gravity enabled?

Hello I was trying to build my first 2D game using a video I saw on youtube. The thing is the camera keeps on moving downwards and away from the game character.
Create a new camera and make sure the view is on your player.
Go to Window -> Package Manager and search for Cinemachine.
Import Cinemachine Package.
Create new Cinemachine 2D camera and set it's Follow to your player gameObject.
More information would also help.

Oculus Headset detached while oculus attached with the pc

I am working on the Oculus project, The player character for my simulation in Unity. in which I have firstperson controller, I have created game object of player in which I put FPCamera as a child and character's body.
Issue: When I attach my oculus camera it detached from the body and with the Oculus headset movement, FPcamera act as a separate view from the body. the body does not rotate and remain static even though FPcamera is moving according to the headset. However it works fine if I disable oculus and move the character with the mouse, I can see my body and move left right everything with all animations.
I have the following link for the oculus controllers integrations in my project
https://assetstore.unity.com/packages/tools/integration/oculus-integration-82022 (Oculus integration)
here is a link which I have to achieve for my project, my FirstPerson should be like this in Oculus. you can see that the movement is with accuracy according to its headset movements
https://www.youtube.com/watch?v=7GpxsI-Tag
Note: I am using Unity 2017, there is no crash report in the project
First of all send the correct link for the video please.
When i create a new scene and I want to implementate the oculus player,cameras and hands I make this:
Find "OVRplayercontroller" prefab and drag to the scene
Find "CustomHandLeft" and "CustomHandRight" and drag to the scene
Go to the child object in OVRplayercontroller>OVRcameraRig>TrackingSpace
Then selec the 2 hands
And drag the TrackingSpace object to the "Parent transform" property in the OVRGrabber script in the 2 hands
Hope it helps you
You can use "OVRPlayercontroler"

Unity UI canvas not working with VR

I have been trying to get a very simple demo of a native Unity UI canvas working with VR.
I have read the oculus blog post here: https://developer3.oculus.com/blog/unitys-ui-system-in-vr/ but i need to use the native unity UI as i want to redistribute the code without license worries.I followed this tutorial https://unity3d.com/learn/tutorials/topics/virtual-reality/interaction-vr?playlist=22946 and downloaded the unity vr samples project from the asset store. In this they provide some scripts to place on the camera (VRInput and VREyeRaycaster) and some scripts to place on the target object (VRInteractiveItem and ExampleInteractiveItem).
When i apply the target scripts to a regular GameObject in the scene (e.g. a cube) the raycast works fine and the appropriate calls are made when fire1 is activated. When i try and do this for a canvas object (e.g. a button) - no hit is detected. I have tried placing the two target scripts (VRInteractiveItem and ExampleInteractiveItem) on the canvas, the image containing the button and the button itself and none work. What am i doing wrong? Why would it work on a regular gameobject and not on a UI canvas? I have made sure all my canvas elements have their raycast target boolean property ticked
EDIT:
It seems to work when i attach a box collider to the UI element, is this required? i thought it should just work with a GraphicsRaycaster attached. but the configuration below doesn't work (when box collider is disabled and graphics raycaster is enabled)
This is what is on my players camera:
I dont have a problem using box colliders if i have to but i wanted to take advantage of the UI buttons changes in highlighted and pressed color properties
In Unity raycasting works only with game objects having colliders. Raycast returns true when it hits a collider. Without colliders there is nothing the ray can hit.
Unity Physics.Raycast documentation
I believe, for anyone just seeing this for the first time, a potential reason it is not working is because the canvas from the above picture is using a "Graphics Raycaster" element and not an "OVR Raycaster" element. The OVR Raycaster is meant to replace the graphics raycaster to connect Oculus to Unity UI.
If you want to use the unity's UI in VR you might want to take a look at this asset: VRTK
There are some examples of VR UI using controllers or camera targeting.
Go to your canvas, you should have an option that is "Plane Distance" it's set to 100 , I change it to 0.5 and it works quite well.

FPS is fixed at a single position only

I am making a VR Game in Unity. But the problem is, after generating the apk and installing it in my phone, when I look through the cardboard my first person character is fixed at a single position only.
When I look at different directions, the fps arm remains at the same position, it doesn't rotate according to the direction I am facing.
I am using Unity Cardboard asset and I am working on Unity 5.
I've had a similar problem before, make sure that your model is a child of the Head Component, that way your model will be fixed to the head as it rotates.
EDIT
From the image you supplied in your question, you have the Unity Standard Asset FPS controller. This moves by mouse movement, which of course you cannot do on a phone. Because your arms are a child of the FPS Controller, they will only move if the mouse moves. Therefore you need to make your Arms a child of the Head component, like so: