label text on Controller Unity VR - unity3d

I want to add label of text on my controller of HTC VIVE in Unity like that but after many research on internet, I don't find how to do.
Someone can help me ?
thank you !

Steps to create text on controller :-
1.Drag camera rig prefab in your scene.
2.Attached Player script of stream VR plugin on camera rig.
3.Attached Hand script on both controllers.
4.Create a empty game object as a child of each controller ,set their local positions at (0,0,0) .Now add script ControllerButtonhints on both. Drag controllerbuttonhints material on controller material ,set any flash color,drag prefab controllertexthint from stream VR prefabs.
5.In Hand scripts drag other hand i.e. in left controller drag right controller and in right vice versa , set starting hand type is Any for both controller.,drag controller prefab as blank controller prefab in stream VR .
In player script ,drag camera rig to tracking origin transform, add camera eye as hmd transforms ,add both controllers in hands array,leave other things none and untick allowtoggleto2d also.
7.Last step is comment line no 279 -288 as we don't required for our current purpose.
Although my English is not so good but may be it help you.
Thanks,

Related

Using a Rig Builder and Nav Mesh Agent Unity

I am creating the enemy AI for my game and when trying to set up a rig builder to help position things like hands/head to grab the weapon/face the player. The problem is that this seems to make the nav mesh agent stop working. This means my enemy can't move event though its applying the animation (root motion turned off).
Anyone know a fix, Please Help.
Thanks!!

How do I implement a raycast / laserpointer to Oculus controller?

I want to implement a graphic raycaster/ laserpointer to the left Oculus controller, so I can interact with UI buttons in Unity.
I have seen a lot of tutorials etc. but nothing has helped.
I want a laserbeam or laserpointer/graphic raycast to shoot out from Oculus controller when a button os pressed on the controller. I need the laserbeam to interact with UI buttons in Unity.
You can use a normal raycast
I recomend you make this:
Create a script on your hand,and add a component called line renderer
In the script attach the lineRenderer component
make a simple raycast hit
get the position of the hit object
set the first position of the actual hand and the second to the hit object like this:
lineRenderer.SetPosition(0,transform.position);
lineRenderer.SetPosition(1,hitObject.transform.position);
And it draw a line from your hand to the hit object, remember to change the lineRender parameter to make a beautiful line
hope it helps
I created a project that uses unity's event system. It is a laser pointer which you can interact with unity's UI and 3d objects in the scene. If you want to check it out here's the link: https://github.com/balataca/oculus-laser-pointer

Oculus Headset detached while oculus attached with the pc

I am working on the Oculus project, The player character for my simulation in Unity. in which I have firstperson controller, I have created game object of player in which I put FPCamera as a child and character's body.
Issue: When I attach my oculus camera it detached from the body and with the Oculus headset movement, FPcamera act as a separate view from the body. the body does not rotate and remain static even though FPcamera is moving according to the headset. However it works fine if I disable oculus and move the character with the mouse, I can see my body and move left right everything with all animations.
I have the following link for the oculus controllers integrations in my project
https://assetstore.unity.com/packages/tools/integration/oculus-integration-82022 (Oculus integration)
here is a link which I have to achieve for my project, my FirstPerson should be like this in Oculus. you can see that the movement is with accuracy according to its headset movements
https://www.youtube.com/watch?v=7GpxsI-Tag
Note: I am using Unity 2017, there is no crash report in the project
First of all send the correct link for the video please.
When i create a new scene and I want to implementate the oculus player,cameras and hands I make this:
Find "OVRplayercontroller" prefab and drag to the scene
Find "CustomHandLeft" and "CustomHandRight" and drag to the scene
Go to the child object in OVRplayercontroller>OVRcameraRig>TrackingSpace
Then selec the 2 hands
And drag the TrackingSpace object to the "Parent transform" property in the OVRGrabber script in the 2 hands
Hope it helps you
You can use "OVRPlayercontroler"

Unity how to find Vive controller game objects?

New to VR development and relatively new to Unity. Anyone know a good/best practice way to find the game object for the left and right controllers?
You can find them in the Vive SDK package for Unity.
It does with some sample scenes and controller models are included.
If you want to know their destination path- just pause the game once SteamVR is up and running, select left or right controller and check model's mesh renderer component - it will point you directly to the place in project where mesh is saved.

SmoothFollow camera in VR

In a test scene with a locomotion character, when I attach a SmoothFollow script to the character, it works as it should, but when I use Oculus Rift to view the scene in VR, it no longer follows the character as it walks...
I am aware that the camera transform is over-ridden with the head-tracked pose, and that if I want to move the camera, I must attach it as a child to another game object and then move the root game object, but doing so still would not let me follow the character in VR.
Am I missing something, or is it not possible to have this in Oculus Rift where you can just make the character walk and you automatically follow it?
It sounds like you are close. Did you attach the SmoothFollow script to the new GameObject (parent of the camera) rather than the camera itself?
Also, you may want to comment out the part of SmoothFollow where it sets the rotation of the camera - this can be very disorienting in VR.