I have been following Justin P Barnett's Unity VR tutorial using an Oculus Quest 2 (https://www.youtube.com/watch?v=yxMzAw2Sg5w) and have reached timestamp 14:32. Although I can move the headset around and view my world, I am unable to control the red lines that should be raycasting from my controllers (as it appears in the video). The lines are instead just coming straight out of the floor. How can I fix this? Thanks so much for your help!
Related
I'm working in unity(2018) and building for the HTC Vive VR headset. I had an idea to use the small camera on the front of the headset to make an AR system, as in run the video from the headsets camera to the headset view to then be able to overlap things from a unity environment. But unfortunately, I can't seem to find any examples of others doing this (other than the Tron blue outline system that the Vive comes with) though perhaps I'm not looking with the right keywords.
If anyone has seen something like this or know if it can be done I'd greatly appreciate it.
It is registered as a standard WebCam, so you should be able to use Unitys WebCamTexture.
But the resolution of the cameras is very low.
just begin to code/work on unity.
No big trouble to set some camera move or launch animation on standalone plateform.
But when I try Oculus go ... I want to get the same "on click' I've got on standalone plateform. Ideally I would make a gaze pointer timed, which avoid joystick.
First : Do you know how to display what you see on the Oculus on unity (https://www.youtube.com/watch?v=TektJroMwxY at 9:40 for example) ?
And then How could I simply implement a gaze pointer click ? I try the Oculus unity integration (https://developer.oculus.com/downloads/package/unity-integration/) UI scene but there is too many code to understand quickly everything.
Many thanks for your help,
Axel
First of all:
Interesting stuff is not quickly to learn
Anyway, you need to know that Oculus gaze point do not work as the typical mouse-click. You need to raycast from your camera to stuff (with attached colliders or build some system) to know if you are clicking on it.
Watch deeply on VrInteractiveItem class, as it's how oculus mainly works.
I havea game that uses ARCamera from Vuforia. When running in Unity the game works no problem. After I deploy the game to UWP, however, it seems that every thing has flipped. Almost as if I was looking at the game from the back. Even the writing is reversed. when deploying the game to android devices this does not happen. Does anyone have an idea how to solve this?
Thanks in advance
There is a ReflectionMode which flips the view horizonally. This is used when using a front facing camera.
If you look at the 'Vuforia Core Samples Example' from the asset store, they have a CameraManager script that shows how to handle camera configuration.
I have built a Unity3D + Google Tango based game on the NVidia Dev. device. Everything seems to work fine, but now I would like to play this game in stereoscopic view (For Dive Goggles). I looked at the ExperimentalVirtualReality example (https://github.com/googlesamples/tango-examples-unity/tree/master/UnityExamples/Assets/TangoExamples/ExperimentalVirtualReality) and was successfully able to port all the prefabs into my game, but for some reason the experience is not satisfactory.
The stereoscopic view of my game tends to over lap with each other when I look through the Dive goggles. The experience is a quite off.
I noticed that there are some public parameters on the TangoVR Player Object in Unity Project for 'IPD in MM', 'Screen Width in MM', 'Eye Offset in MM', etc. Do I have to play around with any of these. What does these values even represent?
Any help or pointers will be greatly helpful and appreciated.
IPD would be Inter-Pupillary Distance, while offset is the distance from your eye to the 'point of articulation' when you move your head.
This describes it (with pictures!): http://gamasutra.com/blogs/NickWhiting/20130611/194007/Integrating_the_Oculus_Rift_into_Unreal_Engine_4.php
I've found when trying to use cardboard lenses on devices with wider displays than the fov of the lenses you get an unsatisfactory experience.
This has to do with the lenses not being in the center of the frame, when focused at the display.
To circumnavigate this with larger devices you can push in the margins of the stereoscopic views. For the tango, with testing standard cardboard lenses I found that things work nicely if they were pushed in about an inch. The apps on the play store, Tango Mini Town and Tango Mini Village do a nice job of demonstrating this work around.
The ideal way to get this working would be with google cardboard and a proper tango tablet 7 inch view controller, but currently the cardboard app is incompatible with the tango. Fingers crossed for cardboard support.
As far as simply playing around with an optimal view points in unity, one can modify the view port rect on the stereo camera inspector menu in unity to get the ideal experience for a specific device with what ever controller you choose.
Thanks for all those who helped answer this. Many of my concepts definitely got cleared but nothing got me close to an actual solution. After researching a lot, I finally found this article (http://www.talkingquickly.co.uk/2014/11/google-cardboard-unity-tutorial/) super useful. it basically tells me to implement the Durovis SDK (https://www.durovis.com/sdk.html) with its Unity package.
Everything was pretty straightforward and experience I got from it was so far the best.
When I add an OVRPlayerController into a Unity3d scene and build and run the scene for the GearVR the built-in touchpad spins the camera around the vertical axis, which is redundant with head tracking. What do I need to change so that the touchpad instead allows the camera to move forward and backward, as if walking? Is there a thorough tutorial?
The Oculus SDK 0.4.3 comes with the support for the GearVR Samsung GamePad.
All you need to do:
import the SDK.
overwrite the projectsettings folder of your project with the one that comes with the SDK.
add the overplayercontroller to your scene.
add a gameobject below the overplayercontroller e.g. plane, quad... this will act like the ground (keep the player from falling)
add a collider to the gameobject e.g. mesh collider
then once you run it you will see that you can move around using the gamepad as well as turning the came around the vertical axis...
basically use any first person shooter tutorial for Unity3D and because the Oculus SDK comes with support for the gamepad you can quickly do this...
this link might help
https://www.youtube.com/watch?v=mbm9lPB5GPw