Can you use the headset camera on the HTC Vive in Unity - unity3d

I'm working in unity(2018) and building for the HTC Vive VR headset. I had an idea to use the small camera on the front of the headset to make an AR system, as in run the video from the headsets camera to the headset view to then be able to overlap things from a unity environment. But unfortunately, I can't seem to find any examples of others doing this (other than the Tron blue outline system that the Vive comes with) though perhaps I'm not looking with the right keywords.
If anyone has seen something like this or know if it can be done I'd greatly appreciate it.

It is registered as a standard WebCam, so you should be able to use Unitys WebCamTexture.
But the resolution of the cameras is very low.

Related

Augmented Reality with Unity ARFoundation on mobile device front-facing camera

Does anyone know if the Unity library ARFoundation works with the front-facing camera on a mobile device? I am having trouble finding recent answers online
Thanks
You can go to the AR Camera -> AR Camera Manager and change the CameraFacingDirection to User.
Now the mobile will use the Front camera if the feature works with front camera. If not, it will enable the back camera.
Disclosure: Lot's of features like Image 2D Tracking and Body Tracking can't work with front camera :(

implementing Android AR with UVC / External camera (Unity)

I'm working on an AR project (Unity) and I want to use an external camera instead of my Android's original one. I saw that Vuforia has such a feature - but claims that ny using that, Ground Plane detection wouldn't work at all and ModelTargets performances taking a hit.
I also saw EasyAR has CustomCamera and Camera2 lib in ARCore.
Question is: What's the best way to approach this? has anyone experienced using an external camera? and with what AR solution? (ARFoundation / Vuforia / EasyAR...).
2nd Question: What should I look for when buying said UVC? Any examples for one?
Also I'd like to hear about experiences with AR solutions regardless of the external camera thing.
Thanks in advance!
Unfortunately, this is unlikely to work with an external camera.
A key part of AR is having a precise calibration of the camera's optics. Without that, it's not possible to accurately analyze the world to draw new objects in it or other AR effects.
A UVC webcam doesn't come with any such calibration information. So it would have to be calibrated somehow, and the calibration information given to Unity's AR engine. I don't know if that's possible with Unity in some way.
Note that not that all internal camera devices on Android are calibrated enough for AR either, but the ARCore team is certifying devices that have sufficient calibration in place.

Unity game:HTC-Vive And mouse?

How do you make a unity3D game who support both HTC-Vive headset and mouse controller?
I'm developing a small VR demo for an event with a team, for the controller HTC Vive (steam)
the issue is, we have only one headset for eleven people.
a solution would be to be able to use the mouse instead of the head.
GoogleVR allow to do it(when you press Alt in dev mode), but it don't work with Vive.
Any idea of how and why?
Any plugin who support both, vive and the mouse/waspd?
Well, you can make control with mouse like in first person games. Track your mouse movement and rotate your 'VR' camera.

How to detect a circle using SteamVR SDK?

I am involved in a virtual-reality project using the HTC Vive device, Unity and the SteamVR SDK used to communicate with the Vive.
Thanks to the joysticks, the final user must draw some shapes (for example a circle) and the movements begin when he presses a joysticks' button.
From all the generated data (output from the joysticks), how could I detect a circle ?
Do you have some documentation on this ?
Please correct me if I understand your concern incorrectly here:
You use joysticks to draw some shapes like circles in some apps like steamvr home,
and you want to detect what you have drawn using software. And maybe you want to show the result in real time in screen or save to a file.
That means you need the ability to get rendered images, and detect image content using algorithms like deep learning.
HTC Vive device are compatible with openVR SDK:
https://github.com/ValveSoftware/openvr
You can use openVR SDK to DIY a steamVR driver,and get images in real time using direct mode component in the SDK. It has lot of works to do even before adding the detection algorithm, because you need a steamvr driver that can used to execute steamVR.

Making my Unity Game with Stereoscopic View (VR)

I have built a Unity3D + Google Tango based game on the NVidia Dev. device. Everything seems to work fine, but now I would like to play this game in stereoscopic view (For Dive Goggles). I looked at the ExperimentalVirtualReality example (https://github.com/googlesamples/tango-examples-unity/tree/master/UnityExamples/Assets/TangoExamples/ExperimentalVirtualReality) and was successfully able to port all the prefabs into my game, but for some reason the experience is not satisfactory.
The stereoscopic view of my game tends to over lap with each other when I look through the Dive goggles. The experience is a quite off.
I noticed that there are some public parameters on the TangoVR Player Object in Unity Project for 'IPD in MM', 'Screen Width in MM', 'Eye Offset in MM', etc. Do I have to play around with any of these. What does these values even represent?
Any help or pointers will be greatly helpful and appreciated.
IPD would be Inter-Pupillary Distance, while offset is the distance from your eye to the 'point of articulation' when you move your head.
This describes it (with pictures!): http://gamasutra.com/blogs/NickWhiting/20130611/194007/Integrating_the_Oculus_Rift_into_Unreal_Engine_4.php
I've found when trying to use cardboard lenses on devices with wider displays than the fov of the lenses you get an unsatisfactory experience.
This has to do with the lenses not being in the center of the frame, when focused at the display.
To circumnavigate this with larger devices you can push in the margins of the stereoscopic views. For the tango, with testing standard cardboard lenses I found that things work nicely if they were pushed in about an inch. The apps on the play store, Tango Mini Town and Tango Mini Village do a nice job of demonstrating this work around.
The ideal way to get this working would be with google cardboard and a proper tango tablet 7 inch view controller, but currently the cardboard app is incompatible with the tango. Fingers crossed for cardboard support.
As far as simply playing around with an optimal view points in unity, one can modify the view port rect on the stereo camera inspector menu in unity to get the ideal experience for a specific device with what ever controller you choose.
Thanks for all those who helped answer this. Many of my concepts definitely got cleared but nothing got me close to an actual solution. After researching a lot, I finally found this article (http://www.talkingquickly.co.uk/2014/11/google-cardboard-unity-tutorial/) super useful. it basically tells me to implement the Durovis SDK (https://www.durovis.com/sdk.html) with its Unity package.
Everything was pretty straightforward and experience I got from it was so far the best.