Align HTC Vive Coordinate with OptiTrack System in Unity - unity3d

I am working on aligning HTC Vive Controller (for example, right controller) with a rigid body marker which is tracked by Optitrack. Since the coordinate system of both systems are different how can I align these two systems? I am trying to move the rigid body marker similar to the right-hand controller of the htc vive.
Scenario:
I have a unity environment which is viewed using HTC Vive and now I want to have a rigid body marker which is tracked by Optitrack and have to align properly while I move the marker in the environment.
Any suggestions would be very helpful.
Thank you.

I am currently facing the same problem and have no solution yet, but some ideas:
1st idea match the centers
When you calibrate OptiTrack you set the center of your OptiTrack Space.
Maybe you can track the OptiTrack center point with the vive controller and then shift the vive's coordinate space appropriately.
I don't have an idea how to solve the rotation missmatch, maybe you have one?
2nd idea track some reference points with both systems
If you have the possibility to track 4 reference points in both systems, you should be able to define a transformation matrix from one vector space to the other.
For now, i have not tried the ideas, but i will soon.
Have you found a solution yet?

Related

HoloLens companion map

I am implementing a "companion map" for a HoloLens application using Unity and Visual Studio. My vision is for a small rectangular map to be affixed to the bottom right of the HoloLens view, and to follow the HoloLens user as they move about, much like the display of a video game.
At the moment my "map" is a .jpeg made into a material and put on an upright plane. Is there a way for me to affix the plane such that it is always in the bottom right of the user's view, as opposed to being fixed in the 3D space that the user moves through?
The Orbital Solver in MRTK can implement this idea without even writing any code. It can lock the map to a specified position and offset it from the player.
To use it what you need to do is:
Add Orbital Script Component to your companion map.
Modify the Local Offset and World Offset properties to keep the map in the bottom right of the user's view.
Modify the Orientation Type as Face Tracked Object.
Besides, the SolverExamples scene provided by the mrtkv2 SDK is an excellent outset to become familiar with Solver components

Get pixel-count in FoV inside VR-Sphere

Recently i made a application for HTC Vive users to view 360 degree videos. To have a point of reference, lets assume that this video had a resolution of FullHD (1920x1080). See the picture of a 3D model below for illustration:
The field of view of a HTC Vive is 110° vertically and 100° horizontally.
It would be okay to simplify it to a round FoV of 100°.
My question would be: How can i determine the amount of video-information inside my FoV?
Here is what i know so far:
You can create a sphere on paper and calculate its surface area by using the formulas for spherical caps. -> https://en.wikipedia.org/wiki/Spherical_cap
Also there seems to be a function for the UV-Mapping that is done by Unity (because this is done in Unity). That formula can be found here: https://en.wikipedia.org/wiki/UV_mapping
Any suggestions are welcomed!

Kinematic based world being messed with marker movement

Kinematic based world being messed on movement
Hello, I have been developing a humble AR-based game in Unity3d. Until this point, I have been using Vuforia to deploy my scene on a (multi)tracker. However, I have been doing tests with Kudan and I´m quite happy with its tracking performance when using a tracker.
http://i.imgur.com/nTHs6cM.png
My engine is based on collisions by raycasts and not "UnityEngine.Physics" (Almost Everything is Kinematic). I have stumbled into a problem when I deploy my 3d environment on a tracker using the Kudan engine, my whole physics get messed up. If the marker is moved the elements move with it but the axis seem to change with marker but my physics seem to respond to my old axis orientation. My characters are always standing upward in the world Y axis (not the local inside the tracker). Another issue is that my player 3D asset keeps switching between "standing" and "falling" status and eventually clipping and falling through the floor (this is probably due to the jitter in the camera detection).
http://i.imgur.com/ROn4uEz.png
One of the solutions that come to mind is to use a local coordinate system but I hope that there is an alternative solution since when I was using Vuforia I did not have to do any further corrections.
Any links or feedback are appreciated.
You could use transform.InverseTransformPoint and transform.InverseTransformDirection combined with Quaternion.LookDirection to get the position and rotation of the kudan camera relative to the MarkerTransformDriver object. This will allow you to position a camera in world space and keep whatever content you want to augment static at the unity3d world origin.
cameraPos = markerTransform.InverseTransformPoint(kudanCamera.position);
cameraRot = Quaternion.LookRotation(markerTransform.InverseTransformDirection (kudanCamera.transform.forward));

Normalize Vector3.Distance based on rotation

I am trying to measure distance between multiple positions but I do not want the rotation to affect the distance. In concept, I want to track the starting transform and upon each update track the distance traveled without regard to the change in rotation. I am using an HTC Vive controller and people tend to rotate their hands and I want to control for this.
I've tried resetting the Eular Angles, but this doesn't seem to work.
Adding an Analogy that will certainly help.
Think of it like trying to draw and measure a line with a pencil, the position is in the eraser, and I can hold the pencil in any number of ways and in fact change the position in the middle of drawing the line, but my line will remain straight and the measurement will remain accurate.
Any help is appreciated.
I believe your problem lies around the position you are tracking. It sounds like you are tracking the transform.position of one of the child elements of the Vive controller model, leading to the situation that you're describing with the pencil eraser analogy.
Depending on where your script is attached, you could either move this to the top level element of the Vive controller, or alter your script to instead track transform.parent.position, which shouldn't be affected by the rotations of someone's hand.

HTC Vive, Unity VR, change acceleration of controllers

I am doing some programming with the HTC Vive in Unity using the SteamVR plugin and I have a simple program where I am hitting boxes around. I now want to add another functionallity but after searching around for an hour I am no closer to figuring out how to approach this.
I want to change the acceleration of the controllers on a specific axis. For example, if I increased the positive Y-axis and flapped my arms up and down, the controllers would end up at head level at one point even though my arms are by my side. How do I do this? Is there a way to change the acceleration for the controllers?
If I can't actually change the acceleration of the controllers, is there a way to change the position of the controller permanently so it doesn't snap back? If so then I could achieve the same effect.
Thanks in advance.