Kinematic based world being messed with marker movement - unity3d

Kinematic based world being messed on movement
Hello, I have been developing a humble AR-based game in Unity3d. Until this point, I have been using Vuforia to deploy my scene on a (multi)tracker. However, I have been doing tests with Kudan and I´m quite happy with its tracking performance when using a tracker.
http://i.imgur.com/nTHs6cM.png
My engine is based on collisions by raycasts and not "UnityEngine.Physics" (Almost Everything is Kinematic). I have stumbled into a problem when I deploy my 3d environment on a tracker using the Kudan engine, my whole physics get messed up. If the marker is moved the elements move with it but the axis seem to change with marker but my physics seem to respond to my old axis orientation. My characters are always standing upward in the world Y axis (not the local inside the tracker). Another issue is that my player 3D asset keeps switching between "standing" and "falling" status and eventually clipping and falling through the floor (this is probably due to the jitter in the camera detection).
http://i.imgur.com/ROn4uEz.png
One of the solutions that come to mind is to use a local coordinate system but I hope that there is an alternative solution since when I was using Vuforia I did not have to do any further corrections.
Any links or feedback are appreciated.

You could use transform.InverseTransformPoint and transform.InverseTransformDirection combined with Quaternion.LookDirection to get the position and rotation of the kudan camera relative to the MarkerTransformDriver object. This will allow you to position a camera in world space and keep whatever content you want to augment static at the unity3d world origin.
cameraPos = markerTransform.InverseTransformPoint(kudanCamera.position);
cameraRot = Quaternion.LookRotation(markerTransform.InverseTransformDirection (kudanCamera.transform.forward));

Related

Is there a way to add dynamic clipping to cameras besides the scene view camera?

I have recently begun to develop an XR experience within Unity. Everything looked fine in the scene viewer, but when I ran it on the headset there was horrendous z-fighting. I learned that a fix for this is to raise the camera's minimum clipping plane to .3-.7 and dropping the max to about 1000. However, this has the issue of clipping into objects that are only within about half a meter of the camera. I noticed the scene camera has no z-fighting at all and I wonder if that has to do with its dynamic clipping ability or not and how I can re-create this effect within my project.

Can Vuforia track spatial location when using targetless device tracking?

I am trying to wrap my head around Vuforia's capabilities. I want to make an app which lets me place a 3D object into a camera view and have that 3D object stick to the world. I've been learning how to use Vuforia in Unity3D, and Vuforia seems to be slightly capable of this, but is severely limited by its craving for "Targets". It doesn't seem to be able to do much if I don't give it some sort of target.
One workaround I've found is to set the ARCamera's World Center Mode to DEVICE_TRACKING. This seems to let me place a 3D object into the world (in Unity) and have this object overlay into the camera feed, almost making it seem like it's anhcored to the real world. This doesn't work perfectly though: it tracks properly when I angle the device up/down/left/right (rotation), but it does not seem to track the device's translational motion; that is, when I move the device forward/back/left/right, the overlaid object doesn't get closer/farther nor does it rotate as I move around it.
Is it possible to get this sort of tracking out of Vuforia, or am I better off switching to something like Google Tango?
The difficulty with setting World Center Mode to CAMERA in Vuforia is that apparently 3D objects rotate around the camera based on its accelerometer/gyroscope changes. This doesn't allow for objects to be anchored to the environment. Instead they follow with the camera.
Kudan is a good markerless tracking option.

Camera-Offset | Project Tango

I am developing an augmented reality app for Project Tango using Unity3d.
Since I want to have virtual object interact with the real world, I use the Meshing with Physics scene from the examples as my basis and placed the Tango AR Camera prefab inside of the Tango Delta Camera (at the relative position (0,0,0)).
I found out, that I have to rotate the AR Camera up by about 17deg, so the Dynamic mesh matches the room, however there is still a significant offset to the live preview from the camera.
I was wondering, if anyone who had to deal with this before could share his solution to aligning the Dynamic Mesh with the real world.
How can I align the virtual world with the camera image?
I'm having similar issues. It looks like this is related to a couple of previously-answered questions:
Point cloud rendered only partially
Point Cloud Unity example only renders points for the upper half of display
You need to take into account the color camera offset from the device origin, which requires you to get the color camera pose relative to the device. You can't do this directly, but you can get the device in the IMU frame, and also the color camera in the IMU frame, to work out the color camera in the device frame. The links above show example code.
You should be looking at something like (in unity coordinates) a (0.061, 0.004, -0.001) offset and a 13 degree rotation up around the x axis.
When I try to use the examples, I get broken rotations, so take these numbers with a pinch of salt. I'm also seeing small rotations around y and z, which don't match with what I'd expect.

unity3d - how to control the movement of the main Camera in Unity3d

I am trying to make an mobile application that contains AR(Augumented Reality)-Mode using Unity3D. So I have connected my mobile device with my unity3d program, and the camera works fine. But when move the mobile device, the main camera inside unity program does not move the same orbit that the mobile device moves. Does any one know how to change or control the orbit of the main Camera in unity3d?
This could be happening due to a number of reasons. It could be due to non centered pivots, or coordinate systems for example.
Could you please specify which AR system are you using? As a side note, at work we recently had a project involving Unity3d and Metaio and it was a nightmare to bend the system to do what we needed, specially when we needed to do a lot of object positioning based on the local coordinate system.
When you refer to the orbit of the camera, I imagine it could be that the pivot of the camera is somehow offset and the camera is rotating around that offset. Or maybe that the camera is a child of the actual Game Object that is controlled by the AR system, in which case this parent node acts as a pivot to the camera.
In the picture below you can see that the camera is away from that center point and when it rotates it does it based on that center point, in other words the camera always tries to look at that center point and it gives that feeling of "orbiting" when it moves.
Here's the link to the image (I can't post pictures yet on this forum -.- )
http://i.stack.imgur.com/fIcY2.png

AR Overlay Accuracy in Google Project Tango

I am experimenting with overlaying augmented reality objects over a pass-through image from the rear camera in Unity.
Has anyone experimented with overlaying objects with accurate tracking? I've tweaked the movement scale to get somewhat decent results but rotation is still not accurate and drift is a big issue.
I've had good luck with the augmented reality sample that ships with the latest tango. in my experience it does work the way you speculated where if you add items to the unity scene they are synced to motion detected by the device.
I believe the tracking and syncing function have improved since you asked this question originally because I've noticed an improvement since I got my tango devkit a month or so ago. there was an update a week or so later, with an immediate improvement.
I have found that some scenes track better than others, it seems to help for there to be additional scenery for it to track. in my workspace, a fairly cluttered apartment, it tracks well but in the neighboring identical apartment unit which is currently vacant and empty, it does not track as well. that could also be a product of the blinds hanging up in my unit that are not hanging up in the vacant unit, filtering out additional infrared.
I'm experimenting with placing 3D objects over the real time input from the Tango color camera.
One problem here is that the hardware color camera 'point' in a (strange) direction. I wasn't able to get the direction vector from the api until now. Your virtual camera for rendering the scene needs this rotation to render 3D objects properly.
There are augmented reality examples of Tango's Unity plugin:
https://developers.google.com/tango/apis/unity/unity-simple-ar
They solve this problem with a matrix that rotates the 3d camera.
It can be found in the Unity script "TangoARPoseController" (C#) that, when attached to a unity camera, rotates it so that it looks at the scene in the right direction. The matrix is obtained in the method "SetCameraExtrinsics" of that script.
Unfortunately, when I apply the matrix to my unity scene it does not produce a perfect overlay (actually it's quiet bad). But I have other sources of position input which may be the problem here.
However, until now I'm not sure if the matrix used in the examples is good enough for accurate ar overlays. Maybe it is just suitable for demonstration purposes. But it should be a good starting point for further investigation.
Are we talking about displaying the 'webcam' in the background as opposed to a skybox ?
Take a look at my GhostHunter repo. It includes a shader and a script for displaying the rear facing camera 'behind' the gameplay objects (like the skybox). It should be useable with Tango and it is better than the 'display on a mesh' technique I`ve seen others used.
https://github.com/NVentimiglia/Augmented-Reality-Ghost-Hunter