How to uniform the reading data from the gyroscope sensor from different devices? - unity3d

I'm developing an app based on AR and orientation of the device, the problem is that different devices give me different data from the gyroscope. Probably because the different hardware, but at this point there is a system to uniforming the results?
I also have the problem of working on Unity and not on Android Studio or Xcode, so the methods I can use are limited.
I'm using the Input.gyro.attitude to get the data and the results are not the same. Any suggestion?
Quaternion direction = Input.gyro.attitude;

Related

Unity. Move player when mobile moves (android VR)

i'm developing VR using google cardboard SDK..
i want to move on virtual environment when i walk on real world, like this : https://www.youtube.com/watch?v=sZG5__Z9pzs&feature=youtu.be&t=48
is it possible to make VR application like that for android...? maybe using accelerometer sensor ? how can i implement this using unity...?
i try to record accelerometer sensor while i walk with smartphone, here are the result : https://www.youtube.com/watch?v=ltPwS7-3nOI [i think the accelerometer value is so random -___- ]
Actually it is not possible with only mobile:
You're up against a fundamental limitation of the humble IMU (the primary motion sensor in a smartphone).
I won't go into detail, but basically you need an external reference frame when trying to extract positional data from acceleration data. This is the topic of a lot of research right now, and it's why VR headsets that track position like the Oculus Rift have external tracking cameras.
Unfortunately, what you're trying to do is impossible without using the camera on your phone to track visual features in the scene and use those as the external reference point, which is a hell of a task better suited to a lab full of computer vision experts.
One another possible but difficult way is:
This may be possible if you connect device to internet then watch it's position from satelite(google maps or something like that)but that is a very hard thing to do.

Does VR headsets using smartphones have its own sensors?

As far as i know, Samsung Gear VR is the only VR headset that has its own head-tracking sensors and sends sensor data to its mounted smartphone(galaxy series).
Am i getting it right?
From technical specifications of ZEISS VR One, it says
Tracking sensors : Internal tracking by smartphone sensors
Does this mean it has no in-built sensors?
Just like Google Cardboard, all the other low-cost VR headsets are dependent FULLY on smartphone sensors, right?
Cardboard-like headsets are just a way for your eyes to look at your phone in stereo mode, they don't provide any sensors (and very often not even an interaction button like the original google cardboard), these devices typically use your phone's gyroscope to track your orientation.

How to access Iphone camera using MATLAB Mobile

I have been searching for how to access the iphone camera using MATLAB and I have found it can be accessed by app called IP CAM with the use of a local network. Yet the solution of IP Cam program existed on apple store isn't working so well for my application because I'm trying to build a real time image capturing program using Iphone's camera and Matlab mobile with later processing (and this method keeps MATLAB busy as long as it display the scene and I still want MATLAB to run in the foreground instead of IP Cam).
So far I've downloaded MATLAB Mobile and the connector and connected the Iphone to MATLAB on my laptop, so is there any one who knows how to access the Iphone's camera on MATLAB Mobile and capture the image so that it can be stored on MATLAB workspace for later processing ? or is there any one who can suggest tutorials any piece of material helping me through this.I'd appreciate your answer very much and thank you in advance.
P.S: if there is a solution on android's devices it's also work for me.
I am not aware of iPhone based solutions but here is an opportunity for android based system. I will suggest you to used Sensor EX. I have used it for sometime to acquire Accelerometer, Gyroscope data set along with live images. This tool has bindings available for MATLAB beside other programming environments. Feel free to ask questions if you cannot figure out that how this system works.

iOS 3D indoor navigation application

what are the steps needed to create an indoor 3d navigation application. I have some auto cad files for a building and it would not be a problem to create a 3d model using 3dmax. Inertial sensors will be used for localization, bit After getting the model, how can I integrate it in iOS and create the visualization?
Depending on what your complete requirements are i believe it sounds like you do require openGL programming in order to create that 3D environment. And for navigation, i would suggest using the GPS in order to specify where you are located as opposed to inertial sensors. Or maybe a Mix of both so as to reduce your errors. I am guessing you want to be able to locate yourself in a building where GPS and wifi or 3G signals are not available. Just making use of inertial sensors would definitely be error prone.

Improving iPhone AR (Tool)Kit by using the Gyroscope

I'm using iPhone AR Kit and its fork, iPhone AR Toolkit, but I'm trying to improve the user experience by using the gyroscope when it's available.
For those of you who used the kits, do you have any idea on how to do this ? My first thought was to get the gyroscope yaw to get a more precise azimuth value.
So I have to questions :
Does anyone used the AR Kit linked above, and have thoughts on including gyroscope in it ?
Is it a good idea to mix gyroscope and compass data to get a more precise value of the azimuth ?
Gyroscopes measure rotational velocity, so the gyro output will be in change in yaw per second (e.g rad/s) rather than an absolute yaw. There are various methods for trying to use gyros for "dead reckoning" of orientation, but in practice while they're very accurate over the short term, integrating gyro read-outs to determine orientation "drifts" significantly, so you have to keep recalibrating against some absolute measure.
It would be very trivial to use the gyro to interpolate between compass readings, or calculate the bearing based on the gyro only for short fast motions while the compass catches up, but properly fusing the compass and gyro isn't trivial. There's a talk here on integrating sensor for Android that might be a good start. The standard method of fusing sensors is to use a Kalman Filter, there's an introduction here. They're fairly involved tools, you need a good model of your sensor errors for example.