I use Unity remote 5 to connect my phone to the Editor. I am reading the values Input.gyro.attitude.eulerAngles and am some-have confused. Because not matter how I orient my device, all angles change. I was hoping to get specific one that reflects the camera yaw rotation. Angle that would change only if I rotate my device like in image below and stay constant if I rotate it differently.
P.S. When I observe Debug.Log(gyroInp.attitude.eulerAngles), the Z angle keeps growing even phone stands still on table. From 276 to 350 and continues to grow. Is it because the earth is turning? :)
Unity Remote 5 (UR5) seems to be broken for reading gyro as indicated by this (more specific) issue in Unity's issue tracker: https://issuetracker.unity3d.com/issues/ios-13-dot-0-rotations-around-device-y-axis-does-not-work-when-rotating-a-device-and-using-remote
If I use the UR5 to read gyro attitude/acceleration in the editor it gives an initial value that is never updated but only repeated for each read after that. The actual build on my (iOS 14) device works as expected.
So the Unity Remote 5 is unfortunately broken for all things gyro, it may be an iOS specific issue.
Related
Firstly, I've built for the Quest and checked the Unity profiler - everything seems good.
The weird thing is, when I move my head from side to side I don't see any framerate drops in the profiler but I see a noticeable stutter/lag in the headset. When my head is still or even just rotating, everything looks great, there's no stutter at all.
This stutter only occurs when I'm close to my UI and moving my head. When my head is static or rotating, it's all good. If I move back from the UI, there's no stutter either.
I don't think that it's an issue with too complex geometry or materials, as surely this would show up as a framerate drop in the profiler. I'm guessing that the camera movement is causing some kind of frame mismatch which (for some weird reason) isn't showing up in the profiler.
Does anybody have any ideas as to what might be causing this?
Well, I found the issue after narrowing the issue down to a specific set of GameObjects. It seems that the issue was being caused by a custom 'Dissolve' shader I was using. After setting the shaders back to 'standard' the problem went away! Weird...
This happens if your game FPS is lower than the device refresh rate.
For example if the headset is displaying 90 frames per second and your game is only able to render 70 frames per second, the headset needs to reuse a rendered frame every few frames.
When the game doesn't provide a rendered frame in time the device will just take the last rendered "image" and adjust it to the changed headset position and rotation. If it's a rotation change only, you will not notice, because the headset can easily just rotate the last image. But if it's also movement, the device can't simulate the difference of movement of close objects versus far objects on the image, so if you are moving an object close to a camera (e.g. your hand), it will stay at the same position for two frames (every few frames), which will make it look like it's stuttering.
I am working on roulette(Casino Style game) game project in Unity3D.
I am rotating a ball around a wheel and wheel is also rotating on its own axis in fixed update.
I am using transform.RotateAround function to rotate the ball around the wheel and i am also decreasing ball's speed in fixed update.I am assigning a random initial speed to ball within a range such that it always stops on different position each time .
For testing purpose i kept the initial ball speed to constant and check it in unity editor such that every time it rotates it always stops on the same number.
I build this project to android and PC .Though the ball stops on the same number each time in both android and PC build but the result is different in both of them.
For example- Every time ball rotates it stops on number 8 in android and number 20 on Pc each time.
Can somebody please suggest me some ways to obtain same result on different devices?
Why it is happening? Is unity physics behaviour is different in different processor?
and please explain me how to fix that .
Unity has a fixed time step, so that isn't the cause of the differences as one might expect. Physics simulations are incredibly complex things, so I'm not going to pretend that I know exactly why you're seeing differences. However, I would imagine it is to do with floating point precision differences between your computer and a much smaller phone processor.
One way to test this would be to run the simulation on another computer, and compare the results to the both of your current devices.
I'm playing around with an app that overlays an image on top of an AV view, a sort-of-but-not-really AR app. You use it by holding the camera up in front of you like you're taking a picture of friends in front of you. Currently, it just displays a compass rose on a SceneKit object.
It basically works, but on my 5S and 6 and I'm finding it is:
laggy on startup, with the pointing vector being off by as much as 180 degrees
not terribly accurate, with errors even after its been running of something like 10 degrees
the error in (2) changes, so it's idea of "south" moves around over time
Ultimately I'm going to need the accuracy to be on the order of 5 degrees. Can anyone comment on whether this sort of accuracy is possible, if there's a way to test it, and perhaps comment on platform issues - is it better on the 7 for instance?
I remember from WWDC that there was a talk showing a teapot in OpenGL ES, which rotated with movement of device. It appeared like the teapot stood still in space.
When the app launched, the teapot started in a specific position. Then when device got rotated, the teapot rotated too to stand still in space.
At this talk, they mentioned that we must get the "reference frame" e.g. upon app launch, which tells us how the user initially held the device.
For instance here's the accelerometer axis:
I want to know rotation around Y axis, but relative to how the user holds the device.
So when the user holds it upright and rotates around Y, I need to know that rotation value.
I think the key is removing the gravity from the readings? Also I target iPhone 4 / 4S with gyros, but I think CoreMotion would sensor-fusion them automatically.
How could I figure out by how much the user rotated the device around the Y-axis?
From your other question Why is this CMDeviceMotionHandler never called by CoreMotion? I know that you working on iOS 4 - things have changed slightly in iOS5. In general gyro data or even better sensor fusion of accelerometer and gyro data as done in DeviceMotion is the best approach for getting proper results.
So if you got this up and running, you will need to work with CMAttitude's multiplyByInverseOfAttitude method to get all CMDeviceMotion results relative to your reference frame. Just store a reference of the very first CMAttitude in a class member and call multiplyByInverseOfAttitude with it on all subsequent calls. Then all members of CMDeviceMotion.attitude will refer to this reference frame.
For getting the rotation around Y axis, a first approach is to take Euler angles i.e. CMAttitude.roll. If you just need to track small motions this might be fine. If motions are more extensive, you will run into trouble regarding Gimbal Lock. Then you need advanced techniques like quaternion operation to get stable results, but this sounds like an own question.
I don't have an iPhone 4 with me right now and I am trying to find a documentation that shows the ranges of yaw, pitch and roll and the correspondent positions of the device.
I know that the accelerometer varies from -1 to +1 but on my tests yesterday on my iPhone, showed me that the roll varies from -M_PI to +M_PI, but yaw and pitch ranges are half of that. Is this correct?
Where do I find documentation about these ranges? I don't see it on Apple vague docs.
Thanks.
This is not a full answer, but in the interest of starting the ball rolling:
I'm assuming you are talking about the device attitude rather than the raw gyro data.
Anecdotally (I have an ipod touch 4 gen sitting in front of me displaying these values):
pitch: looks to be a range of -(M_PI/2) -> +(M_PI/2) although mine caps at ~ +1.55 / -1.51
roll: -M_PI -> +M_PI
yaw: -M_PI -> +M_PI
Just a note, at least on my device pitch doesn't differentiate tilt "forward" vs "backward", just gives the angle of the device relative to the direction of gravity. To figure out if the screen is pointed down or up, you can of course check gravity.z.
If you're using CMDeviceMotion there is a property called gravity on it, just grab gravity.z. It will be negative if the device's display is tilting upward (away from gravity) and positive if the display is facing down (toward gravity)
Note that the algorithms used by CMDeviceMotion are pretty good at separating gravity from user acceleration but under certain kinds of motion there may be some lag before the values become correct, I would love to here from someone with a better solution.
I have recently faced the same problem for an iOS app that counts the number of flips that the phone does. Apple has rejected it so I have published it on GitHub, may be useful for you:
Flip Your Phone! - https://github.com/apascual/flip-your-phone
I never thought on the solution using the gravity Z variable, I will try it soon and I come back with some updates.