how to detect that object in the space is currently stable or not and his position also.
which parameter gives us info about that or is there any function?
If its position is constant through time, then it's stable. If not - it's not stable.
Just check for body's velocity and, eventually, acceleration. If you want to be super accurate in checking if your body is static use:
if cpveql(body->v,cpvzero)
theyAreQualDoSomethingFunction();
However, as the documentation warns
Be careful when comparing floating
point numbers!
So you might be better of checking if the absolute values of body->v.x and body->v.y are smaller than some small precision value.
As mentioned earlier, to be super precise you should also check acceleration.
Related
Given a set of non-rotated AABB bounds, I'm hoping to create a simpler set of bounds from the original set, that allows for a specified amount of inaccuracy.
Some examples:
I'm working with this in Unity with Bounds, but it's just basic AABB comparison stuff, nothing Unity-specific. I figure someone must have worked out a system for this at some point in the past, but I had no luck searching around. Encapsulating bounds are easy but this is harder, since you can't just iterate through each bounds one by one. Sometimes a simpler solution can only be seen by looking at the whole thing.
Fast performance isn't critical but would be nice. Inaccuracy is OK in both directions (i.e. the bounds may cover a little less than the actual size or a little more). If it helps, I can expect all bounds in the original set to be connected somewhere - no free-floating pieces in a separate group.
I don't expect anyone to write up a whole system to solve this, I'm more hoping that it's already been solved or that maybe there's an obvious process to achieve it that I haven't thought of yet.
This sounds something that could be handled with Surface Area Heuristics (SAH). SAH is commonly used in ray tracing to build better tree like structures were the triangles are stored. There are multiple sources discussing it more. One good is Wald's thesis chapter 7.3.
The basic idea in the SAH built is to start with the whole space and divide it recursively. Division position is decided by sweeping through all reasonable positions and calculating surface area of both child nodes. The reasonable positions are the positions were any triangle has its upper or lower bound. After sweeping through all the candidates, the division with the smallest total surface area in the children is used.
If SAH is not a good idea for your application, you could use similar sweeping through all candidates, but calculate for example the extra space inside the AABBs.
I understand that most textures are normalized except GL_TEXTURE_RECTANGLE.
However, I can't find information on GL_TEXTURE_EXTERNAL_OES. Are the coordinates normalized or in the range of [0, imageWidth], [0, imageHeight]?
I would also appreciate if you mention where you got the information from. I couldn't find it on khronos website.
They use normalized texture coordinates. You can address them with texture coordinates in the range [0.0, 1.0]. While it might have been nice to point that out in the extension spec, they probably thought it was not necessary because it's just like all other textures in OpenGL ES.
Source: Tried it on a Kindle Fire HDX 7" tablet.
Like you I frustratingly couldn't quickly find a definitive statement. However...
The extension documentation for OES_EGL_image_external mentions both that:
Their default min filter is LINEAR. It is an INVALID_ENUM error to set the min filter value to anything other than LINEAR or NEAREST.
And:
The default s and t wrap modes are CLAMP_TO_EDGE and it is an
INVALID_ENUM error to set the wrap mode to any other value.
Which are pretty clear clues that coordinates aren't normalised if you're used to dealing with non-power-of-two textures. Indeed the whole tenor of the extension — that one to three hardware sampling units may be used, that some varyings may be lost and that only a single level-of-detail is permitted — strongly reserves the right for an implementation to do the exact same thing as if you'd sampled Y, U and V separately from non-power-of-two sources and combined them arithmetically yourself.
But in terms of providing a thorough finger-on-paper answer: CLAMP_TO_EDGE is defined by the appropriate man page as:
GL_CLAMP_TO_EDGE causes coordinates to be clamped to the range (1/2N, 1 - 1/2N), where N is the size of the texture in the direction of
clamping.
... which, again, makes little sense if coordinates were normalised (though it wouldn't actually be undefined).
So I'm willing to gamble strongly that they're not normalised.
And the reverse. It would not be difficult to write my own. But I'd rather use Apple's if there is one.
There is nothing that does this, they are inherently different mathematical ideas.
A size is an absolute measure of some quantifiable quality of an object, a point is an exact location in a coordinate space relative to a particular frame of reference.
I'm trying to use the accelerometer to move a UIImage. I works well but my problem is that with my code
self.character.center = CGPointMake(160+acceleration.x*175, 230-acceleration.y*175);
my picture moves even on a stable surface because of the precision of the acceleration.x value. So I decided to use a workaround by multiplying it with a value, casting it to an INT and then dividing it and cast it to a float (i.e i just remove some numbers after the coma)
self.character.center = CGPointMake(160+(float)((int)((acceleration.x*100000))/100000)*175, 230-(float)((int)((acceleration.y*100000))/100000)*175);
But after i use this code, my little picture isn't moving anymore.
So my question is : do you know why it doesn't work anymore ?
Is there a proper way to remove numbers after the coma in a float ?
Thanks a lot
Fred.
Instead of trying to remove decimals after the comma, you should better use a low pass filter. A lowpass filter will let only pass changes to your acceleration that happen below a certain cutoff frequency. Therefore, it will keep steady changes to the acceleration but remove fluctations and jitter with very high frequencies.
Wikipedia has a good explanation how a simple RC Lowpass filter works and shows a possible implementation. Apple shows a similar implementation in the AccelerometerGraph sample code.
I want to create an application could detect the number of spin when user rotates the iPhone device. Currently, I am using the Compass API to get the angle and try many ways to detect spin. Below is the list of solutions that I've tried:
1/ Create 2 angle traps (piece on the full round) on the full round to detect whether the angle we get from compass passed them or not.
2/ Sum all angle distance between times that the compass is updated (in updateHeading function). Let try to divide the sum angle to 360 => we could get the spin number
The problem is: when the phone is rotated too fast, the compass cannot catch up with the speed of the phone, and it returns to us the angle with latest time (not continuously as in the real rotation).
We also try to use accelerometer to detect spin. However, this way cannot work when you rotate the phone on a flat plane.
If you have any solution or experience on this issue, please help me.
Thanks so much.
The iPhone4 contains a MEMS gyrocompass, so that's the most direct route.
As you've noticed, the magnetometer has sluggish response. This can be reduced by using an anticipatory algorithm that uses the sluggishness to make an educated guess about what the current direction really is.
First, you need to determine the actual performance of the sensor. To do this, you need to rotate it at a precise rate at each of several rotational speeds, and record the compass behavior. The rotational platform should have a way to read the instantaneous position.
At slower speeds, you will see a varying degree of fixed lag. As the speed increases, the lag will grow until it approaches 180 degrees, at which point the compass will suddenly flip. At higher speeds, all you will see is flipping, though it may appear to not flip when the flips repeat at the same value. At some of these higher speeds, the compass may appear to rotate backwards, opposite to the direction of rotation.
Getting a rotational table can be a hassle, and ensuring it doesn't affect the local magnetic field (making the compass useless) is a challenge. The ideal table will be made of aluminum, and if you need to use a steel table (most common), you will need to mount the phone on a non-magnetic platform to get it as far away from the steel as possible.
A local machine shop will be a good place to start: CNC machines are easily capable of doing what is needed.
Once you get the compass performance data, you will need to build a model of the observed readings vs. the actual orientation and rotational rate. Invert the model and apply it to the readings to obtain a guess of the actual readings.
A simple algorithm implementation will be to keep a history of the readings, and keep a list of the difference between sequential readings. Since we know there is compass lag, when a difference value is non-zero, we will know the current value has some degree of inaccuracy due to lag.
The next step is to create a list of 'corrected' readings, where the know lag of the prior actual values is used to generate an updated value that is used to create an updated value that is added to the last value in the 'corrected' list, and is stored as the newest value.
When the cumulative correction (the difference between the latest values in the actual and corrected list exceed 360 degrees, that means we basically don't know where the compass is pointing. Hopefully, that point won't be reached, since most rotational motion should generally be for a fairly short duration.
However, since your goal is only to count rotations, you will be off by less than a full rotation until the accumulated error reaches a substantially higher value. I'm not sure what this value will be, since it depends on both the actual compass lag and the actual rate of rotation. But if you care only about a small number of rotations (5 or so), you should be able to obtain usable results.
You could use the velocity of the acceleration to determine how fast the phone is spinning and use that to fill in the blanks until the phone has stopped, at which point you could query the compass again.
If you're using an iPhone 4, the problem has been solved and you can use Core Motion to get rotational data.
For earlier devices, I think an interesting approach would be to try to detect wobbling as the device rotates, using UIAccelerometer on a very fine reporting interval. You might be able to get some reasonable patterns detected from the motion at right angles to the plane of rotation.