How to count total number of current touches? - touch

Is it possible to know the total number of touches we currently have globally in an app? What I do now is a workaround: I have an integer, which get increased on every "onTouchBegan" and is decreased on every "End". But the problem is that this method is not exact. If you touch the screen in a chaotic manner, this "counter" skips some touch ends...

Why can't you use onTouchesBegan? Its argument is a vector of touches in the screen currently. (In case of multitouch)

In cocos2d-x Multitouch enabled app size of "std::vector &touches" is the number of touches.

Related

How to make UISlider stepless

I am wondering if there is a way to make a UISlider move smoothly like the one in music App. Without steps and jumps. Just move till the end. I am currently using CADisplayLink to update the slider, the only problem is that the slider just jumps to the next value even when animated is set to true. This looks bad with values under 2min.
I am a bit confused by your question, but if your slider is very erratic and jumps to values the user did not set, I would recommend increasing the size of the element on your view controller.
If that doesn't work, you could scale your minimumValue and maximumValue by a factor of ten and then divide by ten when actually doing calculations with your UISlider. What I mean by that is:
multiply min and max values by 10
Keep a variable of the actual value you want to use
Save your value and divide by 10 whenever the user sets a new value in the UISlider

How to obtain iPhone's heading when it is rotated fast

I am programming an iPhone app in ios 6 where I obtain the heading information and then using simple calculations, I convert that information to words, as "You are facing west". Now my problem is, when I rotate my phone from (say) north to east, phone will point out: 1. You are facing north-east (3-4 times) then at last when the phone settles, it says 2. You are facing east. The problem is I am converting all the commands to speech, and it turns out to be confusing and disturbing.
One way is to increase the value of heading filter, but I can't do that since I need precision of 20 degrees. I tried to use timestamps of headings, but no use. I tried usleep() but no use since the entire program stops. Is there any way I could tell phone to take information at 1.5 seconds interval from location manager, or can the location manager itself be programmed to check for heading after the phone's rotation has stabilized.
Update: How I failed in using timestamps ->
Suppose I move my phone fast my north to east. The difference in stamps look like this:
(all values in seconds, this value can be arbitrarily large or even undefined since I am moving the phone for the first time, in this case, it is 10 seconds which indicates I rotated the phone from west to north 10 seconds before)
10, 0.0012, 0.012, 0.005 (phone settles and the location manager is not called again). I do not know how to detect that I should take the value corresponding to 0.005.

cocos2d in iphone z order

I am working on a game with many sprites in a layer, and once you touch the sprite, it will do something.
my problem is the z order issue, most of my sprites are overlapping and when you touch the overlapping sprites, the one behind(i think with the lowest z order) react instead of the one in front.
I need to understand more on the z order for cocos2d.
how do I change the z order at runtime?
thanks!
Z Order and the touch handling system are separate. Z Order is only for the visual layout of your layers. The touch handling system, however, relies on the priority assigned when you register with the touch dispatcher. If you register two layers with the touch dispatcher who have the same priority, then the second layer will get the touches first, regardless of the Z ordering of the layers.
Here's the part that really confused me when I had the same issue. Whereas Z Order puts higher numbers on top of each other visually, it's exactly the opposite with touches. LOWER priority numbers actually get the touches first. To keep my own sanity, I refactored my code so that whenever possible I added layers in the same order as the Z index anyway, so the touches of the top layers would behave intuitively.
When this isn't possible, I use the touch priority system, and I define constants so that I don't get confused later. To register for touches using the priority system, use the following:
-(void) registerWithTouchDispatcher {
[[TouchDispatcher sharedDispatcher] addTargetedDelegate:self priority: DEFAULT_TOUCH_PRIORITY swallowsTouches:YES];
}
Are you using the Touch Dispatcher added in 0.8?
A singleton that dispatches touches like in v0.7.x (Standard) or one touch at the time (Targeted). The benefits of using Targeted touches is that you only receive the events (begin, move, end, cancel) of the touch that you claim. Using Targeted touches simplifies the touch handling code both in multi-touch and single-touch games. Unless you need a complex touch handling code, the use of Targeted touches is recommended.
Another benefit of the new TouchDispatcher is that it has a priority-queue, so you can alter the position of the touch consumers.
You can use the priority-queue to accomplish what you want. You can set the priority on each Sprite so you can define which Sprites should respond to a touch first and whether or not they should pass the touch on or swallow it so after it handles it nothing else will get the touch event.
The Touches Test example in the cocos2d project is probably the best place to look: http://code.google.com/p/cocos2d-iphone/source/browse/trunk/tests/TouchesTest (especially the Paddle.m class)
In the onEnter method in your sprite class you can set the priority of the touch dispatcher:
- (void)onEnter
{
[[TouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:YES];
[super onEnter];
}
Higher priority delegates will respond before low priority ones.
think of it as layers of paper. 1 will be on top of 0. 0 is somewhere above -5.
16 is above 14. That;s all it is, higher numbers are on top of lower numbers, whether you choose to use negative numbers or positive numbers.

iPhone detecting touches on a map image

I have a static map image with a bunch of circles and squares on it that depict cities. I have loaded the image into an imageView that is sub-classed under a scrollView so that I can capture user touches and zoom/scroll across the map. My challenge is that I want to pop-up a label whenever a user touches one of these circles/squares for a city to tell them which city it is and possibly load a detail view for the city. I figured I could pre-load all the relative CGPoints for the cities based on the imageView map into a dictionary so I can reference them during a "touchesBegan" event, but I'm quickly getting in over my head and possibly going about this the wrong way.
So far everything is working and I can capture the CGPoint x and y coordinates of touches. The biggest issue I have is determining the proximity of the user touches to a discrete point I may have in the dictionary. In other words if the dictionary has "Boston = NSPoint: {235, 118};" how can I tell when a user is close to that point without making them repeat the touch until it is exact? Is there an easy way to determine if a user touch is "close" to a pre-existing point? Am I going about this the right way?
Any advice or slaps in the back of the head are welcome.
Thanks, Mike
You could use UIButtons to represent the cities. Then you'll get the standard touch, highlight, etc, behaviors with less effort. Adding the buttons as subviews on your map should cause them to scale and scroll along with the map.
if i understand it correctly, you want to know if the point at which the user tapped is "close" enough to a point that is marked as a city.
you would have to quantify close i.e. set a threshold value after which the tap is farther, before which the tap is closer.
once you do that, calculate the cartesian coordinate distance sqrt ( (x1-x2)^2 + (y1-y2)^2)
for each element ( read dictionary with x,y values for cities) in the array and store the results in another array. then take the minimum of the result. the index of that result is the city that is closest to the tap if it is lesser than the said threshold.
you can either use an R-Tree, or you can calculate the proximity of the touch to each visible point in the current view. To calculate the proximity you would normally use the Pythagorean theorem but in this case you can skip the square-root because you're only comparing the relative sizes. Also you can declare a distance cut off if you like say 50 pixels squared to 2500. So you'd put the result into an object containing distance and reference point and put the objects in an NSMutableArray, not adding the results under your cutoff, and select the minimum result.
So if you have a touched point pT, then for each point pN, you'd calculate:
d=(pT.x-pN.x)*(pT.x-pN.x) + (pT.y-pN.y)*(pT.y-pN.y); //d is the squared distance
The point pN with the minimum d is the point that was closest to pT. And like I said if you want only touches within 10 pixels to count, you can test that d <= 10*10;
The method of testing for touches within a 20x20 square area works too, except if two points are within 20 pixels of each other, then you need to know which is the closest touched point.

Detect the iPhone rotation spin?

I want to create an application could detect the number of spin when user rotates the iPhone device. Currently, I am using the Compass API to get the angle and try many ways to detect spin. Below is the list of solutions that I've tried:
1/ Create 2 angle traps (piece on the full round) on the full round to detect whether the angle we get from compass passed them or not.
2/ Sum all angle distance between times that the compass is updated (in updateHeading function). Let try to divide the sum angle to 360 => we could get the spin number
The problem is: when the phone is rotated too fast, the compass cannot catch up with the speed of the phone, and it returns to us the angle with latest time (not continuously as in the real rotation).
We also try to use accelerometer to detect spin. However, this way cannot work when you rotate the phone on a flat plane.
If you have any solution or experience on this issue, please help me.
Thanks so much.
The iPhone4 contains a MEMS gyrocompass, so that's the most direct route.
As you've noticed, the magnetometer has sluggish response. This can be reduced by using an anticipatory algorithm that uses the sluggishness to make an educated guess about what the current direction really is.
First, you need to determine the actual performance of the sensor. To do this, you need to rotate it at a precise rate at each of several rotational speeds, and record the compass behavior. The rotational platform should have a way to read the instantaneous position.
At slower speeds, you will see a varying degree of fixed lag. As the speed increases, the lag will grow until it approaches 180 degrees, at which point the compass will suddenly flip. At higher speeds, all you will see is flipping, though it may appear to not flip when the flips repeat at the same value. At some of these higher speeds, the compass may appear to rotate backwards, opposite to the direction of rotation.
Getting a rotational table can be a hassle, and ensuring it doesn't affect the local magnetic field (making the compass useless) is a challenge. The ideal table will be made of aluminum, and if you need to use a steel table (most common), you will need to mount the phone on a non-magnetic platform to get it as far away from the steel as possible.
A local machine shop will be a good place to start: CNC machines are easily capable of doing what is needed.
Once you get the compass performance data, you will need to build a model of the observed readings vs. the actual orientation and rotational rate. Invert the model and apply it to the readings to obtain a guess of the actual readings.
A simple algorithm implementation will be to keep a history of the readings, and keep a list of the difference between sequential readings. Since we know there is compass lag, when a difference value is non-zero, we will know the current value has some degree of inaccuracy due to lag.
The next step is to create a list of 'corrected' readings, where the know lag of the prior actual values is used to generate an updated value that is used to create an updated value that is added to the last value in the 'corrected' list, and is stored as the newest value.
When the cumulative correction (the difference between the latest values in the actual and corrected list exceed 360 degrees, that means we basically don't know where the compass is pointing. Hopefully, that point won't be reached, since most rotational motion should generally be for a fairly short duration.
However, since your goal is only to count rotations, you will be off by less than a full rotation until the accumulated error reaches a substantially higher value. I'm not sure what this value will be, since it depends on both the actual compass lag and the actual rate of rotation. But if you care only about a small number of rotations (5 or so), you should be able to obtain usable results.
You could use the velocity of the acceleration to determine how fast the phone is spinning and use that to fill in the blanks until the phone has stopped, at which point you could query the compass again.
If you're using an iPhone 4, the problem has been solved and you can use Core Motion to get rotational data.
For earlier devices, I think an interesting approach would be to try to detect wobbling as the device rotates, using UIAccelerometer on a very fine reporting interval. You might be able to get some reasonable patterns detected from the motion at right angles to the plane of rotation.