Sort an NSSet of UITouches by y component of touch point - coordinates

I have an NSSet of UITouches.
I can get the coordinates of a touch as follows:
CGPoint touchPosition = [myTouch locationInView:myTouch.window];
My set contains 4 touches, and I need to sort them by the y component of their touchPosition.
What would be the best approach for doing this?
thanks

Related

applyImpulse towards CGPoint SpriteKit

I'm kind of a newb in SpriteKit, game dev, as a matter of fact I'm just learning. So I got to a point where I what to move a bunch of nodes towards a users tap location. So far I fought that I might calculate a virtual right triangle and get sin and cos of angles based on the sides. Unfortunately that left me with a very strong impulse that doesn't really consider the user tap location.
Any ideas?
Thanks in advance.
Look up the shooting projectiles section in the tutorial by Ray Wenderlich here:
http://www.raywenderlich.com/42699/spritekit-tutorial-for-beginners
Change the code from the tutorial as follows:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
// 1 - Choose one of the touches to work with
UITouch * touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
// 2 - Set up initial location of projectile
SKSpriteNode * projectile = [self childNodeWithName:#"desirednode"];
//make projectile point to your desired node.
// 3- Determine offset of location to projectile
CGPoint offset = rwSub(location, projectile.position);
// 4 - Bail out if you are shooting down or backwards. You can ignore this if required.
if (offset.x <= 0) return;
// 5 - OK to add now - we've double checked position
[self addChild:projectile];
// 6 - Get the direction of where to shoot
CGPoint direction = rwNormalize(offset);
// 7 - Make it shoot far enough to be guaranteed off screen
float forceValue = 200; //Edit this value to get the desired force.
CGPoint shootAmount = rwMult(direction, forceValue);
//8 - Convert the point to a vector
CGVector impulseVector = CGVectorMake(shootAmount.x, shootAmount.y);
//This vector is the impulse you are looking for.
//9 - Apply impulse to node.
[projectile.physicsBody applyImpulse:impulseVector];
}
The projectile object in the code represents your node. Also, you will need to edit the forceValue to get the desired impulse.

Drawing Circle at CGPoint touchPoint

I am trying to draw a circle (small like a dot) at the exact point of a UITouch.
The problem is, when I get the touchPoint from the user touch, I have to then draw the Rect in which to place the circle.
To work out an origin for the Rect, to place it such that, a circle drawn within it will have a centre point at the original touchpoint, I have used Pythagorus' theorem.
However, this approach is not elegant and is not exact.
Is there a better way to do this?
UITouch *touch = obj;
CGPoint touchPoint = [touch locationInView:self.view];
CGContextAddEllipseInRect(context,(CGRectMake ((touchPoint.x - 5.7), (touchPoint.y - 5.7)
, 9.0, 9.0)));
you can do like below
CFFloat radius=10;
UITouch *touch = obj;
CGPoint touchPoint = [touch locationInView:self.view];
CGContextAddEllipseInRect(context,(CGRectMake ((touchPoint.x - radius/2), (touchPoint.y
- radius/2)
, radius, radius)));
I think your approach is fine. What could be more elegant than Pythagorus? As for accuracy, add a few more digits to the constants and you can be more accurate than both the user's sense of his own finger position and the system calculations giving you the centroid of the touch.
What's more, I can't think of a good alternative. (Maybe one: do this with an image view, where the image is a circle, and set the view's anchor point so that it positions in the middle of the touch).

touches began with multiple touches gives wrong position

I am currently using this code to get the locations of all the touches:
NSSet *allTouches = [event allTouches];
NSArray *allObjects=[allTouches allObjects];
for (int i=0;i<[allObjects count];i++)
{
UITouch *touch = [allObjects objectAtIndex:i];
CGPoint location = [touch locationInView: [touch view]];
//Add to array....
}
While testing it on the simulator (don't have an iPad now to test it), it works perfectly with single-touch. But when trying with multiple-touches, the first iteration is correct while the second iteration doesn't give the correct position.
i.e.
First touch: (536,163) correct
Second touch: (198,608) but should be somewhere around (148,345)
I have a feeling that I should change something with [touch locationInView: [touch view]]; to give the right location but I don't know what to change.
Any help is appreciated.
It's difficult to understand the issue from a set of coordinates without seeing your views, any subviews and where you're touching in that, but you should know that:
[touch locationInView:aView] gives the coordinates of the touch in the coordinate system of aView. That is, the coordinates you see are relative to the top left of aView which may not be what you're expecting. Try [touch locationInView:self], which is more common.
Have a look at the "Events and Touches" section of the Event Handling Guide for iOS. You generally don't need to iterate over that collection if you're trying to track multiple touches. iOS handles all that for you. You can use properties tapCount and phase to get information about whether touches have moved, how many fingers are down and so on.
Does any of that help?

How can I convert a value to a CGPoint?

I´ve got a problem with my Xcode-Project and hope you can help me!
I just began coding, so my problem should be very easy to solve.
I wanna make a Pong Game with two paddles and a ball. I have got a value, which is between 0 and 1. If the value is high the paddle should also go up. The paddle position can be changed with "CGPoint", but how can I convert my value to a point?
Please help me.
Thanks and greets from Germany :)
CGPoint is a 2-dimensional point.
struct CGPoint {
CGFloat x;
CGFloat y;
};
typedef struct CGPoint CGPoint;
You can create a CGPoint with CGPointMake(x,y).
When your value is between 0 and 1 you may want to scale either x or y by multiplying with a constant factor.
I have a better idea for the paddle controll:
You can use the users touch imput to controll the paddle:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint pointToMove = [touch locationInView:self.view];
yourImage.center = CGPointMake(yourImage.center.x, pointToMove.y);
}
When the user touches the screen, the paddle moves to the .y position of the touch but stays on its defaulf x cordinates!
That way, the controlls would be more fluid. Just try it out!
Viel Spass beim programmieren :D
This is exact code from my app to do this job
for (id obj in points) {
[self moveToPoint:[((NSValue*) obj) CGPointValue]];

iPhone touch screen events

in iPhone SDK there are touchesBegan, touchesEnded and touchesMoved functions that are called when a touch event appears. My problem is that when I put a finger on the screen and I'm not moving it there is no way to know if there is a finger on the screen or not. Is there a way to get current touch screen state?
In any of touchesBegan, touchesEnded and touchesMoved functions I can use fallowing code to know the state for every touch, but outside them I can't:|
NSSet *allTouches = [event allTouches];
for(GLuint index = 0; index < [allTouches count]; ++index)
{
UITouch *touch = [[allTouches allObjects] objectAtIndex:index];
if([touch phase] != UITouchPhaseCancelled)
{
CGPoint touchPoint = [touch locationInView:self];
//do something with touchPoint that has state [touch phase]
}
}
You should keep a list of all points where the touches currently are. Make sure you don't just retain the UIEvent or UITouch objects (they're not guaranteed to live long) - instead, create your own data data structure. Odds are all you need is to keep track of the points where touches are currently down.
If you don't want to fix your broken design, why doesn't keeping a running list of active touches solve your problem?
NSMutableSet *touches;
In touchesBegan you add them, in touchesEnded and touchesCancelled you remove them...
You might use a "global" object (i.e. singleton) to track those and synchronize data access. I cannot see how querying this set would be different than asking UIKit directly for this information.
If you need information on touches on standard UIKit objects (i.e. UIButton), you probably need to subclass those - and I am not sure if you can get the desired effect with all classes (as they - or their subviews - could handle touches in any way they want).