Jittery movement when using CGAffineTransformMakeRotation - iphone

I'm having a problem with a jumping movement with using CGAffineTransformMakeRotation. It seems to work fine until I get to the top of quadrants 1 and 2. Below is the code I am using and youtube link that shows what happens. Any insight would be really appreciated.
My goal is to rotate a UIWebView with an svg inside. Since I can't easily detect touch on the UIWebView alone, I'm putting a blank UIImageView over it. This allows me to detect the touch and prevent the copy dialog from popping up.
http://www.youtube.com/watch?v=x_OmS0MPdEE&feature=youtu.be
- (void)touchesMoved: (NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
NSArray *allTouches = [touches allObjects];
CGPoint location = [[allTouches objectAtIndex:0] locationInView:self.view];
if(selected == 1) {
CGFloat rads = atan2f(location.y - (grid1.frame.size.height/2),location.x - (grid1.frame.size.width/2));
grid1.transform = grid1Btn.transform = CGAffineTransformMakeRotation(rads);
}
}
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
NSArray *allTouches = [touches allObjects];
CGPoint location = [[allTouches objectAtIndex:0] locationInView:self.view];
if(CGRectContainsPoint(grid1Btn.frame, location))
{
selected = 1;
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
selected = -1;
}

You use some frame.size.* in your computations. Note, that those values (unlike bounds.size.*) are affected by transform. Use grid1.center instead. It is also more logical (you rotate around center).
The next thing you need to fix is that it should not jump to new position when touches start :)

The jerkiness is caused by the inherent inaccuracy of touch event locations.
Touch events that are located sufficiently close to the center of the frame have a high likelihood of straying to the diagonally opposite quadrant of the frame whilst a circle is being traced about the center of the frame.
This will result in a sudden jump of 90 degrees as observed in your video.
One way to avoid the problem is to introduce a dead zone about the center of the frame. Any touch that is located within a given radius of the center of the frame would not trigger the rotation to be recalculated.
Hope this helps.

Related

TouchesMoved With 2 Fingers

I am trying to make a simple landscape "split screen" app where two players can play at once (each player gets one half of the screen), but I am having trouble tracking both touches at the same time. This is the code I am trying to use:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
CGPoint point = [touch locationInView:self.view];
if (point.x < 240) {
[player1 updatePoint:point];
} else {
[player2 updatePoint:point];
}
}
}
But I am obviously doing something wrong. Although this code works fine, it will only track one finger and move the player on the side of the screen the finger is on. What is my code lacking? Is this task more difficult to pull off then I think it is?
Did you set UIView's multipleTouchEnabled to YES?
multipleTouchEnabled should be set to YES for the UIView.

iOS UIScrollView with 2 finger pan for paging and one finger pan for "fingerpointer"

I spend quite some time to figure out how to achieve what I want to do but didn't find a proper solution for it, yet. I have a UIScrollView where I changed the panGestureRecognizer from one to two finger recognition so the paging only works when two fingers are used. Now I want to add an additional panGestureRecognizer that shows a courser if I'm panning with one finger. I tried that by just adding an additional panGestureRecognizer to the UIScrollView but then the app crashes immediately. So I thought of adding a subview that is transparent and positioned above the UIScrollView and that I delegate the two finger gestures to the UIScrollView with something like resgin firstResponder. I also thought of overwriting the pangestureRecognizer of the UIScrollView and let it add a Subview where my "fingerpointer"(a little point that is centered where I'm touching the screen right now) is located. I'm totally clueless what way I should go and how to implement it. Your help is greatly appreciated! Thanks a lot!
Timo
Ok, this is the second time editing my response. This might do the trick for you.
If you extend UIScrollView you can override these methods in this way:
//In your h file
BOOL cursorShown;
//In your m file
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if([touches count] == 1)
{
cursorShown = YES;
CGPoint touchLocation = [[touches anyObject] locationInView:self.superview]
//Add your cursor to the parent view here and set its location to touchLocation
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if(cursorShown == YES)
{
CGPoint touchLocation = [[touches anyObject] locationInView:self.superview]
//Move your cursors location to touchLocation
}
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if(cursorShown == YES)
{
cursorShown = NO;
//Destroy your cursor
}
}
-(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
if(cursorShown == YES)
{
cursorShown = NO;
//Destroy your cursor
}
}

iPhone hitTest broken after rotation

I have a UIView that contains a number of CALayer subclasses. I am using the following code to detect which layer a touch event corresponds to:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self];
NSLog(#"%#,%#",NSStringFromCGPoint(point),[self.layer hitTest:point].name);
}
This works fine until the device is rotated. When the device is rotated all current layers are removed from the superlayer, and new CALayers are created to fit the new orientation. The new layers are correctly inserted and viewable in the correct orientation.
After the rotation the hitTest method consistently returns null for the layer. I have noticed that when rotated 180 degrees, the returned layer is what was in that location before the rotation, i.e. touching the top left layer gives the layer in the bottom right when rotated 180 degrees. The coordinates of the hit test are printed as expected with (0,0) being in the top left. I redraw the layers with every rotation, but for some reason they seem to be mapped to being the "correct" way up, with the home button at the bottom. Am I missing a function call or something after handling the rotation?
Cheers,
Adam
Okay, I've found that the following code works in all orientations without any issues (In my case there are no overlapping views so this is appropriate for me):
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *t = [touches anyObject];
CGPoint point = [t locationInView:self];
for(CALayer *layer in self.layer.sublayers) {
CGPoint convertedPoint = [self.layer convertPoint:point
toLayer:layer];
if([layer containsPoint:convertedPointPoint]) {
NSLog(#"%#",layer.name);
}
}
}
While this manual point conversion works correctly, the question still remains as to why the original method call did not. Can anybody enlighten me?
Adam

How can I implement a circular scrolling mechanism?

I'm having some images from which the user should choose one. Now I don't want to just offer an flat scrolling area with a boring grid. Instead, I'd like to show up a wheel that contains those images. At the top would be a marker indicating the selection. Something similar to the Pickers.
The problem is not the rotation stuff; I'd use some geometric functions for that. But I have no idea how to actually get the scrolling gestures on that wheel. Where must I start?
BTW: With circular I don't mean something like the Pickers. I mean a real wheel that has a center axis and can be rolled. Like the very old telephones, like a bike wheel. Or a Picker turned by 90°, facing with the axis to you (Z-coordinate).
If you're talking about capturing gestures then here is the example they give in the docs.
Though I could have sworn I heard Alan Cannistraro say in one of the first CS193P lectures that you don't have to do this, that you can just trap the swipe event but I can't find that.
Could someone that actually knows what they are doing please correct me and I'll remove this post but for now I know this will work:
#define HORIZ_SWIPE_DRAG_MIN 12
#define VERT_SWIPE_DRAG_MAX 4
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
startTouchPosition = [touch locationInView:self];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentTouchPosition = [touch locationInView:self];
// If the swipe tracks correctly.
if (fabsf(startTouchPosition.x - currentTouchPosition.x) >= HORIZ_SWIPE_DRAG_MIN &&
fabsf(startTouchPosition.y - currentTouchPosition.y) <= VERT_SWIPE_DRAG_MAX)
{
// It appears to be a swipe.
if (startTouchPosition.x < currentTouchPosition.x)
[self myProcessRightSwipe:touches withEvent:event];
else
[self myProcessLeftSwipe:touches withEvent:event];
}
else
{
// Process a non-swipe event.
}
}
Just how similar to a picker view are you thinking? You can load up a picker view with your own custom subviews, which could be image views. That'd get you an actual picker view with your images, which might or might not be what you're actually aiming for.

How can I detect touch in cocos2d?

I am developing a 2d game for iPhone by using cocos2d.
I use many small sprite (image) in my game. I want to touch two similar types of sprite(image) and then both sprite(image) will be hidden.
How can I detect touch in a specific sprite(image) ?
A better way to do this is to actually use the bounding box on the sprite itself (which is a CGRect). In this sample code, I put all my sprites in a NSMutableArray and I simple check if the sprite touch is in the bounding box. Make sure you turn on touch detection in the init. If you notice I also accept/reject touches on the layer by returning YES(if I use the touch) or NO(if I don't)
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
CGPoint location = [self convertTouchToNodeSpace: touch];
for (CCSprite *station in _objectList)
{
if (CGRectContainsPoint(station.boundingBox, location))
{
DLog(#"Found sprite");
return YES;
}
}
return NO;
}
Following Jonas's instructions, and adding onto it a bit more ...
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [[[Director sharedDirector] convertCoordinate: touch.location];
CGRect particularSpriteRect = CGMakeRect(particularSprite.position.x, particularSprite.position.y, particularSprite.contentSize.width, particularSprite.contentSize.height);
if(CGRectContainsPoint(particularSpriteRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
You may need to adjust the x/y a little to account for the 'centered positioning' in Cocos
In your layer that contains your sprite, you need to say:
self.isTouchEnabled = YES;
then you can use the same events that you would use in a UIView, but they're named a little differently:
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
//in your touchesEnded event, you would want to see if you touched
//down and then up inside the same place, and do your logic there.
}
#david, your code has some typos for cocos 0.7.3 and 2.2.1, specifically CGRectMake instead of CGMakeRect and [touch location] is now [touch locationInView:touch.view].
here's what I did:
- (BOOL)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [[Director sharedDirector] convertCoordinate: [touch locationInView:touch.view]];
CGRect myRect = CGRectMake(sprite.position.x, sprite.position.y, sprite.contentSize.width, sprite.contentSize.height);
if(CGRectContainsPoint(myRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
#Genericrich: CGRectContainsPoint works in CocosLand because of the call 2 lines above:
[[Director sharedDirector] convertCoordinate:]
The Cocos2D objects will be using the OpenGL coordinate system, where 0,0 is the lower left, and UIKit coordinates (like where the touch happened) have 0,0 is upper left. convertCoordinate: is making the flip from UIKit to OpenGL for you.
Here's how it worked for me...
Where spriteSize is obviously the sprite's size... :P
- (BOOL)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [[Director sharedDirector] convertCoordinate: [touch locationInView:touch.view]];
CGRect myRect = CGRectMake(sprite.position.x-spriteSize/2, sprite.position.y-spriteSize/2, spriteSize, spriteSize);
if(CGRectContainsPoint(myRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
this is a good tutorial explaining the basic touch system
http://ganbarugames.com/2010/12/detecting-touch-events-in-cocos2d-iphone/
first, write
self.isTouchEnabled = YES;
then, you need to implement the functions ccTouchesEnded, ccTouchesBegan, etc
from what I understood, you want to be able to 'match' two sprites that can be on different coordinates on the screen.
a method for doing this.. : (im sure theres many other methods)
consider having 2 global variables.
so everytime a touch touches a sprite, you use the CGRectContainsPoint function that is mentioned several times to find which sprite has been touched. then, you can save the 'tag' of that sprite in one of the global variables.
You do the same for the second touch, and then you compare the 2 global variables.
you should be able to figure out the rest but comment if you have problems.