Anchor Point in CALayer - iphone

Looking at the Touches example from Apple's documentation, there is this method:
// scale and rotation transforms are applied relative to the layer's anchor point
// this method moves a gesture recognizer's view's anchor point between the user's fingers
- (void)adjustAnchorPointForGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer {
if (gestureRecognizer.state == UIGestureRecognizerStateBegan) {
UIView *piece = gestureRecognizer.view;
CGPoint locationInView = [gestureRecognizer locationInView:piece];
CGPoint locationInSuperview = [gestureRecognizer locationInView:piece.superview];
piece.layer.anchorPoint = CGPointMake(locationInView.x / piece.bounds.size.width, locationInView.y / piece.bounds.size.height);
piece.center = locationInSuperview;
}
}
First question, can someone explain the logic of setting the anchor point in the subview, and changing the center of the superview (like why this is done)?
Lastly, how does the math work for the anchorPoint statement? If you have a view that has a bounds of 500, 500, and say you touch at 100, 100 with one finger, 500, 500 with the other. In this box your normal anchor point is (250, 250). Now it's ???? (have no clue)
Thanks!

The center property of a view is a mere reflection of the position property of its backing layer. Surprisingly what this means is that the center need not be at the center of your view. Where position is situated within its bounds is based on the anchorPoint which takes in values anywhere between (0,0) and (1,1). Think of it as a normalized indicator of whether the position lies within its bounds. If you were to change either the anchorPoint or the position, the bounds will adjust itself rather than the position shifting w.r.t to its superlayer/superview. So to readjust position so that the frame of the view doesn't shift one can manipulate the center.
piece.layer.anchorPoint = CGPointMake(locationInView.x / piece.bounds.size.width, locationInView.y / piece.bounds.size.height);
Imagine the original thing being where O is the touch point,
+++++++++++
+ O + +++++++++++
+ X + --> + X +
+ + + +
+++++++++++ + +
+++++++++++
Now we want this X to be at the point where the user has touched. We do this because all scaling and rotations are done based on the position/anchorPoint. To adjust the frame back to its original position, we set the "center" of the view to the touch location.
piece.center = locationInSuperview;
So this reflects in the view readjusting its frame back,
+++++++++++
+++++++++++ + X +
+ X + --> + +
+ + + +
+ + +++++++++++
+++++++++++
Now when the user rotates or scales, it will happen as if the axis were at the touch point rather than the true center of the view.
In your example, the location of view might end up being the average i.e. (300, 300) which means the anchorPoint would be (0.6, 0.6) and in response the frame will move up. To readjust we move the center to the touch location will move the frame back down.

First question, can someone explain
the logic of setting the anchor point
in the subview, and changing the
center of the superview (like why this
is done)?
This code isn't changing the center of the superview. It's changing the center of the gesture recognizer's view to be the location of the gesture (coordinates specified in the superview's frame). That statement is simply moving the view around in its superview while following the location of the gesture. Setting center can be thought of as a shorthand way of setting frame.
As for the anchor point, it affects how scale and rotation transforms are applied to the layer. For example, a layer will rotate using that anchor point as its axis of rotation. When scaling, all points are offset around the anchor point, which doesn't move itself.
Lastly, how does the math work for the
anchorPoint statement? If you have a
view that has a bounds of 500, 500,
and say you touch at 100, 100 with one
finger, 500, 500 with the other. In
this box your normal anchor point is
(250, 250). Now it's ???? (have no
clue)
The key concept to note on the anchorPoint property is that the range of the values in the point is declared to be from [0, 1], no matter what that actual size of the layer is. So, if you have a view with bounds (500, 500) and you touch twice at (100, 100) and (500, 500), the location in the view of the gesture as a whole will be (300, 300), and the anchor point will be (300/500, 300/500) = (0.6, 0.6).

Related

Main view rotates from center point to 360 degree but inner views are also being rotated

I am rotating main view with 360 degrees, and I have subviews added inside main view, everything works correctly, but with one issue.
What I want to do is when I rotate main view, inner views should not lost their frames/position. Right now, when I rotate main view with infinte repeat count and dynamically if I add subview inside main view, it goes into proper position, but it does not retain its frame.
For example, I am implementing orbit, and for that, I have used entire transparent view as orbit and orbit is rotated from center point to 360 degree infinite times, and User can add many planets as he wants onto orbit, so when planets added on orbit, planets do not retain its frame. Can you suggest any idea?
Thanks in advance.
Well it sounds like you need to add a rotating animation for every subview that you add in your main view. If the main view rotates clockwise your subviews will need to rotate around their center in a counter-clockwise direction.
I guess you're trying to keep the subviews' orientations while rotating.
If I were you, I'd use CAAnimation instead of using a view to rotate.
You may add the animation to every subview, try this:
CAKeyframeAnimation* animation;
animation = [CAKeyframeAnimation animation];
CGMutablePathRef path = CGPathCreateMutable();
CGPathMoveToPoint(path, NULL,imgview.layer.position.x,imgview.layer.position.y);
int p = [self getblank:tag];
float f = 2.0*M_PI - 2.0*M_PI *p/PHOTONUM;
float h = f + 2.0*M_PI *num/PHOTONUM;
float centery = self.view.center.y;
float centerx = self.view.center.x;
float tmpy = centery + RADIUS*cos(h);
float tmpx = centerx - RADIUS*sin(h);
imgview.center = CGPointMake(tmpx,tmpy);
CGPathAddArc(path,nil,self.view.center.x, self.view.center.y,RADIUS,f+ M_PI/2,f+ M_PI/2 + 2.0*M_PI *num/PHOTONUM,0);
animation.path = path;
CGPathRelease(path);
animation.duration = TIME;
animation.repeatCount = 1;
animation.calculationMode = #"paced";
return animation;
I assume you have a stored variable that represents the rotation of the "world", correct? If not, you should.
Then, for each image you add, also store a variable with it that represents its rotation to the world.
For example, if your world is rotated 180°, and you added a cup (which you want to appear right-side up when the world is upside-down) the cup's "offset" to the world rotation would be -180°.
Then, if the world is at 180° and you rotate your world by adding 90°, then the cup's new rotation value would be cup_rotate_offset + world_rotation, or 180°+270°, which is the same as saying 90°, and the top of the world would be facing left and the cup's top would be facing right.
You have to independently track the offset values for each added object.

Translating a view and the rotating it problem

I have a custom UIImageView, I can drag it around screen by making a translation with (xDif and yDif is the amount fingers moved):
CGAffineTransform translate = CGAffineTransformMakeTranslation(xDif, yDif);
[self setTransform: CGAffineTransformConcat([self transform], translate)];
Let's say I moved the ImageView for 50px in both x and y directions. I then try to rotate the ImageView (via gesture recognizer) with:
CGAffineTransform transform = CGAffineTransformMakeRotation([recognizer rotation]);
myImageView.transform = transform;
What happens is the ImageView suddenly moves to where the ImageView was originally located (before the translation - not from the moved position + 50px in both directions).
(It seems that no matter how I translate the view, the self.center of the ImageView subclass stays the same - where it was originally laid in IB).
Another problem is, if I rotate the ImageView by 30 deg, and then try to rotate it a bit more, it will again start from the original position (angle = 0) and go from there, why wouldn't it start from the angle 30 deg and not 0.
You are overwriting the earlier transform. To add to the current transform, you should do this –
myImageView.transform = CGAffineTransformRotate(myImageView.transform, recognizer.rotation);
Since you're changing the transform property in a serial order, you should use CGAffineTransformRotate, CGAffineTransformTranslate and CGAffineTransformScale instead so that you add to the original transform and not create a new one.

Cocos2D Rotation and Anchor point

The problem that I have is that when ever I change the anchor point sprite automatically rotates with respect to the current anchor point. And I don't want that to happen.
The steps that I followed
create a sprite with anchor point (0.5, 0.5)
Changed the anchor point to (0,1)
Rotated the sprite to 90 degree. (Using CCRotateBy. Sprite rotated correctly)
Changed the anchor point to (0.5, 0.5) (Every thing is fine till now. And this is the position that I need to keep). Now sprite.rotation is 90.
I changed the anchor point to (1,0) (Sprite automatically rotates to 90 degree with respect to the given anchor point - I need to stop this behavior)
Is there any way to reset the rotation of sprite to 0, without actually rotating the texture(ie., to keep the texture in its current form - actual texture rotated to 90 degrees) and changing anchor point or position along with step 4, so that I can continue from point 5.
As Lukman says, the anchor point will always affect rotation, since your goal is to be able to specify the sprite position with a different anchor point from the rotation I would suggest making an empty CCNode as a parent of your sprite.
This way, you can set the position on sprite to be relative to this parent node to compensate for your anchor point change and then keep the anchor point for rotation on the sprite but use the parent node for position.
anchorPoint affects both position and rotation. You cannot stop it from affecting either one of them.
But from reading your question, since you want to prevent anchorPoint from affecting the rotation, I'm assuming here that the reason you change the anchorPoint is for the position, for example you are setting it to be ccp(1, 0) because you want to the sprite bottom right corner, instead of the sprite center, to be where you set the position is.
My suggestion is: don't change the anchorPoint at all, but change the way you set the sprite position. You can use this small function to adjust the position:
CGPoint adjustedPosition(const CGPoint position, const CGPoint anchor, const CGSize size) {
return CGPointMake(position.x - (anchor.x - 0.5) * size.width, position.y - (anchor.y - 0.5) * size.height);
}
Now, assuming you wanted to use anchorPoint of (1,0) when doing the positioning, instead of sprite.position = ccp(200, 300), you just need to do:
sprite.position = adjustedPosition(ccp(200, 300), ccp(1.0, 0.0), sprite.contentSize);
If you want, I'll post the logic behind the math later. Otherwise, I hope this will help.
Maybe it will help you to install an anchor for the sprites in the correct coordinate.
void SetAnchorPosition(CCSprite * sprite, const CCPoint & point)
{
static CCSize winSize = CCDirector::sharedDirector()->getWinSize();
double x = ((double)1/(double)winSize.width)*(double)point.x;
double y = ((double)1/(double)winSize.height)*(double)point.y;
sprite->setAnchorPoint(ccp(x,y));
sprite->setPosition(point);
}
You can add a line in the touchEnded method as a forceful alternative:
-(void)touchEnded:(UITouch *)touch withEvent:(UIEvent *)event{
_yourSprite.rotation = 90;
}

Find the center point of a UIScrollView while zooming

I'm having difficulties getting a tiled UIScrollView to zoom in and out correctly with pinch zooming. The issue is that when a pinch-zoom occurs, the resulting view is usually not centered in the same region.
Details: The app starts with a tiled image that is 500x500. If a user zooms in, it will snap to 1000x1000 and the tiles will redraw. For all the zoom affects, etc. I am just letting the UIScrollView do it's thing. When scrollViewDidEndZooming:withView:atScale: is called, I redraw the tiles (like you can see in many examples and other questions here).
I think that I've drilled the problem down to calculating the center of the view correctly when I get to scrollViewDidEndZooming:withView:atScale: (I can center on a known point fine after I redraw).
What I'm currently using:
- (void)scrollViewDidEndZooming:(UIScrollView *)scrollView withView:(UIView *)view atScale:(float)scale {
// as an example, create the "target" content size
CGSize newZoomSize = CGSizeMake(1000, 1000);
// get the center point
CGPoint center = [scrollView contentOffset];
center.x += [scrollView frame].width / 2;
center.y += [scrollView frame].height / 2;
// since pinch zoom changes the contentSize of the scroll view, translate this point to
// the "target" size (from the current size)
center = [self translatePoint:center currentSize:[scrollView contentSize] newSize:newZoomSize];
// redraw...
}
/*
Translate the point from one size to another
*/
- (CGPoint)translatePoint:(CGPoint)origin currentSize:(CGSize)currentSize newSize:(CGSize)newSize {
// shortcut if they are equal
if(currentSize.width == newSize.width && currentSize.height == newSize.height){ return origin; }
// translate
origin.x = newSize.width * (origin.x / currentSize.width);
origin.y = newSize.height * (origin.y / currentSize.height);
return origin;
}
Does this seem correct? Is there a better way? Thanks!
The way I have solved this so far is to store the initial center point of the view when the zoom starts. I initially saving this value when the scrollViewDidScroll method is called (and the scroll view is zooming). When scrollViewDidEndZooming:withView:atScale: is called, I use that center point (and reset the saved value).
The center of the scrollview can be found by adding it's center property, and it's contentOffset property.
aView.center = CGPointMake(
self.scrollView.center.x + self.scrollView.contentOffset.x,
self.scrollView.center.y + self.scrollView.contentOffset.y);

How to rotate image around center point automatically with finger touch

On iPhone, how to implement rotating image around the center point using finger touch ?
Just like wheel, if you put finger on the iPhone screen , then move suddenly, then the image becoming rotating around center point just like the wheel, after a while, it becomes more and more slow , finally stop.
Who can help to give some pieces of codes (Object-C) or some suggest ?
I was working with a "spin the bottle"-app yesterday. On the window I have a ImageView with an bottle that's suppose to response to touches and rotate the way the user swipes his finger. I struggled to get my ImageView to rotate during the touch-events (TouchesBegan, Touchesoved, TouchesEnd). I used this code in TouchesMoved to find out the angle in witch to rotate the image.
public override void TouchesMoved (NSSet touches, UIEvent evt)
{
PointF pt = (touches.AnyObject as UITouch).LocationInView(this);
float x = pt.X - this.Center.X;
float y = pt.Y - this.Center.Y;
double ang = Math.Atan2(x,y);
// yada yada, rotate image using this.Transform
}
THIS IS IMPORTANT! When the ImageView rotates, even the x & y-coordinates changes. So touching the same area all the time would give me different values in the pt and prePt-points. After some thinking, googeling and reading I came up with an simple solution to the problem. The "SuperView"-property of the ImageView.
PointF pt = (touches.AnyObject as UITouch).LocationInView(this.SuperView);
Having that small change in place made it alot easier, no i can use the UITouch-metohs LocationInView and PreviousLocationInView and get the right x & y coordinates. Her is parts of my code.
float deltaAngle;
public override void TouchesMoved (NSSet touches, UIEvent evt)
{
PointF pt = (touches.AnyObject as UITouch).LocationInView(this.Superview);
float x = pt.X - this.Center.X;
float y = pt.Y - this.Center.Y;
float ang = float.Parse(Math.Atan2(dx,dy).ToString());
//do the rotation
if (deltaAngle == 0.0) {
deltaAngle = ang;
}
else
{
float angleDif = deltaAngle - ang;
this.Transform = CGAffineTransform.MakeRotation(angleDif);
}
}
Hope that helped someone from spending hours on how to figure out how to freaking rotate a bottle! :)
I would use the affine transformations - yuou can assign a transformation to any layer or UI element using the transform property.
You can create a rotation transform using CGAffineTransform CGAffineTransformMakeRotation( CGFloat angle) which will return a transformation that rotates an element. The default rotation should be around the centerpoint.
Be aware, the rotation is limited to 360 degrees, so if you want to rotate something more than that (say through 720 degrees) - you have to break the rotation into several sequences.
You may find this SO article useful as well.
The transform property of a view or layer can be used to rotate the image displayed within. As far as the spinning part goes, you just track the location and movement of touches in your view with touchesBegan, touchesMoved, and touchesEnded.
Use the distance and time between the touches updates to calculate a speed, and use that to set a rotational velocity. Once you start the image spinning, update the position periodically (with an NSTimer, maybe), and reduce the rotational velocity by some constant.