Rotation based on touch problem - iphone

I'm making a simple dial that rotates as you drag your finger across it. It rotates great, but it also rotates when i touch anywhere on the screen and drag my finger.
How can i restrict the first touches to be only inside my imageview object? or where am i going wrong?
this is my code of trouble:
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
if (self) {
UIImage *image1 = [UIImage imageNamed:#"nav#2x.png"];
wheelfrom = [[UIImageView alloc] initWithImage:image1];
wheelfrom.frame =CGRectMake(10, -130, 300, 300);
[self addSubview:wheelfrom];
}
return self;
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch =[[[event allTouches] allObjects] lastObject];
firstLoc = [touch locationInView:self];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch =[[[event allTouches] allObjects] lastObject];
CGPoint curLoc = [touch locationInView:self];
float fromAngle = atan2( firstLoc.y-wheelfrom.center.y,
firstLoc.x-wheelfrom.center.x );
float toAngle = atan2( curLoc.y-wheelfrom.center.y,
curLoc.x-wheelfrom.center.x );
float newAngle = angle + (toAngle - fromAngle);
CGAffineTransform cgaRotate = CGAffineTransformMakeRotation(newAngle);
wheelfrom.transform = cgaRotate;
angle = newAngle;
}
Thanks for your help!

You try like this,
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if(CGRectContainsPoint(wheelfrom.frame, location))
{
//do your things
}
}

You can try by checking if the point of touch is within the frame of the image view.Do what you want only if its yes.

Inside -(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event, check the firstLoc is within your range.

Related

Determine iPhone Screen Tapped is in region or not?

I want to determine whether the tapped location is in region or not. I have 4 CGPoints and I know this can be done by using UITouch. Also, I have screen tapped location by using the function
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *myTouch = [[touches allObjects] objectAtIndex: 0];
CGPoint currentPos = [myTouch locationInView:self.view];
}
And for example my 4 CGPoints are
self.firstPoint = CGPointMake(50.0f, 50.0f);
self.secondPoint = CGPointMake(200.0, 50.0);
self.thirdPoint = CGPointMake(200.0, 200.0);
self.fourthPoint = CGPointMake(50.0, 120.0);
Thanks in advance
You should use a CGRect to represent the rect instead of four CGPoints and then use CGRectContainsPoint() to check if the rect contains the point.
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
image=[UIImage imageNamed:#"anyImage.gif"];
newView = [[UIImageView alloc]initWithImage:image];
if (location.y<480|| location.y>50)
{
//write your code
}
}

How can I move CGRect with UITouches?

How can I move CGRect with uitouches? If anyone has idea please explain.
if(areaSelected)
return;
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[myPath addLineToPoint:[mytouch locationInView:self]];
[self setNeedsDisplay];
Hi as per my understanding you need to move the cropped image to one place to other place so you need to code it in the touches stuff...
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
cloud.center = location;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesBegan:touches withEvent:event];
}
Here cloud is the UIImageView you can make it as cropped image ....
cloud.image=croppedImage; //[you must assign this first...]
i hope it helps you...
You need to reset the frame of the rectangle:
[cropRect setFrame:CGRectMake(<new x coordinate>, <new y coordinate>, width, height)]

touchesbegan, touchesmoved, touchesended issue

For various reasons, I've moved these methods from a UIView subclass to my viewcontroller. And I finally got it working, except for one thing. Not only am I able to drag the UIImageviews I've programmatically created, but the actual view controllers view is draggable too. Creating this much undesired effect. I guess it's the fact that it's touches anyobject, and the background itself is an object. I'm just not sure how exclude the background. I would think that it would need the "UserInteraction enabled", but I guess not? I only want it to make UIImageViews draggable. Please forgive my noobness. I'm still learning.
I have all the imageviews i'd want "touchable" in an NSMutableDictionary called "letterDictionary". Would it be possible to only have touch apply to what's in the dictionary?
http://imgur.com/W08dI
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
UITouch *touch = [touches anyObject];
touchPoint = [touch locationInView:self.view];
movingLetter = [touch view];
CGPoint pointInside = [touch locationInView:[touch view]];
if ([movingLetter pointInside:pointInside withEvent:event]) touchedInside = YES;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if (touchedInside) {
UITouch *touch = [touches anyObject];
CGPoint newPoint = [touch locationInView:self.view]; // get the new touch location
movingLetter.center = CGPointMake(movingLetter.center.x + newPoint.x - touchPoint.x, movingLetter.center.y + newPoint.y - touchPoint.y);
touchPoint = newPoint;
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if (touchedInside) {
UITouch *touch = [touches anyObject];
CGPoint newPoint = [touch locationInView:self.view];
movingLetter.center = CGPointMake(movingLetter.center.x + newPoint.x - touchPoint.x, movingLetter.center.y + newPoint.y - touchPoint.y);
if (CGRectIntersectsRect([movingLetter frame], [placeHolder frame]))
{
movingLetter.center = placeHolder.center;
}
}
touchedInside = NO;
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
touchedInside = NO;
}
You have the view that was touched,
UITouch *touch = [touches anyObject];
touchPoint = [touch locationInView:self.view];
movingLetter = [touch view];
just test to see if it is the class you are looking for (e.g. a UIImageView) then return
UITouch *touch = [touches anyObject];
if (![[touch view] isKindOfClass:[UIImageView class]])
{
return;
}
In this touchesBegen Method :
-(void)touchesBegan:(NSSet* )touches withEvent:(UIEvent *)event
{
[self.view endEditing:YES];
UITouch *touch = [touches anyObject];
_previousPoint1 = [touch previousLocationInView:self.main_uiview];
_previousPoint2 = [touch previousLocationInView:self.main_uiview];
_currentPoint = [touch locationInView:self.main_uiview];
[self touchesMoved:touches withEvent:event];
self.bezierPath = [UIBezierPath bezierPath];
[self.bezierPath moveToPoint:_currentPoint];
}
TouchesMove Method
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent* )event
{
UITouch *touch = [touches anyObject];
_previousPoint2 = _previousPoint1;
_previousPoint1 = [touch previousLocationInView:self.main_uiview];
_currentPoint = [touch locationInView:self.main_uiview];
lastPoint = _currentPoint;
[_bezierPath addLineToPoint:lastPoint];
// calculate mid point
CGPoint mid1 = midPoint4(_previousPoint1, _previousPoint2);
CGPoint mid2 = midPoint4(_currentPoint, _previousPoint1);
UIGraphicsBeginImageContextWithOptions(self.bg_imageview.frame.size, NO, 0.0);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context,brush);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineJoin(context,kCGLineJoinRound);
[self.bg_imageview.image drawInRect:CGRectMake(0, 0, self.bg_imageview.frame.size.width, self.bg_imageview.frame.size.height)];
CGContextMoveToPoint(context, mid1.x, mid1.y);
// Use QuadCurve is the key
CGContextAddQuadCurveToPoint(context, _previousPoint1.x, _previousPoint1.y, mid2.x, mid2.y);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetStrokeColorWithColor(context,[UIColor blackColor].CGColor);
CGContextSetLineWidth(context, 3.0);
CGContextStrokePath(context);
self.bg_imageview.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();}

Move a UIImageView in y axis

I am working on a app and i want to move a UIImage in the Y axis only, not in Xaxis, here's what i did till now
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
stretchPoint = [[touches anyObject]locationInView:self.view];
arrowImage.center = stretchPoint;
}
The above code is moving the arrowImage in both the axis,
stretchPoint is an instance of CGPoint, My app is in portrait mode, can anyone give me a basic idea to how to do this, as the arrowImage is a instance of UIImageView.
this should work fine:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
currentPoint.x = arrowImage.center.x;
arrowImage.center = currentPoint;
}
Subclass the UIImage and implement the following code
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:[self superview]];
CGPoint newCenter = CGPointMake(self.center.x, currentPoint.y);
self.center = newCenter;
}
I am not sure as have not tested,but you can implement something like this :
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
currentPoint.y = 20;
}
Something like:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
stretchPoint = [[touches anyObject] locationInView:self.view];
arrowImage.frame = CGPointMake(arrowImage.origin.x,
stretchPoint.y,
arrowImage.size.width,
arrowImage.size.height);
}

iPhone - Wheel tracking finger jumps to same starting position?

I am trying to implement a selection wheel in an iphone application. I have got the wheel to track the users finger but now for some reason when the user touches the screen the wheel jumps so the same section of the wheel tracks the finger each time.
How would I get it so the wheel rotates like the user is dragging it from any point?
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
NSSet *allTouches = [event allTouches];
int tInput = [allTouches count]-1;
UITouch *touch =[[allTouches allObjects] objectAtIndex:tInput];
CGPoint location = [touch locationInView:self.view];
float theAngle = atan2( location.y-imageX.center.y, location.x-imageX.center.x );
CGAffineTransform cgaRotate = CGAffineTransformMakeRotation(theAngle);
imageX.transform = cgaRotate;
}
Any suggestion?
Right - that makes perfect sense - this is indeed what your code does.
What you need to add is the initial value where you started your drag as a relative rotation -- or track your last rotation.
A very elaborate way is shown below - but this should help get the point across.
#interface UntitledViewController : UIViewController {
CGPoint firstLoc;
UILabel * fred;
double angle;
}
#property (assign) CGPoint firstLoc;
#property (retain) UILabel * fred;
#implementation UntitledViewController
#synthesize fred,firstLoc;
- (void)viewDidLoad {
[super viewDidLoad];
self.fred = [[UILabel alloc] initWithFrame:CGRectMake(100,100,100,100)];
fred.text = #"Fred!"; fred.textAlignment = UITextAlignmentCenter;
[self.view addSubview:fred];
angle = 0; // we aint have rotated just yet...
};
// make sure we get them drag events.
- (BOOL)isFirstResponder { return YES; }
-(void)handleObject:(NSSet *)touches
withEvent:(UIEvent *)event
isLast:(BOOL)lst
{
UITouch *touch =[[[event allTouches] allObjects] lastObject];
CGPoint curLoc = [touch locationInView:self.view];
float fromAngle = atan2( firstLoc.y-fred.center.y,
firstLoc.x-fred.center.x );
float toAngle = atan2( curLoc.y-fred.center.y,
curLoc.x-fred.center.x );
// So the angle to rotate to is relative to our current angle and the
// angle through which our finger moved (to-from)
float newAngle = angle + (toAngle - fromAngle);
CGAffineTransform cgaRotate = CGAffineTransformMakeRotation(newAngle);
fred.transform = cgaRotate;
// we only 'save' the current angle when we're done with the drag.
//
if (lst)
angle = newAngle;
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch =[[[event allTouches] allObjects] lastObject];
// capture where we started - so we can later work out the
// rotation relative to this point.
//
firstLoc = [touch locationInView:self.view];
};
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self handleObject:touches withEvent:event isLast:NO];
};
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[self handleObject:touches withEvent:event isLast:YES];
}
Obviously you can do this a lot more elegant - and above misses a bit of 0 .. 2xPI capping you need.
Because the view itself is still the old one. You may want to fetch the zip example file in:
http://bynomial.com/blog/?p=77
and in PuttyView.m change the:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
....
else {
self.center = CGPointMake(self.center.x + touchPoint.x - touchStart.x,
self.center.y + touchPoint.y - touchStart.y);
into
else {
NSSet *allTouches = [event allTouches];
int tInput = [allTouches count]-1;
UITouch *touch =[[allTouches allObjects] objectAtIndex:tInput];
CGPoint location = [touch locationInView:self];
float theAngle = atan2( location.y-self.center.y, location.x-self.center.x );
CGAffineTransform cgaRotate = CGAffineTransformMakeRotation(theAngle);
self.transform = cgaRotate;
}
where you thus stay within the view and its relative angle. Alternatively - in your code - keep the relative angle itself.