Drawing bezier curves with my finger in iOS? - iphone

Hey, I'm trying to figure out how to generate bezier curves in iOS based on user input. Are there any existing classes for this? Can someone give me a general summary of what would be required? I just need help getting started on the right foot.

If you want to stay in objective-c, you can use UIBezierPath's addCurveToPoint:controlPoint1:controlPoint2: method. You can also use a similarly named function with CGPaths. When using bezier curves, you need 4 points: starting point, ending point, and a control point at each end to define the curve.
One way to define this is to have the user drag a finger to define the start and end points, then tap the screen at the control points. Here is an example view to handle this.
BezierView.h
enum {
BezierStateNone = 0,
BezierStateDefiningLine,
BezierStateDefiningCP1,
BezierStateDefiningCP2
};
#interface BezierView : UIView {
CGPoint startPt, endPt, cPt1, cPt2;
UInt8 state;
UIBezierPath *curvePath;
#private
UITouch *currentTouch;
}
#property (nonatomic, retain) UIBezierPath *curvePath;
#end
BezierView.m
#interface BezierView
#dynamic curvePath;
- (UIBezierPath *)curvePath {
return [[curvePath retain] autorelease];
}
- (void)setCurvePath:(UIBezierPath *)newPath {
id tmp = curvePath;
curvePath = [newPath retain];
[tmp release];
state = BezierStateNone;
[self setNeedsDisplay];
}
- (void)_updateCurve {
UIBezierPath *path = [UIBezierPath bezierPath];
[path moveToPoint:startPt];
[path addCurveToPoint:endPt controlPoint1:cPt1 controlPoint2:cPt2];
}
- (void)_calcDefaultControls {
if(ABS(startPt.x - endPt.x) > ABS(startPt.y - endPt.y)) {
cPt1 = (CGPoint){(startPt.x + endPt.x) / 2, startPt.y};
cPt2 = (CGPoint){cPt1.x, endPt.y};
} else {
cPt1 = (CGPoint){startPt.x, (startPt.y + endPt.y) / 2};
cPt2 = (CGPoint){endPt.x, cPt1.y};
}
}
- (void)drawRect:(CGRect)rect {
UIBezierPath *path = self.curvePath;
if(path) [path stroke];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if(currentTouch) return;
if(state == BezierStateNone) {
state = BezierStateDefiningLine;
currentTouch = [touches anyObject];
startPt = [currentTouch locationInView:self];
} else if(state == BezierStateDefiningCP1) {
currentTouch = [touches anyObject];
cPt1 = [currentTouch locationInView:self];
[self _updateCurve];
} else if(state == BezierStateDefiningCP2) {
currentTouch = [touches anyObject];
cPt2 = [currentTouch locationInView:self];
[self _updateCurve];
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if(!currentTouch) return;
if(state == BezierStateDefiningLine) {
endPt = [currentTouch locationInView:self];
[self _calcDefaultControls];
[self _updateCurve];
} else if(state == BezierStateDefiningCP1) {
cPt1 = [currentTouch locationInView:self];
[self _updateCurve];
} else if(state == BezierStateDefiningCP2) {
cPt2 = [currentTouch locationInView:self];
[self _updateCurve];
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if(!currentTouch) return;
if(state == BezierStateDefiningLine) {
state = BezierStateDefiningCP1;
} else if(state == BezierStateDefiningCP1) {
state = BezierStateDefiningCP2;
} else if(state == BezierStateDefiningCP2) {
state = BezierStateNone;
}
currentTouch = nil;
}
- (void)touchesCanceled:(NSSet *)touches withEvent:(UIEvent *)event {
if(state == BezierStateDefiningLine) {
self.curvePath = nil;
self.state = BezierStateNone;
}
self.currentTouch = nil;
}

Okay, the easiest way to do that is probably subclassing UIView and use CoreGraphics for drawing. Check out the sample code from QuarzDemo.
Implement the drawInRect-method for your custom view class. And detect the user's touches with touchesBegan,touchesMoved etc.
Here is an example code (taken from QuarzDemo) for drawing a bezier curve:
// Drawing with a white stroke color
CGContextSetRGBStrokeColor(context, 1.0, 1.0, 1.0, 1.0);
// Draw them with a 2.0 stroke width so they are a bit more visible.
CGContextSetLineWidth(context, 2.0);
// Draw a bezier curve with end points s,e and control points cp1,cp2
CGPoint s = CGPointMake(30.0, 120.0);
CGPoint e = CGPointMake(300.0, 120.0);
CGPoint cp1 = CGPointMake(120.0, 30.0);
CGPoint cp2 = CGPointMake(210.0, 210.0);
CGContextMoveToPoint(context, s.x, s.y);
CGContextAddCurveToPoint(context, cp1.x, cp1.y, cp2.x, cp2.y, e.x, e.y);
CGContextStrokePath(context);
Hope that helps you getting started ;)

Related

How to detect in touchesEnded which nodes are still being pressed

I've been stuck on this problem for days. What I have is multiple SKSpriteNode's, one for a left arrow, right arrow and up arrow. When I hold down the right arrow I want my character to continue moving right while its being held down, on the other hand if you press the up Arrow then you will only jump once regardless of if you hold it down.
So my problem for example is when i hold the right arrow and then i press the up arrow, touchesEnded is called and it stops my character from moving right even though I still have my finger on the right arrow
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
for (UITouch *touch in touches){
CGPoint location = [touch locationInNode:self];
if (CGRectContainsPoint(rightArrow.frame, location)){
[wizard setTexture:[SKTexture textureWithImageNamed:#"wizardRight"]];
didTouchRightArrow = YES;
isLookingRight = YES;
isLookingLeft = NO;
rightArrow.alpha = 0.5;
NSLog(#"Touching right");
}
if (CGRectContainsPoint(leftArrow.frame, location)){
[wizard setTexture:[SKTexture textureWithImageNamed:#"wizardLeft"]];
isLookingRight = NO;
isLookingLeft = YES;
didTouchLeftArrow = YES;
leftArrow.alpha = 0.5;
NSLog(#"Touching left");
}
if (CGRectContainsPoint(upArrow.frame, location)){
didTouchUpArrow = YES;
upArrow.alpha = 0.5;
NSLog(#"Touching up");
}
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
if (rightArrow.alpha != 1.0){
rightArrow.alpha = 1.0;
}
if (leftArrow.alpha != 1.0){
leftArrow.alpha = 1.0;
}
if (upArrow.alpha != 1.0){
upArrow.alpha = 1.0;
for (UITouch *touch in touches){
CGPoint location = [touch locationInNode:self];
if (CGRectContainsPoint(rightArrow.frame, location)){
NSLog(#"Touching right");
didTouchRightArrow = YES;
} {
NSLog(#"Not touching right");
didTouchRightArrow = NO;
}
if (CGRectContainsPoint(leftArrow.frame, location)){
NSLog(#"Touching Left");
didTouchLeftArrow = YES;
} else {
NSLog(#"not touching left");
didTouchLeftArrow = NO;
}
didTouchUpArrow = NO;
}
This may not be the right way to approach the problem, but in touchesEnded I am trying to see if the touch is still in the desired Rect.
You need a way to identify the different nodes which are registering a touch. There is more than one way to do this but I have always found using the name property of a node to be the simplest and easiest to work with.
You already have the right idea by using the BOOLs to register the touch states.
I wrote some code to handle what you are trying to accomplish:
#import "GameScene.h"
#implementation GameScene {
SKSpriteNode *node0;
SKSpriteNode *node1;
BOOL node0touch;
BOOL node1touch;
}
-(void)didMoveToView:(SKView *)view {
self.backgroundColor = [SKColor blackColor];
node0 = [SKSpriteNode spriteNodeWithColor:[SKColor redColor] size:CGSizeMake(100, 100)];
node0.name = #"node0";
node0.position = CGPointMake(100, 300);
[self addChild:node0];
node1 = [SKSpriteNode spriteNodeWithColor:[SKColor blueColor] size:CGSizeMake(100, 100)];
node1.name = #"node1";
node1.position = CGPointMake(400, 300);
[self addChild:node1];
node0touch = false;
node1touch = false;
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
CGPoint touchLocation = [touch locationInNode:self];
SKNode *node = [self nodeAtPoint:touchLocation];
if([node.name isEqualToString:#"node0"])
node0touch = true;
if([node.name isEqualToString:#"node1"])
node1touch = true;
}
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
CGPoint touchLocation = [touch locationInNode:self];
SKNode *node = [self nodeAtPoint:touchLocation];
if([node.name isEqualToString:#"node0"])
node0touch = false;
if([node.name isEqualToString:#"node1"])
node1touch = false;
}
}
-(void)update:(CFTimeInterval)currentTime {
if(node0touch)
NSLog(#"node0 touch");
if(node1touch)
NSLog(#"node1 touch");
}

how can i fill the transparent area of the image when my finger moved on

I wanna draw the transparent area with brush, but my code work not very well.I think someone can help me here. My code :
// Handles the start of a touch
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds = [self bounds];
UITouch *touch = [[event touchesForView:self] anyObject];
if (![image isPointTransparent:[touch locationInView:self]]
|| ![image isPointTransparent:[touch previousLocationInView:self]])
{
return;
}
firstTouch = YES;
// Convert touch point from UIView referential to OpenGL one (upside-down flip)
location = [touch locationInView:self];
location.y = bounds.size.height - location.y;
}
// Handles the continuation of a touch.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds = [self bounds];
UITouch *touch = [[event touchesForView:self] anyObject];
if (![image isPointTransparent:[touch locationInView:self]]
|| ![image isPointTransparent:[touch previousLocationInView:self]])
{
return;
}
// Convert touch point from UIView referential to OpenGL one (upside-down flip)
if (firstTouch)
{
firstTouch = NO;
previousLocation = [touch previousLocationInView:self];
previousLocation.y = bounds.size.height - previousLocation.y;
}
else
{
location = [touch locationInView:self];
location.y = bounds.size.height - location.y;
previousLocation = [touch previousLocationInView:self];
previousLocation.y = bounds.size.height - previousLocation.y;
}
// Render the stroke
[self renderLineFromPoint:previousLocation toPoint:location];
}
// Handles the end of a touch event when the touch is a tap.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds = [self bounds];
UITouch *touch = [[event touchesForView:self] anyObject];
if (![image isPointTransparent:[touch locationInView:self]] || ![image isPointTransparent:[touch previousLocationInView:self]])
{
return;
}
if (firstTouch)
{
firstTouch = NO;
previousLocation = [touch previousLocationInView:self];
previousLocation.y = bounds.size.height - previousLocation.y;
[self renderLineFromPoint:previousLocation toPoint:location];
}
}
The important thing to know is that you should do your actual drawing in the drawRect: of your UIView. So the renderLineFromPoint:toPoint: method in your code should just be building up an array of lines and telling the view to redraw each time, something like this:
- (void)renderLineFromPoint:(CGPoint)from toPoint:(CGPoint)to
{
[lines addObject:[Line lineFrom:from to:to]];
[self setNeedsDisplay];
}
This assumes that you have a Line class which has 2 CGPoint properties. Your drawRect: may look something like this:
- (void)drawRect:(CGRect)rect
{
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetRGBStrokeColor(context, 0.0f, 0.0f, 0.0f, 1.0f);
for (Line *line in lines) {
CGContextMoveToPoint(context, line.from.x, line.from.y);
CGContextAddLineToPoint(context, line.to.x, line.to.y);
CGContextStrokePath(context);
}
}
If you do it like this (without OpenGL) there is no need to flip the y-axis.
The only thing left is to implement the isPointTransparent: method. It's difficult to know from your code how this should work.

How to check whether a CGPoint is inside a UIImageView?

In touchesBegan:
CGPoint touch_point = [[touches anyObject] locationInView:self.view];
There are tens of UIImageView around, stored in a NSMutableArray images. I'd like to know is there a built-in function to check if a CGPoint (touch_point) is inside one of the images, e.g.:
for (UIImageView *image in images) {
// how to test if touch_point is tapped on a image?
}
Thanks
Follow up:
For unknown reason, pointInside never returns true. Here is the full code.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
touch_point = [touch locationInView:self.view];
for (UIImageView *image in piece_images) {
if ([image pointInside:touch_point withEvent:event]) {
image.hidden = YES;
} else {
image.hidden = NO;
}
NSLog(#"image %.0f %.0f touch %.0f %.0f", image.center.x, image.center.y, touch_point.x, touch_point.y);
}
}
although I can see the two points are sometimes identical in the NSLog output.
I also tried:
if ([image pointInside:touch_point withEvent:nil])
the result is the same. never returns a true.
To eliminate the chance of anything goes with with the images. I tried the following:
if (YES or [image pointInside:touch_point withEvent:event])
all images are hidden after the first click on screen.
EDIT 2:
Really weird. Even I hard coded this:
point.x = image.center.x;
point.y = image.center.y;
the code becomes:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint point; // = [touch locationInView:self.view];
for (UIImageView *image in piece_images) {
point.x = image.center.x;
point.y = image.center.y;
if ([image pointInside:point withEvent:event]) {
image.hidden = YES;
NSLog(#"YES");
} else {
image.hidden = NO;
NSLog(#"NO");
}
NSLog(#"image %.0f %.0f touch %.0f %.0f", image.center.x, image.center.y, point.x, point.y);
}
}
pointInside always returns false ...
Sounds to me like the problem is you need to convert coordinates from your view's coordinates to the UIImageView's coordinates. You can use UIView's convertPoint method.
Try replacing
if ([image pointInside:touch_point withEvent:event]) {
with
if ([image pointInside: [self.view convertPoint:touch_point toView: image] withEvent:event]) {
(Note that you are allowed to pass nil instead of event.)
I guess this is a bit late for the questioner, but I hope it is of use to someone.
if ([image pointInside:touch_point withEvent:event]) {
// Point inside
}else {
// Point isn't inside
}
In this case event is taken from :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self.view];
for (UIImageView *piece in piece_images) {
CGFloat x_min = piece.center.x - (piece.bounds.size.width / 2);
CGFloat x_max = x_min + piece.bounds.size.width;
CGFloat y_min = piece.center.y - (piece.bounds.size.height / 2);
CGFloat y_max = y_min + piece.bounds.size.height;
if (point.x > x_min && point.x < x_max && point.y > y_min && point.y < y_max ) {
piece.hidden = YES;
} else {
piece.hidden = NO;
}
}
}
too bad, I have do it myself...
it's OS 3.2, XCode 3.2.2, tried both on simulator and iPad

how to create a path using coregraphics?

i want to create a path.something like i touch the screen and draw line in touchmove event.when line intersect from starting point.fill that path using any colour.
now see in the image i've drawn a line.i just want to detect if line intersects again to start point.then fill that path with my own desired color.also i m using core graphics to draw line but it's very slow on real device.could you tell me a way to improve speed?
Header:
#import <UIKit/UIKit.h>
#interface myView : UIView {
CGMutablePathRef path;
CGPathRef drawingPath;
CGRect start;
BOOL outsideStart;
}
#end
Implementation:
#import "myView.h"
#implementation myView
- (id) init {
if (self = [super init]) {
self.userInteractionEnabled = YES;
self.multipleTouchEnabled = NO;
}
}
- (void) finishPath {
if (drawingPath) {
CGPathRelease(drawingPath);
}
CGPathCloseSubpath(path);
drawingPath = CGPathCreateCopy(path);
CGPathRelease(path);
path = NULL;
[self setNeedsDisplay];
return;
}
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
path = CGPathCreateMutable();
UITouch *t = [touches anyObject];
CGPoint p = [t locationInView:self];
start = CGRectZero;
start.origin = p;
start = CGRectInset(start,-10, -10);
outsideStart = NO;
CGPathMoveToPoint(path, NULL, p.x, p.y);
}
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if (!path) {
return;
}
UITouch *t = [touches anyObject];
CGPoint p = [t locationInView:self];
if (CGRectContainsPoint(start,p)) {
if (outsideStart) {
[self finishPath];
}
} else {
outsideStart = YES;
}
CGPathAddLineToPoint(path,NULL,p.x,p.y);
[self setNeedsDisplay];
}
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if (!path) {
return;
}
[self finishPath];
}
- (void) touchesCanceled:(NSSet *)touches withEvent:(UIEvent *)event {
if (!path) {
return;
}
CGPathRelease(path);
path = NULL;
}
- (void) drawInRect:(CGRect)rect {
CGContextRef g = UIGraphicsGetCurrentContext();
if (drawingPath) {
CGContextAddPath(g,drawingPath);
[[UIColor redColor] setFill];
[[UIColor blackColor] setStroke];
CGContextDrawPath(g,kCGPathFillStroke);
}
if (path) {
CGContextAddPath(g,path);
[[UIColor blackColor] setStroke];
CGContextDrawPath(g,kCGPathStroke);
}
}
- (void) dealloc {
if (drawingPath) {
CGPathRelease(drawingPath);
}
if (path) {
CGPathRelease(path);
}
[super dealloc];
}
#end
Note that you will probably want to do something so you aren't actually calling setNeedsDisplay every time the path changes. This can get very slow. Suggestions include having an NSTimer that fires every x milliseconds to check if it needs to redisplay and do so if it does, or only redrawing if the touch has moved a significant distance.
Maybe it can be useful to you link text
Core Graphics should definitely not be using [self setNeedsDisplay] every time the image changes, which is probably why your code is so slow on the device. Drawing is slow. If you use OpenGL ES to draw the lines, it will be much quicker, at the expense of being more difficult to understand.

Multi touch is not working perfectly on UIImageView

I am doing multi touch on UImageView means zoom in and zoom out on image view. I am using followng code but it doesn't work very well. Can anyone look at this code,
#import "ZoomingImageView.h"
#implementation ZoomingImageView
#synthesize zoomed;
#synthesize moved;
define HORIZ_SWIPE_DRAG_MIN 24
define VERT_SWIPE_DRAG_MAX 24
define TAP_MIN_DRAG 10
CGPoint startTouchPosition;
CGFloat initialDistance;
- (id)initWithFrame:(CGRect)frame {
if (self = [super initWithFrame:frame]) {
// Initialization code
moved = NO;
zoomed = NO;
}
return self;
}
- (void)drawRect:(CGRect)rect {
// Drawing code
}
- (void)dealloc {
if([timer isValid])
[timer invalidate];
[super dealloc];
}
- (void) setImage: (UIImage*)img
{
zoomed = NO;
moved = NO;
self.transform = CGAffineTransformIdentity;
[super setImage:img];
}
- (CGFloat)distanceBetweenTwoPoints:(CGPoint)fromPoint toPoint:(CGPoint)toPoint {
float x = toPoint.x - fromPoint.x;
float y = toPoint.y - fromPoint.y;
return sqrt(x * x + y * y);
}
- (CGFloat)scaleAmount: (CGFloat)delta {
CGFloat pix = sqrt(self.frame.size.width * self.frame.size.height);
CGFloat scale = 1.0 + (delta / pix);
return scale;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if([timer isValid])
[timer invalidate];
moved = NO;
switch ([touches count]) {
case 1:
{
// single touch
UITouch * touch = [touches anyObject];
startTouchPosition = [touch locationInView:self];
initialDistance = -1;
break;
}
default:
{
// multi touch
UITouch *touch1 = [[touches allObjects] objectAtIndex:0];
UITouch *touch2 = [[touches allObjects] objectAtIndex:1];
initialDistance = [self distanceBetweenTwoPoints:[touch1 locationInView:self]
toPoint:[touch2 locationInView:self]];
break;
}
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch1 = [[touches allObjects] objectAtIndex:0];
if([timer isValid])
[timer invalidate];
/*if ([touches count] == 1) {
CGPoint pos = [touch1 locationInView:self];
self.transform = CGAffineTransformTranslate(self.transform, pos.x - startTouchPosition.x, pos.y - startTouchPosition.y);
moved = YES;
return;
}****/
if ((initialDistance > 0) && ([touches count] > 1)) {
UITouch *touch2 = [[touches allObjects] objectAtIndex:1];
CGFloat currentDistance = [self distanceBetweenTwoPoints:[touch1 locationInView:self]
toPoint:[touch2 locationInView:self]];
CGFloat movement = currentDistance - initialDistance;
NSLog(#"Touch moved: %f", movement);
CGFloat scale = [self scaleAmount: movement];
self.transform = CGAffineTransformScale(self.transform, scale, scale);
// }
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch1 = [[touches allObjects] objectAtIndex:0];
if ([touches count] == 1) {
// double tap to reset to default size
if ([touch1 tapCount] > 1) {
if (zoomed) {
self.transform = CGAffineTransformIdentity;
moved = NO;
zoomed = NO;
}
return;
}
}
else {
// multi-touch
UITouch *touch2 = [[touches allObjects] objectAtIndex:1];
CGFloat finalDistance = [self distanceBetweenTwoPoints:[touch1 locationInView:self]
toPoint:[touch2 locationInView:self]];
CGFloat movement = finalDistance - initialDistance;
NSLog(#"Final Distance: %f, movement=%f",finalDistance,movement);
if (movement != 0) {
CGFloat scale = [self scaleAmount: movement];
self.transform = CGAffineTransformScale(self.transform, scale, scale);
NSLog(#"Scaling: %f", scale);
zoomed = YES;
}
}
}
- (void)singleTap: (NSTimer*)theTimer {
// must override
}
- (void)animateSwipe: (int) direction {
// must override
}
It is not working fine on device. can anyone tell that where i am wrong.
When you use any CGAffinTransform.... the value of the frame property becomes undefined. In your code you are using the frame.size.width and frame.size.height to calculate the change in size. After the first iteration of CGAffinTransformScale you would not get the right scale factor. According to the documentation bounds would be the right property for scale calculations.