Implementing Eraser functionality for drawing app - iphone

I was following this great tutorial at ray wenderlich's site about creating a simple drawing app with UIKit.
http://www.raywenderlich.com/18840/how-to-make-a-simple-drawing-app-with-uikit
the tutorial is great and everything is working. the problem that I have is that when it came to creating the eraser functionality the solution proposed by ray was to use the brush with the same color that the background has. To me this doesn't seem like a great solution. what if the background is not a solid color like a gradient or any image like in so many coloring book apps.
So basically the question is: is there a way to remove color (convert all pixels in that area to transparent maybe) from a UIImageView at a given location?
Any help or pointers would greatly be appriciated. Thanks

I have the same issue in my app after the long search i found the simple solution for this issue.
i just used touches methods of the UIViewController
The below is the my approach,
Code:-
#pragma mark touches stuff...
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
lastTouch = [touch locationInView:self.editedImageView];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
currentTouch = [touch locationInView:self.editedImageView];
CGFloat brushSize;
if (isEraser)
{
brushSize=eraser;
}
else
{
brushSize=mark;
}
CGColorRef strokeColor = [UIColor whiteColor].CGColor;
UIGraphicsBeginImageContext(self.editedImageView.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[self.editedImageView.image drawInRect:CGRectMake(0, 0, self.editedImageView.frame.size.width, self.editedImageView.frame.size.height)];
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, brushSize);
if (isEraser) {
CGContextSetStrokeColorWithColor(UIGraphicsGetCurrentContext(), [UIColor colorWithPatternImage:self.im].CGColor);
}
else
{
CGContextSetStrokeColorWithColor(context, strokeColor);
CGContextSetBlendMode(context, kCGBlendModeClear);
}
CGContextBeginPath(context);
CGContextMoveToPoint(context, lastTouch.x, lastTouch.y);
CGContextAddLineToPoint(context, currentTouch.x, currentTouch.y);
CGContextStrokePath(context);
self.editedImageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastTouch = [touch locationInView:self.editedImageView];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
}
The Inputs:
self.editedImageView.image = "Your Custom image";
self.im = "Your Custom image";
The simple solution of your problem will be :-
CGContextSetStrokeColorWithColor(UIGraphicsGetCurrentContext(), [UIColor colorWithPatternImage:self.im].CGColor);
Note:
This not unerase it again drawing the image over
Update:-
self.im = "Your Custom image";
this will be like this....
-(void)eraserConfigure
{
UIImageView *resizeImage=[[[UIImageView alloc]initWithImage:editedImage]autorelease];
/*UIImage */self.im=[UIImage imageFromView:resizeImage scaledToSize:CGSizeMake(self.editedImageView.frame.size.width, self.editedImageView.frame.size.height)];
}

Use the brush but for the color use:
[UIColor clearColor]

I could solve the issue using below code:
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeClear);
Reference

Related

Filing color like effect in iphone app

I am trying to make a "feeling app" which has an image which will change its color up to that location where the user touches it. I implemented UIPanGestureRecognizer for that and found the touch but was unable to change the color of that particular part of the image.
try this block of code....
you want to change color means change the stokeColor and you want to change size of the brush means see the commented line
#pragma mark touches stuff...
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
lastTouch = [touch locationInView:imageView];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
currentTouch = [touch locationInView:imageView];
CGColorRef strokeColor = [UIColor brownColor].CGColor;
UIGraphicsBeginImageContext(imageView.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[imageView.image drawInRect:CGRectMake(0, 0, imageView.frame.size.width, imageView.frame.size.height)];
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, 10); // u can change the brush size here
CGContextSetStrokeColorWithColor(context, strokeColor); // u need to change this to required color
CGContextSetBlendMode(context,kCGBlendModeDarken ); // try the various blend modes
CGContextBeginPath(context);
CGContextMoveToPoint(context, lastTouch.x, lastTouch.y);
CGContextAddLineToPoint(context, currentTouch.x, currentTouch.y);
CGContextStrokePath(context);
imageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastTouch = [touch locationInView:imageView];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
lastTouch = [touch locationInView:imageView];
}
i hope this will help you...

iPhone SDK Erase A UIImageView from the Screen Using Touches?

I am looking for a way of being able to erase a UIImageView from the screen. When I say erase I don't mean [imageView removeFromSuperview];, I mean to erase parts of the image by scribbling your finger on the screen. Where ever your finger is, that is the portion of the image that is erased. I just can't find any help with this.
I would image in has to do with Quartz? If so, I'm not real good with that. :(
I guess the best example is a Lottery ticket. Once you scratch the portion of the ticket, that area beneath it reveals. Anyone know how to accomplish this?
Thank you!
Update: The following code is what did the trick. Thank you!
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
lastTouch = [touch locationInView:canvasView];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
currentTouch = [touch locationInView:canvasView];
CGFloat brushSize = 35;
CGColorRef strokeColor = [UIColor whiteColor].CGColor;
UIGraphicsBeginImageContext(scratchView.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[canvasView.image drawInRect:CGRectMake(0, 0, canvasView.frame.size.width, canvasView.frame.size.height)];
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, brushSize);
CGContextSetStrokeColorWithColor(context, strokeColor);
CGContextSetBlendMode(context, kCGBlendModeClear);
CGContextBeginPath(context);
CGContextMoveToPoint(context, lastTouch.x, lastTouch.y);
CGContextAddLineToPoint(context, currentTouch.x, currentTouch.y);
CGContextStrokePath(context);
canvasView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastTouch = [touch locationInView:canvasView];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
}
You can definitely do this with a UIImageView, no custom Quartz layer required. Are you familiar with any form of drawing in iOS? Basically, you just need to keep track of the current and previous touch location using touchesBegan:, touchesMoved:, and touchesEnded.
Then you need to draw a 'line' (which in this case erases what's underneath it) between the current touch location and previous touch location using something like the following, which is taken directly from an actual application I developed that did something rather similar:
UIGraphicsBeginImageContext(canvasView.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[canvasView.image drawInRect:CGRectMake(0, 0, canvasView.frame.size.width, canvasView.frame.size.height)];
CGContextSetLineCap(context, lineCapType);
CGContextSetLineWidth(context, brushSize);
CGContextSetStrokeColorWithColor(context, strokeColor);
CGContextSetBlendMode(context, kCGBlendModeClear);
CGContextBeginPath(context);
CGContextMoveToPoint(context, lastTouch.x, lastTouch.y);
CGContextAddLineToPoint(context, currentTouch.x, currentTouch.y);
CGContextStrokePath(context);
canvasView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
In this code canvasView is a UIImageView. There are lots of tutorials out there for this kind of drawing. The important thing for what you want is to set the blend mode to kCGBlendModeClear. That's this line:
CGContextSetBlendMode(context, kCGBlendModeClear);

I want to draw a rectangle like shape with drag-able corners in IOS

I want to overlay a a box shape over a photograph and allow the user to select each corner, and drag the corners to where they want.
I could use 4 invisible buttons(to represent each corner) that respond to drag events to get the x,y points for each corner, but is there some line drawing functionality available in xcode without touching any of the game api classes? I guess I want to draw lines onto a UIView.
Many Thanks,
-Code
Make a subclass of UIView to represent your view. Add a UIImageView to your view. This will hold the image with user's drawing.
Enable user interaction in the UIView sub-class.
self.userInteractionEnabled = YES;
Detect the starting tap by implementing this method in your subclass:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// We are starting to draw
// Get the current touch.
UITouch *touch = [touches anyObject];
startingPoint = [touch locationInView:self];
}
Detect the last tap to draw straight line:
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
endingPoint = [touch locationInView:self];
// Now draw the line and save to your image
UIGraphicsBeginImageContext(self.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context, 10);
CGContextMoveToPoint(context, NULL, startingPoint.x, startingPoint.y);
CGContextAddLineToPoint(context, NULL, endingPoint.x, endingPoint.y);
CGContextSetRGBFillColor(context, 255, 255, 255, 1);
CGContextSetRGBStrokeColor(context, 255, 255, 255, 1);
CGContextStrokePath(context);
self.image = UIGraphicsGetImageFromCurrentImageContext();
CGContextRelease(context);
}

Drawing paint brush using UIBezierPath is working with uiview but not working with uiimageview?

i am making paint brush using UIBezierPath using following code.
.h File
#interface MyLineDrawingView : UIView
{
UIBezierPath *myPath;
UIColor *brushPattern;
}
#end
.m File
-(id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self)
{
self.backgroundColor=[UIColor colorWithPatternImage:[UIImage imageNamed:#"0010.png"]];
myPath=[[UIBezierPath alloc]init];
myPath.lineWidth=30;
brushPattern=[UIColor redColor];
}
return self;
}
- (void)drawRect:(CGRect)rect
{
[brushPattern setStroke];
[myPath strokeWithBlendMode:kCGBlendModeNormal alpha:1.0];
// [myPath strokeWithBlendMode:kCGBlendModeSaturation alpha:1.0];
// Drawing code
//[myPath stroke];
}
#pragma mark - Touch Methods
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[myPath moveToPoint:[mytouch locationInView:self]];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[myPath addLineToPoint:[mytouch locationInView:self]];
[self setNeedsDisplay];
}
It is working well with UIView. As you can see in above custom class inherits from UIVIew. But when i am subclassing UIImageView instead of UIView I am not able to draw anything. Toches method are called but it draws nothing on the screen. Does anyone know what is wrong with this or how can i resolve this?
The reason that i want to change it to UIImageView from UIView is I want to set the image and change the color. When I use UIView and use
self.backgroundColor=[UIColor colorWithPatternImage:[UIImage imageNamed:#"0010.png"]];
the image does not fit to view. Only part of the whole image is visible. Even if i change the content mode to UIViewContentModeScaleToFill then also whole image is not visible.
From the UIImageView documentation:
The UIImageView class is optimized to draw its images to the display. UIImageView will not call drawRect: a subclass. If your subclass needs custom drawing code, it is recommended you use UIView as the base class.
In short, drawRect can't be used in UIImageView subclasses. Add an image view as a subview or draw the image as part of your drawRect instead.
It can be done using following code:
#interface Canvas : UIImageView {
CGPoint location;
}
#property CGPoint location;
.m file
#synthesize location;
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
self.location = [touch locationInView:self];
}
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentLocation = [touch locationInView:self];
UIGraphicsBeginImageContext(self.frame.size);
CGContextRef ctx = UIGraphicsGetCurrentContext();
//CGContextSetBlendMode(ctx, kCGBlendModeOverlay);
[self.image drawInRect:CGRectMake(0, 0, self.frame.size.width, self.frame.size.height)];
CGContextSetLineCap(ctx, kCGLineCapRound);
CGContextSetBlendMode(ctx, kCGBlendModeMultiply);
CGContextSetLineWidth(ctx, 5.0);
CGContextSetRGBStrokeColor(ctx, 1.0, 0.0, 0.0, 1.0);
//CGContextSetBlendMode(ctx, kCGBlendModeOverlay);
CGContextBeginPath(ctx);
CGContextMoveToPoint(ctx, location.x, location.y);
CGContextAddLineToPoint(ctx, currentLocation.x, currentLocation.y);
CGContextStrokePath(ctx);
self.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
location = currentLocation;
}
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentLocation = [touch locationInView:self];
UIGraphicsBeginImageContext(self.frame.size);
CGContextRef ctx = UIGraphicsGetCurrentContext();
// CGContextSetBlendMode(ctx, kCGBlendModeOverlay);
[self.image drawInRect:CGRectMake(0, 0, self.frame.size.width, self.frame.size.height)];
CGContextSetBlendMode(ctx, kCGBlendModeMultiply);
CGContextSetLineCap(ctx, kCGLineCapRound);
CGContextSetLineWidth(ctx, 5.0);
CGContextSetRGBStrokeColor(ctx, 1.0, 0.0, 0.0, 1.0);
CGContextBeginPath(ctx);
CGContextMoveToPoint(ctx, location.x, location.y);
CGContextAddLineToPoint(ctx, currentLocation.x, currentLocation.y);
CGContextStrokePath(ctx);
self.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
location = currentLocation;
}

In subclassed UIImageView, touchesBegan fires, touchesMoved fires a couple times, but then stops for no reason

I have subclassed a UIImageView, and have implemented touchesBegan/Moved/Finished like this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"Began");
mouseSwiped = NO;
UITouch *touch = [touches anyObject];
if ([touch tapCount] == 2) {
self.image = nil;
return;
}
lastPoint = [touch locationInView:self];
lastPoint.y -= 20;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = YES;
NSLog(#"Moved");
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self];
currentPoint.y -= 20;
UIGraphicsBeginImageContext(self.bounds.size);
[self.image drawInRect:CGRectMake(0, 0, self.bounds.size.width, self.bounds.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 18.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.5, 1.0, 1.0);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
self.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"Ended");
UITouch *touch = [touches anyObject];
if ([touch tapCount] == 2) {
self.image = nil;
return;
}
if(!mouseSwiped) {
NSLog(#"here?");
UIGraphicsBeginImageContext(self.bounds.size);
[self.image drawInRect:CGRectMake(0, 0, self.bounds.size.width, self.bounds.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 18.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.5, 1.0, 1.0);
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
CGContextFlush(UIGraphicsGetCurrentContext());
self.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
}
And it appears to be working for the first little bit. TouchesBegan() fires every time, then when I start moving, touchesMoved() is fired sometimes 5 times, sometimes 3 times, sometimes 7 times, but then it just stops. The touchesEnded() is never fired, and I just don't see what is going on!
I've been staring at this for a while now, does anyone see something I am missing?
Override and implement: - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
See if that is being called.
If so, something is cancelling your touch event. Could be low memory.
For anyone reading this in the future, the solution is to wrap the UIImageView inside a container UIView. The way Ido it for my drawing apps is to dedicate a single UIView to one purpose: containing the UIImageView on which I draw.
Create a class named "ImageContainer" as a subclass of "UIView"
ImageContainer *view = [[ImageContainer alloc] initWithFrame:CGRectMake(0, 0, 100, 100)]];
Of course, the frame can be whatever you want it to be.
Then, inside the "ImageContainer" class (the header file), add a property as follows:
#property UIImageView *imageView;
Synthesize that property inside the implementation file (ImageContainer.m) and add is as a subview to the ImageContainer in the "- (id)initWithFrame:(CGRect)frame" method.
Then, in your draw code, instead of referencing:
self.image
Reference the following:
[[[self imageView] image] drawInRect:self.bounds];
[[self imageView] setImage:...];
Hope this helps someone out there!