iPhone SDK Erase A UIImageView from the Screen Using Touches? - iphone

I am looking for a way of being able to erase a UIImageView from the screen. When I say erase I don't mean [imageView removeFromSuperview];, I mean to erase parts of the image by scribbling your finger on the screen. Where ever your finger is, that is the portion of the image that is erased. I just can't find any help with this.
I would image in has to do with Quartz? If so, I'm not real good with that. :(
I guess the best example is a Lottery ticket. Once you scratch the portion of the ticket, that area beneath it reveals. Anyone know how to accomplish this?
Thank you!
Update: The following code is what did the trick. Thank you!
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
lastTouch = [touch locationInView:canvasView];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
currentTouch = [touch locationInView:canvasView];
CGFloat brushSize = 35;
CGColorRef strokeColor = [UIColor whiteColor].CGColor;
UIGraphicsBeginImageContext(scratchView.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[canvasView.image drawInRect:CGRectMake(0, 0, canvasView.frame.size.width, canvasView.frame.size.height)];
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, brushSize);
CGContextSetStrokeColorWithColor(context, strokeColor);
CGContextSetBlendMode(context, kCGBlendModeClear);
CGContextBeginPath(context);
CGContextMoveToPoint(context, lastTouch.x, lastTouch.y);
CGContextAddLineToPoint(context, currentTouch.x, currentTouch.y);
CGContextStrokePath(context);
canvasView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastTouch = [touch locationInView:canvasView];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
}

You can definitely do this with a UIImageView, no custom Quartz layer required. Are you familiar with any form of drawing in iOS? Basically, you just need to keep track of the current and previous touch location using touchesBegan:, touchesMoved:, and touchesEnded.
Then you need to draw a 'line' (which in this case erases what's underneath it) between the current touch location and previous touch location using something like the following, which is taken directly from an actual application I developed that did something rather similar:
UIGraphicsBeginImageContext(canvasView.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[canvasView.image drawInRect:CGRectMake(0, 0, canvasView.frame.size.width, canvasView.frame.size.height)];
CGContextSetLineCap(context, lineCapType);
CGContextSetLineWidth(context, brushSize);
CGContextSetStrokeColorWithColor(context, strokeColor);
CGContextSetBlendMode(context, kCGBlendModeClear);
CGContextBeginPath(context);
CGContextMoveToPoint(context, lastTouch.x, lastTouch.y);
CGContextAddLineToPoint(context, currentTouch.x, currentTouch.y);
CGContextStrokePath(context);
canvasView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
In this code canvasView is a UIImageView. There are lots of tutorials out there for this kind of drawing. The important thing for what you want is to set the blend mode to kCGBlendModeClear. That's this line:
CGContextSetBlendMode(context, kCGBlendModeClear);

Related

I have used Quartz2D to draw the line, Now to UNDO the drawn line?

Well i'm using Quartz2D to draw the things, i have followed this tutorial and its working fine.But i need to implement the UNDO option in it.
I have a undo button when i press it , it must undo the drawn line .
I"M using below code to draw . Does any one know the solution for it.
Thaks in Advance.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = NO;
UITouch *touch = [touches anyObject];
lastPoint = [touch locationInView:self.view];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = YES;
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
UIGraphicsBeginImageContext(self.view.frame.size);
[self.tempDrawImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brush );
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), red, green, blue, 1.0);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeNormal);
CGContextStrokePath(UIGraphicsGetCurrentContext());
self.tempDrawImage.image = UIGraphicsGetImageFromCurrentImageContext();
[self.tempDrawImage setAlpha:opacity];
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if(!mouseSwiped) {
UIGraphicsBeginImageContext(self.view.frame.size);
[self.tempDrawImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brush);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), red, green, blue, opacity);
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
CGContextFlush(UIGraphicsGetCurrentContext());
self.tempDrawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
UIGraphicsBeginImageContext(self.mainImage.frame.size);
[self.mainImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height) blendMode:kCGBlendModeNormal alpha:1.0];
[self.tempDrawImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height) blendMode:kCGBlendModeNormal alpha:opacity];
self.mainImage.image = UIGraphicsGetImageFromCurrentImageContext();
self.tempDrawImage.image = nil;
UIGraphicsEndImageContext();
}
you can not undo a specific drawing operation. You have to save all the touches and store it in an appropriate way (your "step" object). You store this "step" objects in a drawing-session (an array) and if you want to do an undo you delete the last object in this array and after this redraw the whole image with all the steps remaining in the (session)array.
The simplest way to do undo is to capture the rectangle that is about to change and store the contents of that rectangle in memory or in a file. When you want to undo, simply draw the rectangle that was saved at the correct coordinates using kCGBlendModeCopy.
For multiple undo, you can store a stack of these rectangles and pop when they want to undo. Redo is also easy, instead of a pop, you just move back one position in an array. For a redo, you move forward one position.
If you are modifying your image in real time (i.e. they are drawing with their finger) then you can't pre-get the rectangle before-hand, and instead you'll need a second buffer that contains a copy of your image, and you can use that to get your undo rectangle after they finish the draw operation. Once they finish, you copy the image to the undo buffer.
Good luck!

Implementing Eraser functionality for drawing app

I was following this great tutorial at ray wenderlich's site about creating a simple drawing app with UIKit.
http://www.raywenderlich.com/18840/how-to-make-a-simple-drawing-app-with-uikit
the tutorial is great and everything is working. the problem that I have is that when it came to creating the eraser functionality the solution proposed by ray was to use the brush with the same color that the background has. To me this doesn't seem like a great solution. what if the background is not a solid color like a gradient or any image like in so many coloring book apps.
So basically the question is: is there a way to remove color (convert all pixels in that area to transparent maybe) from a UIImageView at a given location?
Any help or pointers would greatly be appriciated. Thanks
I have the same issue in my app after the long search i found the simple solution for this issue.
i just used touches methods of the UIViewController
The below is the my approach,
Code:-
#pragma mark touches stuff...
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
lastTouch = [touch locationInView:self.editedImageView];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
currentTouch = [touch locationInView:self.editedImageView];
CGFloat brushSize;
if (isEraser)
{
brushSize=eraser;
}
else
{
brushSize=mark;
}
CGColorRef strokeColor = [UIColor whiteColor].CGColor;
UIGraphicsBeginImageContext(self.editedImageView.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[self.editedImageView.image drawInRect:CGRectMake(0, 0, self.editedImageView.frame.size.width, self.editedImageView.frame.size.height)];
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, brushSize);
if (isEraser) {
CGContextSetStrokeColorWithColor(UIGraphicsGetCurrentContext(), [UIColor colorWithPatternImage:self.im].CGColor);
}
else
{
CGContextSetStrokeColorWithColor(context, strokeColor);
CGContextSetBlendMode(context, kCGBlendModeClear);
}
CGContextBeginPath(context);
CGContextMoveToPoint(context, lastTouch.x, lastTouch.y);
CGContextAddLineToPoint(context, currentTouch.x, currentTouch.y);
CGContextStrokePath(context);
self.editedImageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastTouch = [touch locationInView:self.editedImageView];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
}
The Inputs:
self.editedImageView.image = "Your Custom image";
self.im = "Your Custom image";
The simple solution of your problem will be :-
CGContextSetStrokeColorWithColor(UIGraphicsGetCurrentContext(), [UIColor colorWithPatternImage:self.im].CGColor);
Note:
This not unerase it again drawing the image over
Update:-
self.im = "Your Custom image";
this will be like this....
-(void)eraserConfigure
{
UIImageView *resizeImage=[[[UIImageView alloc]initWithImage:editedImage]autorelease];
/*UIImage */self.im=[UIImage imageFromView:resizeImage scaledToSize:CGSizeMake(self.editedImageView.frame.size.width, self.editedImageView.frame.size.height)];
}
Use the brush but for the color use:
[UIColor clearColor]
I could solve the issue using below code:
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeClear);
Reference

how to draw continuous lines by QuartzCore in iOS

I have got a question about quartz core on iOS
When I use Quartz to draw lines, the following happens and I can't find the reason:
when I draw the second line the first line would disappear
when I draw the third line the second would the disappear, and the 1st and 3rd would show
..
when I draw the 2n+1th line, 1,3,5,...2n-1th lines shows, and 2,4,6,8... 2n dissappears
see the code below. I don't save contexts and paths
as my understanding, I think it should be one of the two cases
display all lines I drawn
display the last line I drawn and the previous lines should disappear
But the two cases don't happen.
- (void)drawInContext:(CGContextRef)context {
// 橡皮擦
//CGContextClearRect(context, CGRectMake(0, 0, 320, 480));
CGContextSetLineWidth(context, 4.0);
CGContextMoveToPoint(context, previousPoint.x, previousPoint.y);
CGContextSetRGBStrokeColor(context, 0, 1.0, 1.0, 1.0);
CGContextAddLineToPoint(context, nextPoint.x,nextPoint.y);
CGContextStrokePath(context);
//NIF_TRACE(#"began : %# moved : %#", NSStringFromCGPoint(previousPoint),NSStringFromCGPoint(nextPoint));
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
previousPoint = [touch locationInView:self];
nextPoint = [touch locationInView:self];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
previousPoint = nextPoint;
nextPoint = [touch locationInView:self];
[self setNeedsDisplay];
}
I saw a demo that use NSMutableArray to save all UIBezierPaths it drawn, when the View redraw, it transfer paths saved in array, restores them in drawRect:
UIBezierPath is a wrapper of Objective C, and it only works in 3.2+
I need do something make it work in 3.0+
I think it must be exists a better method to save contexts and paths (Colors,Paths,strokeWidths)
Anybody have ideas?
Every time you "drawInContext" you are clearing the drawing area and placing the new line.... This clear code:
CGContextClearRect(context, CGRectMake(0, 0, 320, 480));
Needs to be somewhere else in your code... somewhere where it will only be activated 1 time (or if you need to clear the whole drawing (perhaps an erase all function))
You need to save the image on canvas:
CGImageRef imageRef;
- (void)drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
if (imageRef) {
// Restore the screen that was previously saved
CGContextTranslateCTM(context, 0, rect.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
CGContextDrawImage(context, rect, imageRef);
CGImageRelease(imageRef);
CGContextTranslateCTM(context, 0, rect.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
}
// Your drawing code here...
imageRef = CGBitmapContextCreateImage(context);
}

draw a line from a stationary point to a moving point on iphone

How can I draw a line between one point (the center of one UIView) to a point that moves (touch location), and the line moves the 2nd point as the touch moves.
In your custom view:
in touchesMoved:withEvent store current point into a variable, and call [self setNeedsDisplay] so that the view would redraw
implement drawing of a line in drawRect:, use core graphics to draw a line
Let's say you store the touched point into property self.touchedPoint, then drawing might look like this:
#property (nonatomic, assign) CGPoint touchedPoint;
- (void)drawRect:(CGRect)rect
{
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSaveGState(context);
CGContextTranslateCTM(context, 0.0, rect.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
CGContextSetShouldAntialias(context, YES);
CGContextSetLineWidth(context, 1.0f);
CGContextSetRGBStrokeColor(context, 0.7, 0.7, 0.7, 1.0);
CGContextMoveToPoint(context, rect.size.width/2, rect.size.height/2);
CGContextAddLineToPoint(context, self.touchedPoint.x, self.touchedPoint.y);
CGContextDrawPath(context, kCGPathStroke);
CGContextRestoreGState(context);
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
self.touchedPoint = [[touches anyObject] locationInView:self];
[self setNeedsDisplay];
}
I voted Michal's answer up. But I would also suggest looking at the Touches sample project. It is easy to get it running - which may be helpful if you are still just putting together your project.

Use a CoreGraphic Stroke as Alpha Mask in iPhone App

I'm basically looking to create something akin to a very simple version of iSteam/iFog alebit for a different purpose. In effect there will be two images, one of the subject matter and the other an image of condensation or some such. The user can then wipe their finger over the screen and it will "cut" that from the top layer to reveal the lower layer. So far I've been able to draw a basic line on the screen using CoreGraphics and strokes but I can't find a way to then use this as an alpha mask for the steam layer.
If anyone could give me advice on what to use or even better some sample code I'd be very grateful as right now I'm pulling my hair out. Here is what I have so far:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = YES;
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
UIGraphicsBeginImageContext(self.view.frame.size);
CGRect rect = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
[drawImage.image drawInRect:rect];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, 36.0);
CGContextSetRGBStrokeColor(context, 1.0, 1.0, 1.0, 1.0);
CGContextBeginPath(context);
CGContextMoveToPoint(context, lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(context, currentPoint.x, currentPoint.y);
CGContextStrokePath(context);
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
}
The method you want to call is CGContextClipToMask. Just draw your "steam" image and then the stroke on another CGImage. Then, clip the steam to the stroke. Something like this:
- (void)somewhereElse{
UIImage *steam = [[UIImage imageNamed:#"steamImage.png"] retain];
steamRef = steam.CGImage;
//...
}
- (void)drawRect:(CGRect)rect
{
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextDrawImage(context, self.bounds, steamRef); //draw the main image
CGContextClipToMask(context, self.bounds, maskRef); //respect alpha mask
//where maskRef is a GCImage of your stroked path
}