Erase some part of a UIImageView (like a rubber do) - iphone

I'm building an iphone View controller that display two stacked images (on on top of the other).
What i need to do is to erase some parts of the image on top (making it transparent while i'm moving my finger on it).
I'll be fine with all the application logic (dragging, saving) but i need to know how i should implement that feature: CALayer, UIView ?
Thanks in advance

This is thoroughly untested and I have only recently really started grokking CoreGraphics drawing, so I maybe completely wrong. In other words, please let me know if this does indeed work...
So, my thought is you draw your image in your views drawLayer:withContext: using:
CGContextDrawImage(context, CGRectMake(0, 0, image.size.width, image.size.height), [image CGImage]);
Then, in that same drawLayer:withContext: method set a transparent color, and draw your touches after that. Hopefully that will replace your image pixels with transparent pixels on the layer.

Related

How to create a blurry to non-blurry animation

I want to create a magnifying glass type effect in my iPhone app, where the text goes from blurry to not blurry in an animation. Can anyone think of a way to do that? Thanks.
In iOS 6 we finally have the ability to make an image literally blurry, using a CIFilter. So you could, if you really wanted to, make an image of the area to be blurred, blur it with CIFilter, and superimpose that blurred image. Then you could use a timer or CADisplayLink to ask for successive "frames" of the animation, and each time you would do the same thing, only creating a less and less blurred image and showing it.
It is Loupe effect.
- (void)drawRect:(CGRect)rect {
// here we're just doing some transforms on the view we're magnifying,
// and rendering that view directly into this view,
// rather than the previous method of copying an image.
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(context,1*(self.frame.size.width*0.5),1*(self.frame.size.height*0.5));
CGContextScaleCTM(context, 1.5, 1.5);
CGContextTranslateCTM(context,-1*(touchPoint.x),-1*(touchPoint.y));
[self.viewToMagnify.layer renderInContext:context];
}
Reference: http://coffeeshopped.com/2010/03/a-simpler-magnifying-glass-loupe-view-for-the-iphone

Hide / reveal UIImage based on finger path?

I hope that makes sense. I'll try to explain it.
I have a UIImageView on screen, and am wondering how I can take the area after "drawing" on it with a finger, and remove that section from the UImage, or, create a separate UIImage from the selection.
I'm not looking for code (unless you have it =] ), just an idea of how to go about doing it. If you have tips, I'd be very grateful, thanks.
If I understand your question,
I think I would add a transparent UIView as a subview over the top of the UIImageView. And draw on that. Then you can remove/hide the subview when your done.
you need to create a UIGestureRecognizer with target and a action like -imageIsPressed, in this -imagePressed method you can call something to make the image disappear. I would suggest placing the UIImage into a UIImageView and calling imageview.hidden = YES; to hide the image, and set it back to "NO" once its not held by the finger.
You'd need to implement something that captures the area the user 'selected' (maybe be creating a CGPath. Then you create a CALayer of the size of the imageView. In it you create, draw and fill the captured path with some arbitrary color while leaving the rest transparent. Finally you apply your generated CALayer as a mask to the UIImageView:
imageView.layer.mask = maskLayer;
Hope that gets you started.
For more info on how to draw that custom CALayer pls refer to Quartz Programming Guide
So basically you want to implement freehand erasing of an image? You will need to use core graphics and the various CGContext methods (with blend mode set to clear) to achieve this. There are two approaches, but both start with drawing your image as the first part of drawRect, and then
1) Store your strokes in an array, and stroke all of them over top of the image.
2) Stroke one stroke over the image and then store the resulting image into a UIImage. Use this UIImage as the next image that you draw in drawRect. This one is difficult for undo/redo functionality.
I recently implemented this myself and made the source available here. Basically I used the same methods described here with this change when setting up the graphics context:
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeClear);

UIImage Not Transparent

I am trying to make a view that lets a user draw with their finger. I have a png made in Pixelmator of the brush. My drawRect: method looks like this:
- (void)drawRect:(CGRect)rect
{
NSAssert(brush != nil, #"");
for (UITouch *touch in touchesNeedingDrawing)
{
//Draw the touch
[brush drawInRect:CGRectWithPoints([touch locationInView:self], [touch previousLocationInView:self]) blendMode:0 alpha:1];
}
[touchesNeedingDrawing removeAllObjects];
}
The brush image is a png with transparency, but when I run the app, there is no transparency. Does anyone know why, and how to fix it?
Edit:
I discovered that the image is transparent, but when I call drawInRect, the image draws the transparent pixels as the background color of the view. Is there a CGBlendMode I can use to fix this?
Solution is very simple, testes on my own project. In class which you updated with custom - (void)drawRect:(CGRect)rect set in -(id)init background color for self
[self setBackgroundColor:[UIColor clearColor]];
And thats all. Tested on my project.
It seems like it could be taking the current fill color of the context.
Try setting the fill color for the context with CGContextSetFillColorWithColor() to [UIColor clearColor].CGColor
If that doesn't work, the only other solution that is simple and shouldn't have a performance hit is to have 2 views:
Background View - This will be view that has the proper background color or image
Overlay View - This view will detect the touches etc and then draw all of the brush strokes on top. The background color of this view can then be [UIColor clearColor] so when you draw the brush images, the alpha will be filled with [UIColor clearColor]. Basically the same view you have now, just with a clear background.
Note: You shouldn't need to mess with the blend mode to make this work. You should be able to use the default drawInRect: method
Is the brush png loaded to an imageView? That is, the variable brush is an object of UIImageView, isn't it?
If so, perhaps simple
brush.backgroundColor = [UIColor clearColor];
will help
i think you should try destination out blend mode: kCGBlendModeDestinationOut.
You could also draw at point instead draw in rect:
[brush drawAtPoint:[touch locationInView:self] blendMode:kCGBlendModeDestinationOut alpha:1]
A possible issue is that you are setting the blendMode to 0. I suggest using the -drawInRect: method without a blend mode. The issue may also be that your view has a black background, but that is doubtful. I would also suggest attempting to display the UIImage in a UIImageView as a test. The issue may be related to the way that PixelMator exports images.
Your problem is a fundamental misconception of how drawRect: actually works. Every time you draw something into the current graphics context, everything that was there previously will be cleared (so that only the backgroundColor remains).
Since you're only drawing the current touch(es) (touchesNeedingDrawing is emptied), there's nothing under the rectangle you're filling that could show the transparency of the image you're drawing, so the background color shows through.
You basically have two options to resolve this:
1) Keep all touches that contribute to the drawing around (don't empty the touchesNeedingDrawing array) and redraw all of them every time – this is going to be easy but quite slow.
2) Draw into a buffer of some kind (e.g. a UIImage, using UIGraphicsBeginImageContext etc.). Every time your drawing changes, create a new buffer. Draw the old buffer into the new buffer and the images for the new stroke on top of it.

CGContext transparency problem

I have an UIView which has white background color set.
I have set the blending mode of the CGContext of the UIView as 'kCGBlendModeCopy'.
Then,
1. Draw an UIImage in that CGContext
2. Draw a path with alpha as 0 in that context.
The transparent area covered by the path appears in black whereas my expected output was that it should be UIView's background color (i.e. white).
Does anyone knows what is the problem here?
Thanks in advance,
Regards,
Deepa
There is no problem here and hence no solution. Since we are drawing on the UIView's context with transparency, we can see the screen that is in black color. The drawing hierarchy is like this:
1. Black screen
2. On this transparent window is kept. Through this window we can see the screen
3. A view that is partially transparent is kept on this window. Through this view we can see window(Window is transparent and hence we can see screen)
(View is partially transparent because I am drawing part of the View with transparent path)
Hope this helps you Naren
Maybe this article can help you..
http://losingfight.com/blog/2007/08/18/how-to-implement-a-basic-bitmap-brush/
I understood the problem after putting some more effort:
CGContext is not a separate drawing layer of UIView. UIView is a cocoa wrapper for CGContext drawing. Since I am drawing the path with transparency, the screen behind the UIView is visible.

Zoom Loupe on UIImageView

Does anyone know of a way to implement the zoom loupe functionality of a UITextField on the iphone in a UIImage view?
Part of the app I'm building allows a user to draw a line on a UIImage, a process that might involve precision positioning of the points. In order to help the user, I want to provide the zoom loupe as seen when positioning the cursor in a UITextField. Does anyone have any idea as to how to do this? Any pointers to relevant docs?
Cheers!
This tutorial is well written and includes sample code:
http://www.craftymind.com/2009/02/10/creating-the-loupe-or-magnifying-glass-effect-on-the-iphone/
You need to comment out the line that imports CustomView.h and then it compiles and runs fine.
I do not know of any relevant docs.
What I would do when implement something like this is. Show a UIImage representation of your view over the view. Then clip the overlay UIImage to just the portion of the view you want to enlarge. You can enlarge the portion by using CGAffineTransformScale.
You can draw a existing view to a UIImage by using CALayer's renderInContext: method.
UIGraphicsBeginImageContext(self.view.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[self.view.layer renderInContext:context];
UIImage *imageRepresentation = UIGraphicsGetImageFromCurrentImageContext();