How to create a blurry to non-blurry animation - iphone

I want to create a magnifying glass type effect in my iPhone app, where the text goes from blurry to not blurry in an animation. Can anyone think of a way to do that? Thanks.

In iOS 6 we finally have the ability to make an image literally blurry, using a CIFilter. So you could, if you really wanted to, make an image of the area to be blurred, blur it with CIFilter, and superimpose that blurred image. Then you could use a timer or CADisplayLink to ask for successive "frames" of the animation, and each time you would do the same thing, only creating a less and less blurred image and showing it.

It is Loupe effect.
- (void)drawRect:(CGRect)rect {
// here we're just doing some transforms on the view we're magnifying,
// and rendering that view directly into this view,
// rather than the previous method of copying an image.
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(context,1*(self.frame.size.width*0.5),1*(self.frame.size.height*0.5));
CGContextScaleCTM(context, 1.5, 1.5);
CGContextTranslateCTM(context,-1*(touchPoint.x),-1*(touchPoint.y));
[self.viewToMagnify.layer renderInContext:context];
}
Reference: http://coffeeshopped.com/2010/03/a-simpler-magnifying-glass-loupe-view-for-the-iphone

Related

Hide / reveal UIImage based on finger path?

I hope that makes sense. I'll try to explain it.
I have a UIImageView on screen, and am wondering how I can take the area after "drawing" on it with a finger, and remove that section from the UImage, or, create a separate UIImage from the selection.
I'm not looking for code (unless you have it =] ), just an idea of how to go about doing it. If you have tips, I'd be very grateful, thanks.
If I understand your question,
I think I would add a transparent UIView as a subview over the top of the UIImageView. And draw on that. Then you can remove/hide the subview when your done.
you need to create a UIGestureRecognizer with target and a action like -imageIsPressed, in this -imagePressed method you can call something to make the image disappear. I would suggest placing the UIImage into a UIImageView and calling imageview.hidden = YES; to hide the image, and set it back to "NO" once its not held by the finger.
You'd need to implement something that captures the area the user 'selected' (maybe be creating a CGPath. Then you create a CALayer of the size of the imageView. In it you create, draw and fill the captured path with some arbitrary color while leaving the rest transparent. Finally you apply your generated CALayer as a mask to the UIImageView:
imageView.layer.mask = maskLayer;
Hope that gets you started.
For more info on how to draw that custom CALayer pls refer to Quartz Programming Guide
So basically you want to implement freehand erasing of an image? You will need to use core graphics and the various CGContext methods (with blend mode set to clear) to achieve this. There are two approaches, but both start with drawing your image as the first part of drawRect, and then
1) Store your strokes in an array, and stroke all of them over top of the image.
2) Stroke one stroke over the image and then store the resulting image into a UIImage. Use this UIImage as the next image that you draw in drawRect. This one is difficult for undo/redo functionality.
I recently implemented this myself and made the source available here. Basically I used the same methods described here with this change when setting up the graphics context:
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeClear);

UIImage Not Transparent

I am trying to make a view that lets a user draw with their finger. I have a png made in Pixelmator of the brush. My drawRect: method looks like this:
- (void)drawRect:(CGRect)rect
{
NSAssert(brush != nil, #"");
for (UITouch *touch in touchesNeedingDrawing)
{
//Draw the touch
[brush drawInRect:CGRectWithPoints([touch locationInView:self], [touch previousLocationInView:self]) blendMode:0 alpha:1];
}
[touchesNeedingDrawing removeAllObjects];
}
The brush image is a png with transparency, but when I run the app, there is no transparency. Does anyone know why, and how to fix it?
Edit:
I discovered that the image is transparent, but when I call drawInRect, the image draws the transparent pixels as the background color of the view. Is there a CGBlendMode I can use to fix this?
Solution is very simple, testes on my own project. In class which you updated with custom - (void)drawRect:(CGRect)rect set in -(id)init background color for self
[self setBackgroundColor:[UIColor clearColor]];
And thats all. Tested on my project.
It seems like it could be taking the current fill color of the context.
Try setting the fill color for the context with CGContextSetFillColorWithColor() to [UIColor clearColor].CGColor
If that doesn't work, the only other solution that is simple and shouldn't have a performance hit is to have 2 views:
Background View - This will be view that has the proper background color or image
Overlay View - This view will detect the touches etc and then draw all of the brush strokes on top. The background color of this view can then be [UIColor clearColor] so when you draw the brush images, the alpha will be filled with [UIColor clearColor]. Basically the same view you have now, just with a clear background.
Note: You shouldn't need to mess with the blend mode to make this work. You should be able to use the default drawInRect: method
Is the brush png loaded to an imageView? That is, the variable brush is an object of UIImageView, isn't it?
If so, perhaps simple
brush.backgroundColor = [UIColor clearColor];
will help
i think you should try destination out blend mode: kCGBlendModeDestinationOut.
You could also draw at point instead draw in rect:
[brush drawAtPoint:[touch locationInView:self] blendMode:kCGBlendModeDestinationOut alpha:1]
A possible issue is that you are setting the blendMode to 0. I suggest using the -drawInRect: method without a blend mode. The issue may also be that your view has a black background, but that is doubtful. I would also suggest attempting to display the UIImage in a UIImageView as a test. The issue may be related to the way that PixelMator exports images.
Your problem is a fundamental misconception of how drawRect: actually works. Every time you draw something into the current graphics context, everything that was there previously will be cleared (so that only the backgroundColor remains).
Since you're only drawing the current touch(es) (touchesNeedingDrawing is emptied), there's nothing under the rectangle you're filling that could show the transparency of the image you're drawing, so the background color shows through.
You basically have two options to resolve this:
1) Keep all touches that contribute to the drawing around (don't empty the touchesNeedingDrawing array) and redraw all of them every time – this is going to be easy but quite slow.
2) Draw into a buffer of some kind (e.g. a UIImage, using UIGraphicsBeginImageContext etc.). Every time your drawing changes, create a new buffer. Draw the old buffer into the new buffer and the images for the new stroke on top of it.

Create a glow around a UIView

I would like to draw a glow-ish border around a UIView which is roughly 5px from the actual UIView itself.
Please could you tell me how I could achieve this?
Probably the easiest way is to create a shadow, but use a light color instead of a dark one. Shadow details can be found here: How do I draw a shadow under a UIView? and here.
Something like this should get the ball rolling:
- (void)drawRect:(CGRect)rect
{
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSaveGState(context);
CGContextSetShadowWithColor(context, CGSizeMake(0, 0), 10,
[UIColor whiteColor].CGColor);
[super drawRect:rect];
CGContextRestoreGState(context);
}
Update: I just tried this out. You will have to use this code on the superview of the glowing view for it to work properly.
The first thing I would try is to embed the UIView within a UIView which has the glow image. If the glow effect is just an image, then you create a UIView containing the glow image that is 10 px taller and wider than the UIView being surrounded. This will allow for 5 px of extension on all 4 sides. You can do all this quickly and easily using Interface Builder.
If you want the glow effect to look really cool, considering creating a collection of glow images that when viewed as a sequence will show sort of a moving glow effect. You can then use this collection of images in the UIView and turn on animation. All UIView controls have animation support built in.
Hope this helped. Good Luck.

Erase some part of a UIImageView (like a rubber do)

I'm building an iphone View controller that display two stacked images (on on top of the other).
What i need to do is to erase some parts of the image on top (making it transparent while i'm moving my finger on it).
I'll be fine with all the application logic (dragging, saving) but i need to know how i should implement that feature: CALayer, UIView ?
Thanks in advance
This is thoroughly untested and I have only recently really started grokking CoreGraphics drawing, so I maybe completely wrong. In other words, please let me know if this does indeed work...
So, my thought is you draw your image in your views drawLayer:withContext: using:
CGContextDrawImage(context, CGRectMake(0, 0, image.size.width, image.size.height), [image CGImage]);
Then, in that same drawLayer:withContext: method set a transparent color, and draw your touches after that. Hopefully that will replace your image pixels with transparent pixels on the layer.

composite colors: CALayer and blend mode on iPhone

I'm trying to use core image on the iphone. I'm able to composite my colors using quartz to draw an uiview, but I want to separate each component into CALayer (UIview consume more resources).
So I have a white mask I want to use to filter a background bitmap, and I want to try different blending mode. Unfortunately, the layers are only "adding" their colors.
Here is my code:
#implementation WhiteLayerHelper
- (void)drawLayer:(CALayer *)theLayer
inContext:(CGContextRef)myContext
{
// draw a white overlay, with special blending and alpha values, so that the saturation can be animated
CGContextSetBlendMode(myContext,kCGBlendModeSaturation);
CGContextSetRGBFillColor(myContext,1.0,1.0,1.0,0.9);
CGContextFillRect(myContext,[UIScreen mainScreen].bounds);
}
#end
And here is the main view drawrect code, where I use my CALayer:
- (void)drawRect:(CGRect)rect {
//get the drawing context
CGContextRef myContext = UIGraphicsGetCurrentContext();
// draw the background
[self fillContext:myContext withBounds:m_overlayRect withImage:m_currentImage];
[whiteLayer renderInContext:myContext];
}
Is there something wrong?
I managed to get the affect of compositing multiple CALayers by drawing them directly into a UIView's graphics context.
-(void)drawRect:(CGRect)rect {
CGContextRef c = UIGraphicsGetCurrentContext();
CGContextSetBlendMode(c, kCGBlendModeDifference);
[myLayer drawInContext:c];
}
BTW, I did not add the layers as sublayers of the view's layer (that is I never called [myView.layer addSublayer:myLayer])
This method seems not to be a flaw of Core Animation, because the layers are prerendered into image contexts. Core Image is used for real time filtering (during animation and whatnot) of these images against background layers and their images. So the compositing properties of CALayer are used for this ability, which are not available on iPhone/iOS (yet) due to the requirement of Core Image.
OpenGL can do this for us in our situation, however =)
edit(add): setting the blend mode with CGContext in -drawInContext: and -drawLayer:inContext: does of course still have effect with what was already rendered or present in the image of that context. (when it is set before anything was rendered in the context('s image), it is the effect of blending against either full Black or full White (i am not sure which=)
Not at all... I think this is easier to use opengl to achieve this, because it seems to be not yet implemented in CA.