Blur effect while drawing with Quartz 2D - iphone

I'd like to have a blur effect while drawing like the right line in this picture:
Currently, I'm drawing with the following code, but this only draws the picture on the left:
CGContextSetLineWidth(currentContext, thickness);
CGContextSetLineCap(currentContext, kCGLineCapRound);
CGContextBeginPath(currentContext);
CGContextMoveToPoint(currentContext, x, y);
CGContextAddLineToPoint(currentContext, x, y);
CGContextMoveToPoint(currentContext, x, y);
CGContextStrokePath(currentContext);
Any ideas for me please?
Regards,
Alexandre

CIContext *context = [CIContext contextWithOptions:nil];
CIImage *inputImage = [[CIImage alloc] initWithImage:#"your image"];
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:9.0f] forKey:#"inputRadius"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
CGImageRef cgImage = [context createCGImage:result fromRect:[result extent]];
UIImage *blurrImage = [UIImage imageWithCGImage:cgImage];
use this code this will give you blurr effect.

Related

Adding Filters to imageview

I am new to ios development and trying to use code filters...
imgAnimation=[[UIImageView alloc]initWithFrame:frame];
imgAnimation.animationImages=_arrimg;
//animationImages=animationImages;
imgAnimation.contentMode=UIViewContentModeScaleAspectFit;
imgAnimation.animationDuration = 2.0f;
imgAnimation.animationRepeatCount = 0;
[imgAnimation startAnimating];
[self.view addSubview:imgAnimation];
My animation is working properly but how can I apply filters like sepia, grey scale I had found many tutorials but they are for single images kindly help ???
Sample Code :
+(UIImage *) applyFilterToImage: (UIImage *)inputImage
{
CIImage *beginImage = [CIImage imageWithCGImage:[inputImage CGImage]];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CISepiaTone" keysAndValues: kCIInputImageKey, beginImage, #"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
return newImg;
}
Basically you have to pre load all the images with the filters. Then load them to an array and set it to animationImages property.
For getting the various filters you can refer this project from github
GPU image open source framework for image effects.
Download the Source Code From Here:-
Hope it Helps to You :)

how can i draw square or circle arround eyes and mouth in ciimage iOS

i am new to iOS programming, and i have searched a lot of core image links, answers and tutorials but i still don’t understand exactly. I have a test app who have buttons, easy stuff, i have image when i press some filter button it makes filters and show it.
CIImage *inputImage = [[CIImage alloc]initWithImage:[UIImage imageNamed:#"pic1.JPG"]];
CIFilter *colorControls = [CIFilter filterWithName:#"CIColorControls"];
[colorControls setValue:inputImage forKey:#"inputImage"];
[colorControls setValue:[NSNumber numberWithFloat:0.5f] forKey:#"inputSaturation"];
[colorControls setValue:[NSNumber numberWithFloat:0.8f] forKey:#"inputContrast"];
[colorControls setValue:[NSNumber numberWithFloat:0.4f] forKey:#"inputBrightness"];
CIImage *outputImage = [colorControls valueForKey:#"outputImage"];
CIContext *context = [CIContext contextWithOptions:nil];
theImageView.image = [UIImage imageWithCGImage:[context createCGImage:outputImage fromRect:outputImage.extent]];
now i would like to have a button who would detect eyes and mouth and draw rectangles or circles around it, it doesn’t matter what and what color. i have come until
- (IBAction)detectFace:(id)sender
{
CIImage *inputImage = [[CIImage alloc]initWithImage:[UIImage imageNamed:#"pic1.JPG"]];
CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace
context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];
NSArray* features = [detector featuresInImage:inputImage];
for (CIFaceFeature *faceFeature in features){}
}
and now i have a problem (and i am not sure if this is right start). Please help i would like to know what to do next..thank you in advance
How about (haven't compiled it, but should be close):
for (CIFaceFeature *faceFeature in features){
CGRect faceRect = faceFeature.bounds;
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context, 1.0);
CGContextAddRect(context, faceRect);
CGContextDrawPath(context, kCGPathStroke);
}

CIGaussianBlur and CIAffineClamp on iOS 6

I am trying to blur an image using CoreImage on iOS 6 without having a noticeable black border. Apple documentation states that using a CIAffineClamp filter can achieve this but I'm not able to get an output image from the filter. Here's what I tried, but unfortunately an empty image is created when I access the [clampFilter outputImage]. If I only perform the blur an image is produced, but with the dark inset border.
CIImage *inputImage = [[CIImage alloc] initWithCGImage:self.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGAffineTransform transform = CGAffineTransformIdentity;
CIFilter *clampFilter = [CIFilter filterWithName:#"CIAffineClamp"];
[clampFilter setValue:inputImage forKey:kCIInputImageKey];
[clampFilter setValue:[NSValue valueWithBytes:&transform objCType:#encode(CGAffineTransform)] forKey:#"inputTransform"];
CIImage *outputImage = [clampFilter outputImage];
CIFilter *blurFilter = [CIFilter filterWithName:#"CIGaussianBlur"
keysAndValues:kCIInputImageKey, outputImage, #"inputRadius", [NSNumber numberWithFloat:radius], nil];
outputImage = [blurFilter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *blurredImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
The CIAffineClamp filter is setting your extent as infinite, which then confounds your context. Try saving off the pre-clamp extent CGRect, and then supplying that to the context initializer.

CISoftLightBlendMode not centered on Image

hey I'm having a problem with the CISoftLightBlendMode when I apply the filter to the image the overlay image is not centered over the other image it just stays in the bottom left corner so only a very small portion of the image is covered by the overlay texture.
Heres my code:
UIImage* bg = [UIImage imageNamed:#"Texture.png"];
CIImage* beginImage = [CIImage imageWithCGImage:[image_view.image CGImage]];
CIImage* bgImage = [CIImage imageWithCGImage:bg.CGImage];
context = [CIContext contextWithOptions:nil];
filter = [CIFilter filterWithName:#"CISoftLightBlendMode" keysAndValues: kCIInputImageKey, beginImage,#"inputBackgroundImage",bgImage, nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
[Image_View setImage:newImg];

CIFilter integration only works with CISepiaTone

Following code adds a nice sepia effect to an image but when I choose another filter for example: CIBloom I see no image at all. Can you help?
- (void)drawRect:(CGRect)rect
{
UIImage *megan = [UIImage imageNamed:#"megan.png"];
CIImage *cimage = [[CIImage alloc] initWithImage:megan];
CIFilter *myFilter = [CIFilter filterWithName:#"CISepiaTone"];
[myFilter setDefaults];
[myFilter setValue:cimage forKey:#"inputImage"];
[myFilter setValue:[NSNumber numberWithFloat:0.8f] forKey:#"inputIntensity"];
CIImage *image = [myFilter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:image fromRect:image.extent];
UIImage *resultUIImage = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
CGRect imageRecht = [[UIScreen mainScreen] bounds];
[resultUIImage drawInRect:imageRecht];
}
From my understanding I should be able to just edit following lines to change the filter:
CIFilter *myFilter = [CIFilter filterWithName:#"CIBloom"];
[myFilter setValue:[NSNumber numberWithFloat:10.0f] forKey:#"inputIntensity"];
but when I do this I see no image at all when I start the app.
CISepiaTone is available in Mac OS X v10.4 and later and in iOS 5.0 and later, while CIBloom
is only available in Mac OS X v10.4 and later.
https://developer.apple.com/library/mac/#documentation/graphicsimaging/reference/CoreImageFilterReference/Reference/reference.html