Adding Filters to imageview - iphone

I am new to ios development and trying to use code filters...
imgAnimation=[[UIImageView alloc]initWithFrame:frame];
imgAnimation.animationImages=_arrimg;
//animationImages=animationImages;
imgAnimation.contentMode=UIViewContentModeScaleAspectFit;
imgAnimation.animationDuration = 2.0f;
imgAnimation.animationRepeatCount = 0;
[imgAnimation startAnimating];
[self.view addSubview:imgAnimation];
My animation is working properly but how can I apply filters like sepia, grey scale I had found many tutorials but they are for single images kindly help ???

Sample Code :
+(UIImage *) applyFilterToImage: (UIImage *)inputImage
{
CIImage *beginImage = [CIImage imageWithCGImage:[inputImage CGImage]];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CISepiaTone" keysAndValues: kCIInputImageKey, beginImage, #"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
return newImg;
}

Basically you have to pre load all the images with the filters. Then load them to an array and set it to animationImages property.
For getting the various filters you can refer this project from github

GPU image open source framework for image effects.
Download the Source Code From Here:-
Hope it Helps to You :)

Related

UIGraphicsContext memory leak

Hi In my app I have a function that takes an Image of the current view and turns it into a blurred image then adds it to the current.view. All though I remove the view using [remove from superview] it the memory still stays high. I am using core graphics and set all of the UI Images to zero.
I do get a memory leak warning
-(void)burImage
{
//Get a screen capture from the current view.
UIGraphicsBeginImageContext(CGSizeMake(320, 450));
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImg = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//Blur the image
CIImage *blurImg = [CIImage imageWithCGImage:viewImg.CGImage];
CGAffineTransform transform = CGAffineTransformIdentity;
CIFilter *clampFilter = [CIFilter filterWithName:#"CIAffineClamp"];
[clampFilter setValue:blurImg forKey:#"inputImage"];
[clampFilter setValue:[NSValue valueWithBytes:&transform objCType:#encode(CGAffineTransform)] forKey:#"inputTransform"];
CIFilter *gaussianBlurFilter = [CIFilter filterWithName: #"CIGaussianBlur"];
[gaussianBlurFilter setValue:clampFilter.outputImage forKey: #"inputImage"];
[gaussianBlurFilter setValue:[NSNumber numberWithFloat:22.0f] forKey:#"inputRadius"];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImg = [context createCGImage:gaussianBlurFilter.outputImage fromRect:[blurImg extent]];
UIImage *outputImg = [UIImage imageWithCGImage:cgImg];
//Add UIImageView to current view.
UIImageView *imgView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 320, 450)];
[imgView setTag:1109];
imgView.image = outputImg;
[imgView setTag:1108];
gaussianBlurFilter = nil;
outputImg = nil;
blurImg = nil;
viewImg = nil;
[self.view addSubview:imgView];
UIGraphicsEndImageContext();
}
The static analyzer ("Analyze" on the Xcode "Product" menu) is informing you that you are missing a needed CGImageRelease(cgImg) at the end of your method. If you have a Core foundation object returned from a method/function with "Create" or "Copy" in the name, you are responsible for releasing it.
By the way, if you tap on the icon (once in the margin, and again on the version that appears in the error message), it will show you more information:
That can be helpful for tracking back to where the problem originated, in this case the call to createCGImage. If you look at the documentation for createCGImage, it confirms this diagnosis, reporting:
Return Value
A Quartz 2D image. You are responsible for releasing the returned image when you no longer need it.
For general counsel about releasing Core Foundation objects, see the Create Rule in the Memory Management Programming Guide for Core Foundation.

Applying CIFilter to UIImageView

In UIViewController displaying image in UIImageView. I want to display it with some special effects. Using core image.framework
UIImageView *myImage = [[UIImageView alloc]
initWithImage:[UIImage imageNamed:#"piano.png"]];
myImage.frame = CGRectMake(10, 50, 300, 360);
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter= [CIFilter filterWithName:#"CIVignette"];
CIImage *inputImage = [[CIImage alloc] initWithImage:[UIImage imageNamed:#"piano.png"]];
[filter setValue:inputImage forKey:#"inputImage"];
[filter setValue:[NSNumber numberWithFloat:18] forKey:#"inputIntensity"];
[filter setValue:[NSNumber numberWithFloat:0] forKey:#"inputRadius"];
[baseView addSubview:inputImage];
but looks like I'm missing something or doing something wrong.
As the other post indicates, a CIImage is not a view so it can't be added as one. CIImage is really only used for doing image processing, to display the filtered image you'll need to convert it back to a UIImage. To do this, you need to get the output CIImage from the filter (not the input image). If you have multiple filters chained, use the last filter in the chain. Then you'll need to convert the output CIImage to a CGImage, and from there to a UIImage. This code accomplishes these things:
CIImage *result = [filter valueForKey:kCIOutputImageKey]; //Get the processed image from the filter
CGImageRef cgImage = [context createCGImage:result fromRect:[result extent]; //Create a CGImage from the output CIImage
UIImage* outputImage = [UIImage imageWithCGImage:cgImage]; // Create a UIImage from the CGImage
Also remember that the UIImage will have to go into a UIImageView, as it's not a view itself!
For more information, see the Core Image programming guide: https://developer.apple.com/library/ios/#documentation/GraphicsImaging/Conceptual/CoreImaging/ci_intro/ci_intro.html
CIImage can not be added as subview because it is not a view (UIView subclass). You need a UIImageView with a UIImage attached to its 'image' property (and this UIImage you can create from the CIImage, I believe).

How to remove red eye from image in iPhone?

I want to remove red eye effect form photo but not get any sample can any one help me with working demo code or code snippet?
Thanks.
Use below category of UIImage :
#interface UIImage (Utitlities)
-(UIImage*)redEyeCorrection;
#end
#implementation UIImage (Utitlities)
-(UIImage*)redEyeCorrection
{
CIImage *ciImage = [[CIImage alloc] initWithCGImage:self.CGImage];
// Get the filters and apply them to the image
NSArray* filters = [ciImage autoAdjustmentFiltersWithOptions:[NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:kCIImageAutoAdjustEnhance]];
for (CIFilter* filter in filters)
{
[filter setValue:ciImage forKey:kCIInputImageKey];
ciImage = filter.outputImage;
}
// Create the corrected image
CIContext* ctx = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [ctx createCGImage:ciImage fromRect:[ciImage extent]];
UIImage* final = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
return final;
}
#end
Usage: Sample code given below
UIImage *redEyeImage = [UIImage imageNamed:#"redEye.jpg"];
if (redEyeImage) {
UIImage *newRemovedRedEyeImage = [redEyeImage redEyeCorrection];
if (newRemovedRedEyeImage) {
imgView.image = newRemovedRedEyeImage;
}
}
Refer NYXImagesKit UIImage Enhancing link

CIGaussianBlur and CIAffineClamp on iOS 6

I am trying to blur an image using CoreImage on iOS 6 without having a noticeable black border. Apple documentation states that using a CIAffineClamp filter can achieve this but I'm not able to get an output image from the filter. Here's what I tried, but unfortunately an empty image is created when I access the [clampFilter outputImage]. If I only perform the blur an image is produced, but with the dark inset border.
CIImage *inputImage = [[CIImage alloc] initWithCGImage:self.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGAffineTransform transform = CGAffineTransformIdentity;
CIFilter *clampFilter = [CIFilter filterWithName:#"CIAffineClamp"];
[clampFilter setValue:inputImage forKey:kCIInputImageKey];
[clampFilter setValue:[NSValue valueWithBytes:&transform objCType:#encode(CGAffineTransform)] forKey:#"inputTransform"];
CIImage *outputImage = [clampFilter outputImage];
CIFilter *blurFilter = [CIFilter filterWithName:#"CIGaussianBlur"
keysAndValues:kCIInputImageKey, outputImage, #"inputRadius", [NSNumber numberWithFloat:radius], nil];
outputImage = [blurFilter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *blurredImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
The CIAffineClamp filter is setting your extent as infinite, which then confounds your context. Try saving off the pre-clamp extent CGRect, and then supplying that to the context initializer.

CIFilter integration only works with CISepiaTone

Following code adds a nice sepia effect to an image but when I choose another filter for example: CIBloom I see no image at all. Can you help?
- (void)drawRect:(CGRect)rect
{
UIImage *megan = [UIImage imageNamed:#"megan.png"];
CIImage *cimage = [[CIImage alloc] initWithImage:megan];
CIFilter *myFilter = [CIFilter filterWithName:#"CISepiaTone"];
[myFilter setDefaults];
[myFilter setValue:cimage forKey:#"inputImage"];
[myFilter setValue:[NSNumber numberWithFloat:0.8f] forKey:#"inputIntensity"];
CIImage *image = [myFilter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:image fromRect:image.extent];
UIImage *resultUIImage = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
CGRect imageRecht = [[UIScreen mainScreen] bounds];
[resultUIImage drawInRect:imageRecht];
}
From my understanding I should be able to just edit following lines to change the filter:
CIFilter *myFilter = [CIFilter filterWithName:#"CIBloom"];
[myFilter setValue:[NSNumber numberWithFloat:10.0f] forKey:#"inputIntensity"];
but when I do this I see no image at all when I start the app.
CISepiaTone is available in Mac OS X v10.4 and later and in iOS 5.0 and later, while CIBloom
is only available in Mac OS X v10.4 and later.
https://developer.apple.com/library/mac/#documentation/graphicsimaging/reference/CoreImageFilterReference/Reference/reference.html