hey I'm having a problem with the CISoftLightBlendMode when I apply the filter to the image the overlay image is not centered over the other image it just stays in the bottom left corner so only a very small portion of the image is covered by the overlay texture.
Heres my code:
UIImage* bg = [UIImage imageNamed:#"Texture.png"];
CIImage* beginImage = [CIImage imageWithCGImage:[image_view.image CGImage]];
CIImage* bgImage = [CIImage imageWithCGImage:bg.CGImage];
context = [CIContext contextWithOptions:nil];
filter = [CIFilter filterWithName:#"CISoftLightBlendMode" keysAndValues: kCIInputImageKey, beginImage,#"inputBackgroundImage",bgImage, nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
[Image_View setImage:newImg];
Related
I'd like to have a blur effect while drawing like the right line in this picture:
Currently, I'm drawing with the following code, but this only draws the picture on the left:
CGContextSetLineWidth(currentContext, thickness);
CGContextSetLineCap(currentContext, kCGLineCapRound);
CGContextBeginPath(currentContext);
CGContextMoveToPoint(currentContext, x, y);
CGContextAddLineToPoint(currentContext, x, y);
CGContextMoveToPoint(currentContext, x, y);
CGContextStrokePath(currentContext);
Any ideas for me please?
Regards,
Alexandre
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *inputImage = [[CIImage alloc] initWithImage:#"your image"];
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:9.0f] forKey:#"inputRadius"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
CGImageRef cgImage = [context createCGImage:result fromRect:[result extent]];
UIImage *blurrImage = [UIImage imageWithCGImage:cgImage];
use this code this will give you blurr effect.
I am new to ios development and trying to use code filters...
imgAnimation=[[UIImageView alloc]initWithFrame:frame];
imgAnimation.animationImages=_arrimg;
//animationImages=animationImages;
imgAnimation.contentMode=UIViewContentModeScaleAspectFit;
imgAnimation.animationDuration = 2.0f;
imgAnimation.animationRepeatCount = 0;
[imgAnimation startAnimating];
[self.view addSubview:imgAnimation];
My animation is working properly but how can I apply filters like sepia, grey scale I had found many tutorials but they are for single images kindly help ???
Sample Code :
+(UIImage *) applyFilterToImage: (UIImage *)inputImage
{
CIImage *beginImage = [CIImage imageWithCGImage:[inputImage CGImage]];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CISepiaTone" keysAndValues: kCIInputImageKey, beginImage, #"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
return newImg;
}
Basically you have to pre load all the images with the filters. Then load them to an array and set it to animationImages property.
For getting the various filters you can refer this project from github
GPU image open source framework for image effects.
Download the Source Code From Here:-
Hope it Helps to You :)
I want to remove red eye effect form photo but not get any sample can any one help me with working demo code or code snippet?
Thanks.
Use below category of UIImage :
#interface UIImage (Utitlities)
-(UIImage*)redEyeCorrection;
#end
#implementation UIImage (Utitlities)
-(UIImage*)redEyeCorrection
{
CIImage *ciImage = [[CIImage alloc] initWithCGImage:self.CGImage];
// Get the filters and apply them to the image
NSArray* filters = [ciImage autoAdjustmentFiltersWithOptions:[NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:kCIImageAutoAdjustEnhance]];
for (CIFilter* filter in filters)
{
[filter setValue:ciImage forKey:kCIInputImageKey];
ciImage = filter.outputImage;
}
// Create the corrected image
CIContext* ctx = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [ctx createCGImage:ciImage fromRect:[ciImage extent]];
UIImage* final = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
return final;
}
#end
Usage: Sample code given below
UIImage *redEyeImage = [UIImage imageNamed:#"redEye.jpg"];
if (redEyeImage) {
UIImage *newRemovedRedEyeImage = [redEyeImage redEyeCorrection];
if (newRemovedRedEyeImage) {
imgView.image = newRemovedRedEyeImage;
}
}
Refer NYXImagesKit UIImage Enhancing link
I am trying to blur an image using CoreImage on iOS 6 without having a noticeable black border. Apple documentation states that using a CIAffineClamp filter can achieve this but I'm not able to get an output image from the filter. Here's what I tried, but unfortunately an empty image is created when I access the [clampFilter outputImage]. If I only perform the blur an image is produced, but with the dark inset border.
CIImage *inputImage = [[CIImage alloc] initWithCGImage:self.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGAffineTransform transform = CGAffineTransformIdentity;
CIFilter *clampFilter = [CIFilter filterWithName:#"CIAffineClamp"];
[clampFilter setValue:inputImage forKey:kCIInputImageKey];
[clampFilter setValue:[NSValue valueWithBytes:&transform objCType:#encode(CGAffineTransform)] forKey:#"inputTransform"];
CIImage *outputImage = [clampFilter outputImage];
CIFilter *blurFilter = [CIFilter filterWithName:#"CIGaussianBlur"
keysAndValues:kCIInputImageKey, outputImage, #"inputRadius", [NSNumber numberWithFloat:radius], nil];
outputImage = [blurFilter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *blurredImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
The CIAffineClamp filter is setting your extent as infinite, which then confounds your context. Try saving off the pre-clamp extent CGRect, and then supplying that to the context initializer.
Im having a trouble with Core image. what I'm doing is getting an image from a UIImageView and then use some code i found in tutorials (I'm new to core Image) but then I want to put the sepia image back into the same same UIImageView when ever I try to put the new image into the view it just disappears I have tested to see if the image view contains an image and it does but it is not visible. any suggestions on what to do?
EDIT:
okay I got the sepia filter to work so then I tried posterize and i had the same problem the image just disappears. Here is the code:
CIImage *beginImage = [CIImage imageWithCGImage:[image_view.image CGImage]];
context = [CIContext contextWithOptions:nil];
filter = [CIFilter filterWithName:#"CIColorPosterize" keysAndValues:kCIInputImageKey, beginImage,#"inputLevels",[NSNumber numberWithFloat:0.8], nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
[image_view setImage:newImg];
CGImageRelease(cgimg);
Try something like this (let say you have an UIImageView *myImageView)
CIImage *beginImage = [CIImage imageWithCGImage:[myImageView.image CGImage]];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CISepiaTone" keysAndValues: kCIInputImageKey, beginImage, #"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
[myImageView setImage:newImg];
CGImageRelease(cgimg);
Same code in Swift 3.0 on Xcode 8.2 (My imageView in project is "#IBOutlet var photoImageView : UIImageView!")
Checked successfully on an iPhone 7Plus device iOS 10
guard let myImageView = self.photoImageView.image else
{
return
}
guard let cg = myImageView.cgImage else
{
return
}
let beginImage = CIImage(cgImage: cg)
let context : CIContext = CIContext(options: nil)
let inputParams : [String : Any]? = [kCIInputImageKey : beginImage,
"inputIntensity" : NSNumber(value : 0.8)]
let filter : CIFilter = CIFilter(name: "CISepiaTone", withInputParameters: inputParams)!
let outputImage : CIImage? = filter.outputImage!
if let outputImage = outputImage
{
let cgImage : CGImage? = context.createCGImage(outputImage, from: outputImage.extent)!
if let cg = cgImage
{
let newImage : UIImage = UIImage(cgImage: cg)
photoImageView.image = newImage
}
}