Apply CiFilter GaussianBlur - swift

As the title says I need to implement GaussianBlur to an UIImage; i tried to search for a tutorial but I am not still able to implement it. I tried this
var imageToBlur = CIImage(image: coro.logo)
var blurfilter = CIFilter(name: "CIGaussianBlur")
blurfilter.setValue(imageToBlur, forKey: "inputImage")
blurfilter.setValue(2, forKey: "inputImage")
var resultImage = blurfilter.valueForKey("outputImage") as! CIImage
var blurredImage = UIImage(CIImage: resultImage)
self.immagineCoro.image = blurredImage
importing CoreImage framework, but Xcode shows me an error ("NSInvalidArgumentException") at line 5. Can anyone help me to implement gaussianBlur and CIFilter in general?
Edit: thank to you both, but I have an other question; I need to apply blur only to a little part of the image like this

I just tried your code, and here's the modification I suggest, this works:
let fileURL = NSBundle.mainBundle().URLForResource("th", withExtension: "png")
let beginImage = CIImage(contentsOfURL: fileURL)
var blurfilter = CIFilter(name: "CIGaussianBlur")
blurfilter.setValue(beginImage, forKey: "inputImage")
//blurfilter.setValue(2, forKey: "inputImage")
var resultImage = blurfilter.valueForKey("outputImage") as! CIImage
var blurredImage = UIImage(CIImage: resultImage)
self.profileImageView.image = blurredImage
So, commenting out the portion you see above, did the trick and I get a blurred image as expected. And I'm using the file path, but this shouldn't make a difference from what you have.

You've used inputImage twice. The second time is probably meant to be inputRadius.

You might want to create a CIImage greyscale mask image with the shape you want, a blurred CIImage (using CIGaussianBlur), and then use CIBlendWithMask to blend them together.
The inputs of CIBlendWithMask are the input image (the blurred image), the input background image (the unblurred image), and the mask image (the shape you want). The output is the image you desire.

Related

CIGaussianBlur shrinks UIImageView

Using CIGaussianBlur causes UIImageView to apply the blur from the border in, making the image appear to shrink (right image). Using .blur on a SwiftUI view does the opposite; the blur is applied from the border outwards (left image). This is the effect I’m trying to achieve in UIKit. How can I go about this?
I've seen a few posts about using CIAffineClamp, but that causes the blur to stop at the image boarder which is not what I want.
private let context = CIContext()
private let filter = CIFilter(name: "CIGaussianBlur")!
private func createBluredImage(using image: UIImage, value: CGFloat) -> UIImage? {
let beginImage = CIImage(image: image)
filter.setValue(beginImage, forKey: kCIInputImageKey)
filter.setValue(value, forKey: kCIInputRadiusKey)
guard
let outputImage = filter.outputImage,
let cgImage = context.createCGImage(outputImage, from: outputImage.extent)
else {
return nil
}
return UIImage(cgImage: cgImage)
}
When I used CIGaussianBlur I wanted my output image to be contained inside the image frame, so I used CIAffineClamp on the image before applying the blur, as you describe.
You might need to render your source image into a larger frame, clamp to that larger frame using CIAffineClamp, apply your blur filter, then load the resulting blurred output image. Core Image is a bit of a pain to set up and figure out, so I don’t have a full solution ready for you, but that’s what I would suggest.

Add Dark and Highlight effect on image Like apple native app Dark and Highlight effect

Hi Every one I am facing issue regarding to implement Dark and Highlight effect using Core Image builtin filters. I have worked with other filters but this one giving me wrong result or may be I am not using right filter. I am using CIFilter.highlightShadowAdjust filter Here is my implementation.
let context = CIContext(options: nil)
let filter = CIFilter.highlightShadowAdjust()
let aCIImage = CIImage(image: self.image)
filter.setValue(aCIImage, forKey: kCIInputImageKey)
filter.setValue(highlitValue, forKey: "inputHighlightAmount")
filter.setValue(shadowValue, forKey: "inputShadowAmount")
filter.setValue(radiousValue, forKey: "inputRadius")
guard let outputImage = filter.outputImage else {return UIImage()}
guard let cgimg = context.createCGImage(outputImage, from: outputImage.extent) else {return UIImage()}
return UIImage(cgImage: cgimg)
For clarification I also uploaded a Video what I want to achieve.
Any Suggestion or guid will be greatly thankful.

UIView invert mask MaskView

I want to apply an inverted mask to my UIView. I set the mask to a UIImageView with a transparent image. However the output with
view.mask = imageView
is not the desired result. How can I achieve the desired result as I illustrated below? The desired result uses the mask cutout as transparency. When I check the mask of the View, it isn't a CAShapeLayer so I can't invert it that way.
Seems like you could do a few things. You could use the image you have but mask a white view and place a blue view behind it. Or you could adjust the image asset you’re using to by reversing the transparency. Or you could use CoreImage to do that in code. For example:
func invertMask(_ image: UIImage) -> UIImage?
{
guard let inputMaskImage = CIImage(image: image),
let backgroundImageFilter = CIFilter(name: "CIConstantColorGenerator", withInputParameters: [kCIInputColorKey: CIColor.black]),
let inputColorFilter = CIFilter(name: "CIConstantColorGenerator", withInputParameters: [kCIInputColorKey: CIColor.clear]),
let inputImage = inputColorFilter.outputImage,
let backgroundImage = backgroundImageFilter.outputImage,
let filter = CIFilter(name: "CIBlendWithAlphaMask", withInputParameters: [kCIInputImageKey: inputImage, kCIInputBackgroundImageKey: backgroundImage, kCIInputMaskImageKey: inputMaskImage]),
let filterOutput = filter.outputImage,
let outputImage = CIContext().createCGImage(filterOutput, from: inputMaskImage.extent) else { return nil }
let finalOutputImage = UIImage(cgImage: outputImage)
return finalOutputImage
}

cast SKSpriteNode to UIImage

I want to convert a SKSpriteNode into an UIImage.
let testImage = SKSpriteNode(imageNamed: "PlainProject") as! UIImage
but I get a crash on the thread above. Is there another way to do this?
You should get the cgImage of the texture of the sprite first, and then cast it to UIImage:
let testImage = SKSpriteNode(imageNamed: "someImage.png")
let image = UIImage(cgImage: (testImage.texture?.cgImage())!)
or a better version without force-unwraping a cgImage that might be nil:
let image = UIImage()
if testImage.texture?.cgImage() != nil{
image = UIImage(cgImage: (testImage.texture?.cgImage())!)
}
Result (in playground):
That is exactly what my image looks like. Hope this helps!
You can not directly do that.
First of all you need to get texture from SKSpriteNode.
After that you can get image with textureOfNode!.cgImage() as shown in below example:
let testNode = SKSpriteNode(imageNamed: "Spaceship")
let textureOfNode = testNode.texture
let imageFromTexture = UIImage.init(cgImage: textureOfNode!.cgImage())
print(imageFromTexture) //<UIImage: 0x61000008afa0>, {394, 347}

Applying a CIFilter with SKEffectNode to a SKSpriteNode

I am trying to apply a CIFilter with SKEffectNode to a SKSpriteNode.
Can't get this to work and not sure why. I got this working using a blur but cannot get the mono photo effect.
Anyone have any insight? Thanks again.
//Mono Effect (not working)
let filter = CIFilter(name: "CIPhotoEffectMono")
filter?.setDefaults()
effectsNode.filter = filter
self.addChild(effectsNode)
effectsNode.addChild(SKSpriteNode)
//Blur Effect (working)
let filter = CIFilter(name: "CIGaussianBlur")
let blurAmount = 4.0
filter?.setValue(blurAmount, forKey: kCIInputRadiusKey)
effectsNode.filter = filter
effectsNode.blendMode = .alpha
self.addChild(effectsNode)
effectNode.addChild(SKSpriteNode)