Applying a CIFilter with SKEffectNode to a SKSpriteNode - sprite-kit

I am trying to apply a CIFilter with SKEffectNode to a SKSpriteNode.
Can't get this to work and not sure why. I got this working using a blur but cannot get the mono photo effect.
Anyone have any insight? Thanks again.
//Mono Effect (not working)
let filter = CIFilter(name: "CIPhotoEffectMono")
filter?.setDefaults()
effectsNode.filter = filter
self.addChild(effectsNode)
effectsNode.addChild(SKSpriteNode)
//Blur Effect (working)
let filter = CIFilter(name: "CIGaussianBlur")
let blurAmount = 4.0
filter?.setValue(blurAmount, forKey: kCIInputRadiusKey)
effectsNode.filter = filter
effectsNode.blendMode = .alpha
self.addChild(effectsNode)
effectNode.addChild(SKSpriteNode)

Related

Multiple CIImage into a single CIFilter

I'm trying to add a image on a video, and then I found this link.
I've got two CIimages but not sure how to compile them using CIFilter in Swift. I guess the article is written in obj-c (is it correct?). I'm struggling to find which API correspond to the CIfliter filterWithName in Swift. Could anyone tell me which method should I use in CIFilter?
In Swift, you can use the newish (since iOS 13) protocol-based interface for CIFilters:
import CoreImage.CIFilterBuiltins
let filter = CIFilter.blendWithMask()
filter.inputImage = image
filter.backgroundImage = otherImage
filter.maskImage = mask
let output = filter.outputImage
Alternatively, you can use the string-based API that is used in the Objective-C code above in Swift as well:
let filter = CIFilter(name: "CIBlendWithMask")
filter?.setValue(image, forKey: kCIInputImageKey)
filter?.setValue(otherImage, forKey: kCIInputBackgroundImageKey)
filter?.setValue(mask, forKey: kCIInputMaskImageKey)
let output = filter.outputImage

Add Dark and Highlight effect on image Like apple native app Dark and Highlight effect

Hi Every one I am facing issue regarding to implement Dark and Highlight effect using Core Image builtin filters. I have worked with other filters but this one giving me wrong result or may be I am not using right filter. I am using CIFilter.highlightShadowAdjust filter Here is my implementation.
let context = CIContext(options: nil)
let filter = CIFilter.highlightShadowAdjust()
let aCIImage = CIImage(image: self.image)
filter.setValue(aCIImage, forKey: kCIInputImageKey)
filter.setValue(highlitValue, forKey: "inputHighlightAmount")
filter.setValue(shadowValue, forKey: "inputShadowAmount")
filter.setValue(radiousValue, forKey: "inputRadius")
guard let outputImage = filter.outputImage else {return UIImage()}
guard let cgimg = context.createCGImage(outputImage, from: outputImage.extent) else {return UIImage()}
return UIImage(cgImage: cgimg)
For clarification I also uploaded a Video what I want to achieve.
Any Suggestion or guid will be greatly thankful.

Why the SceneKit Material looks different, even when the image is the same?

The material content support many options to be loaded, two of these are NSImage (or UIImage) and SKTexture.
I noticed when loading the same image file (.png) with different loaders, the material is rendered different.
I'm very sure it is an extra property loaded from SpriteKit transformation, but I don't know what is it.
Why the SceneKit Material looks different, even when the image is the same?
This is the rendered example:
About the code:
let plane = SCNPlane(width: 1, height: 1)
plane.firstMaterial?.diffuse.contents = NSColor.green
let plane = SCNPlane(width: 1, height: 1)
plane.firstMaterial?.diffuse.contents = NSImage(named: "texture")
let plane = SCNPlane(width: 1, height: 1)
plane.firstMaterial?.diffuse.contents = SKTexture(imageNamed: "texture")
The complete example is here: https://github.com/Maetschl/SceneKitExamples/tree/master/MaterialTests
I think this has something to do with color spaces/gamma correction. My guess is that textures loaded via the SKTexture(imageNamed:) initializer aren't properly gamma corrected. You would think this would be documented somewhere, or other people would have noticed, but I can't seem to find anything.
Here's some code to swap with the last image in your linked sample project. I've force unwrapped as much as possible for brevity:
// Create the texture using the SKTexture(cgImage:) init
// to prove it has the same output image as SKTexture(imageNamed:)
let originalDogNSImage = NSImage(named: "dog")!
var originalDogRect = CGRect(x: 0, y: 0, width: originalDogNSImage.size.width, height: originalDogNSImage.size.height)
let originalDogCGImage = originalDogNSImage.cgImage(forProposedRect: &originalDogRect, context: nil, hints: nil)!
let originalDogTexture = SKTexture(cgImage: originalDogCGImage)
// Create the ciImage of the original image to use as the input for the CIFilter
let imageData = originalDogNSImage.tiffRepresentation!
let ciImage = CIImage(data: imageData)
// Create the gamma adjustment Core Image filter
let gammaFilter = CIFilter(name: "CIGammaAdjust")!
gammaFilter.setValue(ciImage, forKey: kCIInputImageKey)
// 0.75 is the default. 2.2 makes the dog image mostly match the NSImage(named:) intializer
gammaFilter.setValue(2.2, forKey: "inputPower")
// Create a SKTexture using the output of the CIFilter
let gammaCorrectedDogCIImage = gammaFilter.outputImage!
let gammaCorrectedDogCGImage = CIContext().createCGImage(gammaCorrectedDogCIImage, from: gammaCorrectedDogCIImage.extent)!
let gammaCorrectedDogTexture = SKTexture(cgImage: gammaCorrectedDogCGImage)
// Looks bad, like in StackOverflow question image.
// let planeWithSKTextureDog = planeWith(diffuseContent: originalDogTexture)
// Looks correct
let planeWithSKTextureDog = planeWith(diffuseContent: gammaCorrectedDogTexture)
Using a CIGammaAdjust filter with an inputPower of 2.2 makes the SKTexture almost? match the NSImage(named:) init. I've included the original image being loaded through SKTexture(cgImage:) to rule out any changes caused by using that initializer versus the SKTexture(imageNamed:) you asked about.

UIView invert mask MaskView

I want to apply an inverted mask to my UIView. I set the mask to a UIImageView with a transparent image. However the output with
view.mask = imageView
is not the desired result. How can I achieve the desired result as I illustrated below? The desired result uses the mask cutout as transparency. When I check the mask of the View, it isn't a CAShapeLayer so I can't invert it that way.
Seems like you could do a few things. You could use the image you have but mask a white view and place a blue view behind it. Or you could adjust the image asset you’re using to by reversing the transparency. Or you could use CoreImage to do that in code. For example:
func invertMask(_ image: UIImage) -> UIImage?
{
guard let inputMaskImage = CIImage(image: image),
let backgroundImageFilter = CIFilter(name: "CIConstantColorGenerator", withInputParameters: [kCIInputColorKey: CIColor.black]),
let inputColorFilter = CIFilter(name: "CIConstantColorGenerator", withInputParameters: [kCIInputColorKey: CIColor.clear]),
let inputImage = inputColorFilter.outputImage,
let backgroundImage = backgroundImageFilter.outputImage,
let filter = CIFilter(name: "CIBlendWithAlphaMask", withInputParameters: [kCIInputImageKey: inputImage, kCIInputBackgroundImageKey: backgroundImage, kCIInputMaskImageKey: inputMaskImage]),
let filterOutput = filter.outputImage,
let outputImage = CIContext().createCGImage(filterOutput, from: inputMaskImage.extent) else { return nil }
let finalOutputImage = UIImage(cgImage: outputImage)
return finalOutputImage
}

Apply CiFilter GaussianBlur

As the title says I need to implement GaussianBlur to an UIImage; i tried to search for a tutorial but I am not still able to implement it. I tried this
var imageToBlur = CIImage(image: coro.logo)
var blurfilter = CIFilter(name: "CIGaussianBlur")
blurfilter.setValue(imageToBlur, forKey: "inputImage")
blurfilter.setValue(2, forKey: "inputImage")
var resultImage = blurfilter.valueForKey("outputImage") as! CIImage
var blurredImage = UIImage(CIImage: resultImage)
self.immagineCoro.image = blurredImage
importing CoreImage framework, but Xcode shows me an error ("NSInvalidArgumentException") at line 5. Can anyone help me to implement gaussianBlur and CIFilter in general?
Edit: thank to you both, but I have an other question; I need to apply blur only to a little part of the image like this
I just tried your code, and here's the modification I suggest, this works:
let fileURL = NSBundle.mainBundle().URLForResource("th", withExtension: "png")
let beginImage = CIImage(contentsOfURL: fileURL)
var blurfilter = CIFilter(name: "CIGaussianBlur")
blurfilter.setValue(beginImage, forKey: "inputImage")
//blurfilter.setValue(2, forKey: "inputImage")
var resultImage = blurfilter.valueForKey("outputImage") as! CIImage
var blurredImage = UIImage(CIImage: resultImage)
self.profileImageView.image = blurredImage
So, commenting out the portion you see above, did the trick and I get a blurred image as expected. And I'm using the file path, but this shouldn't make a difference from what you have.
You've used inputImage twice. The second time is probably meant to be inputRadius.
You might want to create a CIImage greyscale mask image with the shape you want, a blurred CIImage (using CIGaussianBlur), and then use CIBlendWithMask to blend them together.
The inputs of CIBlendWithMask are the input image (the blurred image), the input background image (the unblurred image), and the mask image (the shape you want). The output is the image you desire.