cast SKSpriteNode to UIImage - swift

I want to convert a SKSpriteNode into an UIImage.
let testImage = SKSpriteNode(imageNamed: "PlainProject") as! UIImage
but I get a crash on the thread above. Is there another way to do this?

You should get the cgImage of the texture of the sprite first, and then cast it to UIImage:
let testImage = SKSpriteNode(imageNamed: "someImage.png")
let image = UIImage(cgImage: (testImage.texture?.cgImage())!)
or a better version without force-unwraping a cgImage that might be nil:
let image = UIImage()
if testImage.texture?.cgImage() != nil{
image = UIImage(cgImage: (testImage.texture?.cgImage())!)
}
Result (in playground):
That is exactly what my image looks like. Hope this helps!

You can not directly do that.
First of all you need to get texture from SKSpriteNode.
After that you can get image with textureOfNode!.cgImage() as shown in below example:
let testNode = SKSpriteNode(imageNamed: "Spaceship")
let textureOfNode = testNode.texture
let imageFromTexture = UIImage.init(cgImage: textureOfNode!.cgImage())
print(imageFromTexture) //<UIImage: 0x61000008afa0>, {394, 347}

Related

Why the SceneKit Material looks different, even when the image is the same?

The material content support many options to be loaded, two of these are NSImage (or UIImage) and SKTexture.
I noticed when loading the same image file (.png) with different loaders, the material is rendered different.
I'm very sure it is an extra property loaded from SpriteKit transformation, but I don't know what is it.
Why the SceneKit Material looks different, even when the image is the same?
This is the rendered example:
About the code:
let plane = SCNPlane(width: 1, height: 1)
plane.firstMaterial?.diffuse.contents = NSColor.green
let plane = SCNPlane(width: 1, height: 1)
plane.firstMaterial?.diffuse.contents = NSImage(named: "texture")
let plane = SCNPlane(width: 1, height: 1)
plane.firstMaterial?.diffuse.contents = SKTexture(imageNamed: "texture")
The complete example is here: https://github.com/Maetschl/SceneKitExamples/tree/master/MaterialTests
I think this has something to do with color spaces/gamma correction. My guess is that textures loaded via the SKTexture(imageNamed:) initializer aren't properly gamma corrected. You would think this would be documented somewhere, or other people would have noticed, but I can't seem to find anything.
Here's some code to swap with the last image in your linked sample project. I've force unwrapped as much as possible for brevity:
// Create the texture using the SKTexture(cgImage:) init
// to prove it has the same output image as SKTexture(imageNamed:)
let originalDogNSImage = NSImage(named: "dog")!
var originalDogRect = CGRect(x: 0, y: 0, width: originalDogNSImage.size.width, height: originalDogNSImage.size.height)
let originalDogCGImage = originalDogNSImage.cgImage(forProposedRect: &originalDogRect, context: nil, hints: nil)!
let originalDogTexture = SKTexture(cgImage: originalDogCGImage)
// Create the ciImage of the original image to use as the input for the CIFilter
let imageData = originalDogNSImage.tiffRepresentation!
let ciImage = CIImage(data: imageData)
// Create the gamma adjustment Core Image filter
let gammaFilter = CIFilter(name: "CIGammaAdjust")!
gammaFilter.setValue(ciImage, forKey: kCIInputImageKey)
// 0.75 is the default. 2.2 makes the dog image mostly match the NSImage(named:) intializer
gammaFilter.setValue(2.2, forKey: "inputPower")
// Create a SKTexture using the output of the CIFilter
let gammaCorrectedDogCIImage = gammaFilter.outputImage!
let gammaCorrectedDogCGImage = CIContext().createCGImage(gammaCorrectedDogCIImage, from: gammaCorrectedDogCIImage.extent)!
let gammaCorrectedDogTexture = SKTexture(cgImage: gammaCorrectedDogCGImage)
// Looks bad, like in StackOverflow question image.
// let planeWithSKTextureDog = planeWith(diffuseContent: originalDogTexture)
// Looks correct
let planeWithSKTextureDog = planeWith(diffuseContent: gammaCorrectedDogTexture)
Using a CIGammaAdjust filter with an inputPower of 2.2 makes the SKTexture almost? match the NSImage(named:) init. I've included the original image being loaded through SKTexture(cgImage:) to rule out any changes caused by using that initializer versus the SKTexture(imageNamed:) you asked about.

UIView invert mask MaskView

I want to apply an inverted mask to my UIView. I set the mask to a UIImageView with a transparent image. However the output with
view.mask = imageView
is not the desired result. How can I achieve the desired result as I illustrated below? The desired result uses the mask cutout as transparency. When I check the mask of the View, it isn't a CAShapeLayer so I can't invert it that way.
Seems like you could do a few things. You could use the image you have but mask a white view and place a blue view behind it. Or you could adjust the image asset you’re using to by reversing the transparency. Or you could use CoreImage to do that in code. For example:
func invertMask(_ image: UIImage) -> UIImage?
{
guard let inputMaskImage = CIImage(image: image),
let backgroundImageFilter = CIFilter(name: "CIConstantColorGenerator", withInputParameters: [kCIInputColorKey: CIColor.black]),
let inputColorFilter = CIFilter(name: "CIConstantColorGenerator", withInputParameters: [kCIInputColorKey: CIColor.clear]),
let inputImage = inputColorFilter.outputImage,
let backgroundImage = backgroundImageFilter.outputImage,
let filter = CIFilter(name: "CIBlendWithAlphaMask", withInputParameters: [kCIInputImageKey: inputImage, kCIInputBackgroundImageKey: backgroundImage, kCIInputMaskImageKey: inputMaskImage]),
let filterOutput = filter.outputImage,
let outputImage = CIContext().createCGImage(filterOutput, from: inputMaskImage.extent) else { return nil }
let finalOutputImage = UIImage(cgImage: outputImage)
return finalOutputImage
}

Swift playground - cannot convert a filtered UIImage to a CGImage

In a Swift playground, I am loading a JPEG, converting it to a UIImage and filtering it to monochrome.
I then convert the resulting filtered image to a UIImage.
The input and the filtered images display correctly.
I then convert both images to a CGImage type.
This works for the input image, but the filtered image returns nil from the conversion:
// Get an input image
let imageFilename = "yosemite.jpg"
let inputImage = UIImage(named: imageFilename )
let inputCIImage = CIImage(image:inputImage!)
// Filter the input image - make it monochrome
let filter = CIFilter(name: "CIPhotoEffectMono")
filter!.setDefaults()
filter!.setValue(inputCIImage, forKey: kCIInputImageKey)
let CIout = filter!.outputImage
let filteredImage = UIImage(CIImage: CIout!)
// Convert the input image to a CGImage
let inputCGImageRef = inputImage!.CGImage // Result: <CGImage 0x7fdd095023d0>
// THE LINE ABOVE WORKS
// Try to convert the filtered image to a CGImage
let filteredCGImageRef = filteredImage.CGImage // Result: nil
// THE LINE ABOVE DOES NOT WORK
// Note that the compiler objects to 'filteredImage!.CGImage'
What's wrong?
A UIImage created from a CIImage as you've done isn't backed by a CGImage. You need to explicitly create one:
let context = CIContext()
let filteredCGImageRef = context.createCGImage(
CIout!,
fromRect: CIout!.extent)
If you need a UIImage, create that from the CGImage rendered by the CIContext:
UIImage(CGImage: filteredCGImageRef)
Cheers,
Simon

Black & white effect/filter

I have UIImage and I would like to change it to black & white picture, or if you know how to do some other filters I would also appreciate it. For example: chrome filter(with nice colours).
Most of the codes I have already found are in objective-c, which I do not understand a lot.
Right now I am using this code to give it some effects.
func applyFilter() {
let inputImage = CIImage(image: tempImageView.image!)
let randomColor = [kCIInputAngleKey: (Double(arc4random_uniform(314)) / 100)]
let filteredImage = inputImage!.imageByApplyingFilter("CIHueAdjust", withInputParameters: randomColor)
let renderedImage = context.createCGImage(filteredImage, fromRect: filteredImage.extent)
tempImageView.image = UIImage(CGImage: renderedImage)
}
it works , but effects are terrible.
Thanks.

Apply CiFilter GaussianBlur

As the title says I need to implement GaussianBlur to an UIImage; i tried to search for a tutorial but I am not still able to implement it. I tried this
var imageToBlur = CIImage(image: coro.logo)
var blurfilter = CIFilter(name: "CIGaussianBlur")
blurfilter.setValue(imageToBlur, forKey: "inputImage")
blurfilter.setValue(2, forKey: "inputImage")
var resultImage = blurfilter.valueForKey("outputImage") as! CIImage
var blurredImage = UIImage(CIImage: resultImage)
self.immagineCoro.image = blurredImage
importing CoreImage framework, but Xcode shows me an error ("NSInvalidArgumentException") at line 5. Can anyone help me to implement gaussianBlur and CIFilter in general?
Edit: thank to you both, but I have an other question; I need to apply blur only to a little part of the image like this
I just tried your code, and here's the modification I suggest, this works:
let fileURL = NSBundle.mainBundle().URLForResource("th", withExtension: "png")
let beginImage = CIImage(contentsOfURL: fileURL)
var blurfilter = CIFilter(name: "CIGaussianBlur")
blurfilter.setValue(beginImage, forKey: "inputImage")
//blurfilter.setValue(2, forKey: "inputImage")
var resultImage = blurfilter.valueForKey("outputImage") as! CIImage
var blurredImage = UIImage(CIImage: resultImage)
self.profileImageView.image = blurredImage
So, commenting out the portion you see above, did the trick and I get a blurred image as expected. And I'm using the file path, but this shouldn't make a difference from what you have.
You've used inputImage twice. The second time is probably meant to be inputRadius.
You might want to create a CIImage greyscale mask image with the shape you want, a blurred CIImage (using CIGaussianBlur), and then use CIBlendWithMask to blend them together.
The inputs of CIBlendWithMask are the input image (the blurred image), the input background image (the unblurred image), and the mask image (the shape you want). The output is the image you desire.