How to colorize an SKTexture that is created with imageFromFile? - swift

I have created png images of several shapes, such as hexagons and stars, and they are all white color.
I want to colorize these SKSpriteNodes or SKTextures so that I can assign any color rather than creating shape images for each color I need and loading those textures into memory. I need the images to be high resolution, so having several images and loading them into memory is not a good option since I receive memory warnings from XCode. Loading them into memory as they are needed also does not work because they cause my game to lag for about 0.25 seconds.
So, is there any way I can create 1 SKTexture and colorize the image texture of an SKSpriteNode whenever I need it to change colors?

You can change color of white texture like this (assuming that you are using pure white textures):
//Colorize to red
let yourNode = SKSpriteNode(imageNamed: "textureName")
yourNode.colorBlendFactor = 1
yourNode.color = UIColor.redColor()
About colorBlendFactor:
The value must be a number between 0.0 and 1.0, inclusive. The default
value (0.0) indicates the color property is ignored and that the
texture’s values should be used unmodified. For values greater than
0.0, the texture is blended with the color before being drawn to the scene.
More about this topic on StackOverflow.
Hope this helps.

Related

Swift SceneKit Diffuse color doesn't change anything

I have a 3D model in .obj form and a corresponding .mtl file.
I dragged them both into xcode and converted the .obj model to a .scn file.
Xcode loads the file correctly and also applies the material. So in diffuse there is a color. (See image below) The problem is: The model stays white, except if I add an emission color. (See the two attached images - one with and one without emission color - once the model is visible and the other time not)
How do I apply the material color correctly?
Assuming the first image shows the result you don't want and the second image is a result more or less good, I can tell you this: The emission color of the first image is set to a pure white, which means, the emission is set to its maximum, what makes it almost invisible on white backgrounds. the second image just has some emission (a gray color) and so the geometry becomes better visible.
If your geometry is not a lamp and neither a light source, set the emission color to UIColor.black, and your object should be just fine.

ARKit – Semi-transparent texture pixels cause clipping in multiply blend mode

I am using the basic ARKit example and if I use two face meshes, one with the wireframe (back) and one with white boxes (front), I get odd result when using material blend mode multiply. As you can see in the image the semi-transparent pixels make the back mesh lose alpha in those areas (red arrows). The fully white and fully transparent pixel blend as expected.
I tried this same test in The default SceneKit example too and there the back object shows black at the semi-transparent pixels. The code is pretty basic for materials...
let m: SCNMaterial = SCNMaterial()
m.lightingModel = .physicallyBased
m.blendMode = SCNBlendMode.multiply
m.metalness.contents = 0.0
m.roughness.contents = 0.7
m.diffuse.contents = #imageLiteral(resourceName: "atest")
p.firstMaterial = m
I would expect in multiply blend mode that all white and white-ish pixels to disappear (like they would in photoshop). If anyone can confirm this behavior and expectation, that would be great. thank you.
(ARKit project)
https://drive.google.com/file/d/1TEPzBMTz83PdV_XrwjzKXRVnIQ4Xov72/view?usp=sharing
(SceneKit project)
https://drive.google.com/file/d/1lGButxh9SfHg-0akniqKDKlL5qMCrk1P/view?usp=sharing
ARKit
SceneKit

Animate an SKSpriteNode with textures that have a size different from the original

I want to animate an SKSpriteNode using textures from an SKTextureAtlas, using SKAction.animateWithTextures(textures,timePerFrame,resize,restore). However, the textures in the atlas have a size that is slightly larger than the original texture (it's basically a character moving). When the action is run, the textures are either compressed to fit the original size of the sprite, or recentered when I set resize to false, which changes the position of the character. What I want, though, is for the textures to be anchored at the lower-left corner (or lower-right, depending on the direction) so that the position of the character doesn't change apart from the extra part of the texture.
I've tried changing the anchor point of the sprite prior to running the action, but obviously that applies to the original texture as well. Also, I guess changing the size of the original texture would have an impact on the physics behaviour, which I want to avoid.
Does anyone have a suggestion about how to do this?
Thanks!
David
This would work
Edit all the textures to match the size of the largest sized texture.
Just give the smaller textures some padding using an alpha channel to give you a transparent background.
E.g. Notice how the first texture has lots of negative space
(From CartoonSmart.com)
Create the physics body with a certain size in mind. E.g. You can load the texture without the padding and get the size. Then position it as needed onto the new and improved texture with padding. So after you create the Sprite as normal with the new resized textures you can then
/// load a texture to be a template for the size
let imageTextureSizeTemplate = SKTexture(imageNamed: textureWithoutPadding)
let bodySize = imageTextureSizeTemplate.size()
/// position template texture physics body on texture that will be used
let bodyCenter = CGPointMake(0.5, 0.5)
// create physics body
let body:SKPhysicsBody = SKPhysicsBody(rectangleOfSize: bodySize, center: bodyCeneter)
self.physicsBody = body
Set resize: false when you animate the textures.

iOS Sprite Kit - SKSpriteNode - blend two sprites

Actually, I'm migrating a game from another platform, and I need to generate a sprite with two images.
The first image will be something like the form, a pattern or stamp, and the second is only a rectangle that sets color to the first. If the color was plane, it will be easy, I could use sprite.color and sprite.colorBlendFactor to play with it, but there are levels where the second image is a rectangle with two colors (red and green, for example).
Is there any way to implement these with Sprite Kit?
I mean, something like using Core image filter, and CIBlendWithAlphaMask, but only with Image and Mask image. (https://developer.apple.com/library/ios/documentation/graphicsimaging/Reference/CoreImageFilterReference/Reference/reference.html#//apple_ref/doc/uid/TP40004346) -> CIBlendWithAlphaMask.
Thanks.
Look into the SKCropNode class (documentation here) - it allows you to set a mask for an image underneath it.
In short, you would create two SKSpriteNodes - one with your stamp, the other with your coloured rectangle. Then:
SKCropNode *myCropNode = [SKCropNode node];
[myCropNode addChild:colouredRectangle]; // the colour to be rendered by the form/pattern
myCropNode.maskNode = stampNode; // the pattern sprite node
[self addChild:myCropNode];
Note that the results will probably be more similar to CIBlendWithMask rather than CIBlendWithAlphaMask, since the crop node will mask out any pixels below 5% transparency and render all pixels above this level, so the edges will be jagged rather than smoothly faded. Just don't use any semi-transparent areas in your mask and you'll be fine.

Change texture opacity in OpenGL

This is hopefully a simple question: I have an OpenGL texture and would like to be able to change its opacity, how do I do that? The texture already has an alpha channel and blending works fine, but I want to be able to decrease the opacity of the whole texture, to fade it into the background. I have fiddled with glBlendFunc, but with no luck – it seems that I would need something like GL_SRC_ALPHA_MINUS_CONSTANT, which is not available. I am working on iPhone, with OpenGL ES.
I have no idea about OpenGL ES, but in standard OpenGL you would set the opacity by declaring a colour for the texture before you use it:
// R, G, B, A
glColor4f(1.0, 1.0, 1.0, 0.5);
The example would give you 50% alpha without affecting the colour of your texture. By adjusting the other values you can shift the texture colour too.
Use a texture combiner. Set the texture stage to do a GL_MODULATE operation between a texture and constant color. Then change the constant color from your code (glTexEnv, GL_TEXTURE_ENV_COLOR).
This should come as "free" in terms of performance. On most (if not all) graphics chips combiner operations take the same number of GPU cycles (usually 1), so just using a texture versus doing a modulate operation (or any other operation) is exactly the same cost.
Basically you have two options: use glTexEnv for your texture with GL_MODULATE and specify the color using glColor4* and use a non-opaque level for the alpha channel. Note that glTexEnv should be issued only once, when you first load your texture. This scenario will not work if you specify colors in your vertex-attributes though. Those will namely override any glColor4* color you may set. In that case, you may want to resort to either of 2 options: use texture combiners (advanced topic, not nice to use in fixed-pipeline), or "manually" change the vertex color attribute of each individual vertex (can be undesirable for larger meshes).
If you are using modern OpenGL..
You can do this in the fragment shader :
void main()
{
color = vec4(1.0f, 1.0f, 1.0f, OPACITY) * texture(u_Texture, TexCoord);
}
This allows you to apply a opacity value to a texture without disrupting the blending.
Thank You all for the ideas. I’ve played both with glColor4f and glTexEnv and at last forced myself to read the glTexEnv manpage carefully. The manpage says that in the GL_MODULATE texturing mode, the resulting color is computed by multiplying the incoming fragment by the texturing color (C=Cf×Ct), same goes for the alpha. I tried glColor4f(1, 1, 1, opacity) and that did not work, but passing the desired opacity into all four arguments of the call did the trick. (Still not sure why though.)
The most straightforward way is to change the texture's alpha value on the fly. Since you tell OpenGL about the texture at some point, you will have the bitmap in memory. So just rebind the texture to the same texture id. In case you don't have it in memory, (due to space constraints, since you are on ES), you can retrieve the texture to a buffer again, using glGetTexImage(). That's the clean solution.
Saving/retrieving operations are a bit costly, though, so you might want another solution. Thinking about it, you might be able to work with geometry behind your the geometry displaying your texture or simply work on the material/colour of the geometry that holds the texture. You will probably want to have some additive blending of the back-geometry. Using a glBlendFunc of
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_DST_ALPHA),
you might be able to "easily" and - more important, cheaply - achieve the desired effect.
Most likely you are using cg to get your image into a texture. When you use cg, the alpha is premultiplied, thus why you have to use the alpha for rgba of the color4f func.
I suspect that you had a black background, and thus by decreasing the amount of every color, you were effectively fading the color to black.