how to paint/erase SKSpriteNode - sprite-kit

A little bit stuck on how to paint/draw an effect like an alpha Chanel onto an SKSpriteNode node i've started off with setting up the two images I need (ps if there is another way to do this is sprite-kit id love to know how to paint the masks
1)The hidden picture - SKSpriteNode *hiddenimageNode
2)The overlay that gets scratched away SKSpriteNode *myOverlay
3)And finally a mask node comprising of
UIImage *image;
SKTexture *maskTexture= [SKTexture textureWithImage:image];
maskNode = [SKSpriteNode spriteNodeWithTexture:maskTexture];
all of these are placed inside of a node "cropNode" [SKCropNode node];
this works more like a static image (that of a circle moving at touch location and not quite what I'm after, I'm hoping to be able to scratch away the entire image)
this works fine but its Not quite the look I'm after
Pictures: Dragging finger from pos1 to pos02, while "erasing purple layer to reveal a smileyface"
is there a way to make it look like I'm erasing the sprite?
nubie coder
//Updating project...
So since then I have tried using this code
https://github.com/oyiptong/CGScratch
and have added it to my SkScene by creating a subview then placing the UIView (Scratchview into it)the erasing is working however the erasing is not happening where the touches are occurring, any ideas why this might be happening?

If you are doing this in iOS 8, then your best bet is to just use SKSpriteNodes as your masking nodes, there is a weird bug with other kinds of nodes that causes distortion.
If you are doing this with iOS9 +, then SKShapeNodes are fixed.
I am going to explain the concept for iOS 9. To get this to work in iOS 8 is a real pain, since subtraction does not subtract alpha in iOS 8.
Now for your mask nodes, you only have 2 options for drawing, On and Off based on the alpha level of the pixels in your mask image. So what you want to do is incorporate subtraction alpha blending to create your desired effect.
let bigcircle = SKShapeNode(circleOfRadius: 80)
bigcircle = .whiteColor()
let littlecircle = SKShapeNode(circleOfRadius: 40)
littlecircle.position = CGPoint(x: 10, y: 10)
littlecircle.fillColor = .whiteColor()
littlecircle.blendMode = .Subtract
bigcircle.addChild(littlecircle)
maskNode = bigcircle
What this code is doing is making a big white circle with a radius of 80 points, and drawing a white circle inside of it at 40 points. Since we are using subtraction blending, it is going to take the new color and subtract it from the old (in our case white(1,1,1,1) - white(1,1,1,10 = transparent(0,0,0,0)) and get us a nice hole in our mask that will end up being cropped out of the layer over your smiley face.

Related

Creating a physicsBody from a texture out of a texture atlas results in weird proportions

I use physicsBodies that are made from an image, I put an image file in the Assets.xcassets file and create the spriteNode using the image name as the imageNamed argument, I then use the spriteNodes texture as the texture argument of the physicsBody and the same size value from the spriteNode as the size argument. This works like a charm below 13.0 (13.0 has some issues with this). However, if I try to do this same thing with the same image files but in a Texture Atlas the spriteNode() still looks perfectly fine but the physicsBody is massively out of proportion, I would be able to get them to line up by dividing the width and height of the physicsBody by different values but the fact that I'm gonna have to brute force something means I've probably done something wrong
Image of result when using spriteAtlas, keep in mind the wings aren't part of the spriteNode and the rectangle around the eagle is a different body: oh no no no
How I create physicsBodies:
func createPigeon(size: CGSize, position: CGPoint) {
//W/o texture atlas
sprite = SKSpriteNode(imageNamed: "pigeonHitBox")
//with textureAtlas
sprite = SKSpriteNode(texture: gameScene.spriteAtlas.textureNamed("pigeonHitBox"))
sprite!.size = size
sprite!.position = position
sprite!.name = "Pigeon"
sprite!.physicsBody = SKPhysicsBody(texture: sprite!.texture!, size: size)
sprite!.zPosition = 1
sprite!.physicsBody!.isDynamic = true
defaultCollisions()
gameScene.addChild(sprite!)
}
So I figured out that the problem was due to the way SpriteKit overlaps and deletes alpha in order to squeeze more images into one atlas. The only fix I could find for this was putting the images I used for hitboxes into the Assets.xcassets file and making sure the animation action had resize set to false.
I get why this is done but this has just resulted in me not being able to put a good 13 images inside a textureAtlas, there should be an option to not axe alpha

Detect touches on SKSpriteNode with oddly shaped image(s) running animations

Using SpriteKit for an iOS 9.0 app/game (and no physics body).
I have a few SpriteNodes on a scene. SpriteNodes are with oddly shaped images and runs through multiple frames for animation.
What is the best way to detect touches on a SpriteNode's image content, but not on transparent area of the entire image rectangle.
I see a lot of posts on using SKCropNode / MaskImage. In my scenario, I have multiple images/frames for each SpriteNode for the animation.
Please advice on an approach or point me in the right direction. Thanks.
Based on additional info from comments above:
With only 16 shapes, from the 16 frames, one of the more efficient ways would be to draw primitive shapes that roughly approximate the outlines of your character at each frame, and convert these to CGPaths. These can then be swapped in and out for each frame or (better) plucked out appropriately when there's a touch for testing based on the frame currently being shown at time of touch.
CGPaths are very lightweight structs, so there shouldn't be any performance problems with either approach.
CGPathContainsPoint is the old name of this test, which is now a modernised API for Swift 3 and onwards:
Troubles using CGPathContainsPoint SWIFT
There is an app called PaintCode, that's about $100 USD, which translates vectors into CGPaths, amongst other things, so you could use this to import your shapes. You can copy/paste vectors into it, which I suggest, because you might want to draw in a friendly drawing program. PaintCode doesn't have the world's best drawing experience.
Additionally, here's a free, online tool for doing the creation of a polygon path from textures. Probably more painful than using a drawing app and PaintCode, but it's FREE!
Alternative: Physics Bodies from Texture, and Contact with Touch
You can automagically create physics body shapes from a texture, like so, from the docs on this page:
let texturedSpaceShip = SKSpriteNode(texture: spaceShipTexture)
texturedSpaceShip.physicsBody = SKPhysicsBody(texture: spaceShipTexture,
size: CGSize(width: circularSpaceShip.size.width,
height: circularSpaceShip.size.height))
There have been reports of problems with determining if a point is within a given physics body, so it might be better to create a small physics object, place it where the touch is, and then determine if its in contact with the physics body shape appropriate for the current frame of the current game entity.
Caveat:
There's no telling (nor way to control, that I know of) how complex the resultant CGPaths are for these automagically created physics shapes.
I personally would just check the pixel of the current texture to see if is transparent or not. You can do this to get the pixel data:
(Note this code is hypothetical to get you started, I will try to work out a real example later on for you if you can't figure it out.)
(Also note, if you scale sprites, you need to handle that with converting touch location to texture)
let image = sprite.texture.cgImage()
if let dataProvider = image.dataProvider, let data = dataProvider.data
{
let screenScale = UIScreen.main.scale //let's handle retina graphics (may need work)
//touchedLocation is the location on the sprite, if the anchor was bottom left (it may be top left, will have to verify)
//So if I touch the bottom left corner of the sprite, I should get (0,0)
let x = touchedLocation.x * screenScale
let y = touchedLocation.y * screenScale
let width = sprite.texture.size().width * screenScale
let bpp = 4 // 4 bytes per pixel
let alphaChannel = 3 //Alpha channel is usually the highest byte
let index = bpp * (y * width + x) + alphaChannel
let byte = data.withUnsafeBytes({[UInt8](UnsafeBufferPointer(start:$0 + index,count:1))})
print(Alpha: \(byte))
}
else
{
//we have no data
}

Changing parent node in SpriteKit

I'm new to SpriteKit. When I drag and drop "Color Sprite" from object library to make a new node. It gives the correct size (as you can see in the image below). Here the parent is the main scene not the blue one.
But when I change the parent node from the main scene to any other node (like blue one in my case) the new node gets larger however it's width and height remain same (shown below).
EDIT:
I'm running Xcode 7 beta 3.
I prefer you to do everything programmatically in sprite kit. I didn't understand what is going on there actually but if you want to make a shape like this, you should use SKShapeNode. SKSpriteNode is generally for making a node with a texture and animating that stuff etc...
//for example
let node = SKShapeNode(rectOfSize:CGSize(width:100,height:100))
node.position = CGPoint(x:500,y:350)
node.fillColor = UIColor.redColor()
self.addChild(node)
if you still have problems with the size, you can play with;
// for example;
node.xScale = 469
node.yScale = 300

iOS Sprite Kit - SKSpriteNode - blend two sprites

Actually, I'm migrating a game from another platform, and I need to generate a sprite with two images.
The first image will be something like the form, a pattern or stamp, and the second is only a rectangle that sets color to the first. If the color was plane, it will be easy, I could use sprite.color and sprite.colorBlendFactor to play with it, but there are levels where the second image is a rectangle with two colors (red and green, for example).
Is there any way to implement these with Sprite Kit?
I mean, something like using Core image filter, and CIBlendWithAlphaMask, but only with Image and Mask image. (https://developer.apple.com/library/ios/documentation/graphicsimaging/Reference/CoreImageFilterReference/Reference/reference.html#//apple_ref/doc/uid/TP40004346) -> CIBlendWithAlphaMask.
Thanks.
Look into the SKCropNode class (documentation here) - it allows you to set a mask for an image underneath it.
In short, you would create two SKSpriteNodes - one with your stamp, the other with your coloured rectangle. Then:
SKCropNode *myCropNode = [SKCropNode node];
[myCropNode addChild:colouredRectangle]; // the colour to be rendered by the form/pattern
myCropNode.maskNode = stampNode; // the pattern sprite node
[self addChild:myCropNode];
Note that the results will probably be more similar to CIBlendWithMask rather than CIBlendWithAlphaMask, since the crop node will mask out any pixels below 5% transparency and render all pixels above this level, so the edges will be jagged rather than smoothly faded. Just don't use any semi-transparent areas in your mask and you'll be fine.

SKScene scale + anchorPoint = strange behavior

I have an empty SKScene which needs to be horizontally centered, so I set it's anchorPoint:
self.anchorPoint = CGPoint(0.5f, 0);
I also need to scale it, per SpriteKit - Set Scale and Physics, so I run an action on it like this:
[self runAction:[SKAction scaleTo:0.5f duration:0.0f]];
My problem is that when I detect touches, I get locations that seem strangely off.
I detect touches using the usual (locationInNode:self) method, and I'm currently adding a little blue square where the touches are, but when I touch the 4 corners of my device, I see a frame that is a quarter of my screen (correctly) but is moved to the left by a seemingly arbitrary amount
Here are some things I've already checked:
scene is initialized in viewWillLayoutSubviews, I know it has the correct initial dimensions
scene's scaleMode is set to SKSceneScaleModeAspectFill, but I've tried all of them to no avail
I was struggling with the same issue for awhile, but I think I finally got it figured out. You're not supposed to scale the scene (like it hints if you try setScale). You're supposed to resize it.
Try this:
myScene.scaleMode = SKSceneScaleModeAspectFill;
And then while zooming:
myScene.size = CGSizeMake(newX, newY);
Set Anchor Point as
self.anchorPoint = CGPoint(0.5f, 0);
And set Scene Scale Mode as ASPECT FIT, not aspect fill.
SKSceneScaleModeAspectFit