Changing parent node in SpriteKit - swift

I'm new to SpriteKit. When I drag and drop "Color Sprite" from object library to make a new node. It gives the correct size (as you can see in the image below). Here the parent is the main scene not the blue one.
But when I change the parent node from the main scene to any other node (like blue one in my case) the new node gets larger however it's width and height remain same (shown below).
EDIT:
I'm running Xcode 7 beta 3.

I prefer you to do everything programmatically in sprite kit. I didn't understand what is going on there actually but if you want to make a shape like this, you should use SKShapeNode. SKSpriteNode is generally for making a node with a texture and animating that stuff etc...
//for example
let node = SKShapeNode(rectOfSize:CGSize(width:100,height:100))
node.position = CGPoint(x:500,y:350)
node.fillColor = UIColor.redColor()
self.addChild(node)
if you still have problems with the size, you can play with;
// for example;
node.xScale = 469
node.yScale = 300

Related

Creating a physicsBody from a texture out of a texture atlas results in weird proportions

I use physicsBodies that are made from an image, I put an image file in the Assets.xcassets file and create the spriteNode using the image name as the imageNamed argument, I then use the spriteNodes texture as the texture argument of the physicsBody and the same size value from the spriteNode as the size argument. This works like a charm below 13.0 (13.0 has some issues with this). However, if I try to do this same thing with the same image files but in a Texture Atlas the spriteNode() still looks perfectly fine but the physicsBody is massively out of proportion, I would be able to get them to line up by dividing the width and height of the physicsBody by different values but the fact that I'm gonna have to brute force something means I've probably done something wrong
Image of result when using spriteAtlas, keep in mind the wings aren't part of the spriteNode and the rectangle around the eagle is a different body: oh no no no
How I create physicsBodies:
func createPigeon(size: CGSize, position: CGPoint) {
//W/o texture atlas
sprite = SKSpriteNode(imageNamed: "pigeonHitBox")
//with textureAtlas
sprite = SKSpriteNode(texture: gameScene.spriteAtlas.textureNamed("pigeonHitBox"))
sprite!.size = size
sprite!.position = position
sprite!.name = "Pigeon"
sprite!.physicsBody = SKPhysicsBody(texture: sprite!.texture!, size: size)
sprite!.zPosition = 1
sprite!.physicsBody!.isDynamic = true
defaultCollisions()
gameScene.addChild(sprite!)
}
So I figured out that the problem was due to the way SpriteKit overlaps and deletes alpha in order to squeeze more images into one atlas. The only fix I could find for this was putting the images I used for hitboxes into the Assets.xcassets file and making sure the animation action had resize set to false.
I get why this is done but this has just resulted in me not being able to put a good 13 images inside a textureAtlas, there should be an option to not axe alpha

How to create a Portal effect in ARKit just using the SceneKit editor?

I would like to create a prototype like this one: just using Xcode SceneKit Editor. I found an answer where the room is created programmatically with simple SCNPlane objects and playing around with the rendering order.
However, I would like to put together something more elaborated like downloading the 3d model of a room and make it accessible only through the portal. I'm trying to achieve the same effect directly in the Xcode's SceneKit editor converting this part:
//a. Create The Left Wall And Set Its Rendering Order To 200 Meaning It Will Be Rendered After Our Masks
let leftWallNode = SCNNode()
let leftWallMainGeometry = SCNPlane(width: 0.5, height: 1)
leftWallNode.geometry = leftWallMainGeometry
leftWallMainGeometry.firstMaterial?.diffuse.contents = UIColor.red
leftWallMainGeometry.firstMaterial?.isDoubleSided = true
leftWallNode.renderingOrder = 200
//b. Create The Left Wall Mask And Set Its Rendering Order To 10 Meaning It Will Be Rendered Before Our Walls
let leftWallMaskNode = SCNNode()
let leftWallMaskGeometry = SCNPlane(width: 0.5, height: 1)
leftWallMaskNode.geometry = leftWallMaskGeometry
leftWallMaskGeometry.firstMaterial?.diffuse.contents = UIColor.blue
leftWallMaskGeometry.firstMaterial?.isDoubleSided = true
leftWallMaskGeometry.firstMaterial?.transparency = 0.0000001
leftWallMaskNode.renderingOrder = 10
leftWallMaskNode.position = SCNVector3(0, 0, 0.001)
into two planes in the editor:
I took care of setting isDoubleSided and renderingOrder for both of them and I made the second one transparent (using alpha on the Diffuse Color).
Unfortunately, when displaying in AR, mode it doesn't work. The .scn file is available here.
A virtual world in your example is hidden behind a wall. In order to get a portal like in the presented movie you need a wall opening (where an entrance is), not a plane blocking your 3D objects. The alpha channel of portal's entrance should look like right part of the following image:
Also, look at my answers in the SO posts: ARKit hide objects behind walls and ARKit – Rendering a 3D object under an invisible plane for checking how to set up invisible material.
The code might be like this one:
portalPlane.geometry?.materials.first?.colorBufferWriteMask = []
portalPlane.geometry?.materials.first?.readsFromDepthBuffer = true
portalPlane.geometry?.materials.first?.writesToDepthBuffer = true
portalPlane.renderingOrder = -1
And, of course, you can use properties in Material Inspector:
For portal plane the properties are the following: Writes Depth is true, Reads Depth is true, Write to Color is empty, Rendering Order (in Node Inspector) is -1.
For 3D objects inside a portal Rendering Order (in Node Inspector) is greater than 0.
You can definitely observe a hidden effect right in Viewport of Xcode.
Now hidden wall masks a bigger part of 3D to show the real street, and you see your 3D environment through portal (wrong result is on the left, right result is on the right part of this picture).
And the next picture shows how 3D wall (in my case it's extruded plane) looks like :
But for exit of the portal you just need a 3D object like a door (not a wall opening) and this exit should look like the left side of presented pictures. The normals of the door must be pointed inside, the normals of the wall must be pointed outside. The material for both objects is single sided.
Hope this helps.

SetScale unintentionally affecting SKSpriteNode position

I'm making my first shooter game using Swift and SpriteKit and I've recently been running into problems with setScale(). I have a Laser class whose instances are added as children to a Ship class. I now realize that any down scaling of the parent node will scale its child as well. In my update method in GameScene I check the position of each child named 'Laser' to determine if it should be removed from its parent when offscreen. I recently updated some of my sprites- including their relative sizes, and I noticed that the position of each laser is way off, as they are removed far before they reach the end of the screen and upon debugging their x positions are in fact far larger than where they are displayed onscreen. I also noticed that the starting x of each Laser instance is relative to its original position within the Ship instance. So the ship may be halfway across the screen but the laser x position still starts at 0 and if the ship is able to pass the laser, the laser's position becomes negative--is there a way to grab its position on the screen rather than relative to its start within its parent?
This is the code I'm doing the check with:
self.aShip.gun.enumerateChildNodesWithName("laser",
usingBlock: { node, _ in
if let aLaser = node as? Laser { i
if(aLaser.position.x > self.size.width){
aLaser.removeFromParent()
}
}
}
)
It seems like scaling has a ton more baggage than I would normally assume, so any insight into this problem or how to manage the relation between code and sprites would be awesome!!
EDIT:
I found that adding let positionInScene = self.convertPoint(aLaser.position, fromNode: self.aShip.gun) under if let aLaser = node as? Laser {and consequently using positionInScene rather than the aLaser position works.. however, I still don't fully understand the scaling effect going on with aLaser and it's position and don't know if its efficient to have to convert positions like this at the rate of update (60 times a second).

how to paint/erase SKSpriteNode

A little bit stuck on how to paint/draw an effect like an alpha Chanel onto an SKSpriteNode node i've started off with setting up the two images I need (ps if there is another way to do this is sprite-kit id love to know how to paint the masks
1)The hidden picture - SKSpriteNode *hiddenimageNode
2)The overlay that gets scratched away SKSpriteNode *myOverlay
3)And finally a mask node comprising of
UIImage *image;
SKTexture *maskTexture= [SKTexture textureWithImage:image];
maskNode = [SKSpriteNode spriteNodeWithTexture:maskTexture];
all of these are placed inside of a node "cropNode" [SKCropNode node];
this works more like a static image (that of a circle moving at touch location and not quite what I'm after, I'm hoping to be able to scratch away the entire image)
this works fine but its Not quite the look I'm after
Pictures: Dragging finger from pos1 to pos02, while "erasing purple layer to reveal a smileyface"
is there a way to make it look like I'm erasing the sprite?
nubie coder
//Updating project...
So since then I have tried using this code
https://github.com/oyiptong/CGScratch
and have added it to my SkScene by creating a subview then placing the UIView (Scratchview into it)the erasing is working however the erasing is not happening where the touches are occurring, any ideas why this might be happening?
If you are doing this in iOS 8, then your best bet is to just use SKSpriteNodes as your masking nodes, there is a weird bug with other kinds of nodes that causes distortion.
If you are doing this with iOS9 +, then SKShapeNodes are fixed.
I am going to explain the concept for iOS 9. To get this to work in iOS 8 is a real pain, since subtraction does not subtract alpha in iOS 8.
Now for your mask nodes, you only have 2 options for drawing, On and Off based on the alpha level of the pixels in your mask image. So what you want to do is incorporate subtraction alpha blending to create your desired effect.
let bigcircle = SKShapeNode(circleOfRadius: 80)
bigcircle = .whiteColor()
let littlecircle = SKShapeNode(circleOfRadius: 40)
littlecircle.position = CGPoint(x: 10, y: 10)
littlecircle.fillColor = .whiteColor()
littlecircle.blendMode = .Subtract
bigcircle.addChild(littlecircle)
maskNode = bigcircle
What this code is doing is making a big white circle with a radius of 80 points, and drawing a white circle inside of it at 40 points. Since we are using subtraction blending, it is going to take the new color and subtract it from the old (in our case white(1,1,1,1) - white(1,1,1,10 = transparent(0,0,0,0)) and get us a nice hole in our mask that will end up being cropped out of the layer over your smiley face.

SpriteKit- How to zoom-in and zoom-out of an SKScene?

If I'm making a game in SpriteKit that has a large "world", and I need the user to have the option of zooming in and out of the SKScene, how would I go about this? Or, to make things simpler, in the didMoveToView function, how can I present more of the world to the user's device's screen (without using world.runAction(SKAction.scaleTo(0.5)) or something)?
There's a SKCameraNode that's built specifically for this. The SKCameraNode defines the viewport into your scene. You create a camera node and assign it to the camera property of your scene.
let cameraNode = SKCameraNode()
cameraNode.position = CGPoint(x: scene.size.width / 2, scene.size.height / 2)
scene.addChild(cameraNode)
scene.camera = cameraNode
You can then create actions and run those actions on the camera. So to zoom in on the scene, you'd do this.
let zoomInAction = SKAction.scale(to: 0.5, duration: 1)
cameraNode.run(zoomInAction)
The cameraNode basically is a square node in the scene, that I think takes the proportions of the view by default? Cuz there's no size initializer. So when you make it smaller, the scene looks like it gets zoomed. To zoom out you'd make an action that increases the scale. Basically imagine a rectangle on your entire scene, and whatever is in the cameraNode's rectangle directly shows on your iPhone screen. You can also add moveTo actions and sequence actions and set timingModes on the actions same as if it were your normal spriteNode.
Here's the WWDC where the apple guy shows what I've just said. CameraNode bit is around 3 mins before the end.
https://developer.apple.com/videos/play/wwdc2015-604/
So, the best solution I could could find goes something like this. In the didMoveToView function, create an SKSpriteNode called World and make it whatever size you want your world to be. Then, write world.setScale(0.5) if you want a 50% zoom-out. However, if you have a player node that needs to always be centered in the screen, you'll need to add the following to your update function.
override func update(currentTime: CFTimeInterval) {
world.position.x = -player.position.x * (0.5)
world.position.y = -player.position.y * (0.5)
}