I have an empty SKScene which needs to be horizontally centered, so I set it's anchorPoint:
self.anchorPoint = CGPoint(0.5f, 0);
I also need to scale it, per SpriteKit - Set Scale and Physics, so I run an action on it like this:
[self runAction:[SKAction scaleTo:0.5f duration:0.0f]];
My problem is that when I detect touches, I get locations that seem strangely off.
I detect touches using the usual (locationInNode:self) method, and I'm currently adding a little blue square where the touches are, but when I touch the 4 corners of my device, I see a frame that is a quarter of my screen (correctly) but is moved to the left by a seemingly arbitrary amount
Here are some things I've already checked:
scene is initialized in viewWillLayoutSubviews, I know it has the correct initial dimensions
scene's scaleMode is set to SKSceneScaleModeAspectFill, but I've tried all of them to no avail
I was struggling with the same issue for awhile, but I think I finally got it figured out. You're not supposed to scale the scene (like it hints if you try setScale). You're supposed to resize it.
Try this:
myScene.scaleMode = SKSceneScaleModeAspectFill;
And then while zooming:
myScene.size = CGSizeMake(newX, newY);
Set Anchor Point as
self.anchorPoint = CGPoint(0.5f, 0);
And set Scene Scale Mode as ASPECT FIT, not aspect fill.
SKSceneScaleModeAspectFit
Related
I am creating my first game in spriteKit. I followed a few tutorial, so I am able to make the game work at a single size.
The game has objects spawn from the top of the screen and fall towards the player, which is at the bottom.
The issue I am having is that both the player and objects' coordinates are relative to the scene, which is by default of the iPad pro 9.7 size.
Now when I run the game on an iPhone 8, objects are spawn outside the view as well, and the player can also move a bit past the left/right (I am using aspectFill as scaling mode, so the sides get cut off).
What is the proper way to position both the player and objects as children of the current view, so that they are properly scaled and their coordinates are relative to it?
I was simply using this to give the player a starting position (that's a placeholder of course):
player = Player(color: .red, size: CGSize(width: 40, height: 80))
player.position = CGPoint(x: frame.midX, y: frame.size.height/5)
player.zPosition = ZPosition.player
addChild(player)
Edit: to clarify, the problem is that midX is 384, but the midX of the view is 160 on an iPhone8, thus the coordinate mismatch
Here is an image to clarify https://imgur.com/a/0I8i5CX
The player is only supposed to be spawned in the center, that is not a concern, the problem is that at any given time a touch coordinate system is different from the player's system, making it impossible to be clicked.
Also, my center in gamescene.sks is 0,0
You need to factor in the aspect ratio when you are developing in sprite kit. If you are truly using a "single size," then no matter what device you are on, your center should be the same. (0,0)
From what you are telling me with
the problem is that midX is 384, but the midX of the view is 160 on an iPhone8
is that you are reading from your view, not your scene.
This is bad, because your view and your scene are going to be two different sizes.
The first thing you need to do is define your playing area. If you want your paddle to hit the sides of all devices, then you need to develop in 18:39 aspect ratio (or 9:16 if you plan on using the safe areas on iphone x)
This means that on Ipads, the object will be cut off from the top of the screen because the clipping will happen at the top of the screen instead of the sides.
Now if you want to have the paddles to hit the sides of the screen and the object to spawn at the top of the screen, then you will need to use .resizeScale mode and normalize all of your units (usually between 0 and 1).
This means you are going to end up creating a different game experience across devices with different aspect ratios, as opposed to a different viewing experience from just clipping.
please share an image so that I can understand the problem properly now it looks like you are facing problems in constraints and you want your player position always on the middle of every screen which is quite simple you have to make the x value 0
player.position = CGPoint(x: 0, y: frame.size.height/5)
I solved by adding margins and using them as variables
guard let viewWidth = view?.frame.width else {return}
if viewWidth < frame.width { // scaled device
leftMargin = (frame.width - viewWidth) / 2
rightMargin = leftMargin - viewWidth
}
A little bit stuck on how to paint/draw an effect like an alpha Chanel onto an SKSpriteNode node i've started off with setting up the two images I need (ps if there is another way to do this is sprite-kit id love to know how to paint the masks
1)The hidden picture - SKSpriteNode *hiddenimageNode
2)The overlay that gets scratched away SKSpriteNode *myOverlay
3)And finally a mask node comprising of
UIImage *image;
SKTexture *maskTexture= [SKTexture textureWithImage:image];
maskNode = [SKSpriteNode spriteNodeWithTexture:maskTexture];
all of these are placed inside of a node "cropNode" [SKCropNode node];
this works more like a static image (that of a circle moving at touch location and not quite what I'm after, I'm hoping to be able to scratch away the entire image)
this works fine but its Not quite the look I'm after
Pictures: Dragging finger from pos1 to pos02, while "erasing purple layer to reveal a smileyface"
is there a way to make it look like I'm erasing the sprite?
nubie coder
//Updating project...
So since then I have tried using this code
https://github.com/oyiptong/CGScratch
and have added it to my SkScene by creating a subview then placing the UIView (Scratchview into it)the erasing is working however the erasing is not happening where the touches are occurring, any ideas why this might be happening?
If you are doing this in iOS 8, then your best bet is to just use SKSpriteNodes as your masking nodes, there is a weird bug with other kinds of nodes that causes distortion.
If you are doing this with iOS9 +, then SKShapeNodes are fixed.
I am going to explain the concept for iOS 9. To get this to work in iOS 8 is a real pain, since subtraction does not subtract alpha in iOS 8.
Now for your mask nodes, you only have 2 options for drawing, On and Off based on the alpha level of the pixels in your mask image. So what you want to do is incorporate subtraction alpha blending to create your desired effect.
let bigcircle = SKShapeNode(circleOfRadius: 80)
bigcircle = .whiteColor()
let littlecircle = SKShapeNode(circleOfRadius: 40)
littlecircle.position = CGPoint(x: 10, y: 10)
littlecircle.fillColor = .whiteColor()
littlecircle.blendMode = .Subtract
bigcircle.addChild(littlecircle)
maskNode = bigcircle
What this code is doing is making a big white circle with a radius of 80 points, and drawing a white circle inside of it at 40 points. Since we are using subtraction blending, it is going to take the new color and subtract it from the old (in our case white(1,1,1,1) - white(1,1,1,10 = transparent(0,0,0,0)) and get us a nice hole in our mask that will end up being cropped out of the layer over your smiley face.
I have an SKSpriteNode that has a texture assigned like this:
node.texture = SKTexture(imageNamed: "Oval")
Users can select this object and drag it around. I use the following to identify when it is being selected in the touchesMoved function.
var touchedNode = allObjects.nodeAtPoint(location)
The problem is that almost half the surface area of this image file is transparent. However, nodeAtPoint responds to touches on the transparency.
Does anyone know of a way to ignore the transparency?
In your touches Moved you can get the color of the pixel of the node you touched and then using an if statement you can ignore the node if the color is transparent (has an alpha of 0). Check here to see how to get the color
SpriteKit: How can I get the pixel color from a point in SKSpriteNode?
If I'm making a game in SpriteKit that has a large "world", and I need the user to have the option of zooming in and out of the SKScene, how would I go about this? Or, to make things simpler, in the didMoveToView function, how can I present more of the world to the user's device's screen (without using world.runAction(SKAction.scaleTo(0.5)) or something)?
There's a SKCameraNode that's built specifically for this. The SKCameraNode defines the viewport into your scene. You create a camera node and assign it to the camera property of your scene.
let cameraNode = SKCameraNode()
cameraNode.position = CGPoint(x: scene.size.width / 2, scene.size.height / 2)
scene.addChild(cameraNode)
scene.camera = cameraNode
You can then create actions and run those actions on the camera. So to zoom in on the scene, you'd do this.
let zoomInAction = SKAction.scale(to: 0.5, duration: 1)
cameraNode.run(zoomInAction)
The cameraNode basically is a square node in the scene, that I think takes the proportions of the view by default? Cuz there's no size initializer. So when you make it smaller, the scene looks like it gets zoomed. To zoom out you'd make an action that increases the scale. Basically imagine a rectangle on your entire scene, and whatever is in the cameraNode's rectangle directly shows on your iPhone screen. You can also add moveTo actions and sequence actions and set timingModes on the actions same as if it were your normal spriteNode.
Here's the WWDC where the apple guy shows what I've just said. CameraNode bit is around 3 mins before the end.
https://developer.apple.com/videos/play/wwdc2015-604/
So, the best solution I could could find goes something like this. In the didMoveToView function, create an SKSpriteNode called World and make it whatever size you want your world to be. Then, write world.setScale(0.5) if you want a 50% zoom-out. However, if you have a player node that needs to always be centered in the screen, you'll need to add the following to your update function.
override func update(currentTime: CFTimeInterval) {
world.position.x = -player.position.x * (0.5)
world.position.y = -player.position.y * (0.5)
}
I'm implementing a basic speedometer using an image and rotating it. However, when I set the initial rotation (at something like 240 degrees, converted to radians) It rotates the image and makes it much smaller than it otherwise would be. Some values make the image disappear entirely. (like M_PI_4)
the slider goes from 0-360 for testing.
the following code is called on viewDidLoad, and when the slider value is changed.
-(void) updatePointer
{
double progress = testSlider.value;
progress += pointerStart
CGAffineTransform rotate = CGAffineTransformMakeRotation((progress*M_PI)/180);
[pointerImageView setTransform:rotate];
}
EDIT: Probably important to note that once it gets set the first time, the scale remains the same. So, if I were to set pointerStart to 240, it would shrink, but moving the slider wouldn't change the scale (and it would rotate it as you'd suspect) Replacing "progress" with 240 in the transformation does the same thing. (shrinks it.)
I was able to resolve the issue for anybody who stumbles across this question. Apparently the image is not fully loaded/measured when viewDidLoad is called, so the matrix transforms that cgAffineTransform does actually altered the size of the image. Moving the update code to viewDidAppear fixed the problem.
Take the transform state of the view which you want to rotate and then apply the rotation transform to it.
CGAffineTransform trans = pointerImageView.transform;
pointerImageView.transform = CGAffineTransformRotate(trans, 240);