What is the significance of anchorPoint property in SpriteKIT?
I used it to position a specific node above the lower border of the screen. However, is that the only way to make sure that the objects don't cross the lower borders of the screen in a world where gravity is set to true?
What should the anchor point be set to to ensure that a specific node doesn't go out of horizontal/vertical bounds in case an impulse is applied?
Check out Apple's docs on the anchorPoint. Basically, the anchorPoint is used to define the center of an object. By that, I mean when you set the position of the node, it sets the position of the node's anchorPoint in the scene. From there, the anchorPoint with tell the node (I'm going to use Apple's spaceship example) that the image should be shifted so that the anchorPoint is wherever you defined. anchorPoint is a 1.0 by 1.0 size area, with 0.0, 0.0 being the bottom left of the sprite, and 1.0, 1.0 being the top right. It scales for however wide your sprite is. When you adjust the zRotation of your sprite, it will rotate around the anchorPoint.
So if I say, have a sprite with an anchorPoint at 0.5, 1.0, and I set the position to the exact center of the screen, the sprite will actually hang down, as the anchorPoint has been moved up to the top of the node.
Anyway, it should not have an effect on physics. I would recommend using position for setting the position of the node instead. If you are looking to make sure a physics body won't go through a wall/phsyicsBody, then run usesPreciseCollisionDetection = YES (or whatever language you are using's true statement) on your node's physicsBody.
Related
I understand the following.
1) ViewPort: It is like a window through which one can see the outside world. Basically, it is a 2D plane on which 3D objects gets projected.
2) Field Of View (FOV): FOV is a cone projection from the eye pupil or virtual camera. One can see everything inside that cone projection without turning the head.
Monocular FOV is what a single eye can see
Binocular FOV is what both eyes put together can see clearly
Peripheral FOV is the region outside the binocular FOV but within monocular FOV.
My question is,
1) What is ViewPort and FOV in a virtual reality (VR) Headset device? What is the difference between them in that context?
ViewPort is what you describe. It's usually defined as bounds. Between 0.0 and 1.0 for both width and height, where 0.0 is the left-most edge of the screen. Less intuitively 0.0 is at the bottom of the screen (I guess this might vary between systems). Sometimes it's defined in screen pixels instead. Also, rather than absolute bounds, second parameters may be defined as extent instead.
Example:
Left viewport definition, bounds (0.0, 0.0, 0.5, 1.0) Means going from left/bottom corner, to center/top.
Right viewport definition, bounds (0.5, 0.0, 1.0, 1.0)
Means going from center/bottom, to right/top.
Right viewport definition, extent (0.5, 0.0, 0.5, 1.0)
Going from center/bottom, and extends half a screen horizontally, full screen vertically.
FOV is usually mentioned as the total field the user can see (counting both viewports in the case of VR). It's usually measured as an angle in degrees.
I have some simple code that adds a block sprite at the leftmost part of a tile like this :
block.position = CGPointMake((-self.size.width * 0.5), 0);
[self addChild:block];
Since the anchor point is in the middle of the block sprite. self refers to the actual tile.
This works fine and indeed adds the block on the left side of the tile sprite.
Now I also have a player sprite that can collide with that block if it tries to go through it. That also works just fine.
The problem happened when i tried to get the block sprite to show in the exact same spot using another anchor point (i need a new anchor point for a shrink effect i wanted to create - which appears to work fine btw).
The new code becomes :
block.position = CGPointMake(-(self.size.width * 0.5), -(self.size.width * 0.5));
block.anchorPoint = CGPointZero;
[self addChild:block];
The new block appears in a similar to the first case position (though not totally identical).
I am not sure why the position is not identical but i can fix that by adding/subtracting 1 or 2 from the x,y points.
The weird problem is that if my player sprite now tries to go below that block on the tile below (which is an open tile without any blocks), i get a contact between the player and the block.
I have even added debug paths with SKShapeNode to make sure that the player and block sprites do not actually collide. And they don't ! But i still get a collision event.
The player scale is (0.8, 0.9), but i don't think this would play much of a role.
I really don't get why this could be happening. Any ideas guys ?
Changing the sprite's anchor point have no effect on the physics body.
When talking about CGRect, the rect origin is at point {0, 0},
So what is happening is that you now have a sprite that its image is centred around anchor point {0, 0} but with a physics body, that starts at {0, 0} and is the size of the sprite, meaning that it is centred around {0.5, 0.5}.
So even that the player doesn't collide with the image of the sprite, it does collide with its physics body.
What is happening is that you have a physics body, that before had the same centre point as the sprite,
But as oppose to before, where the sprite anchor point were in the middle, which would 'fit' into the physics body,
Now the sprite's anchor point is {0, 0}, which causes the physics body centre point, to actually be the most bottom left point of the sprite.
To resolve this, you need to offset your physics body, so it will be centred around the new anchor point.
For example:
block.physicsBody = [SKPhysicsBody bodyWithRectangleOfSize:block.size center:CGPointZero];
CGPoint newCenter = CGPointMake(block.size.width, block.size.height);
block.physicsBody = [SKPhysicsBody bodyWithRectangleOfSize:block.size centre:newCenter];
Good luck mate.
EDIT- On a second thought, I think I got confused with the offset directions.
An edit has been made to correct it.
I noticed that in Sprite Kit the coordinate system is flipped.
For example, here is a SKSpriteNode:
SKSpriteNode *car = [SKSpriteNode spriteNodeWithImageNamed:#"car"];
car.position = CGPointMake(0, 0);
[road addChild:car];
The car is positioned in the center of it's parent.
When I set position to:
car.position = CGPointMake(-50, 0);
then it is positioned more to the left.
But when I want to move it down, and increase Y, it moves UP!
car.position = CGPointMake(-50, 20);
In UIKit increasing Y always means something moves down. It feels more logical to me. Is there a way to flip the coordinate system in Sprite Kit?
You can set your scene's yScale property to -1. Everything will render upside down, but you can also set the yScale property for each node to -1, which will cause them to render right side up.
You could also make a class category and create a property called invertedPosition with a custom getter/setter that inverts the default position property.
Sprite Kit uses a coordinate orientation that starts from the bottom left corner of the scene (0,0), and the values increase as you move to the right (increasing x) and up (increasing y).
I have some problem with Anchor point..
i have a sprite and i need to calculate the anchor point for this sprite around the screen center.
How can i calculate the anchor point?
You can access the anchor point values using:
mySprite.anchorPoint.x
and
mySprite.anchorPoint.y
These are both floating point values so be aware of that.
EDIT
To set them you just do:
mySprite.anchorPoint = ccp(1.0f, 1.0f);
An anchor point of (1.0, 1.0) would be the upper right corner of your image, whereas the original anchor point is in the middle and therefore, (0.5, 0.5). To try and solidify the anchor point for the bottom left would be (0.0, 0.0). You can get any other anchor point within the image by understanding these.
I try to set the anchorPoint property, in order to rotate a view by a well defined axis.
But:
myView.layer.anchorPoint = CGPointMake(myView.layer.anchorPoint.x - 1.0, myView.layer.anchorPoint.y);
Wenn I shift it by -1.0, it will not just move 1 unit to left. Instead, my whole view moves by the width of the view to right.
What kind of coordinate system is that? It seems inverted. But also the units don't match with those of for example myView.frame.size.width ?
anchorPoint is a normalized position within your layer. That is, (0.0, 0.0) is the upper-left corner of your layer (in the iPhone's flipped UIView layer coordinate system) and (1.0, 1.0) is the lower-right corner. (0.5, 0.5) is the middle of your layer. These positions are size-independent, and thus merely relative locations.
This makes it possible to scale about that anchorPoint, which would be confusing to do if anchorPoint was an absolute coordinate.