Trying to get Physics Shape That Matches the Graphical Representation - swift

what I'm trying to achieve is that the Node would have the same shape as PhysicsBody/texture (fire has a complicated shape), and then I'm trying to make only fireImage touchable. So far when I'm touching outside of the fireImage on the screen and it still makes a sound. It seems that I'm touching the squared Node, but I want to touch only the sprite/texture.
Would appreciate your help.
The code is below:
import SpriteKit
import AVFoundation
private var backgroundMusicPlayer: AVAudioPlayer!
class GameScene2: SKScene {
var wait1 = SKAction.waitForDuration(1)
override func didMoveToView(view: SKView) {
setUpScenery()
}
private func setUpScenery() {
//I'm creating a Fire Object here and trying to set its Node a physicsBody/texture shape:
let fireLayerTexture = SKTexture(imageNamed: fireImage)
let fireLayer = SKSpriteNode(texture: fireLayerTexture)
fireLayer.anchorPoint = CGPointMake(1, 0)
fireLayer.position = CGPointMake(size.width, 0)
fireLayer.zPosition = Layer.Z4st
var firedown = SKAction.moveToY(-200, duration: 0)
var fireup1 = SKAction.moveToY(10, duration: 0.8)
var fireup2 = SKAction.moveToY(0, duration: 0.2)
fireLayer.physicsBody = SKPhysicsBody(texture: fireLayerTexture, size: fireLayer.texture!.size())
fireLayer.physicsBody!.affectedByGravity = false
fireLayer.name = "fireNode"
fireLayer.runAction(SKAction.sequence([firedown, wait1, fireup1, fireup2]))
addChild(fireLayer)
}
//Here, I'm calling a Node I want to touch. I assume that it has a shape of a PhysicsBody/texture:
override func touchesEnded(touches: NSSet, withEvent event: UIEvent) {
for touch: AnyObject in touches {
let touch = touches.anyObject() as UITouch
let touchLocation = touch.locationInNode(self)
let node: SKNode = nodeAtPoint(touchLocation)
if node.name == "fireNode" {
var playguitar = SKAction.playSoundFileNamed("fire.wav", waitForCompletion: true)
node.runAction(playguitar)
}
}
}
}

Physics bodies and their shapes are for physics — that is, for simulating things like collisions between sprites in your scene.
Touch handling (i.e. nodeAtPoint) doesn't go through physics. That's probably the way it should be — if you had a sprite with a very small collision surface, you might not necessarily want it to be difficult to touch, and you also might want to be able to touch nodes that don't have physics. So your physics body doesn't affect the behavior of nodeAtPoint.
An API that lets you define a hit-testing area for a node — that's independent of the node's contents or physics — might not be a bad idea, but such a thing doesn't currently exist in SpriteKit. They do take feature requests, though.
In the meantime, fine-grained hit testing is something you'd have to do yourself. There are at least a couple of ways to do it:
If you can define the touch area as a path (CGPath or UIBezierPath/NSBezierPath), like you would for creating an SKShapeNode or a physics body using bodyWithPolygonFromPath:, you can hit-test against the path. See CGPathContainsPoint / containsPoint: / containsPoint: for the kind of path you're dealing with (and be sure to convert to the right local node coordinate space first).
If you want to hit-test against individual pixels... that'd be really slow, probably, but in theory the new SKMutableTexture class in iOS 8 / OX X v10.10 could help you. There's no saying you have to use the modifyPixelDataWithBlock: callback to actually change the texture data... you could use that once in setup to get your own copy of the texture data, then write your own hit testing code to decide what RGBA values count as a "hit".

Related

how to detect touch on node

I have an app thats spawn ball on the screen every 1 second. now, I want the user to touch those balls what make them disappear (removeFromParent()). as I understand I have to set the touch function via touchesBegan and I do so, here is my code:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch: AnyObject in touches{
let positionOfTouch = touch.location(in: self)
enumerateChildNodes(withName: "BALL") { (node: SKNode, nil) in
if positionOfTouch == node.position {
print("just touched the ball")
}
else{
print("error")
}
}
}
the problem is that when I touch the screen/ ball the console print error instead of just touched the ball, which mean that my code doesn't work. moreover, the console print the error message as the number of the balls in my view. i don't relay understand what I am doing wrong and how to really set this function.
here is my createBall function which implement from my BallNode class (type SKShapeNode):
func createBall(){
let ball = BallNode(radius: 65)
print(ball.Name)
print(ball._subName!)
ball.position.y = ((frame.size.height) - 200)
let ballXPosition = arc4random_uniform(UInt32(frame.size.width)) // set the ball a randon position from the top of the screen
ball.position.x = CGFloat(ballXPosition)
ball.physicsBody?.categoryBitMask = PhysicsCategory.ball // ball's category bitMask
ball.physicsBody?.collisionBitMask = PhysicsCategory.ball // prevent objects from intersecting
ball.physicsBody?.contactTestBitMask = PhysicsCategory.topBorder // when need to know if two objects touch each other
addChild(ball)
}
can you help me with that? because I am quit new for swift I also would like to get some explanation about this touch detection (and touches in general - the apple doc is poor).
every time you touch the screen you are cycling through all balls to see if you're touching one of them. if you have 50 balls on the screen it goes through them all to see if you are touching 1. that's not an efficient way of figuring out if you are touching 1.
There are many ways you can do this but what I would do is handle the touches inside of the Ball class. That way you don't have to figure out if you are touching a ball and which one it might be.
Explanation of protocol (to the best of my ability) this may seem a little much right now, but the faster you learn and understand protocols that better off you will be (IMO).
In this example we will use a protocol to setup a delegate of the
BallNode class. A protocol is a set user defined "rules" that must be
followed by any class that you designate compliant to that protocol.
In my example I state that for a class to be compliant to the
BallNodeDelegate protocol it must contain the didClick func. When you
add the BallNodeDelegate after GameScene you are stating that this
class will be compliant to that protocol. So if in GameScene you did
not have the didClick func it will cause an error. All this is put in
place so that you have an easy way to communicate between your
BallNode instances and your GameScene class (without having to pass
around references to your GameScene to each BallNode). Each BallNode
then has a delegate (GameScene) which you can pass back the
information to.
inside your BallNode class make sure you have isUserInteraction = true
outside of your BallNode class create a protocol that will send the touch info back to the GameScene
protocol BallNodeDelegate: class {
func didClick(ball: BallNode)
}
create a delegate variable in your BallNode class
weak var delegate: BallNodeDelegate!
move the touches began to you BallNode class
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
self.delegate?.didClick(ball: self)
}
in GameScene add the compliance to the BallNode protocol
class GameScene: SKScene, BallNodeDelegate
in GameScene when you create a Ball make sure you set it's delegate
let ball = BallNode()
ball.delegate = self
in GameScene add the nest. func to handle the clicks
func didClick(ball: BallNode) {
print("clicked ball")
}
You are comparing the exact touch point with the exact position of the node, which are very unlikely to ever be the same.
if positionOfTouch == node.position {
Instead, you'll need to test to see if the user's touch is close enough to the position of the ball.
One option is to use SKNode's contains function, which will handle this for you.
if node.contains(positionOfTouch) {
Side note: You'll probably want to use SKSpriteNode instead of SKShapeNode, as SKShapeNode has poor performance in SpriteKit.
Take a look at nodes(at:CGPoint) defined at SKNode to retrieve a list of the nodes at the touched position. You'll need to convert in between view coordinates and scene coordinates, though, using convertPoint(fromView). Documentation here and here.

How do I programmatically move an ARAnchor?

I'm trying out the new ARKit to replace another similar solution I have. It's pretty great! But I can't seem to figure out how to move an ARAnchor programmatically. I want to slowly move the anchor to the left of the user.
Creating the anchor to be 2 meters in front of the user:
var translation = matrix_identity_float4x4
translation.columns.3.z = -2.0
let transform = simd_mul(currentFrame.camera.transform, translation)
let anchor = ARAnchor(transform: transform)
sceneView.session.add(anchor: anchor)
later, moving the object to the left/right of the user (x-axis)...
anchor.transform.columns.3.x = anchor.transform.columns.3.x + 0.1
repeated every 50 milliseconds (or whatever).
The above does not work because transform is a get-only property.
I need a way to change the position of an AR object in space relative to the user in a way that keeps the AR experience intact - meaning, if you move your device, the AR object will be moving but also won't be "stuck" to the camera like it's simply painted on, but moves like you would see a person move while you were walking by - they are moving and you are moving and it looks natural.
Please note the scope of this question relates only to how to move an object in space in relation to the user (ARAnchor), not in relation to a plane (ARPlaneAnchor) or to another detected surface (ARHitTestResult).
Thanks!
You don't need to move anchors. (hand wave) That's not the API you're looking for...
Adding ARAnchor objects to a session is effectively about "labeling" a point in real-world space so that you can refer to it later. The point (1,1,1) (for example) is always the point (1,1,1) — you can't move it someplace else because then it's not the point (1,1,1) anymore.
To make a 2D analogy: anchors are reference points sort of like the bounds of a view. The system (or another piece of your code) tells the view where it's boundaries are, and the view draws its content relative to those boundaries. Anchors in AR give you reference points you can use for drawing content in 3D.
What you're asking is really about moving (and animating the movement of) virtual content between two points. And ARKit itself really isn't about displaying or animating virtual content — there are plenty of great graphics engines out there, so ARKit doesn't need to reinvent that wheel. What ARKit does is provide a real-world frame of reference for you to display or animate content using an existing graphics technology like SceneKit or SpriteKit (or Unity or Unreal, or a custom engine built with Metal or GL).
Since you mentioned trying to do this with SpriteKit... beware, it gets messy. SpriteKit is a 2D engine, and while ARSKView provides some ways to shoehorn a third dimension in there, those ways have their limits.
ARSKView automatically updates the xScale, yScale, and zRotation of each sprite associated with an ARAnchor, providing the illusion of 3D perspective. But that applies only to nodes attached to anchors, and as noted above, anchors are static.
You can, however, add other nodes to your scene, and use those same properties to make those nodes match the ARSKView-managed nodes. Here's some code you can add/replace in the ARKit/SpriteKit Xcode template project to do that. We'll start with some basic logic to run a bouncing animation on the third tap (after using the first two taps to place anchors).
var anchors: [ARAnchor] = []
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
// Start bouncing on touch after placing 2 anchors (don't allow more)
if anchors.count > 1 {
startBouncing(time: 1)
return
}
// Create anchor using the camera's current position
guard let sceneView = self.view as? ARSKView else { return }
if let currentFrame = sceneView.session.currentFrame {
// Create a transform with a translation of 30 cm in front of the camera
var translation = matrix_identity_float4x4
translation.columns.3.z = -0.3
let transform = simd_mul(currentFrame.camera.transform, translation)
// Add a new anchor to the session
let anchor = ARAnchor(transform: transform)
sceneView.session.add(anchor: anchor)
anchors.append(anchor)
}
}
Then, some SpriteKit fun for making that animation happen:
var ballNode: SKLabelNode = {
let labelNode = SKLabelNode(text: "🏀")
labelNode.horizontalAlignmentMode = .center
labelNode.verticalAlignmentMode = .center
return labelNode
}()
func startBouncing(time: TimeInterval) {
guard
let sceneView = self.view as? ARSKView,
let first = anchors.first, let start = sceneView.node(for: first),
let last = anchors.last, let end = sceneView.node(for: last)
else { return }
if ballNode.parent == nil {
addChild(ballNode)
}
ballNode.setScale(start.xScale)
ballNode.zRotation = start.zRotation
ballNode.position = start.position
let scale = SKAction.scale(to: end.xScale, duration: time)
let rotate = SKAction.rotate(toAngle: end.zRotation, duration: time)
let move = SKAction.move(to: end.position, duration: time)
let scaleBack = SKAction.scale(to: start.xScale, duration: time)
let rotateBack = SKAction.rotate(toAngle: start.zRotation, duration: time)
let moveBack = SKAction.move(to: start.position, duration: time)
let action = SKAction.repeatForever(.sequence([
.group([scale, rotate, move]),
.group([scaleBack, rotateBack, moveBack])
]))
ballNode.removeAllActions()
ballNode.run(action)
}
Here's a video so you can see this code in action. You'll notice that the illusion only works as long as you don't move the camera — not so great for AR. When using SKAction, we can't adjust the start/end states of the animation while animating, so the ball keeps bouncing back and forth between its original (screen-space) positions/rotations/scales.
You could do better by animating the ball directly, but it's a lot of work. You'd need to, on every frame (or every view(_:didUpdate:for:) delegate callback):
Save off the updated position, rotation, and scale values for the anchor-based nodes at each end of the animation. You'll need to do this twice per didUpdate callback, because you'll get one callback for each node.
Work out position, rotation, and scale values for the node being animated, by interpolating between the two endpoint values based on the current time.
Set the new attributes on the node. (Or maybe animate it to those attributes over a very short duration, so it doesn't jump too much in one frame?)
That's kind of a lot of work to shoehorn a fake 3D illusion into a 2D graphics toolkit — hence my comments about SpriteKit not being a great first step into ARKit.
If you want 3D positioning and animation for your AR overlays, it's a lot easier to use a 3D graphics toolkit. Here's a repeat of the previous example, but using SceneKit instead. Start with the ARKit/SceneKit Xcode template, take the spaceship out, and paste the same touchesBegan function from above into the ViewController. (Change the as ARSKView casts to as ARSCNView, too.)
Then, some quick code for placing 2D billboarded sprites, matching via SceneKit the behavior of the ARKit/SpriteKit template:
// in global scope
func makeBillboardNode(image: UIImage) -> SCNNode {
let plane = SCNPlane(width: 0.1, height: 0.1)
plane.firstMaterial!.diffuse.contents = image
let node = SCNNode(geometry: plane)
node.constraints = [SCNBillboardConstraint()]
return node
}
// inside ViewController
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
// emoji to image based on https://stackoverflow.com/a/41021662/957768
let billboard = makeBillboardNode(image: "⛹️".image())
node.addChildNode(billboard)
}
Finally, adding the animation for the bouncing ball:
let ballNode = makeBillboardNode(image: "🏀".image())
func startBouncing(time: TimeInterval) {
guard
let sceneView = self.view as? ARSCNView,
let first = anchors.first, let start = sceneView.node(for: first),
let last = anchors.last, let end = sceneView.node(for: last)
else { return }
if ballNode.parent == nil {
sceneView.scene.rootNode.addChildNode(ballNode)
}
let animation = CABasicAnimation(keyPath: #keyPath(SCNNode.transform))
animation.fromValue = start.transform
animation.toValue = end.transform
animation.duration = time
animation.autoreverses = true
animation.repeatCount = .infinity
ballNode.removeAllAnimations()
ballNode.addAnimation(animation, forKey: nil)
}
This time the animation code is a lot shorter than the SpriteKit version.
Here's how it looks in action.
Because we're working in 3D to start with, we're actually animating between two 3D positions — unlike in the SpriteKit version, the animation stays where it's supposed to. (And without the extra work for directly interpolating and animating attributes.)

Swift-Setting a physics body velocity by angle

I was wondering if it was at all possible to make an SKNode move forward in a particular direction, but with only one factor. I'm aware of both applying an impulse and setting the velocity of a physics body, but they're both determined by two factors; dx and dy. I also know of rotating to an angle with SKActions. But is it possible to make an object simply "move forward" once it has been set on an angle? Or set its velocity with just one factor?
Thanks in advance.
Yes, is the answer to your question.
What I think you're looking for = THRUST... right?
What you want is for the "ship" to be able to rotate in any direction, and the thrust to be applied correctly, out of the "arse" of the ship, moving it forward, in ship terms.
This is absolutely possible, but does require a little "dummy" trick.
But I'm confusing you.
The local space of a SKPhysicsBody is relative to its parent. I presume.
And there's the speculative part. I'm guessing. I haven't tried this.
But... most physicsBodys are the child of an SKNode that's parented to the scene.
If you parent your ship to a dummy node for the purposes of rotation, and then rotate the dummy node, you should be able to make your spaceship fly in circles without ever changing the thrust vector, by simply rotating the dummy node.
Theoretically.
Something like this horrible pseudo code might help to start... maybe.
let dummy = SKNode()
let ship = SKSPriteNode()
dummy.addchild(ship)
ship.Physicsbody(add how you want here...)
ship.PhysicsBody.applyForce (vector that's only X, for example)
rotate dummy with action over time...
Sure I think what you're talking about is something like this:
Now let's say you have an SKSpriteNode that is called player who eventually has a physicsBody setup.
var player: SKSpriteNode!
You can just set the dx property of their velocity, so lets say you wanted to move them horizontally towards the location where the user tapped on the right hand side of the screen. If you then detect the position of the touch with touchesBegan(_:)
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
guard let touch = touches.first else { return }
// Replace with name of node to detect touch
let touchLocation = touch.location(in: <NAME_OF_NODE_PROPERTY>)
// Verify it's in front of the player and not behind
if (touchLocation.x - playerPosition.x) > 0 {
movePlayerVertically(toward: touchLocation)
}
}
func movePlayerVertically(toward location: CGPoint) {
let dx:CGFloat = location.x - player.position.x
player.physicsBody!.velocity.dx = dx
}
EDIT: -
Since you said you just want to be able to move your player horizontally without knowing the destination, you could do something like this, this is just moving the player forward on the x-axis by 50pts every second, and will repeat it forever. Obviously you would want to tweak it to your liking.
let move = SKAction.moveBy(x: 50, y: 0, duration: 1)
let repeatAction = SKAction.repeatForever(move)
player.run(repeatAction)

touchesMoved not functioning properly

I am trying to implement an overriden touchesMoved func to enable SKSpriteNodes to be moved around by the user. However, I have to move the node very slowly for it to follow my touches when I drag. In addition, there are three SKSpriteNodes in the background (which you will see I explicitly set to .userInteractionEnabled = false) and these nodes will occasionally respond to the touches. Any help would be greatly appreciated. Let me know if there are any other parts of the code you need.
override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent?) {
var positionInScene = CGPoint(x: 375.0, y: 400.0) //sets a default position
titleLabel.userInteractionEnabled = false
drawingBoard.userInteractionEnabled = false
sideBar.userInteractionEnabled = false
for touch in touches {
positionInScene = touch.locationInNode(self)
if self.nodeAtPoint(positionInScene) is SKSpriteNode {
if (self.nodeAtPoint(positionInScene)).name == movableNodeName { //movableNodeName is the name assigned to all SKSpriteNodes that should be draggable
//I know this might be a strange way of doing it
(self.nodeAtPoint(touch.previousLocationInNode(self))).position = positionInScene
}
}
}
}
I've managed to fix this issue by just setting the coordinates of the node that shouldn't move back to what they were originally whenever it is moved. This seems to work as I haven't been able to replicate the bug again in testing. If anyone could come up with a better solution though, I would love to hear it.
Your problem is you are using touch.locationInNode(self), where self is your SKSpriteNode. This means that your touch move code will only respond to what is going on inside of your SKSpriteNode. What you need to do, is use either the parent of the sprite, or the scene, based on how you want to apply this logic.

Drag and drop children of the same sprite

I'm making a game using Sprite Kit and I want to be able to drag and drop boxes as they travel down the screen.
Here's the gist of the code: I spawn the boxes on a timer and they move down the screen.
override func didMoveToView(view: SKView) {
let timer = NSTimer.scheduledTimerWithTimeInterval(2.0, target: self, selector: Selector("spawnBox"), userInfo: nil, repeats: true)
}
func spawnBox() {
/* Set up the box */
addChild(box)
let boxMoveDown = SKAction.moveToY(-100, duration: 5.0)
let actionDone = SKAction.removeFromParent()
box.runAction(SKAction.sequence([boxMoveDown, actionDone]))
}
But the problem is I how can I move a specific child which I am touching without affecting all the other 'children'? I understand that at the moment, every time I spawn a box it's exactly the same so I can't be specific when I set a individual child's position.
Here's what's inside my touchesBegan and touchesMoved functions
if let touch = touches.first {
let location = touch.locationInNode(self)
let objects = nodesAtPoint(location) as [SKNode]
if objects.contains(nodeAtPoint(location)) && nodeAtPoint(location).name == "box" {
box.position = location
box.removeAllActions()
}
}
The - box.position = location is what
needs changing.
Hopefully you understand my idea. I've tried to keep included code to what's necessary. I'm quite new to Sprite Kit which you can probably tell.
If I were you, I would handle it this way:
Create a custom class for your box nodes that extends SKSpriteNode.
In this custom class, override the touch property.
Then set the position inside this function based on location.
Now all you need to worry about is your zPosition, whatever child has the highest zPosition will be the one that gets called on touch.
You do not need to worry about nodesAtPoint or what not anymore, the API will handle all that for you.