How to add normal Buttons in a SpriteKit Game? - swift

So I have made 2 games so far and I always used the touchesBegan function and SpriteNodes to create a Menu, like that:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches {
let location = touch.location(in: self)
if (atPoint(location).name == "playButton"){
startGame()
}
}
}
But this way is pretty ugly because as soon as you touch the button it will call the according action to it and you can't cancel it. But with a normal UIButton it will only call the action after you take the finger off the button, and you can even cancle a button click by just moving your finger away from the button. And the problem is, I don't know how I can add a UIButton to my MenuScene.swift file and I also don't use the Storyboard because it's just confusing to me and I don't understand how the .sks, .swift and .Storyboard files are linked together, it just makes no sense to me. I mean the .sks and .storyboard files are both for buildung a GUI but in .sks you cant add a UIButton... but why??? Is there any convenient way to add a button?

I wouldn't use UIButton in my Spritekit code, you are better off to create your own Button class in Spritekit and use it. The sks file is not linked to a storyboard it is linked to any class that you designate (GameScene, MenuScene) it can even be linked to smaller objects that have their own class (ScoreHUD, Castle, etc).
Honestly you are just over thinking the button thing. Think about the touch events. touchesBegan fires when the object is clicked if you want it to fire on finger up call touchesEnded
Here is a very simple button class I wrote for this example, there are a lot of things more you could do with this, but this covers the basics. It decides if the button will click on down or up and will not click if you move your finger off the button. It uses protocols to pass the click event back to the parent (probably you scene)
you can also add this to an sks file by dragging a colorSprite/ or image onto the scene editor and making it a custom class of "Button" in the Custom Class Inspector.
the Button Class...
protocol ButtonDelegate: class {
func buttonClicked(button: Button)
}
class Button: SKSpriteNode {
enum TouchType: Int {
class down, up
}
weak var buttonDelegate: ButtonDelegate!
private var type: TouchType = .down
init(texture: SKTexture, type: TouchType) {
var size = texture.size()
super.init(texture: texture ,color: .clear, size: size)
}
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
isPressed = true
if type == .down {
self.buttonDelegate.buttonClicked(button: self)
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first as UITouch! {
let touchLocation = touch.location(in: parent!)
if !frame.contains(touchLocation) {
isPressed = false
}
}
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
guard isPressed else { return }
if type == .up {
self.buttonDelegate.buttonClicked(button: self)
}
isPressed = false
}
}
In your GameScene file
class GameScene: SKScene, Button {
private var someButton: Button!
private var anotherButton: Button!
func createButtons() {
someButton = Button(texture: SKTexture(imageNamed: "blueButton"), type: .down)
someButton.position = CGPoint(x: 100, y: 100)
someButton.Position = 1
someButton.name = "blueButton"
addChild(someButton)
anotherButton = Button(texture: SKTexture(imageNamed: "redButton"), type: .up)
anotherButton = CGPoint(x: 300, y: 100)
anotherButton = 1
anotherButton = "redButton"
addChild(anotherButton)
}
func buttonClicked(button: Button) {
print("button clicked named \(button.name!)")
}
}

Related

removeFromParent() strange behavior

I have really strange behavior with function removeFromParent
lazy var buttonAds: SKSpriteNode = {
let n = SKSpriteNode(imageNamed: "ButtonAds")
n.position = CGPoint(x: size.width / 2, y: 600)
n.zPosition = 100
n.setScale(1.4)
return n
}()
in didMove(...) add this button with addChild(buttonAds), and latter in touchesBegan:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let touch = touches.first!
if buttonAds.contains(touch.location(in: self)) {
// ...
doAds()
buttonAds.removeFromParent()
}
}
If you tap on button for ads, will be removed, but if tap on that place again, this will call function doAds() again... it's strange, buttonAd don't exist on scene.
Initial:
and after tap:
Thanks
What you want to do is check if the node you touch is of the type it should be. Change your code to this:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let touch = touches.first!
if nodeAtPoint(touch.locationInNode(self)) == buttonAds {
doAds()
buttonAds.removeFromParent()
}
}
This should do the trick!
edit: as to why this works, you're removing the node from the scene but it is still an object in memory (otherwise you wouldn't be able to use buttonAds.contains(...) on it) so it also still has its position stored.

Cannot disable, then reenable touch, after an SKAction animation

I am working on an interactive, animated scene. I want all touches on the scene to be disabled on entry. Then, once the objects (which are subclassed nodes) in the scene finish rotating/moving, I want to re-enable all touches on the screen to allow interaction. I have disabled user interaction using this code:
override func didMove(to view: SKView) {
setupNodes()
view?.isUserInteractionEnabled = false
spinLocations()
}
This is the code, within the scene file, for spinLocations:
func spinLocations() {
var allLocationArrays = [[String : CGPoint]]()
var previousArray = hiddenLocationPositions
for _ in 0...SearchConstant.numSpins {
let freshArray = generateNewLocationArray(previous: previousArray)
allLocationArrays.append(freshArray)
previousArray = freshArray
}
for (item, _) in hiddenLocationPositions {
let node = fgNode.childNode(withName: item) as! LocationNode
node.spin(position: allLocationArrays) // this is function below
}
hiddenLocationPositions = previousArray
}
This is the code for the animations in the node class:
func spin(position: [[String : CGPoint]]) {
var allActions = [SKAction]()
for array in position {
let action = SKAction.move(to: array[self.name!]!, duration: 2.0)
allActions.append(action)
}
let allActionsSeq = SKAction.sequence(allActions)
self.run(SKAction.sequence([SKAction.wait(forDuration: 5.0), allActionsSeq, SKAction.run {
self.position = position[position.count - 1][self.name!]!
},]))
}
This is the code for passing back the touches to the main scene from this class:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
guard let parent = self.parent else { return }
}
As you can see, touch is not disabled here.
I do not want to add a "waitForDuration" SKAction to the runBlock to change the view status after the previous action; I want the program to determine when the animations are finished executing and then re-enable touches.
In order to do this, I theorised using a completion handler might work, but it only re-enables touches immediately (e.g. handling a handler to spin causes the touches to be detected again). Previously, I also tried to disable the view in the runBlock, but of course, that is run instantaneously. How do I ensure that the touches are re-detected following the animation without using "waitForDuration."?
So, this is a simple example that shows how you can:
1) Disable touches completely
2) Spin a node
3) When node is done with spinning, to enable touches
Here is the code (you can copy/paste it to try how it works):
class Object:SKSpriteNode{
func spin(times:Int,completion:#escaping ()->()) {
let duration = 3.0
let angle = CGFloat(M_PI) * 2.0
let oneRevolution = SKAction.rotate(byAngle: angle , duration: duration)
let spin = SKAction.repeat(oneRevolution, count: times)
let sequence = SKAction.sequence([spin,SKAction.run(completion)])
run(sequence, withKey:"spinning")
}
}
class WelcomeScene: SKScene {
override func didMove(to view: SKView) {
view.isUserInteractionEnabled = false
print("Touches Disabled")
let object = Object(texture: nil, color: .purple, size: CGSize(width: 200, height: 200))
addChild(object)
object.spin(times: 3, completion: {[weak self] in
self?.view?.isUserInteractionEnabled = true
print("Touches Enabled")
})
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
print("touch detected")
}
deinit {
print("Welcome scene deinited")
}
}
Here, you disable touches when scene is loaded, start spinning the object, and you pass a completion block to it... That block of code is used here:
let sequence = SKAction.sequence([spin,SKAction.run(completion)])
So after spinning, that block will be executed. Now, there are different ways to do this...Personally, I would use delegation, but I thought this can be less confusing... I can write an example for delegation too if needed, but basically, what you would do, is to set a scene as a delegate of your custom node, and notify it about spinning is done, so the scene can tell the view to re-enable the touches.

Sprite subclass detecting touches despite actually touching empty space

I'm subclassing nodes to use for touch detection. I have a box parent which has a line child directly next to it, with the gray space being just empty space:
The problem is when I click the gray space, it registers as a touch on the box, which is quite far away.
Here's the code where I show the problem, and my crappy workaround... I make two sets of boxes, the first being the one showing the problem, and the second being the one with the workaround:
import SpriteKit
class GameScene: SKScene {
enum sizez {
static let
box = CGSize(width: 50, height: 35),
line = CGSize(width: 200, height: 10)
}
class Box: SKSpriteNode {
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
print("touched box")
}
}
class Line: SKSpriteNode {
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
print("touched line")
}
}
override func didMove(to view: SKView) {
// The problem sprites that register touch despite empty space:
let box = Box(color: .red, size: sizez.box)
box.isUserInteractionEnabled = true
addChild(box)
let line = Line(color: .purple, size: sizez.line)
line.isUserInteractionEnabled = true
line.anchorPoint = CGPoint.zero
line.position = CGPoint(x: box.frame.maxX + 10, y: box.frame.minY)
///////////////////
box.addChild(line)
///////////////////
// These sprites detect touches properly (no detection in empty space)
let box2 = Box(color: .red, size: sizez.box)
box2.isUserInteractionEnabled = true
box2.position.y -= 100
addChild(box2)
let line2 = Line(color: .purple, size: sizez.line)
line2.isUserInteractionEnabled = true
line2.anchorPoint = CGPoint.zero
line2.position = CGPoint(x: box2.frame.maxX + 10, y: box2.frame.minY)
////////////////
addChild(line2)
////////////////
}
}
When you click directly above (or even farther out from) the top line, you get this:
When you do the same for the bottom line, you get this:
It will be a huge hassle to forgo SK's built in parent / child system, and then for me to keep track of them on my own manually... as well as it being a big performance hit for my app.
Any reasonable workaround or solution that lets me click in the gray space while using code similar to the first box would be greatly appreciated.
UPDATE:
By making an invisible background node and setting its zPositon 1 less, I can now click in the gray space and it registers as the background node, not the box.
let bkgSize = CGSize(width: 1000, height: 1000)
let bkg = Bkg(color: .gray, size: bkgSize)
bkg.isUserInteractionEnabled = true
bkg.zPosition -= 1
addChild(bkg)
But still, why is this empty space touch being registered as a box touch in the absence of a background node?
You're absolutely correct with a background node it works as expected, without a background sprite it gives the results as described by #Confused. I was able to get it to work as expected without a background by just fine tuning the TouchesBegan function like so...
class Box: SKSpriteNode {
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first as UITouch! {
let touchLocation = touch.location(in: parent!)
if frame.contains(touchLocation) {
print("touched box")
}
}
}
}
class Line: SKSpriteNode {
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first as UITouch! {
let touchLocation = touch.location(in: parent!)
if frame.contains(touchLocation) {
print("touched line")
}
}
}
}
From my understanding of how bounds work for parented objects, I think this is what's going on...
Addendum
This is slightly related, and may help in understanding why this is happening, and why Apple considers this to be how a parent and its children should use an accumulated (combined) rectangle/quad for touch response:

How to make button work in SpriteKit?

I am creating a game, where I have created a Simple UI, and I want the play button to be an action whenever the player hit the play button.
Here is an image:
Here is my code as well.
//I JUST DID THIS.
let bgImage = SKSpriteNode(imageNamed: "bg.png")
bgImage.position = CGPoint(x: self.size.width/2, y: self.size.height/2)
bgImage.size = self.frame.size
bgImage.name = "button1"
self.addChild(bgImage)
By default isUserInteractionEnabled is false so the touch on a scene child like your bgImage is, by default, a simple touch handled to the main (or parent) class (the object is here, exist but if you don't implement any action, you simply touch it)
If you set the userInteractionEnabled property to true on a SKSpriteNode subclass then the touch delegates will called inside this specific class. So, you can handle the touch for the sprite within its class. But you don't need it, this is not your case, you don't have subclassed your bgImage.
You should simply made in your scene:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches {
let location = touch.location(in: self)
let node : SKNode = self.atPoint(location)
if node.name == "button1" {
print("Tapped")
}
}
}
When I look to your image I suspected that your sprite bg.png was composed by background and also the button image: this is very uncomfortable, you should use only an image about your button and , if you want , make another sprite to show your background otherwise you touch ALL (background and button obviusly, not only the button as you needed..).
So, you should separate the image, for example your button could be this:
Did you try to add: bgImage.isUserInteractionEnabled = true ?
let bgImage = SKSpriteNode(imageNamed: "bg.png")
bgImage.position = CGPoint(x: self.size.width/2, y: self.size.height/2)
bgImage.size = self.frame.size
bgImage.name = "button1"
bgImage.isUserInteractionEnabled = true
self.addChild(bgImage)
Then:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches {
let location = touch.location(in: self)
let node : SKNode = self.atPoint(location)
if node.name == "button1" {
print("Tapped")
}
}
}

Swift Spritekit Adding Button programmatically

How do I programmatically add a button that will run an action when it's clicked? What code would be used?
I am used to just adding a button in the storyboard and running an IBAction from there.
Adding a button in SpriteKit and responding to taps on it is not quite as easy as it is in UIKit. You basically need to create an SKNode of some sort which will draw your button and then check to see if touches registered in your scene are within that node's bounds.
A really simple scene with just a single red rectangle in the center acting as a button would look something like this:
import UIKit
import SpriteKit
class ButtonTestScene: SKScene {
var button: SKNode! = nil
override func didMove(to view: SKView) {
// Create a simple red rectangle that's 100x44
button = SKSpriteNode(color: .red, size: CGSize(width: 100, height: 44))
// Put it in the center of the scene
button.position = CGPoint(x:self.frame.midX, y:self.frame.midY);
self.addChild(button)
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
// Loop over all the touches in this event
for touch in touches {
// Get the location of the touch in this scene
let location = touch.location(in: self)
// Check if the location of the touch is within the button's bounds
if button.contains(location) {
print("tapped!")
}
}
}
}
If you need a button that looks and animates like the ones in UIKit, you'll need to implement that yourself; there's nothing built in to SpriteKit.
I have created a class SgButton (https://github.com/nguyenpham/sgbutton) in Swift to create buttons for SpriteKit. You can create buttons with images, textures (from SpriteSheet), text only, text and background images/texture. For example, to create button with image:
SgButton(normalImageNamed: "back.png")
Create button with textures:
SgButton(normalTexture: buttonSheet.buy(), highlightedTexture: buttonSheet.buy_d(), buttonFunc: tappedButton)
Create round corner text button:
SgButton(normalString: "Tap me", normalStringColor: UIColor.blueColor(), size: CGSizeMake(200, 40), cornerRadius: 10.0, buttonFunc: tappedButton)
Mike S - Answer updated for - Swift 5.2
override func didMove(to view: SKView) {
createButton()
}
func createButton()
{
// Create a simple red rectangle that's 100x44
button = SKSpriteNode(color: SKColor.red, size: CGSize(width: 100, height: 44))
// Put it in the center of the scene
button.position = CGPoint(x:self.frame.midX, y:self.frame.midY);
self.addChild(button)
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
let touch = touches.first
let touchLocation = touch!.location(in: self)
// Check if the location of the touch is within the button's bounds
if button.contains(touchLocation) {
print("tapped!")
}
}
You can use OOButtonNode.
Text/Image buttons, Swift 4.
func tappedButton(theButton: UIButton!) {
println("button tapped")
}
}
The above code prints out button tapped when the button is tapped.
P.S. the swift ebook is a really good guide for the new programming language.