I have several buttons that move a main character. (Left, right, jump, etc.) But, when I touch more than one button at a time, the previous one is ended. My question is, how do I keep both touches alive for the duration of their touches? As an example, this would allow the character to move forward and jump at the same time. I have set multipleTouchEnabled to true. I've read that using a dictionary to track touches would help, but I can't seem to wrap my head around the implementation.
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
for touch in touches {
let location = touch.locationInView(nil)
if location.x < self.size.width / 2 && location.y > self.size.height / 2 {
movingLeft = true
}
if location.x > self.size.width / 2 && location.y > self.size.height / 2 {
movingRight = true
}
if location.x < self.size.width / 2 && location.y < self.size.height / 2 {
jump()
}
if location.x > self.size.width / 2 && location.y < self.size.height / 2 {
jump()
}
}
}
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
movingLeft = false
movingRight = false
}
func jump() {
mainCharacter.physicsBody?.velocity = CGVector(dx: 0, dy: 0)
mainCharacter.physicsBody?.applyImpulse(CGVector(dx: 0, dy: 400))
}
The logic
You should create a dictionary of active touches.
private var activeTouches = [UITouch:String]()
Every time a touch begins you save it into the dictionary and assign to it a label.
activeTouches[touch] = "left"
So when the touch does end you can search for it into your dictionary and find the related label. Now you know which button has been released by the user.
let button = activeTouches[touch]
if button == "left" { ... }
And don't forget to remove it from the dictionary.
activeTouches[touch] = nil
The implementation
class GameScene: SKScene {
private var activeTouches = [UITouch:String]()
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
for touch in touches {
let button = findButtonName(from:touch)
activeTouches[touch] = button
tapBegin(on: button)
}
}
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
for touch in touches {
guard let button = activeTouches[touch] else { fatalError("Touch just ended but not found into activeTouches")}
activeTouches[touch] = nil
tapEnd(on: button)
}
}
private func tapBegin(on button: String) {
print("Begin press \(button)")
// your custom logic goes here
}
private func tapEnd(on button:String) {
print("End press \(button)")
// your custom logic goes here
}
private func findButtonName(from touch: UITouch) -> String {
// replace this with your custom logic to detect a button location
let location = touch.locationInView(self.view)
if location.x > self.view?.frame.midX {
return "right"
} else {
return "left"
}
}
}
In the code above you should put your own code into
tapBegin: this method receive the label of a button and start some action.
E.g. start running.
tapEnd: this method receive the label of a button and stop some action.
E.g. stop running.
findButtonName: this method receives a UITouch and returns the label of the button pressed by the user.
Test
I tested the previous code on my iPhone. I performed the following actions.
started pressing the right of the screen
started pressing the left of the screen
removed finger from the right of the screen
removed finger from the left of the screen
As you can see in the following log the code is capable of recognizing different touches
Begin press right
Begin press left
End press right
End press left
Conclusion
I hope I made myself clear. Let me know if something is not.
Issue
For me it was a bit foolish but there's an option that I miss to enable in my storyboard in order to handle multiple touch in my GameScene of type SKScene.
When one of the overwritten functions is being called, you should have more than 1 touch contained in touches.
Those functions are indeed:
touchesBegan
touchesMoved
touchesEnded
touchesCancelled
... and the way to retrieve the touches is indeed to browse through them. However, if you are only able to retrieve one touch only, follow the steps bellow.
Debugging
Here's what you can do to see if you have multiple touches in your set of touches:
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
for (i, t) in touches.enumerated() {
print(i, t.location(in: self))
self.touchMoved(toPoint: t.location(in: self))
}
}
Output:
There you should find a 0 for the first touch and a 1 when you touch with 2 fingers... and so on.
Solution
I case you encounter the same problem than me and have only the touch t at index 0 but no other touch, you should see upper in the hierarchy of views if all of them are enabled for multi-touch. In my case it was the view containing the GameScene of type SKScene that did not have the option Multiple Touch enabled. See below:
Related
I'm working on a keyboard extension for iOS. However, I'm having some weird issues with animations / layers not appearing instantly on the far left of the screen. I use layers / animations to show a "tool tip" when the user presses a key. For all keys except A and Q the tool tips are displayed instantly, but for these two keys there seems to be a slight delay before the layer and animation appears. This only happens on touch down, if I slide into the Q or A hit area the tool tips gets rendered instantly. My debugging shows that the code executes exactly the same for all keys, but for these two keys it has no immediate effect.
Any ideas on if there's anything special with the left edge of the screen that might cause this behaviour? Or am I doing something stupid that might be the cause of this?
This is part of my touch handling code that triggers the tool tip rendering:
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
if(!shouldIgnoreTouches()) {
for touch in touches {
let location = (touch ).locationInView(self.inputView)
// pass coordinates to offset service to find candidate keys
let keyArray = keyOffsetService.getKeys(_keyboardLayout!, location: location)
let primaryKey = keyArray[0]
if primaryKey.alphaNumericKey != nil {
let layers = findLayers(touch )
if layers.keyLayer != nil {
graphicsService.animateKeyDown(layers.keyLayer as! CATextLayer, shieldLayer: layers.shieldLayer)
_shieldsUp.append((textLayer:layers.keyLayer, shieldLayer:layers.shieldLayer))
}
}
}
}
}
animation code:
func animateKeyDown(layer:CATextLayer, shieldLayer:CALayer?) {
if let sLayer = shieldLayer {
keyDownShields(layer, shieldLayer: sLayer)
CATransaction.begin()
CATransaction.setDisableActions(true)
let fontSizeAnim = CABasicAnimation(keyPath: "fontSize")
fontSizeAnim.removedOnCompletion = true
fontSizeAnim.fromValue = layer.fontSize
fontSizeAnim.toValue = layer.fontSize * 0.9
layer.fontSize = layer.fontSize * 0.9
let animation = CABasicAnimation(keyPath: "opacity")
animation.removedOnCompletion = true
animation.fromValue = layer.opacity
animation.toValue = 0.3
layer.opacity = 0.3
let animGroup = CAAnimationGroup()
animGroup.animations = [fontSizeAnim, animation]
animGroup.duration = 0.01
layer.addAnimation(animGroup, forKey: "down")
CATransaction.commit()
}
}
unhide tooltip layer:
private func keyDownShields(layer:CATextLayer, shieldLayer:CALayer) {
shieldLayer.hidden = false
shieldLayer.setValue(true, forKey: "isUp")
shieldLayer.zPosition = 1
shieldLayer.removeAllAnimations()
layer.setValue(true, forKey: "isUp")
}
This is caused by a feature in iOS 9 which allows the user to switch apps by force pressing the left edge of the screen while swiping right.
You can turn this off by disabling 3D touch but this is hardly a solution.
I am not aware of any API that allows you to override this behavior.
The official solution is overriding preferredScreenEdgesDeferringSystemGestures of your UIInputViewController.
https://developer.apple.com/documentation/uikit/uiviewcontroller/2887512-preferredscreenedgesdeferringsys
However, it doesn't seem to work on iOS 13 at least. As far as I understand, that happens due to preferredScreenEdgesDeferringSystemGestures not working properly when overridden inside UIInputViewController, at least on iOS 13.
When you override this property in a regular view controller, it works as expected:
override var preferredScreenEdgesDeferringSystemGestures: UIRectEdge {
return [.left, .bottom, .right]
}
That' not the case for UIInputViewController, though.
UPD: It appears, gesture recognizers will still get .began state update, without the delay. So, instead of following the rather messy solution below, you can add a custom gesture recognizer to handle touch events.
You can quickly test this adding UILongPressGestureRecognizer with minimumPressDuration = 0 to your control view.
Another solution:
My original workaround was calling touch down effects inside hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView?, which is called even when the touches are delayed for the view.
You have to ignore the "real" touch down event, when it fires about 0.4s later or simultaneously with touch up inside event. Also, it's probably better to apply this hack only in case the tested point is inside ~20pt lateral margins.
So for example, for a view with equal to screen width, the implementation may look like:
let edgeProtectedZoneWidth: CGFloat = 20
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
let result = super.hitTest(point, with: event)
guard result == self else {
return result
}
if point.x < edgeProtectedZoneWidth || point.x > bounds.width-edgeProtectedZoneWidth
{
if !alreadyTriggeredFocus {
isHighlighted = true
}
triggerFocus()
}
return result
}
private var alreadyTriggeredFocus: Bool = false
#objc override func triggerFocus() {
guard !alreadyTriggeredFocus else { return }
super.triggerFocus()
alreadyTriggeredFocus = true
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesCancelled(touches, with: event)
alreadyTriggeredFocus = false
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesEnded(touches, with: event)
alreadyTriggeredFocus = false
}
...where triggerFocus() is the method you call on touch down event. Alternatively, you may override touchesBegan(_:with:).
im trying to create a "button touch" effect for one of my sprites, it works well but then I tap with 2 or more fingers at the same time, i get really weird results, here is my code:
let buttonPressAction = SKAction.scaleBy(0.8, duration: 0)
var button = SKNode()
override func didMoveToView(view: SKView) {
//assign sprite to node
button = self.childNodeWithName("button") as! SKSpriteNode!
}
for touch: AnyObject in touches {
let location = touch.locationInNode(self)
if button.containsPoint(location) {
button.runAction(buttonPressAction)
}
}
}
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
button.runAction(buttonPressAction.reversedAction())
}
Try changing the SK scale action from
...scaleBy
to
...scaleTo
to ensure it will always scale to the same size. With scaleBy it will scale it by 0.8, not to 0.8. That most likely causes the weird results on multiple touches because you are scaling by 0.8 for each finger/tap.
I never used reverseAction before so I am not sure if that might cause issues. If it does just reset the button by scaling it back to 1
...scaleTo(1, duration: 0)
As as side note you can just say
for touch in touches
instead of
for touch: AnyObject in touches
Hi I'm having difficulties finding the correct code to apply to my sprite to allow small jump when only taped and a higher jump when finger is on screen longer. (Please find current code below)
override func touchesBegan(touches: Set<NSObject>, withEvent event:UIEvent) {
/* Called when a touch begins */
if (gameOver == 0){
//Player Begin Jumping.
player.physicsBody?.applyImpulse(CGVectorMake(0, 200))
player.runAction(SKAction .playSoundFileNamed("sounds/Jump.caf", waitForCompletion: true))
}
}
override func touchesEnded(touches: Set<NSObject>, withEvent event: UIEvent) {
if (gameOver == 0){
//Player End Jump.
player.physicsBody?.applyImpulse(CGVectorMake(0, -120))
You can use the update-method. and in your touchesBegan-method you set a boolean or something like that to show the update-method that you are still pressing on the screen. For example:
//touchesBegan
touching = true
//update-method
if touching {
player.physicsBody?.applyImpulse(CGVectorMake(0, 1))
}
//touchesEnded
touching = false
You have to change the applyImpulse so that it fits your needs.
It's actually really simple, this means when ever the player lets go of the screen he will fall and if he hold he will reach 200 else he falls
touches began {
sprite.physicsbody.applyImpulse(CGVector(dx: 0 dy: 200)
}
touches eneded {
sprite.physicsbody.applyImpulse(CGVector(dx: 0 dy: -57)
}
So I have a SpriteNode that jumps up and to the right when tapped. That's all great but when it jumps it lands to the right. If I keep tapping it it will jump off the screen.
I need the sprite to jump up and to the right and return to the place where it started. A kind of diagonal jump rather than an arc. How do I do this?
My code:
override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {
/* Called when a touch begins */
for touch: AnyObject in touches {
let location = touch.locationInNode(self)
savior.physicsBody?.velocity = CGVectorMake(0, 0) //pt it returns to?
savior.physicsBody?.applyImpulse(CGVectorMake(5, 30)) //direction/height that it jumps
if UIDevice().iPhone6plus {
savior.physicsBody?.velocity = CGVectorMake(0, 0) //pt it returns to?
savior.physicsBody?.applyImpulse(CGVectorMake(0, 12)) //direction/height that it jumps
}
if UIDevice().iPad2 {
savior.physicsBody?.velocity = CGVectorMake(0, 0) //pt it returns to?
savior.physicsBody?.applyImpulse(CGVectorMake(0, 140)) //direction/height that it jumps
}
}
How can I recognize continuous user touch in Swift code? By continuous I mean that the user has her finger on the screen. I would like to move a sprite kit node to the direction of user's touch for as long as the user is touching screen.
The basic steps
Store the location of the touch events (touchesBegan/touchesMoved)
Move sprite node toward that location (update)
Stop moving the node when touch is no longer detected (touchesEnded)
Here's an example of how to do that
Xcode 8
let sprite = SKSpriteNode(color: SKColor.white, size: CGSize(width:32, height:32))
var touched:Bool = false
var location = CGPoint.zero
override func didMove(to view: SKView) {
/* Add a sprite to the scene */
sprite.position = CGPoint(x:0, y:0)
self.addChild(sprite)
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
touched = true
for touch in touches {
location = touch.location(in:self)
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches {
location = touch.location(in: self)
}
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
// Stop node from moving to touch
touched = false
}
override func update(_ currentTime: TimeInterval) {
// Called before each frame is rendered
if (touched) {
moveNodeToLocation()
}
}
// Move the node to the location of the touch
func moveNodeToLocation() {
// Compute vector components in direction of the touch
var dx = location.x - sprite.position.x
var dy = location.y - sprite.position.y
// How fast to move the node. Adjust this as needed
let speed:CGFloat = 0.25
// Scale vector
dx = dx * speed
dy = dy * speed
sprite.position = CGPoint(x:sprite.position.x+dx, y:sprite.position.y+dy)
}
Xcode 7
let sprite = SKSpriteNode(color: SKColor.whiteColor(), size: CGSizeMake(32, 32))
var touched:Bool = false
var location = CGPointMake(0, 0)
override func didMoveToView(view: SKView) {
self.scaleMode = .ResizeFill
/* Add a sprite to the scene */
sprite.position = CGPointMake(100, 100)
self.addChild(sprite)
}
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
/* Start moving node to touch location */
touched = true
for touch in touches {
location = touch.locationInNode(self)
}
}
override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent?) {
/* Update to new touch location */
for touch in touches {
location = touch.locationInNode(self)
}
}
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
// Stop node from moving to touch
touched = false
}
override func update(currentTime: CFTimeInterval) {
/* Called before each frame is rendered */
if (touched) {
moveNodeToLocation()
}
}
// Move the node to the location of the touch
func moveNodeToLocation() {
// How fast to move the node
let speed:CGFloat = 0.25
// Compute vector components in direction of the touch
var dx = location.x - sprite.position.x
var dy = location.y - sprite.position.y
// Scale vector
dx = dx * speed
dy = dy * speed
sprite.position = CGPointMake(sprite.position.x+dx, sprite.position.y+dy)
}
The most difficult thing about this process is tracking single touches within a multitouch environment. The issue with the "simple" solution to this (i.e., turn "istouched" on in touchesBegan and turn it off in touchesEnded) is that if the user touches another finger on the screen and then lifts it, it will cancel the first touch's actions.
To make this bulletproof, you need to track individual touches over their lifetime. When the first touch occurs, you save the location of that touch and move the object towards that location. Any further touches should be compared to the first touch, and should be ignored if they aren't the first touch. This approach also allows you to handle multitouch, where the object could be made to move towards any finger currently on the screen, and then move to the next finger if the first one is lifted, and so on.
It's important to note that UITouch objects are constant across touchesBegan, touchesMoved, and touchesEnded. You can think of a UITouch object as being created in touchesBegan, altered in touchesMoved, and destroyed in touchesEnded. You can track the phase of a touch over the course of its life by saving a reference to the touch object to a dictionary or an array as it is created in touchesBegan, then in touchesMoved you can check the new location of any existing touches and alter the object's course if the user moves their finger (you can apply tolerances to prevent jitter, e.g., if the x/y distance is less than some tolerance, don't alter the course). In touchesEnded you can check if the touch in focus is the one that ended, and cancel the object's movement, or set it to move towards any other touch that is still occurring. This is important, as if you just check for any old touch object ending, this will cancel other touches as well, which can produce unexpected results.
This article is in Obj-C, but the code is easily ported to Swift and shows you what you need to do, just check out the stuff under "Handling a Complex Multitouch Sequence": https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/multitouch_background/multitouch_background.html
Below is the code to drag the node around on X position (left and right), it is very easy to add Y position and do the same thing.
let item = SKSpriteNode(imageNamed: "xx")
var itemXposition = 50
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
// updates itemXposition variable on every touch
for touch in touches {
let location = touch.location(in: self)
itemXposition = Int(location.x)
}
}
// this function is called for each frame render, updates the position on view
override func update(_ currentTime: TimeInterval) {
spaceShip.position = CGPoint(x: self.itemXposition , y: 50 )
}