Error: Argument labels '(_:, _:)' do not match any available overloads - iphone

When I put the following code in my project
self.physicsWorld.gravity = CGVector(CGFloat((data?.acceleration.x)!) * 10, CGFloat((data?.acceleration.y)!) * 10)
I get an error
Argument labels '(_:, _:)' do not match any available overloads

Whenever you want to know more about a method/property, go to the documentation. For the initializer of CGVector, here is the docs.
Look at the initializer's declaration:
init(dx: CGFloat, dy: CGFloat)
Note how there are the argument labels dx and dy. This means that when you call the method, you should add those in your method call.
self.physicsWorld.gravity = CGVector(
dx: CGFloat((data?.acceleration.x)!) * 10,
dy: CGFloat((data?.acceleration.y)!) * 10)

CGVector(dx: CGFloat, dy: CGFloat)
You aren't using the correct initializer for CGVector.
self.physicsWorld.gravity = CGVector(
dx: CGFloat((data?.acceleration.x)!) * 10,
dy: CGFloat((data?.acceleration.y)!) * 10)
This should work.

Related

CGPoint: No exact matches in call to initializer

CGPoint init - No exact matches in call to initializer error
What's wrong with my code? so ..
let ctx: CGContext = UIGraphicsGetCurrentContext()!
ctx.setLineWidth(1/UIScreen.main.scale);
ctx.setLineCap(.square)
ctx.setLineJoin(.round)
ctx.setStrokeColor(UIColor.black.cgColor)
for i in 0...3 {
ctx.move(to: CGPoint(x: 0, y: 40*i))
ctx.addLine(to: CGPoint(x: UIScreen.main.bounds.size.width, y: CGFloat(40*i)))
}
you pass a CGFloat to x, and an Int to y. So swift is looking in vain for a CGPoint(x: CGFloat, y: Int) initializer.
CGPoint(x: UIScreen.main.bounds.width, y: CGFloat(40 * i))
Should work

Unable to create 108 dots in a circle

I am trying to create 108 dots in a circle and I am getting Index out of range error. The dots are created using an array of UILabels.
I have the following code:
func createMala() {
let malaFrame = UIView()
malaFrame.frame = CGRect(x: 0, y: 0, width: view.frame.width - 20, height: view.frame.width - 20)
malaFrame.center = CGPoint(x: view.frame.width / 2.0, y: (malaFrame.frame.height / 2.0) + 20)
var malaBeadLabel = [RoundLabel]()
let malaRadius : Double = 100.0
let angleInRadians : Double = 3.3333 * .pi / 180.0
for i in 1...108 {
malaBeadLabel[i].frame = CGRect(x: (malaRadius * sin(angleInRadians) * Double(i)),
y: (malaRadius * cos(angleInRadians) * Double(i)),
width: 2.0, height: 2.0)
malaBeadLabel[i].layer.cornerRadius = 1.0
malaBeadLabel[i].layer.borderWidth = 0.25
malaBeadLabel.append(malaBeadLabel[i])
malaFrame.addSubview(malaBeadLabel[i])
}
}
I cannot figure out how is the index out of range.
You start with an empty array named malaBeadLabel.
Then during the first iteration of the loop, when i is 1, you try malaBeadLabel[i]. This of course causes the error because malaBeadLabel is empty and there is nothing at index 1 (or 0, or any other index).
On top of that, you never actually attempt to create any instances of RoundLabel.
Change your loop code so you make no attempt to access anything from the array, just add to the array. And create the actual label instances.
for i in 1...108 {
let label = RoundLabel(frame: CGRect(x: (malaRadius * cos(angleInRadians * Double(i))),
y: (malaRadius * sin(angleInRadians * Double(i))),
width: 2.0, height: 2.0))
label.layer.cornerRadius = 1.0
label.layer.borderWidth = 0.25
malaBeadLabel.append(label)
malaFrame.addSubview(label)
}
BTW - why isn't the code to set the label's corner radius and border width inside the RoundLabel class?
I was able to figure this out. I added the UIBezier path radius to both x and y coordinates of the UILabels around the circle and it worked

CALayer bounds don't change after setting them to a certain value

I am trying to display an image as the content of a CALayer slightly zoomed in by changing its bounds to a bigger size. (This is so that I can pan over it later.)
For some reason however setting the bounds does not change them or trigger an animation to do so.
This is the code I use to change the bounds:
self.imageLayer.bounds = CGRect(x: 0, y: 0, width: 10, height: 10)
I have a function to compute the CGRect, but this dummy one leads to exactly the same result of the size not changing.
I have also determined, that while I can't see the size change, if I check the bounds of the layer right after setting it, it correctly has the value I set it to.
The following code is executed after setting the bounds. I couldn't find anything in it, that changes them back.
self.imageLayer.add(self.generatePanAnimation(), forKey: "pan")
func generatePanAnimation() -> CAAnimation {
var positionA = CGPoint(x: (self.bounds.width / 2), y: self.bounds.height / 2)
var positionB = CGPoint(x: (self.bounds.width / 2), y: self.bounds.height / 2)
positionA = self.generateZoomedPosition()
positionB = self.generateZoomedPosition()
let panAnimation = CABasicAnimation(keyPath: "position")
if self.direction == .AtoB {
panAnimation.fromValue = positionA
panAnimation.toValue = positionB
} else {
panAnimation.fromValue = positionB
panAnimation.toValue = positionA
}
panAnimation.duration = self.panAndZoomDuration
self.panAnimation = panAnimation
return panAnimation
}
func generateZoomedPosition() -> CGPoint {
let maxRight = self.zoomedImageLayerBounds.width / 2
let maxLeft = self.bounds.width - (self.zoomedImageLayerBounds.height / 2)
let maxUp = self.zoomedImageLayerBounds.height / 2
let maxDown = self.bounds.height - (self.zoomedImageLayerBounds.height / 2)
let horizontalFactor = CGFloat(arc4random()) / CGFloat(UINT32_MAX)
let verticalFactor = CGFloat(arc4random()) / CGFloat(UINT32_MAX)
let randomX = maxLeft + horizontalFactor * (maxRight - maxLeft)
let randomY = maxDown + verticalFactor * (maxUp - maxDown)
return CGPoint(x: randomX, y: randomY)
}
I even tried setting the bounds as shown below, but it didn't help.
CATransaction.begin()
CATransaction.setValue(true, forKey: kCATransactionDisableActions)
self.imageLayer.bounds = CGRect(x: 0, y: 0, width: 10, height: 10)
CATransaction.commit()
I really hope someone has an idea. Thanks a lot!
The way to change the apparent drawing size of a layer is not to change its bounds but to change its transform. To make the layer look larger, including its drawing, apply a scale transform.

Need help incorporating randoms in Swift

I am trying to get my sprites on random positions on the screen but it says "CGFloat is not convertible to Double"
var randomNumber = arc4random()
bat1 = SKSpriteNode(imageNamed: "rsz_silverbat.png")
bat1.position = CGPoint(x: self.frame.size.width * 0.1, y: self.frame.size.height * randomNumber)
bat1.position = CGPoint(x: self.view.frame.size.width * CGFloat(0.1), y: self.view.frame.size.height * CGFloat(randomNumber))

Float is not convertible to CGFloat and CGFloat is not convertible to Float

I have a problem with swift on XCode 6 beta 4, is driving me nuts I'm in IOS 8 developing a game and I have follow a tutorial but I get this: Float is not convertible to CGFloat and then when I recast as CGFloat I get the other. Here is the code:
override func didMoveToView(view: SKView) {
/* Setup your scene here */
//physics
self.physicsWorld.gravity = CGVectorMake(0.0, -5.0)
// Bird
var birdTexture = SKTexture(imageNamed: "kirby")
birdTexture.filteringMode = SKTextureFilteringMode.Nearest
bird.setScale(0.5)
bird.position = CGPoint(x: self.frame.size.width * 0.35, y: self.frame.size.height * 0.6)
bird.physicsBody = SKPhysicsBody(circleOfRadius:bird.size.height/2)
bird.physicsBody.dynamic = true
bird.physicsBody.allowsRotation = false
self.addChild(bird)
//ground
ground.setScale(2.0)
ground.position = CGPointMake(self.size.width/2, ground.size.height/2)
ground.physicsBody = SKPhysicsBody(rectangleOfSize: CGSizeMake(self.frame.size.width , ground.size.height))
ground.physicsBody.dynamic = false
self.addChild(ground)
//pipes
//create pipes
//^
//movement of the pipes
let distanceToMove = CGFloat(self.frame.size.width + 2.0 * pipeUp.texture.size().width)
let movePipe = SKAction.moveByX(-distanceToMove, y: CGFloat(0.0), duration: NSTimeInterval(0.01) * distanceToMove) <----- this line is the problem
}
What is going on, I move the line:
CGFloat(self.frame.size.width + 2.0 * pipeUp.texture.size().width)
I get the second error, then a recast:
Float(CGFloat(self.frame.size.width + 2.0 * pipeUp.texture.size().width))
or
Float(self.frame.size.width + 2.0 * pipeUp.texture.size().width)
gives me the first, so what do swift wants here, that is insane. Catch 22? Any help?
EDIT: I'm using a 13-inch mac late 2009, Intel Core 2 Duo, on mavericks OS X 10.9.4 (13E28) if any help. I saw something about the building options may affect the float types, but I do not know where are they.
This will be interpreted as a CGFloat and not as a NSTimeInterval as you cast and then multiply:
let movePipe = SKAction.moveByX(-distanceToMove, y: 0.0, duration: NSTimeInterval(0.01) * distanceToMove)
Try to multiply then cast to NSTimeInterval:
let movePipe = SKAction.moveByX(-distanceToMove, y: 0.0, duration: NSTimeInterval(0.01 * distanceToMove))
Also, as suggested already, you don't need to cast your distance. The following will work just fine:
let distanceToMove = self.frame.size.width + 2.0 * pipeUp.texture.size().width
Just make sure every operand has the same type. Literals (2.0) have always their type inferred. In your case, I see no problem because
let distanceToMove: CGFloat = self.frame.size.width + 2.0 * pipeUp.texture.size().width
has all the operands of type CGFloat so there is no need to cast at all.
On the next line, again make sure to use the correct types. You don't need to cast the literals:
SKAction.moveByX(-distanceToMove, y: 0.0, duration: 0.01 * NSTimeInterval(distanceToMove))