CGFloat * 2 compiles, CGFloat * Variable where variable is an integer fails - swift

I have to wrap my ints in CGFloat() to compile in Swift 2.3
If I just did * 2, then it would compile. Why does this happen?
Is this fixed in Swift 3?
var multiplier = CGFloat(3)
let y = collectionView.frame.origin.y + (cellSize() * multiplier)

Swift does not directly support mixed type arithmetic. Check what the type of 2 is in Swift, it's probably not what you assume. Your use of CGFloat() is converting the value so the operands of * have the same type.
HTH

type inference. When you write out * 2 without explicitly declaring it an int Swift infers it to be a CGFloat. However when you've already declared it to be an int you can't multiply a CGFloat with an Int.

Related

Why Int and Float literals are allowed to be added, but Int and Float variables are not allowed to do the same in Swift?

I tried adding an Int and Float literal in Swift and it compiled without any error :
var sum = 4 + 5.0 // sum is assigned with value 9.0 and type Double
But, when I tried to do the same with Int and Float variables, I got a compile-time error and I had to type-cast any one operand to the other one's type for it to work:
var i: Int = 4
var f:Float = 5.0
var sum = i + f // Binary operator '+' cannot be applied to operands of type 'Int' and 'Float'
Why is it happening so ? Is it related to type safety in any way ?
If you want Double result:
let i: Int = 4
let f: Float = 5.0
let sum = Double(i) + Double(f)
print("This is the sum:", sum)
If you want Int result:
let i: Int = 4
let f: Float = 5.0
let sum = i + Int(f)
print("This is the sum:", sum)
In case of var sum = 4 + 5.0 the compiler automatically converts 4 to a float as that is what is required to perform the operation.
Same happens if you write var x: Float = 4. The 4 is automatically converted to a float.
In second case, since you have explicitly defined the type of the variable, the compiler does not have the freedom to change is as per the requirement.
For solution, look at #Fabio 's answer
The document on Swift.org says:
Type inference is particularly useful when you declare a constant or variable with an initial value. This is often done by assigning a literal value (or literal) to the constant or variable at the point that you declare it. (A literal value is a value that appears directly in your source code, such as 42 and 3.14159 in the examples below.)
For example, if you assign a literal value of 42 to a new constant
without saying what type it is, Swift infers that you want the
constant to be an Int, because you have initialized it with a number
that looks like an integer:
let meaningOfLife = 42 // meaningOfLife is inferred to be of type Int
Likewise, if you don’t specify a type for a floating-point literal,
Swift infers that you want to create a Double:
let pi = 3.14159 // pi is inferred to be of type Double Swift always
chooses Double (rather than Float) when inferring the type of
floating-point numbers.
If you combine integer and floating-point literals in an expression, a
type of Double will be inferred from the context:
> let anotherPi = 3 + 0.14159 // anotherPi is also inferred to be of
type Double The literal value of 3 has no explicit type in and of
itself, and so an appropriate output type of Double is inferred from
the presence of a floating-point literal as part of the addition.

swift 3 stride cannot invoke

var arr: [Double] = Array(stride(from: 0, through: 11, by: 1.0))
This code is ok,
but if I write this, "cannot invoke" problem appears
var s = 11
var arr: [Double] = Array(stride(from: 0, through: s, by: 1.0))
In order for your stride statement to produce Double, the values passed to from, through and by must be Doubles.
In the first case, Swift infers the literals 0 and 11 to be Doubles since 1.0 is a Double and that is the only way they can match. This works because Double conforms to the ExpressibleByIntegerLiteral protocol which just means that you can initialize a Double with an integer literal and an integer literal can be inferred to be a Double if necessary.
In the second case, you have assigned 11 to s and Swift assigns s the type Int. So when you try to use that in the stride statement, the types don't match.
You can fix this in a number of ways:
Declare s to be a Double with var s: Double = 11. In this case, you've explicitly assigned the type of s, so Swift uses the ExpressibleByIntegerLiteral conformance of Double to initialize s.
Declare that 11 is a Double with var s = 11 as Double. Here you have told Swift that 11 is a Double which works because Double conforms to ExpressibleByIntegerLiteral. Swift then infers the type of s to also be Double.
Convert 11 to a Double with var s = Double(11). This uses an initializer of Double that takes an Int as input. Swift then infers the type of s to be Double from the value assigned to it.
Convert s when you use it with Double(s). Here you are explicitly using the Double initializer to create a Double with the Int s.
Declare s with a Double literal with var s = 11.0. Here, Swift infers the literal 11.0 to be of type Double and then infers the type of s to also be a Double since it is initialized by a Double.

Swift-Binary operator cannot be applied to operands, when converting degrees to radians

I'm aware of some relatively similar questions on this site, but if they do apply to my problem (which I'm not certain they do) then I certainly don't understand them. Here's my problem;
var degrees = UInt32()
var radians = Double()
let degrees:UInt32 = arc4random_uniform(360)
let radians = angle * (M_PI / 180)
This returns an error, focused on the multiplication star, reading; "Binary operator "*" cannot be applied to operands of type 'UInt32' and 'Double'.
I'm fairly sure I need to have the degrees variable be of type UInt32 to randomise it, and also that the pi constant cannot be made to be of UInt32, or at least I don't know how, as I'm relatively new to Xcode and Swift in general.
I'd be very grateful if anyone had a solution to my problem.
Thanks in advance.
let degree = arc4random_uniform(360)
let radian = Double(degree) * .pi/180
you need to convert the degree to double before the multiplication .
from apple swift book:
Integer and Floating-Point Conversion
Conversions between integer and floating-point numeric types must be made explicit:
let three = 3
let pointOneFourOneFiveNine = 0.14159
let pi = Double(three) + pointOneFourOneFiveNine
// pi equals 3.14159, and is inferred to be of type Double
Here, the value of the constant three is used to create a new value of type Double, so that both sides of
the addition are of the same type. Without this conversion in place, the addition would not be allowed.
Floating-point to integer conversion must also be made explicit. An integer type can be initialized
with a Double or Float value:
1 let integerPi = Int(pi)
2 // integerPi equals 3, and is inferred to be of type Int
Floating-point values are always truncated when used to initialize a new integer value in this way.
This means that 4.75 becomes 4, and -3.9 becomes -3.

binary operator * cannot be applied to operands of type Int and Double

I'm trying to build a simple Swift app to calculate VAT (Value Added taxes = 20%).
func taxesFree(number: Int) -> Double {
var textfield = self.inputTextField.text.toInt()!
let VAT = 0.2
var result = textfield * VAT
return result
}
For some reason I keep getting
Binary operator * cannot be applied to operands of type Int and Double
on the line
var result = textfield * VAT
You should convert one type to the other one so both variable should be the same types:
var result: Double = Double(textfield) * VAT
It's because you're trying to multiply an Int (textfield) with a Double (VAT). Because with such an operation you could lose the precision of the double Swift doesn't allow to convert one to the other so you need to explicitly cast the Int to a Double ...
var result = Double(textfield) * VAT
The problem here is that the statement given is literally true, because Swift is strongly typed and doesn't coerce implicitly. Just had a similar case myself with "binary operator '-' cannot be applied to operands of type 'Date' and 'Int'".
If you write:
var result = 10 * 0.2
...that's fine, but if you write:
var number = 10
var result = number * 0.2
...that's not fine. This is because untyped explicit values have an appropriate type selected by the compiler, so in fact the first line is taken as being var result = Double(10) * Double(0.2). After all, as a human being you might mean 10 to be floating-point or an integer - you normally wouldn't say which and would expect that to be clear from context. It might be a bit of a pain, but the idea of strong types is that after the code is parsed it can only have one valid compiled expression.
In general you would build a new value using the constructor, so var result = Double(textfield) * VAT in your case. This is different from casting (textfield as Double) because Int is not a subclass of Double; what you are doing instead is asking for a completely new Double value to be built at runtime, losing some accuracy if the value is very high or low. This is what loosely typed languages do implicitly with pretty much all immediate values, at a small but significant time cost.
In your specific case, it wasn't valuable to have an Int in the first place (even if no fraction part is possible) so what you needed was:
func taxesFree(number: Int) -> Double {
var textfield = Double(self.inputTextField.text)!
let VAT = 0.2
var result = textfield * VAT
return result
}
In my case it was just casting to CGFloat:
self.cnsMainFaqsViewHight.constant = CGFloat(self.mainFaqs.count) * 44.0
You can convert like
var result: Double = Double(textfield)
I was misunderstanding the Closed Range Operator in Swift.
You should not wrap the range in an array: [0...10]
for i in [0...10] {
// error: binary operator '+' cannot be applied to operands of type 'CountableClosedRange<Int>' and 'Int'
let i = i + 1
}
for i in 0...10 {
// ok!
let i = i + 1
}
The range is a collection that can itself be iterated. No need to wrap it in an array, as perhaps you would have in Objective-C.
0...3 -> [0, 1, 2, 3]
[0...3] -> [[0, 1, 2, 3]]
Once you realize your object is a nested collection, rather than an array of Ints, it's easy to see why you cannot use numeric operators on the object.
This worked for me when I got the same error message in Playground:
func getMilk(howManyCartons: Int){
print("Buy \(howManyCartons) cartons of milk")
let priceToPay: Float = Float(howManyCartons) * 2.35
print("Pay $\(priceToPay)")
}
getMilk(howManyCartons: 2)

Could not find an overload for 'init' that accepts the supplied arguments

These two lines of code are giving me the
Could not find an overload for 'init' that accepts the supplied arguments
error:
var w = Int(self.bounds.size.width / Float(worldSize.width))
var h = Int(self.bounds.size.height / Float(worldSize.height))
The error message is misleading. This should work:
var w = Int(self.bounds.size.width / CGFloat(worldSize.width))
var h = Int(self.bounds.size.height / CGFloat(worldSize.height))
The width and height elements of CGSize are declared as CGFloat.
On the 64-bit platform, CGFloat is the same as Double and has 64-bit,
whereas Float has only 32-bit.
So the problem is the division operator,
which requires two operands of the same type. In contrast to (Objective-)C, Swift never implicitly converts values to a
different type.
If worldSize is also a CGSize then you do not need a cast at all:
var w = Int(self.bounds.size.width / worldSize.width)
var h = Int(self.bounds.size.height / worldSize.height)