I thought I had a good understanding about Doubles and Ints until I accidentally ran into the following code.
To my surprise the following code works just fine.
let amounts = [50, 5.0, 10]
var total = 0.0
for i in 0..<amounts.count {
total += amounts[i]
print("Total: \(total)")
}
... but it stops working if I change the 5.0 to 10 or a 50 or even to 5 and generates the following error.
error: binary operator '+=' cannot be applied to operands of type 'Double' and 'Int'
Can someone tell me why is that the code doesn't break when mixing 50, 5.0 and 10? I was under the impression that this wouldn't work.
As you know, Swift is very strict with types, but there's one area where it's not so strict - literals. Double conforms to ExpressibleByIntegerLiteral, so you could do:
let x: Double = 1 // "1" is "magically" converted to a Double!?
and have it compile. The same with arrays - the compiler thinks that the array literal that you have:
[50, 5.0, 10]
is a [Double], because it can convert both 50 and 10 to Double. It can't be an [Int] because 5.0 can't be converted to an Int (Int does not conform to ExpressibleByFloatLiteral)
The line:
total += amounts[i]
only works when both sides are of the same type. Note that here, the compiler will not try to convert from Int to Double because the expressions involved (total and amounts[i]) are not literals!
If you change the array literal to [50, 10, 10], all elements are Int, so the compiler infers the array to be [Int], and amount[i] becomes an Int, causing the line to fail compilation.
Arrays in Swift can hold elements of a single type, so when you mix 50, 5.0, 10 the compiler will infer the Array is of type Double
In the working code, the array is an array of [Doubles] when you change 5.0 to 10 the array is of [Int] that because of swift Type Inference.
Thus
Binary operator '+=' cannot be applied to operands of type 'Double' and 'Int'
Swift Compiler don't know operation about two different data-type like Int and Double.
So you can achieve this by type cast from Int to Double using this Code.
let amounts = [50, 5, 10]
var total = 0.0
for i in 0..<amounts.count {
total = total + Double(amounts[i])
print("Total: \(total)")
}
Related
I tried adding an Int and Float literal in Swift and it compiled without any error :
var sum = 4 + 5.0 // sum is assigned with value 9.0 and type Double
But, when I tried to do the same with Int and Float variables, I got a compile-time error and I had to type-cast any one operand to the other one's type for it to work:
var i: Int = 4
var f:Float = 5.0
var sum = i + f // Binary operator '+' cannot be applied to operands of type 'Int' and 'Float'
Why is it happening so ? Is it related to type safety in any way ?
If you want Double result:
let i: Int = 4
let f: Float = 5.0
let sum = Double(i) + Double(f)
print("This is the sum:", sum)
If you want Int result:
let i: Int = 4
let f: Float = 5.0
let sum = i + Int(f)
print("This is the sum:", sum)
In case of var sum = 4 + 5.0 the compiler automatically converts 4 to a float as that is what is required to perform the operation.
Same happens if you write var x: Float = 4. The 4 is automatically converted to a float.
In second case, since you have explicitly defined the type of the variable, the compiler does not have the freedom to change is as per the requirement.
For solution, look at #Fabio 's answer
The document on Swift.org says:
Type inference is particularly useful when you declare a constant or variable with an initial value. This is often done by assigning a literal value (or literal) to the constant or variable at the point that you declare it. (A literal value is a value that appears directly in your source code, such as 42 and 3.14159 in the examples below.)
For example, if you assign a literal value of 42 to a new constant
without saying what type it is, Swift infers that you want the
constant to be an Int, because you have initialized it with a number
that looks like an integer:
let meaningOfLife = 42 // meaningOfLife is inferred to be of type Int
Likewise, if you don’t specify a type for a floating-point literal,
Swift infers that you want to create a Double:
let pi = 3.14159 // pi is inferred to be of type Double Swift always
chooses Double (rather than Float) when inferring the type of
floating-point numbers.
If you combine integer and floating-point literals in an expression, a
type of Double will be inferred from the context:
> let anotherPi = 3 + 0.14159 // anotherPi is also inferred to be of
type Double The literal value of 3 has no explicit type in and of
itself, and so an appropriate output type of Double is inferred from
the presence of a floating-point literal as part of the addition.
I was trying to implement a small iteration which returns the square of some ranges.
Which should be the equivalence of this Python script
for i in range(n):
print(i*i)
In Swift I tried
first attempt
let numbers = [1..<10]
for i in numbers{
print(i*i)
}
and
second attmpt
let numbers = [1..<10]
for i in numbers{
var j: Int = i
print(j*j)
}
but then the compiler says
Cannot convert value of type 'Range<Int>' to specified type 'Int'
I understand from my python experience this is due to different types in Swift. Thus my questions are
How can I fix this? (i.e. implement the same thing i did in python)
What are the problems with my first and second attempts?
Why are there so many types of <Int> in Swift?
Thanks in advance!
Your code doesn't compile because you have used [] around the range, which creates an array. [1..<10] is an array of ranges. The for loop is then iterating over that array, which has only one element - the range 1..<10.
This is why i is of type Range<Int>. It is the range, not the numbers in the range.
Just remove the [] and both of your code would work. You can iterate over ranges directly (in fact, anything that conforms to the Sequence protocol), not just arrays. You can even write the range inline with the loop:
for i in 0..<10 {
print(i * i)
}
Why are there so many types of <Int> in Swift?
You are looking at this the wrong way, the word Range and ClosedRange in the types Range<Int> and ClosedRange<Int> are not words that modify Int, as if they are different "flavours" of Int. It's the opposite - Range<Bound> and ClosedRange<Bound> are generic types, and Range<Int> can be a considered the specific "type" of Range that has Int as its bounds. You can also have Range<Float> or Range<UInt8> for example.
I'm trying to write code that will update my array and give total pay based on the daily pay. I'm getting an error about binary operators so how do I fix this line code so that doesn't happen.
for day in stride(from: 1, to: 31, by: 1)
{
dailyPay[day] = [Int(pay)]
pay*=2
if(day==1)
{
totalPay[day] = Int(pay)
}
else
{
totalPay[day] = totalPay[day-1]+dailyPay[day]//The problem is Here
print("\(heade) \(day) \(head) \(dailyPay[day]) \(total) \(totalPay[day])")
}
You don't show the declarations of your variables, but it appears that totalPay is an array of Ints, whereas dailyPay is a two-dimensional array of arrays of Int. So, totalPay[day-1] will be an Int, whereas dailyPay[day] will be an [Int], or an array of Ints. The error you're getting therefore means exactly what it says; you can't use + to add an Int and an array.
From your code, it appears that dailyPay is probably meant to be a plain old array of integers, like totalPay. So you could fix this by changing the declaration, whereever it is, from:
var dailyPay: [[Int]]
to:
var dailyPay: [Int]
Then, change the assignment to:
dailyPay[day] = Int(pay)
and things should work.
Sidenote: Your for loop is needlessly complex. There's no need for stride, when you can just:
for day in 1...31
var arr: [Double] = Array(stride(from: 0, through: 11, by: 1.0))
This code is ok,
but if I write this, "cannot invoke" problem appears
var s = 11
var arr: [Double] = Array(stride(from: 0, through: s, by: 1.0))
In order for your stride statement to produce Double, the values passed to from, through and by must be Doubles.
In the first case, Swift infers the literals 0 and 11 to be Doubles since 1.0 is a Double and that is the only way they can match. This works because Double conforms to the ExpressibleByIntegerLiteral protocol which just means that you can initialize a Double with an integer literal and an integer literal can be inferred to be a Double if necessary.
In the second case, you have assigned 11 to s and Swift assigns s the type Int. So when you try to use that in the stride statement, the types don't match.
You can fix this in a number of ways:
Declare s to be a Double with var s: Double = 11. In this case, you've explicitly assigned the type of s, so Swift uses the ExpressibleByIntegerLiteral conformance of Double to initialize s.
Declare that 11 is a Double with var s = 11 as Double. Here you have told Swift that 11 is a Double which works because Double conforms to ExpressibleByIntegerLiteral. Swift then infers the type of s to also be Double.
Convert 11 to a Double with var s = Double(11). This uses an initializer of Double that takes an Int as input. Swift then infers the type of s to be Double from the value assigned to it.
Convert s when you use it with Double(s). Here you are explicitly using the Double initializer to create a Double with the Int s.
Declare s with a Double literal with var s = 11.0. Here, Swift infers the literal 11.0 to be of type Double and then infers the type of s to also be a Double since it is initialized by a Double.
I'm trying to build a simple Swift app to calculate VAT (Value Added taxes = 20%).
func taxesFree(number: Int) -> Double {
var textfield = self.inputTextField.text.toInt()!
let VAT = 0.2
var result = textfield * VAT
return result
}
For some reason I keep getting
Binary operator * cannot be applied to operands of type Int and Double
on the line
var result = textfield * VAT
You should convert one type to the other one so both variable should be the same types:
var result: Double = Double(textfield) * VAT
It's because you're trying to multiply an Int (textfield) with a Double (VAT). Because with such an operation you could lose the precision of the double Swift doesn't allow to convert one to the other so you need to explicitly cast the Int to a Double ...
var result = Double(textfield) * VAT
The problem here is that the statement given is literally true, because Swift is strongly typed and doesn't coerce implicitly. Just had a similar case myself with "binary operator '-' cannot be applied to operands of type 'Date' and 'Int'".
If you write:
var result = 10 * 0.2
...that's fine, but if you write:
var number = 10
var result = number * 0.2
...that's not fine. This is because untyped explicit values have an appropriate type selected by the compiler, so in fact the first line is taken as being var result = Double(10) * Double(0.2). After all, as a human being you might mean 10 to be floating-point or an integer - you normally wouldn't say which and would expect that to be clear from context. It might be a bit of a pain, but the idea of strong types is that after the code is parsed it can only have one valid compiled expression.
In general you would build a new value using the constructor, so var result = Double(textfield) * VAT in your case. This is different from casting (textfield as Double) because Int is not a subclass of Double; what you are doing instead is asking for a completely new Double value to be built at runtime, losing some accuracy if the value is very high or low. This is what loosely typed languages do implicitly with pretty much all immediate values, at a small but significant time cost.
In your specific case, it wasn't valuable to have an Int in the first place (even if no fraction part is possible) so what you needed was:
func taxesFree(number: Int) -> Double {
var textfield = Double(self.inputTextField.text)!
let VAT = 0.2
var result = textfield * VAT
return result
}
In my case it was just casting to CGFloat:
self.cnsMainFaqsViewHight.constant = CGFloat(self.mainFaqs.count) * 44.0
You can convert like
var result: Double = Double(textfield)
I was misunderstanding the Closed Range Operator in Swift.
You should not wrap the range in an array: [0...10]
for i in [0...10] {
// error: binary operator '+' cannot be applied to operands of type 'CountableClosedRange<Int>' and 'Int'
let i = i + 1
}
for i in 0...10 {
// ok!
let i = i + 1
}
The range is a collection that can itself be iterated. No need to wrap it in an array, as perhaps you would have in Objective-C.
0...3 -> [0, 1, 2, 3]
[0...3] -> [[0, 1, 2, 3]]
Once you realize your object is a nested collection, rather than an array of Ints, it's easy to see why you cannot use numeric operators on the object.
This worked for me when I got the same error message in Playground:
func getMilk(howManyCartons: Int){
print("Buy \(howManyCartons) cartons of milk")
let priceToPay: Float = Float(howManyCartons) * 2.35
print("Pay $\(priceToPay)")
}
getMilk(howManyCartons: 2)