var arr: [Double] = Array(stride(from: 0, through: 11, by: 1.0))
This code is ok,
but if I write this, "cannot invoke" problem appears
var s = 11
var arr: [Double] = Array(stride(from: 0, through: s, by: 1.0))
In order for your stride statement to produce Double, the values passed to from, through and by must be Doubles.
In the first case, Swift infers the literals 0 and 11 to be Doubles since 1.0 is a Double and that is the only way they can match. This works because Double conforms to the ExpressibleByIntegerLiteral protocol which just means that you can initialize a Double with an integer literal and an integer literal can be inferred to be a Double if necessary.
In the second case, you have assigned 11 to s and Swift assigns s the type Int. So when you try to use that in the stride statement, the types don't match.
You can fix this in a number of ways:
Declare s to be a Double with var s: Double = 11. In this case, you've explicitly assigned the type of s, so Swift uses the ExpressibleByIntegerLiteral conformance of Double to initialize s.
Declare that 11 is a Double with var s = 11 as Double. Here you have told Swift that 11 is a Double which works because Double conforms to ExpressibleByIntegerLiteral. Swift then infers the type of s to also be Double.
Convert 11 to a Double with var s = Double(11). This uses an initializer of Double that takes an Int as input. Swift then infers the type of s to be Double from the value assigned to it.
Convert s when you use it with Double(s). Here you are explicitly using the Double initializer to create a Double with the Int s.
Declare s with a Double literal with var s = 11.0. Here, Swift infers the literal 11.0 to be of type Double and then infers the type of s to also be a Double since it is initialized by a Double.
Related
I tried adding an Int and Float literal in Swift and it compiled without any error :
var sum = 4 + 5.0 // sum is assigned with value 9.0 and type Double
But, when I tried to do the same with Int and Float variables, I got a compile-time error and I had to type-cast any one operand to the other one's type for it to work:
var i: Int = 4
var f:Float = 5.0
var sum = i + f // Binary operator '+' cannot be applied to operands of type 'Int' and 'Float'
Why is it happening so ? Is it related to type safety in any way ?
If you want Double result:
let i: Int = 4
let f: Float = 5.0
let sum = Double(i) + Double(f)
print("This is the sum:", sum)
If you want Int result:
let i: Int = 4
let f: Float = 5.0
let sum = i + Int(f)
print("This is the sum:", sum)
In case of var sum = 4 + 5.0 the compiler automatically converts 4 to a float as that is what is required to perform the operation.
Same happens if you write var x: Float = 4. The 4 is automatically converted to a float.
In second case, since you have explicitly defined the type of the variable, the compiler does not have the freedom to change is as per the requirement.
For solution, look at #Fabio 's answer
The document on Swift.org says:
Type inference is particularly useful when you declare a constant or variable with an initial value. This is often done by assigning a literal value (or literal) to the constant or variable at the point that you declare it. (A literal value is a value that appears directly in your source code, such as 42 and 3.14159 in the examples below.)
For example, if you assign a literal value of 42 to a new constant
without saying what type it is, Swift infers that you want the
constant to be an Int, because you have initialized it with a number
that looks like an integer:
let meaningOfLife = 42 // meaningOfLife is inferred to be of type Int
Likewise, if you don’t specify a type for a floating-point literal,
Swift infers that you want to create a Double:
let pi = 3.14159 // pi is inferred to be of type Double Swift always
chooses Double (rather than Float) when inferring the type of
floating-point numbers.
If you combine integer and floating-point literals in an expression, a
type of Double will be inferred from the context:
> let anotherPi = 3 + 0.14159 // anotherPi is also inferred to be of
type Double The literal value of 3 has no explicit type in and of
itself, and so an appropriate output type of Double is inferred from
the presence of a floating-point literal as part of the addition.
I thought I had a good understanding about Doubles and Ints until I accidentally ran into the following code.
To my surprise the following code works just fine.
let amounts = [50, 5.0, 10]
var total = 0.0
for i in 0..<amounts.count {
total += amounts[i]
print("Total: \(total)")
}
... but it stops working if I change the 5.0 to 10 or a 50 or even to 5 and generates the following error.
error: binary operator '+=' cannot be applied to operands of type 'Double' and 'Int'
Can someone tell me why is that the code doesn't break when mixing 50, 5.0 and 10? I was under the impression that this wouldn't work.
As you know, Swift is very strict with types, but there's one area where it's not so strict - literals. Double conforms to ExpressibleByIntegerLiteral, so you could do:
let x: Double = 1 // "1" is "magically" converted to a Double!?
and have it compile. The same with arrays - the compiler thinks that the array literal that you have:
[50, 5.0, 10]
is a [Double], because it can convert both 50 and 10 to Double. It can't be an [Int] because 5.0 can't be converted to an Int (Int does not conform to ExpressibleByFloatLiteral)
The line:
total += amounts[i]
only works when both sides are of the same type. Note that here, the compiler will not try to convert from Int to Double because the expressions involved (total and amounts[i]) are not literals!
If you change the array literal to [50, 10, 10], all elements are Int, so the compiler infers the array to be [Int], and amount[i] becomes an Int, causing the line to fail compilation.
Arrays in Swift can hold elements of a single type, so when you mix 50, 5.0, 10 the compiler will infer the Array is of type Double
In the working code, the array is an array of [Doubles] when you change 5.0 to 10 the array is of [Int] that because of swift Type Inference.
Thus
Binary operator '+=' cannot be applied to operands of type 'Double' and 'Int'
Swift Compiler don't know operation about two different data-type like Int and Double.
So you can achieve this by type cast from Int to Double using this Code.
let amounts = [50, 5, 10]
var total = 0.0
for i in 0..<amounts.count {
total = total + Double(amounts[i])
print("Total: \(total)")
}
I have to wrap my ints in CGFloat() to compile in Swift 2.3
If I just did * 2, then it would compile. Why does this happen?
Is this fixed in Swift 3?
var multiplier = CGFloat(3)
let y = collectionView.frame.origin.y + (cellSize() * multiplier)
Swift does not directly support mixed type arithmetic. Check what the type of 2 is in Swift, it's probably not what you assume. Your use of CGFloat() is converting the value so the operands of * have the same type.
HTH
type inference. When you write out * 2 without explicitly declaring it an int Swift infers it to be a CGFloat. However when you've already declared it to be an int you can't multiply a CGFloat with an Int.
import Foundation
var x = 17.0
var y = 1.0
var z = 0.5
var isSq : Bool = true
y = ((sqrt(x)) - Int(sqrt(x)))
I am working in Xcode 6.4
Last line produces error: 'could not find an overload for '-' that accepts the supplied arguments'.
Would be nice to understand what is happening here, also is there a function which returns just the decimal part of a double variable - the compliment of Int()?
Many thanks
sqrt(x) has the type Double, and Int(sqrt(x)) has the type
Int. There is no minus operator in Swift that takes a Double as
left operand and an Int as right operand,
and Swift does not implicitly convert between types.
Therefore you have to convert the Int to Double again:
let y = sqrt(x) - Double(Int(sqrt(x)))
You can extract the fractional part also with the fmod() function:
let y = fmod(sqrt(x), 1.0)
I'm trying to build a simple Swift app to calculate VAT (Value Added taxes = 20%).
func taxesFree(number: Int) -> Double {
var textfield = self.inputTextField.text.toInt()!
let VAT = 0.2
var result = textfield * VAT
return result
}
For some reason I keep getting
Binary operator * cannot be applied to operands of type Int and Double
on the line
var result = textfield * VAT
You should convert one type to the other one so both variable should be the same types:
var result: Double = Double(textfield) * VAT
It's because you're trying to multiply an Int (textfield) with a Double (VAT). Because with such an operation you could lose the precision of the double Swift doesn't allow to convert one to the other so you need to explicitly cast the Int to a Double ...
var result = Double(textfield) * VAT
The problem here is that the statement given is literally true, because Swift is strongly typed and doesn't coerce implicitly. Just had a similar case myself with "binary operator '-' cannot be applied to operands of type 'Date' and 'Int'".
If you write:
var result = 10 * 0.2
...that's fine, but if you write:
var number = 10
var result = number * 0.2
...that's not fine. This is because untyped explicit values have an appropriate type selected by the compiler, so in fact the first line is taken as being var result = Double(10) * Double(0.2). After all, as a human being you might mean 10 to be floating-point or an integer - you normally wouldn't say which and would expect that to be clear from context. It might be a bit of a pain, but the idea of strong types is that after the code is parsed it can only have one valid compiled expression.
In general you would build a new value using the constructor, so var result = Double(textfield) * VAT in your case. This is different from casting (textfield as Double) because Int is not a subclass of Double; what you are doing instead is asking for a completely new Double value to be built at runtime, losing some accuracy if the value is very high or low. This is what loosely typed languages do implicitly with pretty much all immediate values, at a small but significant time cost.
In your specific case, it wasn't valuable to have an Int in the first place (even if no fraction part is possible) so what you needed was:
func taxesFree(number: Int) -> Double {
var textfield = Double(self.inputTextField.text)!
let VAT = 0.2
var result = textfield * VAT
return result
}
In my case it was just casting to CGFloat:
self.cnsMainFaqsViewHight.constant = CGFloat(self.mainFaqs.count) * 44.0
You can convert like
var result: Double = Double(textfield)
I was misunderstanding the Closed Range Operator in Swift.
You should not wrap the range in an array: [0...10]
for i in [0...10] {
// error: binary operator '+' cannot be applied to operands of type 'CountableClosedRange<Int>' and 'Int'
let i = i + 1
}
for i in 0...10 {
// ok!
let i = i + 1
}
The range is a collection that can itself be iterated. No need to wrap it in an array, as perhaps you would have in Objective-C.
0...3 -> [0, 1, 2, 3]
[0...3] -> [[0, 1, 2, 3]]
Once you realize your object is a nested collection, rather than an array of Ints, it's easy to see why you cannot use numeric operators on the object.
This worked for me when I got the same error message in Playground:
func getMilk(howManyCartons: Int){
print("Buy \(howManyCartons) cartons of milk")
let priceToPay: Float = Float(howManyCartons) * 2.35
print("Pay $\(priceToPay)")
}
getMilk(howManyCartons: 2)