This question already has an answer here:
Strange Swift numbers type casting
(1 answer)
Closed 3 years ago.
It appears that Swift applies floating point contagion (as it is called in other languages) to literal Int operands in an expression containing a Double variable before evaluating the expression. Is there an explicit statement about that somewhere? I wasn't able to find a specific description about what to expect.
For example, suppose I have let b = 0.14. Then the following all yield the same result. (I am working with Swift 5.0.1.)
19> 5.0 * b / 6.0
$R12: Double = 0.11666666666666668
20> 5 * b / 6
$R13: Double = 0.11666666666666668
21> 5 / 6 * b
$R14: Double = 0.11666666666666668
22> b * 5 / 6
$R15: Double = 0.11666666666666668
23> (5 / 6) * b
$R16: Double = 0.11666666666666668
24> b * (5 / 6)
$R17: Double = 0.11666666666666668
That's great, it seems to make it easier to predict what the result will be, since it appears to be insensitive to the order of operations. (Incidentally that behavior differs from some other languages, I'm pretty sure.) However, I wasn't able to find any explicit statement about what should be expected in the case of literal Int operands mixed with a Double variable; I looked at these pages in hope of finding something: Expressions, Basic Operators, Advanced Operators. Can anyone point to a spec which describes what to expect in such cases?
It appears that Swift applies floating point contagion
It doesn't, actually. It appears that way, because Double conforms to ExpressibleByIntegerLiteral, which explains why this is possible:
let double: Double = 1
but not:
let i: Int = 1
print(i * 1.23) // error: binary operator '*' cannot be applied to operands of type 'Int' and 'Double'
Related
I tried adding an Int and Float literal in Swift and it compiled without any error :
var sum = 4 + 5.0 // sum is assigned with value 9.0 and type Double
But, when I tried to do the same with Int and Float variables, I got a compile-time error and I had to type-cast any one operand to the other one's type for it to work:
var i: Int = 4
var f:Float = 5.0
var sum = i + f // Binary operator '+' cannot be applied to operands of type 'Int' and 'Float'
Why is it happening so ? Is it related to type safety in any way ?
If you want Double result:
let i: Int = 4
let f: Float = 5.0
let sum = Double(i) + Double(f)
print("This is the sum:", sum)
If you want Int result:
let i: Int = 4
let f: Float = 5.0
let sum = i + Int(f)
print("This is the sum:", sum)
In case of var sum = 4 + 5.0 the compiler automatically converts 4 to a float as that is what is required to perform the operation.
Same happens if you write var x: Float = 4. The 4 is automatically converted to a float.
In second case, since you have explicitly defined the type of the variable, the compiler does not have the freedom to change is as per the requirement.
For solution, look at #Fabio 's answer
The document on Swift.org says:
Type inference is particularly useful when you declare a constant or variable with an initial value. This is often done by assigning a literal value (or literal) to the constant or variable at the point that you declare it. (A literal value is a value that appears directly in your source code, such as 42 and 3.14159 in the examples below.)
For example, if you assign a literal value of 42 to a new constant
without saying what type it is, Swift infers that you want the
constant to be an Int, because you have initialized it with a number
that looks like an integer:
let meaningOfLife = 42 // meaningOfLife is inferred to be of type Int
Likewise, if you don’t specify a type for a floating-point literal,
Swift infers that you want to create a Double:
let pi = 3.14159 // pi is inferred to be of type Double Swift always
chooses Double (rather than Float) when inferring the type of
floating-point numbers.
If you combine integer and floating-point literals in an expression, a
type of Double will be inferred from the context:
> let anotherPi = 3 + 0.14159 // anotherPi is also inferred to be of
type Double The literal value of 3 has no explicit type in and of
itself, and so an appropriate output type of Double is inferred from
the presence of a floating-point literal as part of the addition.
Extend the example in “The Swift Programming Language” (Swift 5.5) “Integer and Floating-Point Conversion”:
3 + 0.14 // allowed
let three = 3
let rest = 0.14
3 + rest // allowed
0.14 + three // compile error
three + 0.14 // compile error
I don’t understand why the last two lines are taken as compile error. Can anyone help to explain a bit? Thanks.
There are two basic rules:
A numeric literal without type annotation can be converted implicitly if possible.
A constant or variable is initialized with a fixed type which cannot change. A floating point literal becomes Double and an integer literal becomes Int.
So three is Int and 0.14 is Double.
3 + rest works because 3 can be inferred as Double.
But 0.14 cannot be inferred as Int so the last two lines fail to compile.
This question already has answers here:
How does Swift's int literal to float inference work?
(3 answers)
Closed 4 years ago.
when I read the doc of swift
https://docs.swift.org/swift-book/LanguageGuide/TheBasics.html
let anotherPi = 3 + 0.14159
// anotherPi is also inferred to be of type Double
this works, however if I delete the "Double" in the following case, it does not work.
let three = 3
let pointOneFourOneFiveNine = 0.14159
let pi = Double(three) + pointOneFourOneFiveNine
May I ask why do we need to explicitly convert the type in second case? what is the difference?
It is because of Swift's automatic type inference.
let three = 3 // Here **three** is inferred as Int
let pointOneFourOneFiveNine = 0.14159 // Here **pointOneFourOneFiveNine** is inferred as Double
let pi = Double(three) + pointOneFourOneFiveNine
And since you cannot add a Double to an Int, it’s the reason that you have to wrap the three as Double(three).
let anotherPi = 3 + 0.14159
// anotherPi is also inferred to be of type Double
The reason the above code works because the compiler finds that both the summing inputs can be represented as Double, so it assigns the type Double to anotherPi. Hope this clears your doubt.
I'm aware of some relatively similar questions on this site, but if they do apply to my problem (which I'm not certain they do) then I certainly don't understand them. Here's my problem;
var degrees = UInt32()
var radians = Double()
let degrees:UInt32 = arc4random_uniform(360)
let radians = angle * (M_PI / 180)
This returns an error, focused on the multiplication star, reading; "Binary operator "*" cannot be applied to operands of type 'UInt32' and 'Double'.
I'm fairly sure I need to have the degrees variable be of type UInt32 to randomise it, and also that the pi constant cannot be made to be of UInt32, or at least I don't know how, as I'm relatively new to Xcode and Swift in general.
I'd be very grateful if anyone had a solution to my problem.
Thanks in advance.
let degree = arc4random_uniform(360)
let radian = Double(degree) * .pi/180
you need to convert the degree to double before the multiplication .
from apple swift book:
Integer and Floating-Point Conversion
Conversions between integer and floating-point numeric types must be made explicit:
let three = 3
let pointOneFourOneFiveNine = 0.14159
let pi = Double(three) + pointOneFourOneFiveNine
// pi equals 3.14159, and is inferred to be of type Double
Here, the value of the constant three is used to create a new value of type Double, so that both sides of
the addition are of the same type. Without this conversion in place, the addition would not be allowed.
Floating-point to integer conversion must also be made explicit. An integer type can be initialized
with a Double or Float value:
1 let integerPi = Int(pi)
2 // integerPi equals 3, and is inferred to be of type Int
Floating-point values are always truncated when used to initialize a new integer value in this way.
This means that 4.75 becomes 4, and -3.9 becomes -3.
This question already has answers here:
Multiplying variables and doubles in swift
(2 answers)
So if string is not NilLiteralConvertible... what do some string functions return?
(1 answer)
Closed 7 years ago.
Hello brand new to Swift, and programming in general. Going through an exercise the code given is exactly:
//: Playground - noun: a place where people can play
import UIKit
let height = 12
let width = 10
let area = height * width
let areaInMeters = area / 10.762
But I get the error, "binary operator / cannot be applied to operands of type Int and Double".
After some digging around I found you can't operate on both an Integer and a Double. So I changed the last line to:
let areaInMeters = (Double)area / 10.762
Then I get the error, "Consecutive statements on a line must be separated by a ;" and it wants me to put the ; after area. None of this is making any sense to me.
Using El Capitan beta and Xcode 7 beta.
height and width will both be inferred as of type Int. Therefore area is also of type Int whilst 10.762 is a Double.
And in Swift safety is paramount so you'll need to have both operands of same type.
Solution is (as Eric D. suggested) is to convert area to a Double:
let areaInMeters = Double(area) / 10.762
Try instead adding a decimal point and a zero to the end of your height and width.
Like so:
let height = 12.0
let width = 10.0
And you won't have to worry about having to deal with an Integer.
Hope this helps. Happy Coding!