Extend the example in “The Swift Programming Language” (Swift 5.5) “Integer and Floating-Point Conversion”:
3 + 0.14 // allowed
let three = 3
let rest = 0.14
3 + rest // allowed
0.14 + three // compile error
three + 0.14 // compile error
I don’t understand why the last two lines are taken as compile error. Can anyone help to explain a bit? Thanks.
There are two basic rules:
A numeric literal without type annotation can be converted implicitly if possible.
A constant or variable is initialized with a fixed type which cannot change. A floating point literal becomes Double and an integer literal becomes Int.
So three is Int and 0.14 is Double.
3 + rest works because 3 can be inferred as Double.
But 0.14 cannot be inferred as Int so the last two lines fail to compile.
Related
This question already has an answer here:
Strange Swift numbers type casting
(1 answer)
Closed 3 years ago.
It appears that Swift applies floating point contagion (as it is called in other languages) to literal Int operands in an expression containing a Double variable before evaluating the expression. Is there an explicit statement about that somewhere? I wasn't able to find a specific description about what to expect.
For example, suppose I have let b = 0.14. Then the following all yield the same result. (I am working with Swift 5.0.1.)
19> 5.0 * b / 6.0
$R12: Double = 0.11666666666666668
20> 5 * b / 6
$R13: Double = 0.11666666666666668
21> 5 / 6 * b
$R14: Double = 0.11666666666666668
22> b * 5 / 6
$R15: Double = 0.11666666666666668
23> (5 / 6) * b
$R16: Double = 0.11666666666666668
24> b * (5 / 6)
$R17: Double = 0.11666666666666668
That's great, it seems to make it easier to predict what the result will be, since it appears to be insensitive to the order of operations. (Incidentally that behavior differs from some other languages, I'm pretty sure.) However, I wasn't able to find any explicit statement about what should be expected in the case of literal Int operands mixed with a Double variable; I looked at these pages in hope of finding something: Expressions, Basic Operators, Advanced Operators. Can anyone point to a spec which describes what to expect in such cases?
It appears that Swift applies floating point contagion
It doesn't, actually. It appears that way, because Double conforms to ExpressibleByIntegerLiteral, which explains why this is possible:
let double: Double = 1
but not:
let i: Int = 1
print(i * 1.23) // error: binary operator '*' cannot be applied to operands of type 'Int' and 'Double'
On the official API doc, it says:
Returns the value of this number as an Int, which may involve rounding or truncation.
I want truncation, but not sure. Can anyone explain the exact meaning of may involve rounding or truncation?
p.s.: In my unit test, (1.7).toInt() was 1, which might involve truncation.
The KDoc of Double.toInt() is simply inherited from Number.toInt(), and for that, the exact meaning is, it is defined in the concrete Number implementation how it is converted to Int.
In Kotlin, the Double operations follow the IEEE 754 standard, and the semantics of the Double.toInt() conversion is the same as that of casting double to int in Java, i.e. normal numbers are rounded toward zero, dropping the fractional part:
println(1.1.toInt()) // 1
println(1.7.toInt()) // 1
println(-2.3.toInt()) // -2
println(-2.9.toInt()) // -2
First of all, this documentation is straight up copied from Java's documentation.
As far as I know it only truncates the decimal points, e.g. 3.14 will become 3, 12.345 will become 12, and 9.999 will become 9.
Reading this answer and the comments under it suggests that there is no actual rounding. The "rounding" is actually truncating. The rounding differs from Math.floor that instead of rounding to Integer.MIN_VALUE it rounds to 0.
use this roundToInt() in kotlin
import kotlin.math.roundToInt
fun main() {
var r = 3.1416
var c:Int = r.roundToInt()
println(c)
}
Use the function to.Int(), and send the value to the new variable which is marked as Int:
val x: Int = variable_name.toInt()
I'm aware of some relatively similar questions on this site, but if they do apply to my problem (which I'm not certain they do) then I certainly don't understand them. Here's my problem;
var degrees = UInt32()
var radians = Double()
let degrees:UInt32 = arc4random_uniform(360)
let radians = angle * (M_PI / 180)
This returns an error, focused on the multiplication star, reading; "Binary operator "*" cannot be applied to operands of type 'UInt32' and 'Double'.
I'm fairly sure I need to have the degrees variable be of type UInt32 to randomise it, and also that the pi constant cannot be made to be of UInt32, or at least I don't know how, as I'm relatively new to Xcode and Swift in general.
I'd be very grateful if anyone had a solution to my problem.
Thanks in advance.
let degree = arc4random_uniform(360)
let radian = Double(degree) * .pi/180
you need to convert the degree to double before the multiplication .
from apple swift book:
Integer and Floating-Point Conversion
Conversions between integer and floating-point numeric types must be made explicit:
let three = 3
let pointOneFourOneFiveNine = 0.14159
let pi = Double(three) + pointOneFourOneFiveNine
// pi equals 3.14159, and is inferred to be of type Double
Here, the value of the constant three is used to create a new value of type Double, so that both sides of
the addition are of the same type. Without this conversion in place, the addition would not be allowed.
Floating-point to integer conversion must also be made explicit. An integer type can be initialized
with a Double or Float value:
1 let integerPi = Int(pi)
2 // integerPi equals 3, and is inferred to be of type Int
Floating-point values are always truncated when used to initialize a new integer value in this way.
This means that 4.75 becomes 4, and -3.9 becomes -3.
I have a very odd error here and i've searched all around and i have tried all the suggestions. None work.
scrollView.contentSize.height = 325 * globals.defaults.integer(forKey: "numCards")
Binary operator '*' cannot be applied to two 'Int' operands
WTF Swift! Why not? I multiply Ints all the time. These ARE two Ints. globals.defaults is just an instance of UserDefaults.standard. I have tried the following with the same error each time.
325 * Int(globals.defaults.integer(forKey: "numCards") //NOPE
Int(325) * Int(globals.defaults.integer(forKey: "numCards")) //NOPE
if let h = globals.defaults.integer(forKey: "numCards"){
325 * h //NOPE, and 'Initializer for conditional binding must have optional type, not Int'
}
let h = globals.defaults.integer(forKey: "numCards") as! Int
325 * h //NOPE, and 'Forced cast of Int of same type as no affect'
325 * 2 //YES! But no shit...
All of those "attempts" seemed like a waste of time as i know for a fact both of these are Ints...and i was correct. Please advise. Thanks!
The error is misleading. The problem is actually the attempt to assign an Int value to a CGFloat variable.
This will work:
scrollView.contentSize.height = CGFloat(325 * globals.defaults.integer(forKey: "numCards"))
The cause of the misleading error (thanks to Daniel Hall in the comments below) is due to the compiler choosing the * function that returns a CGFloat due to the return value needed. This same function expects two CGFloat parameters. Since the two arguments being provided are Int instead of CGFloat, the compiler provides the misleading error:
Binary operator '*' cannot be applied to two 'Int' operands
It would be nice if the error was more like:
Binary operator '*' cannot be applied to two 'Int' operands. Expecting two 'CGFloat' operands.
I am certain I am missing something very, very obvious, but can anyone tell me why I am having trouble multiplying two Integers? The following code:
let twenty: Integer = 20
let ten: Integer = 10
let result: Integer = twenty * ten
presents the error Could not find an overload for '*' that accepts the supplied arguments.
Other questions on SO with the same error are caused by trying to multiply different types together, but surely these are both Integer types?
(PS: The actual code I am trying to run is var value = self.value * 10 but I have expanded it to the sample while debugging to make absolutely sure that the correct types are being used)
use Int instead. Integer is a protocol.
Integer is a protocol not a type. Use Int instead.
As already stated , Integer is a protocol not a type .
In your situation, you don't need to do explicit the type because it is of implicit casting.
This could be enough
let twenty = 20
let ten = 10
let result = twenty * ten
NSLog("%d", result)