This question already has answers here:
Division not working properly in Swift
(3 answers)
Closed 6 years ago.
print(String(Float(2 * (10 / 9))))
Why does this code print "2.0"?
Using a calculator, "2 * (10 / 9)" would equal 2.222222.....
You are calculating with integer numbers and cast the (integer) result to Float.
Do your calculation with floating point types (Double) instead:
print(String(Float(2.0 * (10.0 / 9.0))))
No need to cast though:
print(2.0 * (10.0 / 9.0))
2.0 * (10.0 / 9.0) would give your the expected result.
In your case, Swift does the calculations based on Integers first (result = 2), then converts this to a float (result = 2.0) and this into a String (result = "2.0")
To get the correct result, it should read:
print(String(Float(2.0 * (10.0 / 9.0))))
You then could leave out the two type conversations:
print(2.0 * (10.0 / 9.0))
Related
This question already has an answer here:
Strange Swift numbers type casting
(1 answer)
Closed 3 years ago.
It appears that Swift applies floating point contagion (as it is called in other languages) to literal Int operands in an expression containing a Double variable before evaluating the expression. Is there an explicit statement about that somewhere? I wasn't able to find a specific description about what to expect.
For example, suppose I have let b = 0.14. Then the following all yield the same result. (I am working with Swift 5.0.1.)
19> 5.0 * b / 6.0
$R12: Double = 0.11666666666666668
20> 5 * b / 6
$R13: Double = 0.11666666666666668
21> 5 / 6 * b
$R14: Double = 0.11666666666666668
22> b * 5 / 6
$R15: Double = 0.11666666666666668
23> (5 / 6) * b
$R16: Double = 0.11666666666666668
24> b * (5 / 6)
$R17: Double = 0.11666666666666668
That's great, it seems to make it easier to predict what the result will be, since it appears to be insensitive to the order of operations. (Incidentally that behavior differs from some other languages, I'm pretty sure.) However, I wasn't able to find any explicit statement about what should be expected in the case of literal Int operands mixed with a Double variable; I looked at these pages in hope of finding something: Expressions, Basic Operators, Advanced Operators. Can anyone point to a spec which describes what to expect in such cases?
It appears that Swift applies floating point contagion
It doesn't, actually. It appears that way, because Double conforms to ExpressibleByIntegerLiteral, which explains why this is possible:
let double: Double = 1
but not:
let i: Int = 1
print(i * 1.23) // error: binary operator '*' cannot be applied to operands of type 'Int' and 'Double'
This question already has answers here:
Math divison in Swift
(4 answers)
Closed 3 years ago.
I am trying to figure out the percentages like so:
let percentages:CGFloat = CGFloat((result.count / 100) * 100)
But this always returns 0. What am I doing wrong?
result.count is 2.
The problem is that you're probably performing an integer division, so you need to convert that count to something else first:
let percentages = CGFloat(result.count) / 100 * 100
Notice, however that you're dividing and multiplying by the same value (100). You might need to tweak that too to achieve the desired result.
This is a example equation which I want to be solved:
let equation = (5-2) * (10-5) / (4-2) * (10-5)
print (equation)
//35
The result which is printed is 35. But the right result would be 1,5. Whats wrong?
your expression is incorrect I hope you want the result 1.5
put '(' correctly * and / Precedence to execution are same but () is greater than * and /
let equation = ((5-2) * (10-5)) / ((4-2) * (10-5))
print (equation)
if you put the multiplication in another '()' then you will get result one perhaps the right part is integer so its auto conver to integer type
let equation = Double ( (5 - 2) * (10 - 5)) / Double ((4 - 2) * ( 10 - 5 ))
print (equation)
this code will print 1.5
Just look out operators Precedence in programming language
This should work:
let numerator: Double = (5-2) * (10-5)
let denumerator: Double = (4-2) * (10-5)
Fist you calculate the numerator and denumerator. And finally the result:
print(result)
let result: Double = numerator/denumerator
//1.5
As #araf has answered you should look out for the operator precedence in programming language.
Which follow a simple rule of the BODMAS evaluated in following order:
Brackets
Orders
Division and Multiplication (left to right)
Addition and Subtraction (left to right)
In your scenario:
let equation = (5-2) * (10-5) / (4-2) * (10-5)
the output is as follows:
3*5/2*4 = 15/2*5 = 7*5 = 35
#L.Stephan has suggested a better approach of calculating numerator and denumerator separately and then perform the division part.
To know more you can check this link:
https://en.wikipedia.org/wiki/Order_of_operations
This question already has answers here:
Rounding in Swift with round()
(8 answers)
Closed 6 years ago.
I am making a calculation to find a value of a double (below)
let tipAmt = Double(billAmt! * tipPer)
However, I want to take this value and round it up to the closest integer. How would I do that. Is there a round call?
There is, literally a round() method that works on Double
let billAmt: Double? = 23.75
let tipPer: Double = 0.15
let tipAmt = Double(billAmt! * tipPer)
print("tipAmt: \(tipAmt)") // 3.5625
var rounded = round(tipAmt)
print("rounded to nearest dollar: \(rounded)") // 4
rounded = round(tipAmt * 100) / 100
print("rounded to nearest cent: \(rounded)") // 3.56
I've searched for this but i didn't find anything, i hope this is not a doubled question.
I'm doing a formula in TSQL like this:
#Temp = SQRT((((#Base1 - 1) * (#StDev1 * #StDev1))
+ ((#AvgBase - 1) * (#AvgStDev * #AvgStDev)))
* ((1 / #Base1) + (1 / #AvgBase))
/ (#Base1 + #AvgBase - 2))
But it always returns me a 0.
#Base1 and #AvgBase are int, the rest of parameters are float, but i've also tried with decimal(15,15).
I tried also changing the self multiplication with the function POWER() but the only problem i can't solve is this part: (1 / #Base1) + (1 / #AvgBase), because #Base1 and #AvgBase are so big, and the result of the calc is 0,0001... and some more numbers. How can i force the engine to not round the result to 0? Thanks
EDIT: I solved it changing the #AvgBase and the #Base1 to float type. I guess that the result 1/#param, with #param -> int gives you the rounded result and when you go for casting it or whatever, you are working on a rounded result anyway.
have you tried to create a #INVBase1 = (1/#Base1) ? will this also be rounded to 0? what happens when you play around with the data format of this new variable?
alternatively have you tried
/ ((#Base1) + (#AvgBase))
instead of
* ((1 / #Base1) + (1 / #AvgBase))