Dividing UInt64 always returns 0 [duplicate] - swift

I'm trying print the result of division for example:
let division = (4/6)
print(division)
In this case the print out is 0.
How can I print the numeric value of the division without losing the numeric value. I mean without casting the output to string.

You are performing integer division. You need to perform floating point division.
In your code, division is an Int value. 4 / 6 is zero in integer division.
You need:
let division = 4.0/6.0
print(division)

ans = Double(no1)/Double(no2)
return ans

If you want the value should be correct, then try as
let division = Float(v1) / Float(v2)
print(division)

Related

Why is x / 100 == 0 when x != 0 in swift? [duplicate]

This question already has answers here:
Is the Swift divide "/" operator not working or have I missed something?
(3 answers)
Division not working properly in Swift
(3 answers)
Closed 1 year ago.
I have created a for loop in which I calculate a few values.
for i in 1...100{
let xValue = i/100
print(xValue) // returns 0 every time except when i == 100
}
This is a recreation of a part of that for loop. Why is it that I do not get the right value for 'xValue'?
For info I have also tried the following:
let xValue: Float = Float(i/100)
And that doesn't work either, despite me being very specific. I must have forgotten something basic about these arithmetic
operators in swift.
When you divide an Int by an Int, the result will be rounded down. Use a floating point type, like Double or Float for more precision.
for i in 1...100 {
let xValue = Float(i)/100
print(xValue)
}
To address your attempted solution - when you do:
let xValue: Float = Float(i/100)
The Int result is first computed in i/100 (and rounded down to 0) then you are casting to a Float.
Therefore, we cast i to a Float before the division so the result is computed as a Float.
Since i and 100 are both integer values, / will do integer division and the result will be truncated to 0.
Even when you do let xValue: Float = Float(i/100), the result of division inside the parentheses is already truncated to 0 before the value can be converted to a Float.
Convert i to a floating-point value before dividing to prevent the result from being truncated.
for i in 1...100{
let xValue = Float(i)/100
print(xValue)
}

Swift returning 0 on division

I'm trying to do basic division and it always returns 0 as an answer.
let mathStuff = Double((stepCount / Level.expRequired())) * 100
print ("\(totalSteps) / \(Level.expRequired()) * 100 = \(mathStuff)")
My print returns
2117 / 2500 * 100 = 0.0
I've tried using NSDecimal instead of a Double and have also tried not using Double or NSDecimal and having it just do the math, which comes back as 0 instead of 0.0.
I'm really confused on what I'm doing wrong here, this seems like basic math and I'm not sure why I'm always given 0 as an answer.
Your problem probably lies here: 2117 /2500, both 2500 and 2117 are Ints.
If they were Double, then it would work: 2117.0 /2500.0 ==> produces non-zero division
Try casting those variables to double first, and you don't need to cast the result itself:
Double(stepCount) /Double(Level.expRequired()))*100
In fact, I believe only one needs to be cast:
Double(stepCount)/Level.expRequired())*100

How I can get the other part after a division with a modulo operator

When I divide 13 with 3 and use integer numbers the result will be 4.
With mod(13,3) I receive the remainder 1. But how can I get the 4 in Matlab? I think it is not possible to switch to integer numbers for this calculation, isn't it?
You can use the floor function:
result = floor(13/3)
This function always rounds down to the lower integer
You can explicitly use integers:
result = uint32(13)/unit32(3);
You can also use hex numbers:
result = 0xDu32 / 0x3u32;
Note that result will be of type uint32.
Use idivide:
result = idivide(13, 3);
You can specify the rounding method with a third argument, with the default being 'fix', or rounding towards zero. For example, this would round towards negative infinity:
result = idivide(13, 3, 'floor');

Decimal Value Set as 0 in Swift

I am trying to get a random decimal from 0.75 to 1.25
let incomeCalc = Decimal((arc4random_uniform(50)+75)/100)
print("incomeCalc")
print(incomeCalc)
Why does this print 0?
arc4random_uniform return an integer type so you are doing integer math. You need to be doing floating point math.
let incomeCalc = Decimal(Double((arc4random_uniform(50)+75))/100)
By casting the value before you do the division, you get a Double result which is passed to your Decimal initializer.
Or you can do:
let incomeCalc = Decimal((arc4random_uniform(50)+75))/100
which creates the Decimal before the division is done.
You can also use the code below which gets a random number between 75 - 125 and then divides it by 100
let incomeCalc = Decimal((arc4random_uniform(50)+75)) / 100
print("incomeCalc")
print(incomeCalc)

Swift: Print decimal precision of division

I'm trying print the result of division for example:
let division = (4/6)
print(division)
In this case the print out is 0.
How can I print the numeric value of the division without losing the numeric value. I mean without casting the output to string.
You are performing integer division. You need to perform floating point division.
In your code, division is an Int value. 4 / 6 is zero in integer division.
You need:
let division = 4.0/6.0
print(division)
ans = Double(no1)/Double(no2)
return ans
If you want the value should be correct, then try as
let division = Float(v1) / Float(v2)
print(division)