Working with Swift.
I am converting value from String to double.
let value: String = "8"
var size: Double
size = Double(value)
size = 8.0 // 8.0
Result should be 8 unless string value is 8.0
Double only stores a numeric magnitude. "8" and "8.0" have the same magnitude, so they get represented by the same Double value. Whether you show trailing 0s or not is a consequence of how you choose to format and present your values.
print does it one way for debugging, but NumberFormatter gives you more control to format numbers for real, non-debugging purposes.
Related
I found that making operand in Decimal can reduce numeric error in Swift
0.1 + 0.2 = 0.30000000000000004
Decimal(0.1) + Decimal(0.2) = 0.3
So I tried to make a function that calculate calculation String in Decimal like this:
func calculate(expression: String) -> Decimal {
let expression = NSExpression(format: expression)
let value = expression.expressionValue(with: nil, context: nil) as? Decimal
return value ?? 0.0
}
But value property keep getting nil value and function always returning 0.0. Can I get some any help on this?
Thanks
The Decimal type does not reduce numeric error. It just computes values in decimal. That can increase or decrease error, depending on the calculation. 1/10 happens to be bad in binary for the same reason that 1/3 is bad in decimal. Your code doesn't actually compute anything in Decimal. It's just trying to convert the final Double value to Decimal at the end, which introduces binary-to-decimal rounding (making it less accurate).
That said, expressionValue(with:context:) returns an NSNumber. You can't convert that to Decimal with as?. You need to use .decimalValue:
let number = expression.expressionValue(with: nil, context: nil) as? NSNumber
let value = number?.decimalValue
This will compute the value in Double and then round it to a Decimal.
But if you want to do calculations in Decimal, I don't believe that NSExpression can do that.
This question already has an answer here:
Swift JSONEncoder number rounding
(1 answer)
Closed 1 year ago.
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
print(myObjectValues.v) // 33.48
let mydata = try JSONEncoder().encode(myObjectValues)
let string = String(data: mydata, encoding: .utf8)!
print(string) // 33.47999999999999488
Here in myObjectValues contains a Decimal value of 33.48. If i try to encode this mydata to string, the value returned is 33.47999999999999488. I've tried to round the decimal value to 2 places but the final string keeps posting this number. I've tried to save it in String and then back to Decimal but still the value returned in the encoded string is this 33.479999999ish.
I can't use the string for calculate and compare hash as the hash value returned from the server is for 33.48 which will never be equal to what i'll get on my end with this long value.
Decimal values created with underlying Double values will always create these issues.
Decimal values created with underlying String values won't create these issues.
What you can try to do is -
Have a private String value as a backing storage that's there just for safely encoding and decoding this decimal value.
Expose another computed Decimal value that uses this underlying String value.
import Foundation
class Test: Codable {
// Not exposed : Only for encoding & decoding
private var decimalString: String = "33.48"
// Work with this in your app
var decimal: Decimal {
get {
Decimal(string: decimalString) ?? .zero
}
set {
decimal = newValue
decimalString = "\(newValue)"
}
}
}
do {
let encoded = try JSONEncoder().encode(Test())
print(String(data: encoded, encoding: .utf8))
// Optional("{\"decimalString\":\"33.48\"}")
let decoded = try JSONDecoder().decode(Test.self, from: encoded)
print(decoded.decimal) // 33.48
print(decoded.decimal.nextUp) // 33.49
print(decoded.decimal.nextDown) // 33.47
} catch {
print(error)
}
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
Don't do this. Just don't. It's not sensible.
I'll explain it by analogy: Imagine if you represented numbers with six decimal digits of precision. You have to use some amount of precision, right?
Now, 1/3 would be represented as 0.333333. But 2/3 would be represented by 0.666667. So now, if you multiply 1/3 by two, you will not get 2/3. And if you add 1/3 to 1/3 to 1/3, you will not get 1.
So the hash of 1 will be different depending on how you got that 1! If you add 1/3 to 1/3 to 1/3, you will get a different hash than if you added 2/3 to 1/3.
This is simply not the right data type to hash. Don't use doubles for this purpose. Rounding will work until it doesn't.
And you are using 33 + 48/100 -- a value that cannot be represented exactly in base two just as 1/3 cannot be represented exactly in base ten.
Im reading accelerometer data from an ibeacon that appears in the following string format:
x hex string value: "0160"
y hex string value: "ff14"
z hex string value: "0114"
Im expecting to see these values as double values ranging from 0g to 1g. How would you convert these hex strings into doubles in swift?
Get the integer value from hex string with Int(_:radix:)
let string = "ff14"
let hexValue = Int(string, radix: 16)!
and divide by 65535 (16 bit) to get values between 0.0 and 1.0
let result = Double(hexValue) / 65535
let dic : [Double : Double] = [1.1 : 2.3, 2.3 : 1.1, 1.2 : 2.3]
print(dic)// [2.2999999999999998: 1.1000000000000001, 1.2: 2.2999999999999998, 1.1000000000000001: 2.2999999999999998]
let double : Double = 2.3
let anotherdouble : Double = 1.1
print(double) // 2.3
print(anotherdouble) // 1.1
I don't get that why is the compiler printing values from dictionaries differently?
I'm on Swift 3, Xcode 8. Is this a bug or some weird way of optimizing stuff or something?
EDIT
What's even more weird is that :
Some values go over, some go below, some stay as they are! 1.1 is less than 1.1000000000000001 while 2.3 is more than 2.2999999999999998, 1.2 is just 1.2
As already mentioned in the comments, a Double cannot store
the value 1.1 exactly. Swift uses (like many other languages)
binary floating point numbers according to the IEEE 754
standard.
The closest number to 1.1 that can be represented as a Double is
1.100000000000000088817841970012523233890533447265625
and the closest number to 2.3 that can be represented as a Double is
2.29999999999999982236431605997495353221893310546875
Printing that number means that it is converted to a string with
a decimal representation again, and that is done with different
precision, depending on how you print the number.
From the source code at HashedCollections.swift.gyb one can see that the description method of
Dictionary uses debugPrint() for both keys and values,
and debugPrint(x) prints the value of x.debugDescription
(if x conforms to CustomDebugStringConvertible).
On the other hand, print(x) calls x.description if x conforms
to CustomStringConvertible.
So what you see is the different output of description
and debugDescription of Double:
print(1.1.description) // 1.1
print(1.1.debugDescription) // 1.1000000000000001
From the Swift source code one can see
that both use the swift_floatingPointToString()
function in Stubs.cpp, with the Debug parameter set to false and true, respectively.
This parameter controls the precision of the number to string conversion:
int Precision = std::numeric_limits<T>::digits10;
if (Debug) {
Precision = std::numeric_limits<T>::max_digits10;
}
For the meaning of those constants, see std::numeric_limits:
digits10 – number of decimal digits that can be represented without change,
max_digits10 – number of decimal digits necessary to differentiate all values of this type.
So description creates a string with less decimal digits. That
string can be converted to a Double and back to a string giving
the same result.
debugDescription creates a string with more decimal digits, so that
any two different floating point values will produce a different output.
Yes, Swift uses binary floating numbers while storing it into dictionary
Use dictionary as [Double: Any], use Float if your number is 32 bit then upcast to AnyObject
See below example
let strDecimalNumber = "8.37"
var myDictionary : [String: Any] = [:]
myDictionary["key1"] = Float(strDecimalNumber) as AnyObject // 8.369999999999999
myDictionary["key2"] = Double(strDecimalNumber) as AnyObject //8.369999999999999
myDictionary["key3"] = Double(8.37) as AnyObject //8.369999999999999
myDictionary["key4"] = Float(8.37) as AnyObject //8.37
myDictionary["key5"] = 8.37 // 8.3699999999999992
myDictionary["key6"] = strDecimalNumber // "8.37" it is String
myDictionary["key7"] = strDecimalNumber.description // "8.37" it is String
myDictionary["key8"] = Float(10000000.01) // 10000000.0
myDictionary["key9"] = Float(100000000.01) // 100000000.0
myDictionary["key10"] = Float(1000000000.01) // 1e+09
myDictionary["key11"] = Double(1000000000.01) // 1000000000.01
print(myDictionary)
myDictionary will be printed as
["key1": 8.37 , "key2": 8.369999999999999, "key3": 8.369999999999999,
"key4": 8.37, "key5": 8.3699999999999992, "key6": "8.37", "key7": "8.37" , "key8":
10000000.0, "key9": 100000000.0, "key10": 1e+09 ,"key11": 1000000000.01]
As mentioned by Martin R in above answer using .description will be treated as String not actual Float
I can add the date/time value for the X axis to TimeSeries and if I use getPadding(), it returns a double so how can I convert this double to date again ?
Example returned double value: 1.40669949E12 (Also I tried to convert from string but not worked.)
now I want to convert this value to formatted date, Is it possible ?
The value you are getting needs to be rounded to a long using Math.round() and then using this long you can build a Date that you can format any way you need.
String format = "h:mma"; // for example
Double d = 1.40669949E12;
String formattedStr = new SimpleDateFormat(format).format(new Date(Math.round(d)));