Confusion converting a decimal to hex in Swift 5 - swift

Many versions of this question are posted. My question is slightly different, as I'm getting conflicting results.
If I run the following in a playground, it works fine:
let myNumber = 12345
if let myHex = Double(String(myNumber, radix: 16)) {
print(myHex)
} else {
print("Bad input as hexadecimal: \(myNumber)")
}
This returns 3039.
However, if I change myNumber to 1234, I get the Bad Input message. Can anyone see what I'm doing wrong, or point me to a similar question? (I have looked)

You are taking a number, 1234, and converting it to a string (e.g. 4d2). You're then asking Double to try to interpret that alphanumeric hex string, which it obviously cannot do.
If you want the hex string representation, it is simply:
let myNumber = 1234
let myHex = String(myNumber, radix: 16)
print(myHex)
Your value of 12345 resulted in a hex string that did not happen to contain any a-f characters (it was 3039), so the Double conversion did not fail. (But it also didn't return the right value, either.)

Related

String ecoding returns wrong values . 33.48 becomes 33.47999999999999488 [duplicate]

This question already has an answer here:
Swift JSONEncoder number rounding
(1 answer)
Closed 1 year ago.
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
print(myObjectValues.v) // 33.48
let mydata = try JSONEncoder().encode(myObjectValues)
let string = String(data: mydata, encoding: .utf8)!
print(string) // 33.47999999999999488
Here in myObjectValues contains a Decimal value of 33.48. If i try to encode this mydata to string, the value returned is 33.47999999999999488. I've tried to round the decimal value to 2 places but the final string keeps posting this number. I've tried to save it in String and then back to Decimal but still the value returned in the encoded string is this 33.479999999ish.
I can't use the string for calculate and compare hash as the hash value returned from the server is for 33.48 which will never be equal to what i'll get on my end with this long value.
Decimal values created with underlying Double values will always create these issues.
Decimal values created with underlying String values won't create these issues.
What you can try to do is -
Have a private String value as a backing storage that's there just for safely encoding and decoding this decimal value.
Expose another computed Decimal value that uses this underlying String value.
import Foundation
class Test: Codable {
// Not exposed : Only for encoding & decoding
private var decimalString: String = "33.48"
// Work with this in your app
var decimal: Decimal {
get {
Decimal(string: decimalString) ?? .zero
}
set {
decimal = newValue
decimalString = "\(newValue)"
}
}
}
do {
let encoded = try JSONEncoder().encode(Test())
print(String(data: encoded, encoding: .utf8))
// Optional("{\"decimalString\":\"33.48\"}")
let decoded = try JSONDecoder().decode(Test.self, from: encoded)
print(decoded.decimal) // 33.48
print(decoded.decimal.nextUp) // 33.49
print(decoded.decimal.nextDown) // 33.47
} catch {
print(error)
}
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
Don't do this. Just don't. It's not sensible.
I'll explain it by analogy: Imagine if you represented numbers with six decimal digits of precision. You have to use some amount of precision, right?
Now, 1/3 would be represented as 0.333333. But 2/3 would be represented by 0.666667. So now, if you multiply 1/3 by two, you will not get 2/3. And if you add 1/3 to 1/3 to 1/3, you will not get 1.
So the hash of 1 will be different depending on how you got that 1! If you add 1/3 to 1/3 to 1/3, you will get a different hash than if you added 2/3 to 1/3.
This is simply not the right data type to hash. Don't use doubles for this purpose. Rounding will work until it doesn't.
And you are using 33 + 48/100 -- a value that cannot be represented exactly in base two just as 1/3 cannot be represented exactly in base ten.

Why are doubles printed differently in dictionaries?

let dic : [Double : Double] = [1.1 : 2.3, 2.3 : 1.1, 1.2 : 2.3]
print(dic)// [2.2999999999999998: 1.1000000000000001, 1.2: 2.2999999999999998, 1.1000000000000001: 2.2999999999999998]
let double : Double = 2.3
let anotherdouble : Double = 1.1
print(double) // 2.3
print(anotherdouble) // 1.1
I don't get that why is the compiler printing values from dictionaries differently?
I'm on Swift 3, Xcode 8. Is this a bug or some weird way of optimizing stuff or something?
EDIT
What's even more weird is that :
Some values go over, some go below, some stay as they are! 1.1 is less than 1.1000000000000001 while 2.3 is more than 2.2999999999999998, 1.2 is just 1.2
As already mentioned in the comments, a Double cannot store
the value 1.1 exactly. Swift uses (like many other languages)
binary floating point numbers according to the IEEE 754
standard.
The closest number to 1.1 that can be represented as a Double is
1.100000000000000088817841970012523233890533447265625
and the closest number to 2.3 that can be represented as a Double is
2.29999999999999982236431605997495353221893310546875
Printing that number means that it is converted to a string with
a decimal representation again, and that is done with different
precision, depending on how you print the number.
From the source code at HashedCollections.swift.gyb one can see that the description method of
Dictionary uses debugPrint() for both keys and values,
and debugPrint(x) prints the value of x.debugDescription
(if x conforms to CustomDebugStringConvertible).
On the other hand, print(x) calls x.description if x conforms
to CustomStringConvertible.
So what you see is the different output of description
and debugDescription of Double:
print(1.1.description) // 1.1
print(1.1.debugDescription) // 1.1000000000000001
From the Swift source code one can see
that both use the swift_floatingPointToString()
function in Stubs.cpp, with the Debug parameter set to false and true, respectively.
This parameter controls the precision of the number to string conversion:
int Precision = std::numeric_limits<T>::digits10;
if (Debug) {
Precision = std::numeric_limits<T>::max_digits10;
}
For the meaning of those constants, see std::numeric_limits:
digits10 – number of decimal digits that can be represented without change,
max_digits10 – number of decimal digits necessary to differentiate all values of this type.
So description creates a string with less decimal digits. That
string can be converted to a Double and back to a string giving
the same result.
debugDescription creates a string with more decimal digits, so that
any two different floating point values will produce a different output.
Yes, Swift uses binary floating numbers while storing it into dictionary
Use dictionary as [Double: Any], use Float if your number is 32 bit then upcast to AnyObject
See below example
let strDecimalNumber = "8.37"
var myDictionary : [String: Any] = [:]
myDictionary["key1"] = Float(strDecimalNumber) as AnyObject // 8.369999999999999
myDictionary["key2"] = Double(strDecimalNumber) as AnyObject //8.369999999999999
myDictionary["key3"] = Double(8.37) as AnyObject //8.369999999999999
myDictionary["key4"] = Float(8.37) as AnyObject //8.37
myDictionary["key5"] = 8.37 // 8.3699999999999992
myDictionary["key6"] = strDecimalNumber // "8.37" it is String
myDictionary["key7"] = strDecimalNumber.description // "8.37" it is String
myDictionary["key8"] = Float(10000000.01) // 10000000.0
myDictionary["key9"] = Float(100000000.01) // 100000000.0
myDictionary["key10"] = Float(1000000000.01) // 1e+09
myDictionary["key11"] = Double(1000000000.01) // 1000000000.01
print(myDictionary)
myDictionary will be printed as
["key1": 8.37 , "key2": 8.369999999999999, "key3": 8.369999999999999,
"key4": 8.37, "key5": 8.3699999999999992, "key6": "8.37", "key7": "8.37" , "key8":
10000000.0, "key9": 100000000.0, "key10": 1e+09 ,"key11": 1000000000.01]
As mentioned by Martin R in above answer using .description will be treated as String not actual Float

Display certain number of letters

I have a word that is being displayed into a label. Could I program it, where it will only show the last 2 characters of the word, or the the first 3 only? How can I do this?
Swift's string APIs can be a little confusing. You get access to the characters of a string via its characters property, on which you can then use prefix() or suffix() to get the substring you want. That subset of characters needs to be converted back to a String:
let str = "Hello, world!"
// first three characters:
let prefixSubstring = String(str.characters.prefix(3))
// last two characters:
let suffixSubstring = String(str.characters.suffix(2))
I agree it is definitely confusing working with String indexing in Swift and they have changed a little bit from Swift 1 to 2 making googling a bit of a challenge but it can actually be quite simple once you get a hang of the methods. You basically need to make it into a two-step process:
1) Find the index you need
2) Advance from there
For example:
let sampleString = "HelloWorld"
let lastThreeindex = sampleString.endIndex.advancedBy(-3)
sampleString.substringFromIndex(lastThreeindex) //prints rld
let secondIndex = sampleString.startIndex.advancedBy(2)
sampleString.substringToIndex(secondIndex) //prints He

How do I convert a 50 digit string into the appropriate integer type in Swift?

I need to convert this 50 digit string 53503534226472524250874054075591789781264330331690 into the appropriate number type. I tried this:
let str = "53503534226472524250874054075591789781264330331690"
let num = str.toInt(); // Returns nil
let num = Int64(str.toInt()); // Errors out
The maximimum size of an Int64 is 9,223,372,036,854,775,807 when it is signed. So you cannot convert it just like that.
You need something like the BigInt class found in other languages. Check this other question where they answer with alternatives about BigInt in Swift:
BigInteger equivalent in Swift?
In summary, there are third-party libraries out there for arbitrary long integers. The only alternative from Apple is NSDecimalNumber but its limit is 38 digits, whereas your number has 50.

NSDecimalNumber and popping digits off of the end

I'm not quite sure what to call it, but I have a text field to hold a currency value, so I'm storing that as a NSDecimalNumber. I don't want to use the numbers & symbols keyboard so I'm using a number pad, and inferring the location of a decimal place like ATMs do. It works fine for entering numbers. Type 1234 and it displays $12.34 but now I need to implement back space. So assuming $12.34 is entered hitting back space would show $1.23. I'm not quite sure how to do this with a decimal number. With an int you would just divide by 10 to remove the right most digit, but that obviously doesn't work here. I could do it by some messy converting to int / 10 then back to decimal but that just sounds horrific... Any suggestions?
Call - (NSDecimalNumber *)decimalNumberByDividingBy:(NSDecimalNumber *)decimalNumber withBehavior:(id < NSDecimalNumberBehaviors >)behavior on it
How about using stringValue?
1) NSDecimalNumber to String
2) substring last
3) String to NSDecimalNumber
Below is an example for Swift 3
func popLastNumber(of number: NSDecimalNumber) -> NSDecimalNumber {
let stringFromNumber = number.stringValue //NSNumber property
let lastIndex = stringFromNumber.endIndex
let targetIndex = stringFromNumber.index(before: lastIndex)
let removed = stringFromNumber.substring(to: targetIndex)
return NSDecimalNumber(string: removed)
}
If your input number is a single digit, it would return NaN.
You could replace it to NSDecimalNumber.zero if you need.
It may works like delete button on calcultor.
It's not tested much.
If someone found another NaN case, please report by reply.