How do I convert a 50 digit string into the appropriate integer type in Swift? - swift

I need to convert this 50 digit string 53503534226472524250874054075591789781264330331690 into the appropriate number type. I tried this:
let str = "53503534226472524250874054075591789781264330331690"
let num = str.toInt(); // Returns nil
let num = Int64(str.toInt()); // Errors out

The maximimum size of an Int64 is 9,223,372,036,854,775,807 when it is signed. So you cannot convert it just like that.
You need something like the BigInt class found in other languages. Check this other question where they answer with alternatives about BigInt in Swift:
BigInteger equivalent in Swift?
In summary, there are third-party libraries out there for arbitrary long integers. The only alternative from Apple is NSDecimalNumber but its limit is 38 digits, whereas your number has 50.

Related

String ecoding returns wrong values . 33.48 becomes 33.47999999999999488 [duplicate]

This question already has an answer here:
Swift JSONEncoder number rounding
(1 answer)
Closed 1 year ago.
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
print(myObjectValues.v) // 33.48
let mydata = try JSONEncoder().encode(myObjectValues)
let string = String(data: mydata, encoding: .utf8)!
print(string) // 33.47999999999999488
Here in myObjectValues contains a Decimal value of 33.48. If i try to encode this mydata to string, the value returned is 33.47999999999999488. I've tried to round the decimal value to 2 places but the final string keeps posting this number. I've tried to save it in String and then back to Decimal but still the value returned in the encoded string is this 33.479999999ish.
I can't use the string for calculate and compare hash as the hash value returned from the server is for 33.48 which will never be equal to what i'll get on my end with this long value.
Decimal values created with underlying Double values will always create these issues.
Decimal values created with underlying String values won't create these issues.
What you can try to do is -
Have a private String value as a backing storage that's there just for safely encoding and decoding this decimal value.
Expose another computed Decimal value that uses this underlying String value.
import Foundation
class Test: Codable {
// Not exposed : Only for encoding & decoding
private var decimalString: String = "33.48"
// Work with this in your app
var decimal: Decimal {
get {
Decimal(string: decimalString) ?? .zero
}
set {
decimal = newValue
decimalString = "\(newValue)"
}
}
}
do {
let encoded = try JSONEncoder().encode(Test())
print(String(data: encoded, encoding: .utf8))
// Optional("{\"decimalString\":\"33.48\"}")
let decoded = try JSONDecoder().decode(Test.self, from: encoded)
print(decoded.decimal) // 33.48
print(decoded.decimal.nextUp) // 33.49
print(decoded.decimal.nextDown) // 33.47
} catch {
print(error)
}
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
Don't do this. Just don't. It's not sensible.
I'll explain it by analogy: Imagine if you represented numbers with six decimal digits of precision. You have to use some amount of precision, right?
Now, 1/3 would be represented as 0.333333. But 2/3 would be represented by 0.666667. So now, if you multiply 1/3 by two, you will not get 2/3. And if you add 1/3 to 1/3 to 1/3, you will not get 1.
So the hash of 1 will be different depending on how you got that 1! If you add 1/3 to 1/3 to 1/3, you will get a different hash than if you added 2/3 to 1/3.
This is simply not the right data type to hash. Don't use doubles for this purpose. Rounding will work until it doesn't.
And you are using 33 + 48/100 -- a value that cannot be represented exactly in base two just as 1/3 cannot be represented exactly in base ten.

How can I declare and initialize a constant bigger than UInt64 in Swift?

I like to know, How can I declare and initialize a constant bigger than UInt64 in Swift?
Swift infer seems unable to work for down number. How I should solve this issue?
let number = 11111111222222233333333344444445555555987654321 // Error: overflow
print(number, type(of: number))
Decimal is the numeric type capable of holding the largest value in Swift. However,you can't declare a Decimal literal, since integer literals are inferred to Int, while floating point literals are inferred to Double, so you need to initialise the Decimal from a String literal.
let number = Decimal(string: "321321321155564654646546546546554653334334")!
From the documentation of NSDecimalNumber (whose Swift version is Decimal and hence their numeric range is equivalent):
An instance can represent any number that can be expressed as mantissa x 10^exponent where mantissa is a decimal integer up to 38 digits long, and exponent is an integer from –128 through 127.
If you need to be able to represent arbitrary-length numbers in Swift, you need to use a 3rd party library (or create one yourself), there's no built-in type that could handle this in Swift.

how to convert negative int to positive int in flutter

for example if i have a list like
List list = [-12,3,-24,58,12];
i have already tried this ,i did not understand it https://stackoverflow.com/questions/55374813/why-abs-function-in-dart-return-negative-number-when-not-wrapped-in-parenthesi\
i want to convert all parameters to a positive parameters
or if i had only a integer which is negative how to convert it to a positive integer
if you need more information please let me know and thanks for the helps
You can use Abs method for numbers to get absolute value:
int number = -5;
print(number.abs()); // prints: 5
https://dart.dev/guides/language/language-tour#numbers

How to store a price in two integers in swift

I have read that I shouldn't treat prices as double numbers and that I should store them as two integers.
If for example I have an input of type 246.464 in a UITextField how should I make it two integers?
Forget two integers, store it as NSDecimalNumber. That's the class specifically created for that task.
Unlike float or double, NSDecimalNumber uses decadic arithmetic therefore there is no loss in precision when converting between decadic to binary and viceversa.
let text = "246.464" // textField.text
let number = NSDecimalNumber(string: text)

Multiplying integers

I am certain I am missing something very, very obvious, but can anyone tell me why I am having trouble multiplying two Integers? The following code:
let twenty: Integer = 20
let ten: Integer = 10
let result: Integer = twenty * ten
presents the error Could not find an overload for '*' that accepts the supplied arguments.
Other questions on SO with the same error are caused by trying to multiply different types together, but surely these are both Integer types?
(PS: The actual code I am trying to run is var value = self.value * 10 but I have expanded it to the sample while debugging to make absolutely sure that the correct types are being used)
use Int instead. Integer is a protocol.
Integer is a protocol not a type. Use Int instead.
As already stated , Integer is a protocol not a type .
In your situation, you don't need to do explicit the type because it is of implicit casting.
This could be enough
let twenty = 20
let ten = 10
let result = twenty * ten
NSLog("%d", result)