Binary in Swift - swift

I'm trying to convert Binary and stuff with Swift and this is my code :
let hexa = String(Int(a, radix: 2)!, radix: 16)// Converting binary to hexadecimal
I am getting the error
Cannot convert value type of 'Int' to expected argument type 'String'

You're misunderstanding how integers are stored.
There is no notion of a "decimal" Int, a "hexadecimal" Int, etc. When you have an Int in memory, it's always binary (radix 2). It's stored as a series of 64 or 32 bits.
When you try to assign to the Int a value like 10 (decimal), 0xA (hex), 0b1010 (binary), the compiler does the necessary parsing to convert your source code's string representation of that Int, into a series of bits that can be stored in the Int's 64 or 32 bits of memory.
When you try to use the Int, for example with print(a), there is conversion behind the scenes to take that Int's binary representation in memory, and convert it into a String whose symbols represent an Int in base 10, using the symbols we're used to (0-9).
On a more fundamental, it helps to understand that the notion of a radix is a construct devised purely for our convenience when working with numbers. Abstractly, a number has a magnitude that is a distinct entity, uncoupled from any radix. A magnitude can be represented concretely using a textual representation and a radix.
This part Int(a, radix: 2), doesn't make sense. Even supposing such an initializer (Int.init?(Int, radix: Int)) existed, it wouldn't do anything!. If a = 5, then a is stored as binary 0b101. This would then be parsed from binary into an Int, giving you... 0b101, or the same 5 you started with.
On the other hand, Strings can have a notion of a radix, because they can be a textual representation of a decimal Int, a hex Int, etc. To convert from a String that contains a number, you use Int.init?(String, radix: Int). The key here is that it takes a String parameter.
let a = 10 //decimal 10 is stored as binary in memory 1010
let hexa = String(a, radix: 16) //the Int is converted to a string, 0xA

Related

String ecoding returns wrong values . 33.48 becomes 33.47999999999999488 [duplicate]

This question already has an answer here:
Swift JSONEncoder number rounding
(1 answer)
Closed 1 year ago.
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
print(myObjectValues.v) // 33.48
let mydata = try JSONEncoder().encode(myObjectValues)
let string = String(data: mydata, encoding: .utf8)!
print(string) // 33.47999999999999488
Here in myObjectValues contains a Decimal value of 33.48. If i try to encode this mydata to string, the value returned is 33.47999999999999488. I've tried to round the decimal value to 2 places but the final string keeps posting this number. I've tried to save it in String and then back to Decimal but still the value returned in the encoded string is this 33.479999999ish.
I can't use the string for calculate and compare hash as the hash value returned from the server is for 33.48 which will never be equal to what i'll get on my end with this long value.
Decimal values created with underlying Double values will always create these issues.
Decimal values created with underlying String values won't create these issues.
What you can try to do is -
Have a private String value as a backing storage that's there just for safely encoding and decoding this decimal value.
Expose another computed Decimal value that uses this underlying String value.
import Foundation
class Test: Codable {
// Not exposed : Only for encoding & decoding
private var decimalString: String = "33.48"
// Work with this in your app
var decimal: Decimal {
get {
Decimal(string: decimalString) ?? .zero
}
set {
decimal = newValue
decimalString = "\(newValue)"
}
}
}
do {
let encoded = try JSONEncoder().encode(Test())
print(String(data: encoded, encoding: .utf8))
// Optional("{\"decimalString\":\"33.48\"}")
let decoded = try JSONDecoder().decode(Test.self, from: encoded)
print(decoded.decimal) // 33.48
print(decoded.decimal.nextUp) // 33.49
print(decoded.decimal.nextDown) // 33.47
} catch {
print(error)
}
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
Don't do this. Just don't. It's not sensible.
I'll explain it by analogy: Imagine if you represented numbers with six decimal digits of precision. You have to use some amount of precision, right?
Now, 1/3 would be represented as 0.333333. But 2/3 would be represented by 0.666667. So now, if you multiply 1/3 by two, you will not get 2/3. And if you add 1/3 to 1/3 to 1/3, you will not get 1.
So the hash of 1 will be different depending on how you got that 1! If you add 1/3 to 1/3 to 1/3, you will get a different hash than if you added 2/3 to 1/3.
This is simply not the right data type to hash. Don't use doubles for this purpose. Rounding will work until it doesn't.
And you are using 33 + 48/100 -- a value that cannot be represented exactly in base two just as 1/3 cannot be represented exactly in base ten.

How can I declare and initialize a constant bigger than UInt64 in Swift?

I like to know, How can I declare and initialize a constant bigger than UInt64 in Swift?
Swift infer seems unable to work for down number. How I should solve this issue?
let number = 11111111222222233333333344444445555555987654321 // Error: overflow
print(number, type(of: number))
Decimal is the numeric type capable of holding the largest value in Swift. However,you can't declare a Decimal literal, since integer literals are inferred to Int, while floating point literals are inferred to Double, so you need to initialise the Decimal from a String literal.
let number = Decimal(string: "321321321155564654646546546546554653334334")!
From the documentation of NSDecimalNumber (whose Swift version is Decimal and hence their numeric range is equivalent):
An instance can represent any number that can be expressed as mantissa x 10^exponent where mantissa is a decimal integer up to 38 digits long, and exponent is an integer from –128 through 127.
If you need to be able to represent arbitrary-length numbers in Swift, you need to use a 3rd party library (or create one yourself), there's no built-in type that could handle this in Swift.

Swift Binary string to Int [duplicate]

I'm currently trying to use the function
Int(binaryString, radix: 2)
to convert a string of binary to an Int. However, this function seems to always be converting the binary string into an unsigned integer. For example,
Int("1111111110011100", radix: 2)
returns 65436 when I'd expect to get -100 from it if it were doing an signed int conversion. I haven't really worked with binary much, so I was wondering what I should do here? Is there a code-efficient way built-into Swift3 that does this for signed ints? I had initially expected this to work because it's an Int constructor (not UInt).
Playing around you can get the desired result as follows:
let binaryString = "1111111110011100"
print(Int(binaryString, radix: 2)!)
print(UInt16(binaryString, radix: 2)!)
print(Int16(bitPattern: UInt16(binaryString, radix: 2)!))
Output:
65436
65436
-100
The desired result comes from creating a signed Int16 using the bit pattern of a UInt16.

Swift: Why does UInt.max == -1

As the title states, lldb reports the value of UInt.max to be a UInt of -1, which seems highly illogical. Considering that let uint: UInt = -1 doesn't even compile, how is this even possible? I don't see any way to have a negative value of UInt at runtime because the initializer will crash if given a negative value. I want to know the actual maximum value of UInt.
The Int value of -1 and the UInt value UInt.max have the same bit representation in memory.
You can see that if you do:
let i = Int(bitPattern: UInt.max) // i == -1
and in the opposite direction:
if UInt(bitPattern: Int(-1)) == UInt.max {
print("same")
}
Output:
same
The debugger is incorrectly displaying UInt.max as a signed Int. They have the same bit representation in memory (0xffffffffffffffff on a 64-bit system such as iPhone 6 and 0xffffffff on a 32-bit system such as iPhone 5), and the debugger apparently chooses to show that value as an Int.
You can see the same issue if you do:
print(String(format: "%d", UInt.max)) // prints "-1"
It doesn't mean UInt.max is -1, just that both have the same representation in memory.
To see the maximum value of UInt, do the following in an app or on a Swift Playground:
print(UInt.max)
This will print 18446744073709551615 on a 64-bit system (such as a Macintosh or iPhone 6) and 4294967295 on a 32-bit system (such as an iPhone 5).
In lldb:
(lldb) p String(UInt.max)
(String) $R0 = "18446744073709551615"
(lldb)
This sounds like an instance of transitioning between interpreting a literal value under Two's Complement and Unsigned representations.
In the unsigned world, a binary number is just a binary number, so the more digits that are 1, the bigger the number, since there is no need to encode the sign somehow. In order to represent the largest number of signed values, the Two's Complement encoding scheme encodes positive values as normal provided the most extreme bit is not a 1. If the most extreme bit is a 1, then the bits are reinterpreted as described at https://en.wikipedia.org/wiki/Two%27s_complement.
As shown on Wikipedia, the Two's Complement representation of -1 has all bits set to 1, or the maximal unsigned value.

How do I convert a 50 digit string into the appropriate integer type in Swift?

I need to convert this 50 digit string 53503534226472524250874054075591789781264330331690 into the appropriate number type. I tried this:
let str = "53503534226472524250874054075591789781264330331690"
let num = str.toInt(); // Returns nil
let num = Int64(str.toInt()); // Errors out
The maximimum size of an Int64 is 9,223,372,036,854,775,807 when it is signed. So you cannot convert it just like that.
You need something like the BigInt class found in other languages. Check this other question where they answer with alternatives about BigInt in Swift:
BigInteger equivalent in Swift?
In summary, there are third-party libraries out there for arbitrary long integers. The only alternative from Apple is NSDecimalNumber but its limit is 38 digits, whereas your number has 50.