Swift Binary string to Int [duplicate] - swift

I'm currently trying to use the function
Int(binaryString, radix: 2)
to convert a string of binary to an Int. However, this function seems to always be converting the binary string into an unsigned integer. For example,
Int("1111111110011100", radix: 2)
returns 65436 when I'd expect to get -100 from it if it were doing an signed int conversion. I haven't really worked with binary much, so I was wondering what I should do here? Is there a code-efficient way built-into Swift3 that does this for signed ints? I had initially expected this to work because it's an Int constructor (not UInt).

Playing around you can get the desired result as follows:
let binaryString = "1111111110011100"
print(Int(binaryString, radix: 2)!)
print(UInt16(binaryString, radix: 2)!)
print(Int16(bitPattern: UInt16(binaryString, radix: 2)!))
Output:
65436
65436
-100
The desired result comes from creating a signed Int16 using the bit pattern of a UInt16.

Related

String ecoding returns wrong values . 33.48 becomes 33.47999999999999488 [duplicate]

This question already has an answer here:
Swift JSONEncoder number rounding
(1 answer)
Closed 1 year ago.
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
print(myObjectValues.v) // 33.48
let mydata = try JSONEncoder().encode(myObjectValues)
let string = String(data: mydata, encoding: .utf8)!
print(string) // 33.47999999999999488
Here in myObjectValues contains a Decimal value of 33.48. If i try to encode this mydata to string, the value returned is 33.47999999999999488. I've tried to round the decimal value to 2 places but the final string keeps posting this number. I've tried to save it in String and then back to Decimal but still the value returned in the encoded string is this 33.479999999ish.
I can't use the string for calculate and compare hash as the hash value returned from the server is for 33.48 which will never be equal to what i'll get on my end with this long value.
Decimal values created with underlying Double values will always create these issues.
Decimal values created with underlying String values won't create these issues.
What you can try to do is -
Have a private String value as a backing storage that's there just for safely encoding and decoding this decimal value.
Expose another computed Decimal value that uses this underlying String value.
import Foundation
class Test: Codable {
// Not exposed : Only for encoding & decoding
private var decimalString: String = "33.48"
// Work with this in your app
var decimal: Decimal {
get {
Decimal(string: decimalString) ?? .zero
}
set {
decimal = newValue
decimalString = "\(newValue)"
}
}
}
do {
let encoded = try JSONEncoder().encode(Test())
print(String(data: encoded, encoding: .utf8))
// Optional("{\"decimalString\":\"33.48\"}")
let decoded = try JSONDecoder().decode(Test.self, from: encoded)
print(decoded.decimal) // 33.48
print(decoded.decimal.nextUp) // 33.49
print(decoded.decimal.nextDown) // 33.47
} catch {
print(error)
}
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
Don't do this. Just don't. It's not sensible.
I'll explain it by analogy: Imagine if you represented numbers with six decimal digits of precision. You have to use some amount of precision, right?
Now, 1/3 would be represented as 0.333333. But 2/3 would be represented by 0.666667. So now, if you multiply 1/3 by two, you will not get 2/3. And if you add 1/3 to 1/3 to 1/3, you will not get 1.
So the hash of 1 will be different depending on how you got that 1! If you add 1/3 to 1/3 to 1/3, you will get a different hash than if you added 2/3 to 1/3.
This is simply not the right data type to hash. Don't use doubles for this purpose. Rounding will work until it doesn't.
And you are using 33 + 48/100 -- a value that cannot be represented exactly in base two just as 1/3 cannot be represented exactly in base ten.

In Swift I cant create a negative number in binary

I'm trying to assign -5 to signedInt but getting an error:
Integer literal '133' overflows when stored into 'Int8'
let signedInt: Int8 = 0b10000101
print(signedInt)
print(String(signedInt, radix: 2))
Your value is not -5, but -123.
You can't get there with a direct assignment because the value is a signed Int and interpreted as 133.
To assign a negative value, use Int8(bitpattern:) to convert the value from a UInt8 to an Int8:
let signedInt = Int8(bitPattern: 0b10000101)
print(signedInt)
-123
-5 is 0b11111011 which is the 2's complement of 0b00000101.
To form the 2's complement, start with the binary pattern for 5:
0b00000101
invert all of the bits:
0b11111010
and add 1:
0b11111011
You can use UInt8(bitPattern:) to find the representation of the number:
let signedInt: Int8 = -5
print(String(UInt8(bitPattern: signedInt), radix: 2))
11111011

Difference between Int and Uint8 swift

What are the differences between the data types Int & UInt8 in swift.
Looks like UInt8 is used for binary data, i need to convert UInt8 to Int is this possible.
That U in UInt stands for unsigned int.
It is not just using for binary data. Uint is used for positive numbers only, like natural numbers.
I recommend you to get to know how negative numbers are understood from a computer.
Int8 is an Integer type which can store positive and negative values.
UInt8 is an unsigned integer which can store only positive values.
You can easily convert UInt8 to Int8 but if you want to convert Int8 to UInt8 then make sure value should be positive.
UInt8 is an 8bit store, while Int not hardly defined or defined by the compiler:
https://developer.apple.com/library/content/documentation/Swift/Conceptual/Swift_Programming_Language/TheBasics.html
Int could be 32 or 64 bits
Updated for swift:
Operation Output Range Bytes per Element
uint8 0 to 255 1
Int - 9223372036854775808 to 9223372036854775807 2 or 4
If you want to find the max and min range of Int or UInt8:
let maxIntValue = Int.max
let maxUInt8Value = UInt8.max
let minIntValue = Int.min
let minUInt8Value = UInt8.min
If you want to convert UInt8 to Int, used below simple code:
func convertToInt(unsigned: UInt) -> Int {
let signed = (unsigned <= UInt(Int.max)) ?
Int(unsigned) :
Int(unsigned - UInt(Int.max) - 1) + Int.min
return signed
}

Binary in Swift

I'm trying to convert Binary and stuff with Swift and this is my code :
let hexa = String(Int(a, radix: 2)!, radix: 16)// Converting binary to hexadecimal
I am getting the error
Cannot convert value type of 'Int' to expected argument type 'String'
You're misunderstanding how integers are stored.
There is no notion of a "decimal" Int, a "hexadecimal" Int, etc. When you have an Int in memory, it's always binary (radix 2). It's stored as a series of 64 or 32 bits.
When you try to assign to the Int a value like 10 (decimal), 0xA (hex), 0b1010 (binary), the compiler does the necessary parsing to convert your source code's string representation of that Int, into a series of bits that can be stored in the Int's 64 or 32 bits of memory.
When you try to use the Int, for example with print(a), there is conversion behind the scenes to take that Int's binary representation in memory, and convert it into a String whose symbols represent an Int in base 10, using the symbols we're used to (0-9).
On a more fundamental, it helps to understand that the notion of a radix is a construct devised purely for our convenience when working with numbers. Abstractly, a number has a magnitude that is a distinct entity, uncoupled from any radix. A magnitude can be represented concretely using a textual representation and a radix.
This part Int(a, radix: 2), doesn't make sense. Even supposing such an initializer (Int.init?(Int, radix: Int)) existed, it wouldn't do anything!. If a = 5, then a is stored as binary 0b101. This would then be parsed from binary into an Int, giving you... 0b101, or the same 5 you started with.
On the other hand, Strings can have a notion of a radix, because they can be a textual representation of a decimal Int, a hex Int, etc. To convert from a String that contains a number, you use Int.init?(String, radix: Int). The key here is that it takes a String parameter.
let a = 10 //decimal 10 is stored as binary in memory 1010
let hexa = String(a, radix: 16) //the Int is converted to a string, 0xA

How do I convert a 50 digit string into the appropriate integer type in Swift?

I need to convert this 50 digit string 53503534226472524250874054075591789781264330331690 into the appropriate number type. I tried this:
let str = "53503534226472524250874054075591789781264330331690"
let num = str.toInt(); // Returns nil
let num = Int64(str.toInt()); // Errors out
The maximimum size of an Int64 is 9,223,372,036,854,775,807 when it is signed. So you cannot convert it just like that.
You need something like the BigInt class found in other languages. Check this other question where they answer with alternatives about BigInt in Swift:
BigInteger equivalent in Swift?
In summary, there are third-party libraries out there for arbitrary long integers. The only alternative from Apple is NSDecimalNumber but its limit is 38 digits, whereas your number has 50.