Im reading accelerometer data from an ibeacon that appears in the following string format:
x hex string value: "0160"
y hex string value: "ff14"
z hex string value: "0114"
Im expecting to see these values as double values ranging from 0g to 1g. How would you convert these hex strings into doubles in swift?
Get the integer value from hex string with Int(_:radix:)
let string = "ff14"
let hexValue = Int(string, radix: 16)!
and divide by 65535 (16 bit) to get values between 0.0 and 1.0
let result = Double(hexValue) / 65535
Related
Working with Swift.
I am converting value from String to double.
let value: String = "8"
var size: Double
size = Double(value)
size = 8.0 // 8.0
Result should be 8 unless string value is 8.0
Double only stores a numeric magnitude. "8" and "8.0" have the same magnitude, so they get represented by the same Double value. Whether you show trailing 0s or not is a consequence of how you choose to format and present your values.
print does it one way for debugging, but NumberFormatter gives you more control to format numbers for real, non-debugging purposes.
This question already has an answer here:
Swift JSONEncoder number rounding
(1 answer)
Closed 1 year ago.
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
print(myObjectValues.v) // 33.48
let mydata = try JSONEncoder().encode(myObjectValues)
let string = String(data: mydata, encoding: .utf8)!
print(string) // 33.47999999999999488
Here in myObjectValues contains a Decimal value of 33.48. If i try to encode this mydata to string, the value returned is 33.47999999999999488. I've tried to round the decimal value to 2 places but the final string keeps posting this number. I've tried to save it in String and then back to Decimal but still the value returned in the encoded string is this 33.479999999ish.
I can't use the string for calculate and compare hash as the hash value returned from the server is for 33.48 which will never be equal to what i'll get on my end with this long value.
Decimal values created with underlying Double values will always create these issues.
Decimal values created with underlying String values won't create these issues.
What you can try to do is -
Have a private String value as a backing storage that's there just for safely encoding and decoding this decimal value.
Expose another computed Decimal value that uses this underlying String value.
import Foundation
class Test: Codable {
// Not exposed : Only for encoding & decoding
private var decimalString: String = "33.48"
// Work with this in your app
var decimal: Decimal {
get {
Decimal(string: decimalString) ?? .zero
}
set {
decimal = newValue
decimalString = "\(newValue)"
}
}
}
do {
let encoded = try JSONEncoder().encode(Test())
print(String(data: encoded, encoding: .utf8))
// Optional("{\"decimalString\":\"33.48\"}")
let decoded = try JSONDecoder().decode(Test.self, from: encoded)
print(decoded.decimal) // 33.48
print(decoded.decimal.nextUp) // 33.49
print(decoded.decimal.nextDown) // 33.47
} catch {
print(error)
}
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
Don't do this. Just don't. It's not sensible.
I'll explain it by analogy: Imagine if you represented numbers with six decimal digits of precision. You have to use some amount of precision, right?
Now, 1/3 would be represented as 0.333333. But 2/3 would be represented by 0.666667. So now, if you multiply 1/3 by two, you will not get 2/3. And if you add 1/3 to 1/3 to 1/3, you will not get 1.
So the hash of 1 will be different depending on how you got that 1! If you add 1/3 to 1/3 to 1/3, you will get a different hash than if you added 2/3 to 1/3.
This is simply not the right data type to hash. Don't use doubles for this purpose. Rounding will work until it doesn't.
And you are using 33 + 48/100 -- a value that cannot be represented exactly in base two just as 1/3 cannot be represented exactly in base ten.
I'm trying to assign -5 to signedInt but getting an error:
Integer literal '133' overflows when stored into 'Int8'
let signedInt: Int8 = 0b10000101
print(signedInt)
print(String(signedInt, radix: 2))
Your value is not -5, but -123.
You can't get there with a direct assignment because the value is a signed Int and interpreted as 133.
To assign a negative value, use Int8(bitpattern:) to convert the value from a UInt8 to an Int8:
let signedInt = Int8(bitPattern: 0b10000101)
print(signedInt)
-123
-5 is 0b11111011 which is the 2's complement of 0b00000101.
To form the 2's complement, start with the binary pattern for 5:
0b00000101
invert all of the bits:
0b11111010
and add 1:
0b11111011
You can use UInt8(bitPattern:) to find the representation of the number:
let signedInt: Int8 = -5
print(String(UInt8(bitPattern: signedInt), radix: 2))
11111011
What are the differences between the data types Int & UInt8 in swift.
Looks like UInt8 is used for binary data, i need to convert UInt8 to Int is this possible.
That U in UInt stands for unsigned int.
It is not just using for binary data. Uint is used for positive numbers only, like natural numbers.
I recommend you to get to know how negative numbers are understood from a computer.
Int8 is an Integer type which can store positive and negative values.
UInt8 is an unsigned integer which can store only positive values.
You can easily convert UInt8 to Int8 but if you want to convert Int8 to UInt8 then make sure value should be positive.
UInt8 is an 8bit store, while Int not hardly defined or defined by the compiler:
https://developer.apple.com/library/content/documentation/Swift/Conceptual/Swift_Programming_Language/TheBasics.html
Int could be 32 or 64 bits
Updated for swift:
Operation Output Range Bytes per Element
uint8 0 to 255 1
Int - 9223372036854775808 to 9223372036854775807 2 or 4
If you want to find the max and min range of Int or UInt8:
let maxIntValue = Int.max
let maxUInt8Value = UInt8.max
let minIntValue = Int.min
let minUInt8Value = UInt8.min
If you want to convert UInt8 to Int, used below simple code:
func convertToInt(unsigned: UInt) -> Int {
let signed = (unsigned <= UInt(Int.max)) ?
Int(unsigned) :
Int(unsigned - UInt(Int.max) - 1) + Int.min
return signed
}
Sorry if this seems really simple, I just can't find it anywhere online.
I have a UInt8 in hex, I need to get it to a decimal. How do I achieve this in swift?
For example:
"ff"
Thanks
If you have a string representation, "ff", you can use UInt8(_:radix:):
let string = "ff"
if let value = UInt8(string, radix: 16) {
print(value)
}
Try this code, It's work for me.
// Hex to decimal
let h2 = "ff"
let d4 = Int(h2, radix: 16)!
print(d4)
Hope this is help for some one
you can use the function strtoul to convert your hex to decimal:
let result = UInt8(strtoul("ff", nil, 16)) // 255
If your HEX value is not a String, but just something you want to convert compile-time, you can also use:
let integer : UInt8 = 0xff
So prefixing it with 0x will do the job.