How to convert byte position in swift - swift

How would I convert this properly? the value I need is byte number 0-1, has a format of uint16 and its units are in degrees.
print("derived : \(characteristic.value!)")
print(String(bytes: characteristic.value!, encoding: .utf16))
derived : 20 bytes
Optional("\0{Ͽ⌜ƀ")

You just need to get the first two bytes of the Data as UInt16.
var result: UInt16 = 0
_ = withUnsafeMutableBytes(of: &result) {characteristic.value!.copyBytes(to: $0, from: 0...1)}
print("\(result) degrees")
Assuming your format of uint16 is in Little-Endian as well as iOS, which is more often found in BLE characteristics.
(When you get the first value in Data, from: 0...1 is not needed, but you may want some data in other position.)

Related

String ecoding returns wrong values . 33.48 becomes 33.47999999999999488 [duplicate]

This question already has an answer here:
Swift JSONEncoder number rounding
(1 answer)
Closed 1 year ago.
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
print(myObjectValues.v) // 33.48
let mydata = try JSONEncoder().encode(myObjectValues)
let string = String(data: mydata, encoding: .utf8)!
print(string) // 33.47999999999999488
Here in myObjectValues contains a Decimal value of 33.48. If i try to encode this mydata to string, the value returned is 33.47999999999999488. I've tried to round the decimal value to 2 places but the final string keeps posting this number. I've tried to save it in String and then back to Decimal but still the value returned in the encoded string is this 33.479999999ish.
I can't use the string for calculate and compare hash as the hash value returned from the server is for 33.48 which will never be equal to what i'll get on my end with this long value.
Decimal values created with underlying Double values will always create these issues.
Decimal values created with underlying String values won't create these issues.
What you can try to do is -
Have a private String value as a backing storage that's there just for safely encoding and decoding this decimal value.
Expose another computed Decimal value that uses this underlying String value.
import Foundation
class Test: Codable {
// Not exposed : Only for encoding & decoding
private var decimalString: String = "33.48"
// Work with this in your app
var decimal: Decimal {
get {
Decimal(string: decimalString) ?? .zero
}
set {
decimal = newValue
decimalString = "\(newValue)"
}
}
}
do {
let encoded = try JSONEncoder().encode(Test())
print(String(data: encoded, encoding: .utf8))
// Optional("{\"decimalString\":\"33.48\"}")
let decoded = try JSONDecoder().decode(Test.self, from: encoded)
print(decoded.decimal) // 33.48
print(decoded.decimal.nextUp) // 33.49
print(decoded.decimal.nextDown) // 33.47
} catch {
print(error)
}
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
Don't do this. Just don't. It's not sensible.
I'll explain it by analogy: Imagine if you represented numbers with six decimal digits of precision. You have to use some amount of precision, right?
Now, 1/3 would be represented as 0.333333. But 2/3 would be represented by 0.666667. So now, if you multiply 1/3 by two, you will not get 2/3. And if you add 1/3 to 1/3 to 1/3, you will not get 1.
So the hash of 1 will be different depending on how you got that 1! If you add 1/3 to 1/3 to 1/3, you will get a different hash than if you added 2/3 to 1/3.
This is simply not the right data type to hash. Don't use doubles for this purpose. Rounding will work until it doesn't.
And you are using 33 + 48/100 -- a value that cannot be represented exactly in base two just as 1/3 cannot be represented exactly in base ten.

How to convert from array of UInt8 to Data? [duplicate]

This question already has answers here:
NSData from UInt8
(3 answers)
round trip Swift number types to/from Data
(3 answers)
Closed 2 years ago.
There are many questions asking about converting Data to an \[UInt8\]. I would like to do the opposite. Convert from [UInt8] to Data.
I looked at the initializers, and these seem to be the most promising:
init(elements: Sequence)
init(bytes: UnsafeRawPointer, count: Int)
init(repeating: UInt8, count: Int)
The problem with #1, is that it takes in any sequence, not just [UInt8]. Therefore, it doesn't give me much confidence that it'll encode my data exactly as I want.
The problem with #2, is that I'm not sure how to convert from [UInt8] to UnsafeRawPointer. Also the unsafe part makes me think this is not the correct approach.
The problem with #3, is that it only allows me to repeat the same exact byte, multiple times. My data contains different bytes.
How do I convert from [UInt8] to Data?
Use init(elements: Sequence), and treat it as though it was init(elements: [UInt8]).
You'll see this in the Data struct which explains why you can do so:
#inlinable public init<S>(_ elements: S) where S : Sequence, S.Element == UInt8
Original:
Even though the data type is Sequence it looks like the real data type is actually closer to [UInt8]. If you attempt to pass something that isn't UInt8, the compiler will complain.
To verify that this all works as expected, I tried using the repeating overload and verifying that I get the same results:
let data = Data(repeating: 65, count: 2)
let data2 = Data([UInt8(65), UInt8(65)])
let data3 = Data([65, 65])
String(data: data, encoding: String.Encoding.ascii) // Output: "AA"
String(data: data2, encoding: String.Encoding.ascii) // Output: "AA"
String(data: data3, encoding: String.Encoding.ascii) // Output: "AA"
//let data4 = Data([16705]) // Does not compile: Integer literal '16705' overflows when stored into 'UInt8'
//let i: Int = { 16705 }()
//let data5 = Data([i]) // Does not compile: Cannot convert value of type 'Int' to expected element type 'UInt8'
This gives me confidence that it will encode the data correctly.
I'm not sure why the method shows elements as type Sequence instead of [UInt8] especially since doing the latter would make it much more obvious that this is how it behaves (and prevent the need for a question like this in the first place).

Optional UInt8? gives 2 byte memory size

UInt8 memory size is 1 byte . but when i make it optional value, it gives 2 byte of size.
var serail : UInt8? = 255
print(MemoryLayout.size(ofValue: serail)) // it gives 2 byte size.
var serail : UInt8 = 255
print(MemoryLayout.size(ofValue: serail)) // it gives 1 byte size.
how to get exactly 1 byte memory size for Integer value
Under the hood, an optional is an enum and looks something like this:
enum Optional<Wrapped> {
case some(Wrapped) // not nil
case none // nil
}
The symbols ? and ! are just shorthands for referring to this optional enum. This extra layer causes it to be 2 bytes large.
However, if you unwrap the optional, the returned value is the wrapped value itself, so it becomes 1 byte.

How to convert UInt8 to bytes Swift 4

How do I convert 4 to 0x04 in Swift 4?
var value: UInt8 = 4 // want to input 4
let bytes = NSData(bytes: &value, length: sizeof(UInt8())) // bytes must = 0x04
If "bytes must = 0x04" it sounds like you want a Data with a single byte with a value of 4, right? The code you have written will produce an NSData with an array of bytes that represents the memory address of value.
If you just want a single byte in your data, you can say Data(bytes: [value]).

Difference between Int and Uint8 swift

What are the differences between the data types Int & UInt8 in swift.
Looks like UInt8 is used for binary data, i need to convert UInt8 to Int is this possible.
That U in UInt stands for unsigned int.
It is not just using for binary data. Uint is used for positive numbers only, like natural numbers.
I recommend you to get to know how negative numbers are understood from a computer.
Int8 is an Integer type which can store positive and negative values.
UInt8 is an unsigned integer which can store only positive values.
You can easily convert UInt8 to Int8 but if you want to convert Int8 to UInt8 then make sure value should be positive.
UInt8 is an 8bit store, while Int not hardly defined or defined by the compiler:
https://developer.apple.com/library/content/documentation/Swift/Conceptual/Swift_Programming_Language/TheBasics.html
Int could be 32 or 64 bits
Updated for swift:
Operation Output Range Bytes per Element
uint8 0 to 255 1
Int - 9223372036854775808 to 9223372036854775807 2 or 4
If you want to find the max and min range of Int or UInt8:
let maxIntValue = Int.max
let maxUInt8Value = UInt8.max
let minIntValue = Int.min
let minUInt8Value = UInt8.min
If you want to convert UInt8 to Int, used below simple code:
func convertToInt(unsigned: UInt) -> Int {
let signed = (unsigned <= UInt(Int.max)) ?
Int(unsigned) :
Int(unsigned - UInt(Int.max) - 1) + Int.min
return signed
}