How to convert from array of UInt8 to Data? [duplicate] - swift

This question already has answers here:
NSData from UInt8
(3 answers)
round trip Swift number types to/from Data
(3 answers)
Closed 2 years ago.
There are many questions asking about converting Data to an \[UInt8\]. I would like to do the opposite. Convert from [UInt8] to Data.
I looked at the initializers, and these seem to be the most promising:
init(elements: Sequence)
init(bytes: UnsafeRawPointer, count: Int)
init(repeating: UInt8, count: Int)
The problem with #1, is that it takes in any sequence, not just [UInt8]. Therefore, it doesn't give me much confidence that it'll encode my data exactly as I want.
The problem with #2, is that I'm not sure how to convert from [UInt8] to UnsafeRawPointer. Also the unsafe part makes me think this is not the correct approach.
The problem with #3, is that it only allows me to repeat the same exact byte, multiple times. My data contains different bytes.
How do I convert from [UInt8] to Data?

Use init(elements: Sequence), and treat it as though it was init(elements: [UInt8]).
You'll see this in the Data struct which explains why you can do so:
#inlinable public init<S>(_ elements: S) where S : Sequence, S.Element == UInt8
Original:
Even though the data type is Sequence it looks like the real data type is actually closer to [UInt8]. If you attempt to pass something that isn't UInt8, the compiler will complain.
To verify that this all works as expected, I tried using the repeating overload and verifying that I get the same results:
let data = Data(repeating: 65, count: 2)
let data2 = Data([UInt8(65), UInt8(65)])
let data3 = Data([65, 65])
String(data: data, encoding: String.Encoding.ascii) // Output: "AA"
String(data: data2, encoding: String.Encoding.ascii) // Output: "AA"
String(data: data3, encoding: String.Encoding.ascii) // Output: "AA"
//let data4 = Data([16705]) // Does not compile: Integer literal '16705' overflows when stored into 'UInt8'
//let i: Int = { 16705 }()
//let data5 = Data([i]) // Does not compile: Cannot convert value of type 'Int' to expected element type 'UInt8'
This gives me confidence that it will encode the data correctly.
I'm not sure why the method shows elements as type Sequence instead of [UInt8] especially since doing the latter would make it much more obvious that this is how it behaves (and prevent the need for a question like this in the first place).

Related

Swift: Cannot convert value of type 'Range<Int>' to specified type 'Int'

I was trying to implement a small iteration which returns the square of some ranges.
Which should be the equivalence of this Python script
for i in range(n):
print(i*i)
In Swift I tried
first attempt
let numbers = [1..<10]
for i in numbers{
print(i*i)
}
and
second attmpt
let numbers = [1..<10]
for i in numbers{
var j: Int = i
print(j*j)
}
but then the compiler says
Cannot convert value of type 'Range<Int>' to specified type 'Int'
I understand from my python experience this is due to different types in Swift. Thus my questions are
How can I fix this? (i.e. implement the same thing i did in python)
What are the problems with my first and second attempts?
Why are there so many types of <Int> in Swift?
Thanks in advance!
Your code doesn't compile because you have used [] around the range, which creates an array. [1..<10] is an array of ranges. The for loop is then iterating over that array, which has only one element - the range 1..<10.
This is why i is of type Range<Int>. It is the range, not the numbers in the range.
Just remove the [] and both of your code would work. You can iterate over ranges directly (in fact, anything that conforms to the Sequence protocol), not just arrays. You can even write the range inline with the loop:
for i in 0..<10 {
print(i * i)
}
Why are there so many types of <Int> in Swift?
You are looking at this the wrong way, the word Range and ClosedRange in the types Range<Int> and ClosedRange<Int> are not words that modify Int, as if they are different "flavours" of Int. It's the opposite - Range<Bound> and ClosedRange<Bound> are generic types, and Range<Int> can be a considered the specific "type" of Range that has Int as its bounds. You can also have Range<Float> or Range<UInt8> for example.

String ecoding returns wrong values . 33.48 becomes 33.47999999999999488 [duplicate]

This question already has an answer here:
Swift JSONEncoder number rounding
(1 answer)
Closed 1 year ago.
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
print(myObjectValues.v) // 33.48
let mydata = try JSONEncoder().encode(myObjectValues)
let string = String(data: mydata, encoding: .utf8)!
print(string) // 33.47999999999999488
Here in myObjectValues contains a Decimal value of 33.48. If i try to encode this mydata to string, the value returned is 33.47999999999999488. I've tried to round the decimal value to 2 places but the final string keeps posting this number. I've tried to save it in String and then back to Decimal but still the value returned in the encoded string is this 33.479999999ish.
I can't use the string for calculate and compare hash as the hash value returned from the server is for 33.48 which will never be equal to what i'll get on my end with this long value.
Decimal values created with underlying Double values will always create these issues.
Decimal values created with underlying String values won't create these issues.
What you can try to do is -
Have a private String value as a backing storage that's there just for safely encoding and decoding this decimal value.
Expose another computed Decimal value that uses this underlying String value.
import Foundation
class Test: Codable {
// Not exposed : Only for encoding & decoding
private var decimalString: String = "33.48"
// Work with this in your app
var decimal: Decimal {
get {
Decimal(string: decimalString) ?? .zero
}
set {
decimal = newValue
decimalString = "\(newValue)"
}
}
}
do {
let encoded = try JSONEncoder().encode(Test())
print(String(data: encoded, encoding: .utf8))
// Optional("{\"decimalString\":\"33.48\"}")
let decoded = try JSONDecoder().decode(Test.self, from: encoded)
print(decoded.decimal) // 33.48
print(decoded.decimal.nextUp) // 33.49
print(decoded.decimal.nextDown) // 33.47
} catch {
print(error)
}
I'm trying to create a hash of a give object after converting it to string in swift but the values encoded returned in string are different.
Don't do this. Just don't. It's not sensible.
I'll explain it by analogy: Imagine if you represented numbers with six decimal digits of precision. You have to use some amount of precision, right?
Now, 1/3 would be represented as 0.333333. But 2/3 would be represented by 0.666667. So now, if you multiply 1/3 by two, you will not get 2/3. And if you add 1/3 to 1/3 to 1/3, you will not get 1.
So the hash of 1 will be different depending on how you got that 1! If you add 1/3 to 1/3 to 1/3, you will get a different hash than if you added 2/3 to 1/3.
This is simply not the right data type to hash. Don't use doubles for this purpose. Rounding will work until it doesn't.
And you are using 33 + 48/100 -- a value that cannot be represented exactly in base two just as 1/3 cannot be represented exactly in base ten.

Confusion converting a decimal to hex in Swift 5

Many versions of this question are posted. My question is slightly different, as I'm getting conflicting results.
If I run the following in a playground, it works fine:
let myNumber = 12345
if let myHex = Double(String(myNumber, radix: 16)) {
print(myHex)
} else {
print("Bad input as hexadecimal: \(myNumber)")
}
This returns 3039.
However, if I change myNumber to 1234, I get the Bad Input message. Can anyone see what I'm doing wrong, or point me to a similar question? (I have looked)
You are taking a number, 1234, and converting it to a string (e.g. 4d2). You're then asking Double to try to interpret that alphanumeric hex string, which it obviously cannot do.
If you want the hex string representation, it is simply:
let myNumber = 1234
let myHex = String(myNumber, radix: 16)
print(myHex)
Your value of 12345 resulted in a hex string that did not happen to contain any a-f characters (it was 3039), so the Double conversion did not fail. (But it also didn't return the right value, either.)

How to convert byte position in swift

How would I convert this properly? the value I need is byte number 0-1, has a format of uint16 and its units are in degrees.
print("derived : \(characteristic.value!)")
print(String(bytes: characteristic.value!, encoding: .utf16))
derived : 20 bytes
Optional("\0{Ͽ⌜ƀ")
You just need to get the first two bytes of the Data as UInt16.
var result: UInt16 = 0
_ = withUnsafeMutableBytes(of: &result) {characteristic.value!.copyBytes(to: $0, from: 0...1)}
print("\(result) degrees")
Assuming your format of uint16 is in Little-Endian as well as iOS, which is more often found in BLE characteristics.
(When you get the first value in Data, from: 0...1 is not needed, but you may want some data in other position.)

Converting UnsafeBufferPointer to Data in Swift 3

I'm trying to initialize Data from an UnsafeBufferPointer, but it's throwing an EXC_BAD_ACCESS when it hits the third line. Help appreciated.
let pointer = UnsafePointer<UInt8>(UnsafePointer<UInt8>(bitPattern: 15)!)
let buffer = UnsafeBufferPointer<UInt8>(start: pointer, count: 1)
let data = Data(bytes: Array(buffer)) // EXC_BAD_ACCESS
My end goal is to convert bits of data to some human readable format (eg., convert a bit pattern of 15 to "F"). I was hoping to initialize a String from the data object as a hex value. I'm open to better and correct ways of going about this.