Swift: Byte array to decimal value - swift

In my project, I communicate with a bluetooth device, the bluetooth device must send me a timestamp second, I received in byte:
[2,6,239]
When I convert converted to a string:
let payloadString = payload.map {
String(format: "%02x", $0)
}
Output:
["02", "06","ef"]
When I converted from the website 0206ef = 132847 seconds
How can I directly convert my aray [2,6,239] in second (= 132847 seconds)?
And if it's complicated then translate my array ["02", "06,"ef"] in second (= 132847 seconds)

The payload contains the bytes of the binary representation of the value.
You convert it back to the value by shifting each byte into its corresponding position:
let payload: [UInt8] = [2, 6, 239]
let value = Int(payload[0]) << 16 + Int(payload[1]) << 8 + Int(payload[2])
print(value) // 132847
The important point is to convert the bytes to integers before shifting, otherwise an overflow error would occur. Alternatively,
with multiplication:
let value = (Int(payload[0]) * 256 + Int(payload[1])) * 256 + Int(payload[2])
or
let value = payload.reduce(0) { $0 * 256 + Int($1) }
The last approach works with an arbitrary number of bytes – as long as
the result fits into an Int. For 4...8 bytes you better choose UInt64
to avoid overflow errors:
let value = payload.reduce(0) { $0 * 256 + UInt64($1) }

payloadString string can be reduced to hexStr and then converted to decimal
var payload = [2,6,239];
let payloadString = payload.map {
String(format: "%02x", $0)
}
//let hexStr = payloadString.reduce(""){$0 + $1}
let hexStr = payloadString.joined()
if let value = UInt64(hexStr, radix: 16) {
print(value)//132847
}

Related

CheckSum8 Modulo 256 Swift

I have an array of UInt8 and I want to calculate CheckSum8 Modulo 256.
If sum of bytes is less than 255, checkSum function returns correct value.
e.g
let bytes1 : [UInt8] = [1, 0xa1]
let validCheck = checkSum(data : bytes1) // 162 = 0xa2
let bytes : [UInt8] = [6, 0xB1, 27,0xc5,0xf5,0x9d]
let invalidCheck = checkSum(data : bytes) // 41
Below function returns 41 but expected checksum is 35.
func checkSum(data: [UInt8]) -> UInt8 {
var sum = 0
for i in 0..<data.count {
sum += Int(data[i])
}
let retVal = sum & 0xff
return UInt8(retVal)
}
Your checkSum method is largely right. If you want, you could simplify it to:
func checkSum(_ values: [UInt8]) -> UInt8 {
let result = values.reduce(0) { ($0 + UInt32($1)) & 0xff }
return UInt8(result)
}
You point out a web site that reports the checksum8 for 06B127c5f59d is 35.
The problem is that in your array has 27, not 0x27. If you have hexadecimal values, you always need the 0x prefix for each value in your array literal (or, technically, at least if the value is larger than 9).
So, consider:
let values: [UInt8] = [0x06, 0xB1, 0x27, 0xc5, 0xf5, 0x9d]
let result = checkSum(values)
That’s 53. If you want to see that in hexadecimal (like that site you referred to):
let hex = String(result, radix: 16)
That shows us that the checksum is 0x35 in hexadecimal.

Splitting a UInt16 to 2 UInt8 bytes and getting the hexa string of both. Swift

I need 16383 to be converted to 7F7F but I can only get this to be converted to 3fff or 77377.
I can convert 8192 to hexadecimal string 4000 which is essentially the same thing.
If I use let firstHexa = String(format:"%02X", a) It stops at 3fff hexadecimal for the first number and and 2000 hexadecimal for the second number. here is my code
public func intToHexString(_ int: Int16) -> String {
var encodedHexa: String = ""
if int >= -8192 && int <= 8191 {
let int16 = int + 8192
//convert to two unsigned Int8 bytes
let a = UInt8(int16 >> 8 & 0x00ff)
let b = UInt8(int16 & 0x00ff)
//convert the 2 bytes to hexadecimals
let first1Hexa = String(a, radix: 8 )
let second2Hexa = String(b, radix: 8)
let firstHexa = String(format:"%02X", a)
let secondHexa = String(format:"%02X", b)
//combine the 2 hexas into 1 string with 4 characters...adding 0 to the beggining if only 1 character.
if firstHexa.count == 1 {
let appendedFHexa = "0" + firstHexa
encodedHexa = appendedFHexa + secondHexa
} else if secondHexa.count == 1 {
let appendedSHexa = "0" + secondHexa
encodedHexa = firstHexa + appendedSHexa
} else {
encodedHexa = firstHexa + secondHexa
}
}
return encodedHexa
}
Please help ma'ams and sirs! Thanks.
From your test cases, it seems like your values are 7 bits per byte.
You want 8192 to convert to 4000.
You want 16383 to convert to 7F7F.
Note that:
(0x7f << 7) + 0x7f == 16383
Given that:
let a = UInt8((int16 >> 7) & 0x7f)
let b = UInt8(int16 & 0x7f)
let result = String(format: "%02X%02X", a , b)
This gives:
"4000" for 8128
"7F7F" for 16383
To reverse the process:
let str = "7F7F"
let value = Int(str, radix: 16)!
let result = ((value >> 8) & 0x7f) << 7 + (value & 0x7f)
print(result) // 16383

Swift 4 - efficient conversion of Double to big-endian

I need to convert a Double to big-endian in order to write it to a file, using an oil-industry binary file standard, that was originally defined for IBM half-inch 9 track tapes in the 1970s!
I need really efficient Swift 4 code, because this conversion is inside two nested-loops and will be executed upwards of 100,000 times.
You can create an UInt64 containing the big-endian representation
of the Double with
let value = 1.0
var n = value.bitPattern.bigEndian
In order to write that to a file you might need to convert it
to Data:
let data = Data(buffer: UnsafeBufferPointer(start: &n, count: 1))
print(data as NSData) // <3ff00000 00000000>
If many contiguous floating point values are written to the file
then it would be more effective to create an [UInt64] array
with the big-endian representations and convert that to Data,
for example
let values = [1.0, 2.0, 3.0, 4.0]
let array = values.map { $0.bitPattern.bigEndian }
let data = array.withUnsafeBufferPointer { Data(buffer: $0) }
(All the above compiles with Swift 3 and 4.)
I successfully implemented Martin's array suggestion. I decided I should use some "interesting" test values and one thing led to another! Here's my test playground. I hope it is of interest:
//: Playground - noun: a place where people can play
import UIKit
func convert(doubleArray: [Double]) {
let littleEndianArray = doubleArray.map { $0.bitPattern}
var data = littleEndianArray.withUnsafeBufferPointer { Data(buffer: $0) }
print("Little-endian : ", data as NSData)
// Convert and display the big-endian bytes
let bigEndianArray = doubleArray.map { $0.bitPattern.bigEndian }
data = bigEndianArray.withUnsafeBufferPointer { Data(buffer: $0) }
print("Big-endian : ", data as NSData)
}
// Values below are from:
// https://en.wikipedia.org/wiki/Double-precision_floating-point_format
let nan = Double.nan
let plusInfinity = +1.0 / 0.0
let maxDouble = +1.7976931348623157E308
let smallestNumberGreaterThanOne = +1.0000000000000002
let plusOne = +1.0
let maxSubnormalPositiveDouble = +2.2250738585072009E-308
let minSubnormalPositiveDouble = +4.9E-324
let plusZero = +0.0
let minusZero = -0.0
let maxSubnormalNegativeDouble = -4.9E-324
let minSubnormalNegativeDouble = -2.2250738585072009E-308
let minusOne = -1.0
let largestNumberLessThanOne = -1.0000000000000002
let minDouble = -1.7976931348623157E308
let minusInfinity = -1.0 / 0.0
let smallestNumber = "+1.0000000000000002"
let largestNumber = "-1.0000000000000002"
print("\n\nPrint little-endian and big-endian Doubles")
print("\n\nDisplay: NaN and +0.0 to +1.0")
print(" Min. Subnormal Max. Subnormal")
print(" Not a Number Plus Zero Positive Double Positive Double Plus One")
print(String(format: "Decimal : NaN %+8.6e %+8.6e %+8.6e %+8.6e", plusZero, minSubnormalPositiveDouble, maxSubnormalPositiveDouble, plusOne))
var doubleArray = [nan, plusZero, minSubnormalPositiveDouble, maxSubnormalPositiveDouble, plusOne]
convert(doubleArray: doubleArray)
print("\n\nDisplay: +1.0 to +Infinity")
print(" Smallest Number ")
print(" Plus One Greater Than 1.0 Max. Double +Infinity")
print(String(format: "Decimal : %+8.6e \(smallestNumber) %+8.6e%+8.6e", plusOne, maxDouble, plusInfinity))
doubleArray = [plusOne, smallestNumberGreaterThanOne, maxDouble, plusInfinity]
convert(doubleArray: doubleArray)
print("\n\nDisplay: NaN and -0.0 to -1.0")
print(" Min. Subnormal Max. Subnormal")
print(" Not a Number Minus Zero Negative Double Negative Double Minus One")
print(String(format: "Decimal : NaN %+8.6e %+8.6e %+8.6e %+8.6e", minusZero, maxSubnormalNegativeDouble, minSubnormalNegativeDouble, minusOne))
doubleArray = [nan, minusZero, maxSubnormalNegativeDouble, minSubnormalNegativeDouble, minusOne]
convert(doubleArray: doubleArray)
print("\n\nDisplay: -1.0 to -Infinity")
print(" Smallest Number ")
print(" Minus One Less Than -1.0 Min. Double -Infinity")
print(String(format: "Decimal : %+8.6e \(largestNumber) %+8.6e%+8.6e", minusOne, minDouble, minusInfinity))
doubleArray = [minusOne, largestNumberLessThanOne, minDouble, minusInfinity]
convert(doubleArray: doubleArray)

How to convert a byte to a String in Swift?

After searching for a while, I can't figure out how to get this simple result:
let byte : UInt8 = 0xF3 //Should become "F3"
I have tried this method that won't compile when passing-in either a byte or a byte array.
Two ways:
let s1 = String(byte, radix: 16, uppercase: true) // does not do 0-padding but works with
// all radices between 2 and 36
let s2 = String(format: "%02X", byte) // very similar to C

Swift 3 : Negative Int to hexadecimal

Hy everyone,
I need to transform a Int to its hexadecimal value.
Example : -40 => D8
I have a working method for positive (or unsigned) Int but it doesn't work as expected with negatives. Here's my code.
class func encodeHex(data:[Int]) -> String {
let hexadecimal = data.reduce("") { (string , element) in
var append = String(element, radix:16 , uppercase : false)
if append.characters.count == 1 {
append = "0" + append
}
return string + append
}
return hexadecimal
}
If I pass -40 I get -28.
Can anyone help ? Thanks :)
I assume from your existing code that all integers are in the range
-128 ... 127. Then this would work:
func encodeHex(data:[Int]) -> String {
return data.map { String(format: "%02hhX", $0) }.joined()
}
The "%02hhX" format prints the least significant byte of the
given integer in base 16 with 2 digits.
Example:
print(encodeHex(data: [40, -40, 127, -128]))
// 28D87F80
D8 is the last byte of binary representation of -40. The remaining three bytes are all FFs.
If you are looking for a string that represents only the last byte, you can obtain by first converting your number to unsigned 8-bit integer, and then converting it to hex, like this:
let x = UInt8(bitPattern:Int8(data))
let res = String(format:"%02X", x)