Splitting a UInt16 to 2 UInt8 bytes and getting the hexa string of both. Swift - swift

I need 16383 to be converted to 7F7F but I can only get this to be converted to 3fff or 77377.
I can convert 8192 to hexadecimal string 4000 which is essentially the same thing.
If I use let firstHexa = String(format:"%02X", a) It stops at 3fff hexadecimal for the first number and and 2000 hexadecimal for the second number. here is my code
public func intToHexString(_ int: Int16) -> String {
var encodedHexa: String = ""
if int >= -8192 && int <= 8191 {
let int16 = int + 8192
//convert to two unsigned Int8 bytes
let a = UInt8(int16 >> 8 & 0x00ff)
let b = UInt8(int16 & 0x00ff)
//convert the 2 bytes to hexadecimals
let first1Hexa = String(a, radix: 8 )
let second2Hexa = String(b, radix: 8)
let firstHexa = String(format:"%02X", a)
let secondHexa = String(format:"%02X", b)
//combine the 2 hexas into 1 string with 4 characters...adding 0 to the beggining if only 1 character.
if firstHexa.count == 1 {
let appendedFHexa = "0" + firstHexa
encodedHexa = appendedFHexa + secondHexa
} else if secondHexa.count == 1 {
let appendedSHexa = "0" + secondHexa
encodedHexa = firstHexa + appendedSHexa
} else {
encodedHexa = firstHexa + secondHexa
}
}
return encodedHexa
}
Please help ma'ams and sirs! Thanks.

From your test cases, it seems like your values are 7 bits per byte.
You want 8192 to convert to 4000.
You want 16383 to convert to 7F7F.
Note that:
(0x7f << 7) + 0x7f == 16383
Given that:
let a = UInt8((int16 >> 7) & 0x7f)
let b = UInt8(int16 & 0x7f)
let result = String(format: "%02X%02X", a , b)
This gives:
"4000" for 8128
"7F7F" for 16383
To reverse the process:
let str = "7F7F"
let value = Int(str, radix: 16)!
let result = ((value >> 8) & 0x7f) << 7 + (value & 0x7f)
print(result) // 16383

Related

Swift: Byte array to decimal value

In my project, I communicate with a bluetooth device, the bluetooth device must send me a timestamp second, I received in byte:
[2,6,239]
When I convert converted to a string:
let payloadString = payload.map {
String(format: "%02x", $0)
}
Output:
["02", "06","ef"]
When I converted from the website 0206ef = 132847 seconds
How can I directly convert my aray [2,6,239] in second (= 132847 seconds)?
And if it's complicated then translate my array ["02", "06,"ef"] in second (= 132847 seconds)
The payload contains the bytes of the binary representation of the value.
You convert it back to the value by shifting each byte into its corresponding position:
let payload: [UInt8] = [2, 6, 239]
let value = Int(payload[0]) << 16 + Int(payload[1]) << 8 + Int(payload[2])
print(value) // 132847
The important point is to convert the bytes to integers before shifting, otherwise an overflow error would occur. Alternatively,
with multiplication:
let value = (Int(payload[0]) * 256 + Int(payload[1])) * 256 + Int(payload[2])
or
let value = payload.reduce(0) { $0 * 256 + Int($1) }
The last approach works with an arbitrary number of bytes – as long as
the result fits into an Int. For 4...8 bytes you better choose UInt64
to avoid overflow errors:
let value = payload.reduce(0) { $0 * 256 + UInt64($1) }
payloadString string can be reduced to hexStr and then converted to decimal
var payload = [2,6,239];
let payloadString = payload.map {
String(format: "%02x", $0)
}
//let hexStr = payloadString.reduce(""){$0 + $1}
let hexStr = payloadString.joined()
if let value = UInt64(hexStr, radix: 16) {
print(value)//132847
}

Change negative number to binary in Swift

I have tried converting Integer value of -1 to binary using
String(-1, radix: 2)
and it supposed to print out 0b1000 but it printed out -1.
I have tried using another number and looking for another tutorial, but it doesn't specify the negative numbers.
Is there any best practice for this? Thank you.
func printBinary4(x: Int) {
let numBit = 4
var i = x
if i < 0 {
i = 0b1 << numBit + i
}
var str = String(i, radix: 2)
if str.characters.count < numBit {
str = String(repeatElement("0", count: numBit - str.characters.count)) + str
}
print(str)
}
printBinary4(x: -1)
printBinary4(x: -2)
printBinary4(x: -3)
printBinary4(x: 0)
printBinary4(x: 4)
1111
1110
1101
0000
0100

Swift concatenating Int16 and Int32 vars into a multi-byte entity

I am writing code to read and write file using a rigidly defined binary format. I have a mixture of Int16 and Int32 values, which are defined as a 240 byte structure. How do I concatenate these values into this single structure?
here is an example, you can custom your own init method, use loop if there two many repeated length
public struct HistoryRecord {
public let sequence: UInt8
let impFlag: UInt8
let mode: UInt8
let power: UInt16
let utc: UInt32
let weight: UInt16
let impedance: UInt16
private let sequenceLength = 1
private let modeLength = 1
private let powerLength = 2
private let utcLength = 4
private let weightLength = 2
private let impedanceLength = 2
public init() {
sequence = 0
impFlag = 0
mode = 0
power = 0
utc = 0
weight = 0
impedance = 0
}
public init?(_ data: NSData) {
let needLength = sequenceLength + modeLength + powerLength + utcLength + weightLength + impedanceLength
guard data.length == needLength else {
return nil
}
var bufferUInt8: UInt8
var bufferUInt16: UInt16
var bufferUInt32: UInt32
bufferUInt8 = 0
data.getBytes(&bufferUInt8, range: NSMakeRange(0, sequenceLength))
sequence = bufferUInt8
bufferUInt8 = 0
data.getBytes(&bufferUInt8, range: NSMakeRange(sequenceLength, modeLength))
impFlag = (bufferUInt8 >> 4) & 0x00ff
mode = bufferUInt8 & 0x00ff
bufferUInt16 = 0
data.getBytes(&bufferUInt16, range: NSMakeRange(sequenceLength + modeLength, powerLength))
power = bufferUInt16.bigEndian
bufferUInt32 = 0
data.getBytes(&bufferUInt32, range: NSMakeRange(sequenceLength + modeLength + powerLength, utcLength))
utc = bufferUInt32.bigEndian
bufferUInt16 = 0
data.getBytes(&bufferUInt16, range: NSMakeRange(sequenceLength + modeLength + powerLength + utcLength, weightLength))
weight = bufferUInt16.bigEndian
bufferUInt16 = 0
data.getBytes(&bufferUInt16, range: NSMakeRange(sequenceLength + modeLength + powerLength + utcLength + weightLength, impedanceLength))
impedance = bufferUInt16.bigEndian

How in swift to convert Int16 to two UInt8 Bytes

I have some binary data that encodes a two byte value as a signed integer.
bytes[1] = 255 // 0xFF
bytes[2] = 251 // 0xF1
Decoding
This is fairly easy - I can extract an Int16 value from these bytes with:
Int16(bytes[1]) << 8 | Int16(bytes[2])
Encoding
This is where I'm running into issues. Most of my data spec called for UInt and that is easy but I'm having trouble extracting the two bytes that make up an Int16
let nv : Int16 = -15
UInt8(nv >> 8) // fail
UInt8(nv) // fail
Question
How would I extract the two bytes that make up an Int16 value
You should work with unsigned integers:
let bytes: [UInt8] = [255, 251]
let uInt16Value = UInt16(bytes[0]) << 8 | UInt16(bytes[1])
let uInt8Value0 = uInt16Value >> 8
let uInt8Value1 = UInt8(uInt16Value & 0x00ff)
If you want to convert UInt16 to bit equivalent Int16 then you can do it with specific initializer:
let int16Value: Int16 = -15
let uInt16Value = UInt16(bitPattern: int16Value)
And vice versa:
let uInt16Value: UInt16 = 65000
let int16Value = Int16(bitPattern: uInt16Value)
In your case:
let nv: Int16 = -15
let uNv = UInt16(bitPattern: nv)
UInt8(uNv >> 8)
UInt8(uNv & 0x00ff)
You could use init(truncatingBitPattern: Int16) initializer:
let nv: Int16 = -15
UInt8(truncatingBitPattern: nv >> 8) // -> 255
UInt8(truncatingBitPattern: nv) // -> 241
I would just do this:
let a = UInt8(nv >> 8 & 0x00ff) // 255
let b = UInt8(nv & 0x00ff) // 241
extension Int16 {
var twoBytes : [UInt8] {
let unsignedSelf = UInt16(bitPattern: self)
return [UInt8(truncatingIfNeeded: unsignedSelf >> 8),
UInt8(truncatingIfNeeded: unsignedSelf)]
}
}
var test : Int16 = -15
test.twoBytes // [255, 241]

Swift - Turn Int to binary representations

I receive an Int from my server which I’d like to explode in to an array of bit masks. So for example, if my server gives me the number 3, we get two values, a binary 1 and a binary 2.
How do I do this in Swift?
You could use:
let number = 3
//radix: 2 is binary, if you wanted hex you could do radix: 16
let str = String(number, radix: 2)
println(str)
prints "11"
let number = 79
//radix: 2 is binary, if you wanted hex you could do radix: 16
let str = String(number, radix: 16)
println(str)
prints "4f"
I am not aware of any nice built-in way, but you could use this:
var i = 3
let a = 0..<8
var b = a.map { Int(i & (1 << $0)) }
// b = [1, 2, 0, 0, 0, 0, 0, 0]
Here is a straightforward implementation:
func intToMasks(var n: Int) -> [Int] {
var masks = [Int]()
var mask = 1
while n > 0 {
if n & mask > 0 {
masks.append(mask)
n -= mask
}
mask <<= 1
}
return masks
}
println(intToMasks(3)) // prints "[1,2]"
println(intToMasks(1000)) // prints "[8,32,64,128,256,512]"
public extension UnsignedInteger {
/// The digits that make up this number.
/// - Parameter radix: The base the result will use.
func digits(radix: Self = 10) -> [Self] {
sequence(state: self) { quotient in
guard quotient > 0
else { return nil }
let division = quotient.quotientAndRemainder(dividingBy: radix)
quotient = division.quotient
return division.remainder
}
.reversed()
}
}
let digits = (6 as UInt).digits(radix: 0b10) // [1, 1, 0]
digits.reversed().enumerated().map { $1 << $0 } // [0, 2, 4]
Reverse the result too, if you need it.