I am trying to get a two character hex value from an integer:
let hex = String(format:"%2X", 0)
print ("hex = \(hex)")
hex = "0"
How can I format String to result in always 2 characters, in this case I would want
hex = "00"
You can add a padding 0 before the formatter string:
let hex = String(format:"%02X", 0)
Result:
let hex = String(format:"%02X", 0) // 00
let hex = String(format:"%02X", 15) // 0F
let hex = String(format:"%02X", 16) // 10
Related
I'm declaring the following constants and printing their values, but when I print they always print in their decimal form. How do I get them to print in their originally defined form?
let binaryInteger = 0b10001
let octalInteger = 0o21
let hexadecimalInteger = 0x11
print("Binary Integer: \(binaryInteger) and in binary \(binaryInteger)")
print("Octal Integer: \(octalInteger) and in octal \(octalInteger)")
print("Hexadecimal Integer: \(hexadecimalInteger) and in hexadecimal \(hexadecimalInteger)")
The output I'd like to see for the first print statement would be:
Binary Integer: 17 and in binary 10001.
What do I need to put in front of the \( to get it to print in binary/octal/hex?
You have to convert the integers to given String representation manually:
print("Binary Integer: \(binaryInteger) and in binary \(String(binaryInteger, radix: 2))")
print("Octal Integer: \(octalInteger) and in octal \(String(octalInteger, radix: 8))")
print("Hexadecimal Integer: \(hexadecimalInteger) and in hexadecimal \(String(hexadecimalInteger, radix: 16))")
Of course, you might also create custom String interpolation to be able to use something like the following:
print("Hexadecimal Integer: \(hexadecimalInteger) and in hexadecimal \(hexadecimalInteger, radix: 16)")
See an example here
Basic answer is given by Silthan, I would also wrap it in the structure, which helps to preserve the original format:
enum Radix: Int {
case binary = 2
case octal = 8
case hexa = 16
var prefix: String {
switch self {
case .binary:
return "0b"
case .octal:
return "0o"
case .hexa:
return "0x"
}
}
}
struct PresentableNumber<T> where T: BinaryInteger & CustomStringConvertible {
let number: T
let radix: Radix
var originalPresentation: String {
return radix.prefix + String(number, radix: radix.rawValue)
}
}
extension PresentableNumber: CustomStringConvertible {
var description: String {
return number.description
}
}
This way you can define and print your numbers like this:
let binaryInteger = PresentableNumber(number: 0b10001, radix: .binary)
let octalInteger = PresentableNumber(number: 0o21, radix: .octal)
let hexadecimalInteger = PresentableNumber(number: 0x11, radix: .hexa)
print("Binary Integer: \(binaryInteger) and in binary \(binaryInteger.originalPresentation)")
print("Octal Integer: \(octalInteger) and in octal \(octalInteger.originalPresentation)")
print("Hexadecimal Integer: \(hexadecimalInteger) and in hexadecimal \(hexadecimalInteger.originalPresentation)")
The output:
Binary Integer: 17 and in binary 0b10001
Octal Integer: 17 and in octal 0o21
Hexadecimal Integer: 17 and in hexadecimal 0x11
I have an array of UInt8 and I want to calculate CheckSum8 Modulo 256.
If sum of bytes is less than 255, checkSum function returns correct value.
e.g
let bytes1 : [UInt8] = [1, 0xa1]
let validCheck = checkSum(data : bytes1) // 162 = 0xa2
let bytes : [UInt8] = [6, 0xB1, 27,0xc5,0xf5,0x9d]
let invalidCheck = checkSum(data : bytes) // 41
Below function returns 41 but expected checksum is 35.
func checkSum(data: [UInt8]) -> UInt8 {
var sum = 0
for i in 0..<data.count {
sum += Int(data[i])
}
let retVal = sum & 0xff
return UInt8(retVal)
}
Your checkSum method is largely right. If you want, you could simplify it to:
func checkSum(_ values: [UInt8]) -> UInt8 {
let result = values.reduce(0) { ($0 + UInt32($1)) & 0xff }
return UInt8(result)
}
You point out a web site that reports the checksum8 for 06B127c5f59d is 35.
The problem is that in your array has 27, not 0x27. If you have hexadecimal values, you always need the 0x prefix for each value in your array literal (or, technically, at least if the value is larger than 9).
So, consider:
let values: [UInt8] = [0x06, 0xB1, 0x27, 0xc5, 0xf5, 0x9d]
let result = checkSum(values)
That’s 53. If you want to see that in hexadecimal (like that site you referred to):
let hex = String(result, radix: 16)
That shows us that the checksum is 0x35 in hexadecimal.
In my project, I communicate with a bluetooth device, the bluetooth device must send me a timestamp second, I received in byte:
[2,6,239]
When I convert converted to a string:
let payloadString = payload.map {
String(format: "%02x", $0)
}
Output:
["02", "06","ef"]
When I converted from the website 0206ef = 132847 seconds
How can I directly convert my aray [2,6,239] in second (= 132847 seconds)?
And if it's complicated then translate my array ["02", "06,"ef"] in second (= 132847 seconds)
The payload contains the bytes of the binary representation of the value.
You convert it back to the value by shifting each byte into its corresponding position:
let payload: [UInt8] = [2, 6, 239]
let value = Int(payload[0]) << 16 + Int(payload[1]) << 8 + Int(payload[2])
print(value) // 132847
The important point is to convert the bytes to integers before shifting, otherwise an overflow error would occur. Alternatively,
with multiplication:
let value = (Int(payload[0]) * 256 + Int(payload[1])) * 256 + Int(payload[2])
or
let value = payload.reduce(0) { $0 * 256 + Int($1) }
The last approach works with an arbitrary number of bytes – as long as
the result fits into an Int. For 4...8 bytes you better choose UInt64
to avoid overflow errors:
let value = payload.reduce(0) { $0 * 256 + UInt64($1) }
payloadString string can be reduced to hexStr and then converted to decimal
var payload = [2,6,239];
let payloadString = payload.map {
String(format: "%02x", $0)
}
//let hexStr = payloadString.reduce(""){$0 + $1}
let hexStr = payloadString.joined()
if let value = UInt64(hexStr, radix: 16) {
print(value)//132847
}
I need 16383 to be converted to 7F7F but I can only get this to be converted to 3fff or 77377.
I can convert 8192 to hexadecimal string 4000 which is essentially the same thing.
If I use let firstHexa = String(format:"%02X", a) It stops at 3fff hexadecimal for the first number and and 2000 hexadecimal for the second number. here is my code
public func intToHexString(_ int: Int16) -> String {
var encodedHexa: String = ""
if int >= -8192 && int <= 8191 {
let int16 = int + 8192
//convert to two unsigned Int8 bytes
let a = UInt8(int16 >> 8 & 0x00ff)
let b = UInt8(int16 & 0x00ff)
//convert the 2 bytes to hexadecimals
let first1Hexa = String(a, radix: 8 )
let second2Hexa = String(b, radix: 8)
let firstHexa = String(format:"%02X", a)
let secondHexa = String(format:"%02X", b)
//combine the 2 hexas into 1 string with 4 characters...adding 0 to the beggining if only 1 character.
if firstHexa.count == 1 {
let appendedFHexa = "0" + firstHexa
encodedHexa = appendedFHexa + secondHexa
} else if secondHexa.count == 1 {
let appendedSHexa = "0" + secondHexa
encodedHexa = firstHexa + appendedSHexa
} else {
encodedHexa = firstHexa + secondHexa
}
}
return encodedHexa
}
Please help ma'ams and sirs! Thanks.
From your test cases, it seems like your values are 7 bits per byte.
You want 8192 to convert to 4000.
You want 16383 to convert to 7F7F.
Note that:
(0x7f << 7) + 0x7f == 16383
Given that:
let a = UInt8((int16 >> 7) & 0x7f)
let b = UInt8(int16 & 0x7f)
let result = String(format: "%02X%02X", a , b)
This gives:
"4000" for 8128
"7F7F" for 16383
To reverse the process:
let str = "7F7F"
let value = Int(str, radix: 16)!
let result = ((value >> 8) & 0x7f) << 7 + (value & 0x7f)
print(result) // 16383
Hy everyone,
I need to transform a Int to its hexadecimal value.
Example : -40 => D8
I have a working method for positive (or unsigned) Int but it doesn't work as expected with negatives. Here's my code.
class func encodeHex(data:[Int]) -> String {
let hexadecimal = data.reduce("") { (string , element) in
var append = String(element, radix:16 , uppercase : false)
if append.characters.count == 1 {
append = "0" + append
}
return string + append
}
return hexadecimal
}
If I pass -40 I get -28.
Can anyone help ? Thanks :)
I assume from your existing code that all integers are in the range
-128 ... 127. Then this would work:
func encodeHex(data:[Int]) -> String {
return data.map { String(format: "%02hhX", $0) }.joined()
}
The "%02hhX" format prints the least significant byte of the
given integer in base 16 with 2 digits.
Example:
print(encodeHex(data: [40, -40, 127, -128]))
// 28D87F80
D8 is the last byte of binary representation of -40. The remaining three bytes are all FFs.
If you are looking for a string that represents only the last byte, you can obtain by first converting your number to unsigned 8-bit integer, and then converting it to hex, like this:
let x = UInt8(bitPattern:Int8(data))
let res = String(format:"%02X", x)