Binary to hexadecimal in Swift - swift

I have a string in binary (for example "00100100"), and I want it in hexadecimal (like "24").
Is there a method written to convert Binary to Hexadecimal in Swift?

A possible solution:
func binToHex(bin : String) -> String {
// binary to integer:
let num = bin.withCString { strtoul($0, nil, 2) }
// integer to hex:
let hex = String(num, radix: 16, uppercase: true) // (or false)
return hex
}
This works as long as the numbers fit into the range of UInt (32-bit or 64-bit,
depending on the platform). It uses the BSD library function strtoul() which converts a string to an integer according to a given base.
For larger numbers you have to process the input
in chunks. You might also add a validation of the input string.
Update for Swift 3/4: The strtoul function is no longer needed.
Return nil for invalid input:
func binToHex(_ bin : String) -> String? {
// binary to integer:
guard let num = UInt64(bin, radix: 2) else { return nil }
// integer to hex:
let hex = String(num, radix: 16, uppercase: true) // (or false)
return hex
}

let binaryInteger = 0b1
// Your binary number
let hexadecimalNum = String(binaryInteger, radix: 16)
// convert into string format in whatever base you want
For more information
let decimalInteger = 15 // prefix NONE
let binaryInteger = 0b10001 // prefix 0b
let octalInteger = 0o21 // prefix 0o
let hexadecimalInteger = 0x11 // prefix 0x

Related

How to convert my bytes data to Hex String and then Signed integer (32-bit) Two's complement from that ..?

I have a data like a bellow:
let data = Data(bytes: [206, 66, 49, 62])
Then I used this extension (from How to convert Data to hex string in swift) to convert to a hex string:
extension Data {
struct HexEncodingOptions: OptionSet {
let rawValue: Int
static let upperCase = HexEncodingOptions(rawValue: 1 << 0)
}
func hexEncodedString(options: HexEncodingOptions = []) -> String {
let hexDigits = Array((options.contains(.upperCase) ? "0123456789ABCDEF" : "0123456789abcdef").utf16)
var chars: [unichar] = []
chars.reserveCapacity(2 * count)
for byte in self {
chars.append(hexDigits[Int(byte / 16)])
chars.append(hexDigits[Int(byte % 16)])
}
return String(utf16CodeUnits: chars, count: chars.count)
}
}
And then it is giving "ce42313e" as hex string. Now I am trying to convert this to Signed integer (32-bit) Two's complement .. I tried a couple of ways but not find anything perfectly.
When I give "ce42313e" in this bellow link under hex decimal the value is -834522818
http://www.binaryconvert.com/convert_signed_int.html
bellow is one of those I tried to convert "ce42313e" to int and it's giving me 3460444478 ..instead of -834522818 .
let str = value
let number = Int(str, radix: 16)
Please help out to get that value.
Int(str, radix: 16) interprets the string as the hexadecimal
representation of an unsigned number. You could convert it to
Int32 with
let data = Data(bytes: [206, 66, 49, 62])
let str = data.hexEncodedString()
print(str) // ce42313e
let number = Int32(truncatingBitPattern: Int(str, radix: 16)!)
print(number) // -834522818
But actually you don't need the hex representation for that purpose.
Your data is the big-endian representation of a signed 32-bit integer,
and this is how you can get the number from the data directly:
let data = Data(bytes: [206, 66, 49, 62])
let number = Int32(bigEndian: data.withUnsafeBytes { $0.pointee })
print(number) // -834522818

Swift 4 hex string to binary string

Noted that the old method to convert a hex string to a binary string has been removed from swift i.e. : String(hex, radix: 2) -> binary string
What is an alternative in swift 4?
You need first to convert your hexaString to an Array of Bytes [UInt8]. Then you can use String(_, radix:) to convert the bytes to binary. Note that if you would like to return a String instead of an array of strings [String] you would need to add leading zeros to make your binary strings length consistent (8 characters):
extension String {
typealias Byte = UInt8
var hexaToBytes: [Byte] {
var start = startIndex
return stride(from: 0, to: count, by: 2).compactMap { _ in // use flatMap for older Swift versions
let end = index(after: start)
defer { start = index(after: end) }
return Byte(self[start...end], radix: 16)
}
}
var hexaToBinary: String {
return hexaToBytes.map {
let binary = String($0, radix: 2)
return repeatElement("0", count: 8-binary.count) + binary
}.joined()
}
}
let hexString = "00ff01fe"
hexString.hexaToBinary // "00000000111111110000000111111110"
I don't recall any function that would convert a hex string to another string of arbitrary radix. Perhaps you are thinking about the initializer functions that convert between strings and integer values (and vice versa) using an arbitrary radix:
let hex = "00ff01fe"
let value = UInt64(hex, radix: 16)!
let binary = String(value, radix: 2)
let paddedBinary = repeatElement("0", count: 64 - binary.count) + binary
But that only applies when the hex string represents a 64 bit value, but it illustrates the basic idea. Convert to some integer type, and then convert back to binary, padding it out with zeros.
If you have a hex string that is longer than that, you cannot use the above. But you can map the individual characters of your hex string to numeric values, build binary representation of each, zero pad them, and use joined to concatenate them together:
let hex = "ffeeddccbbaa99887766554433221100"
let result = hex.compactMap { c -> String? in
guard let value = Int(String(c), radix: 16) else { return nil }
let string = String(value, radix: 2)
return repeatElement("0", count: 4 - string.count) + string
}.joined()

Dealing with Octal numbers in swift

I can easily turn a decimal number into an octal but I'm trying to do the reverse and I've got stuck.
let decimal = 11_224_393
let octString = String(rawAddress, radix: 8, uppercase: false)
let octal = octString.toInt()
Question
I want a function that given an Int of octal digits will read it in as an octal and convert it to decimal
such as:
// oct2dec(777) = 511
// oct2dec(10) = 8
func oct2dec(octal : Int) -> Int {
// what goes here?
}
Just use Swift native octal literals and initializers (String from integer | Int from String).
let octalInt = 0o1234 // 668
let octalString = "1234" // "1234"
let decimalIntFromOctalString = Int(octalString, radix: 0o10) // 668
let octalStringFromInt = String(octalInt, radix: 0o10) // "1234"
For your specific use-case:
let decimal = 11_224_393
let octString = String(rawAddress, radix: 0o10, uppercase: false)
guard let octal = Int(octString, radix: 0o10) else {
print("octString was not a valid octal string:", octString)
}
Using string conversion functions is pretty horrible in my option. How about something like this instead:
func octalToDecimal(var octal: Int) -> Int {
var decimal = 0, i = 0
while octal != 0 {
var remainder = octal % 10
octal /= 10
decimal += remainder * Int(pow(8, Double(i++)))
}
return decimal
}
var decimal = octalToDecimal(777) // decimal is 511

Convert emoji to hex value using Swift

I'm trying to convert emojis in hex values, I found some code online to do it but it's only working using Objective C, how to do the same with Swift?
This is a "pure Swift" method, without using Foundation:
let smiley = "😊"
let uni = smiley.unicodeScalars // Unicode scalar values of the string
let unicode = uni[uni.startIndex].value // First element as an UInt32
print(String(unicode, radix: 16, uppercase: true))
// Output: 1F60A
Note that a Swift Character represents a "Unicode grapheme cluster"
(compare Strings in Swift 2 from the Swift blog) which can
consist of several "Unicode scalar values". Taking the example
from #TomSawyer's comment below:
let zero = "0️⃣"
let uni = zero.unicodeScalars // Unicode scalar values of the string
let unicodes = uni.map { $0.value }
print(unicodes.map { String($0, radix: 16, uppercase: true) } )
// Output: ["30", "FE0F", "20E3"]
If some one trying to found a way to convert Emoji To Unicode string
extension String {
func decode() -> String {
let data = self.data(using: .utf8)!
return String(data: data, encoding: .nonLossyASCII) ?? self
}
func encode() -> String {
let data = self.data(using: .nonLossyASCII, allowLossyConversion: true)!
return String(data: data, encoding: .utf8)!
}
}
Example:
"😍".encode()
RESULT: \ud83d\ude0d
"\ud83d\ude0d".decode()
RESULT: 😍
It works similarly but pay attention when you're printing it:
import Foundation
var smiley = "😊"
var data: NSData = smiley.dataUsingEncoding(NSUTF32LittleEndianStringEncoding, allowLossyConversion: false)!
var unicode:UInt32 = UInt32()
data.getBytes(&unicode)
// println(unicode) // Prints the decimal value
println(NSString(format:"%2X", unicode)) // Print the hex value of the smiley

How can I create a String from UTF8 in Swift?

We know we can print each character in UTF8 code units?
Then, if we have code units of these characters, how can we create a String with them?
With Swift 5, you can choose one of the following ways in order to convert a collection of UTF-8 code units into a string.
#1. Using String's init(_:) initializer
If you have a String.UTF8View instance (i.e. a collection of UTF-8 code units) and want to convert it to a string, you can use init(_:) initializer. init(_:) has the following declaration:
init(_ utf8: String.UTF8View)
Creates a string corresponding to the given sequence of UTF-8 code units.
The Playground sample code below shows how to use init(_:):
let string = "Café 🇫🇷"
let utf8View: String.UTF8View = string.utf8
let newString = String(utf8View)
print(newString) // prints: Café 🇫🇷
#2. Using Swift's init(decoding:as:) initializer
init(decoding:as:) creates a string from the given Unicode code units collection in the specified encoding:
let string = "Café 🇫🇷"
let codeUnits: [Unicode.UTF8.CodeUnit] = Array(string.utf8)
let newString = String(decoding: codeUnits, as: UTF8.self)
print(newString) // prints: Café 🇫🇷
Note that init(decoding:as:) also works with String.UTF8View parameter:
let string = "Café 🇫🇷"
let utf8View: String.UTF8View = string.utf8
let newString = String(decoding: utf8View, as: UTF8.self)
print(newString) // prints: Café 🇫🇷
#3. Using transcode(_:from:to:stoppingOnError:into:) function
The following example transcodes the UTF-8 representation of an initial string into Unicode scalar values (UTF-32 code units) that can be used to build a new string:
let string = "Café 🇫🇷"
let bytes = Array(string.utf8)
var newString = ""
_ = transcode(bytes.makeIterator(), from: UTF8.self, to: UTF32.self, stoppingOnError: true, into: {
newString.append(String(Unicode.Scalar($0)!))
})
print(newString) // prints: Café 🇫🇷
#4. Using Array's withUnsafeBufferPointer(_:) method and String's init(cString:) initializer
init(cString:) has the following declaration:
init(cString: UnsafePointer<CChar>)
Creates a new string by copying the null-terminated UTF-8 data referenced by the given pointer.
The following example shows how to use init(cString:) with a pointer to the content of a CChar array (i.e. a well-formed UTF-8 code unit sequence) in order to create a string from it:
let bytes: [CChar] = [67, 97, 102, -61, -87, 32, -16, -97, -121, -85, -16, -97, -121, -73, 0]
let newString = bytes.withUnsafeBufferPointer({ (bufferPointer: UnsafeBufferPointer<CChar>)in
return String(cString: bufferPointer.baseAddress!)
})
print(newString) // prints: Café 🇫🇷
#5. Using Unicode.UTF8's decode(_:) method
To decode a code unit sequence, call decode(_:) repeatedly until it returns UnicodeDecodingResult.emptyInput:
let string = "Café 🇫🇷"
let codeUnits = Array(string.utf8)
var codeUnitIterator = codeUnits.makeIterator()
var utf8Decoder = Unicode.UTF8()
var newString = ""
Decode: while true {
switch utf8Decoder.decode(&codeUnitIterator) {
case .scalarValue(let value):
newString.append(Character(Unicode.Scalar(value)))
case .emptyInput:
break Decode
case .error:
print("Decoding error")
break Decode
}
}
print(newString) // prints: Café 🇫🇷
#6. Using String's init(bytes:encoding:) initializer
Foundation gives String a init(bytes:encoding:) initializer that you can use as indicated in the Playground sample code below:
import Foundation
let string = "Café 🇫🇷"
let bytes: [Unicode.UTF8.CodeUnit] = Array(string.utf8)
let newString = String(bytes: bytes, encoding: String.Encoding.utf8)
print(String(describing: newString)) // prints: Optional("Café 🇫🇷")
It's possible to convert UTF8 code points to a Swift String idiomatically using the UTF8 Swift class. Although it's much easier to convert from String to UTF8!
import Foundation
public class UTF8Encoding {
public static func encode(bytes: Array<UInt8>) -> String {
var encodedString = ""
var decoder = UTF8()
var generator = bytes.generate()
var finished: Bool = false
do {
let decodingResult = decoder.decode(&generator)
switch decodingResult {
case .Result(let char):
encodedString.append(char)
case .EmptyInput:
finished = true
/* ignore errors and unexpected values */
case .Error:
finished = true
default:
finished = true
}
} while (!finished)
return encodedString
}
public static func decode(str: String) -> Array<UInt8> {
var decodedBytes = Array<UInt8>()
for b in str.utf8 {
decodedBytes.append(b)
}
return decodedBytes
}
}
func testUTF8Encoding() {
let testString = "A UTF8 String With Special Characters: 😀🍎"
let decodedArray = UTF8Encoding.decode(testString)
let encodedString = UTF8Encoding.encode(decodedArray)
XCTAssert(encodedString == testString, "UTF8Encoding is lossless: \(encodedString) != \(testString)")
}
Of the other alternatives suggested:
Using NSString invokes the Objective-C bridge;
Using UnicodeScalar is error-prone because it converts UnicodeScalars directly to Characters, ignoring complex grapheme clusters; and
Using String.fromCString is potentially unsafe as it uses pointers.
improve on Martin R's answer
import AppKit
let utf8 : CChar[] = [65, 66, 67, 0]
let str = NSString(bytes: utf8, length: utf8.count, encoding: NSUTF8StringEncoding)
println(str) // Output: ABC
import AppKit
let utf8 : UInt8[] = [0xE2, 0x82, 0xAC, 0]
let str = NSString(bytes: utf8, length: utf8.count, encoding: NSUTF8StringEncoding)
println(str) // Output: €
What happened is Array can be automatic convert to CConstVoidPointer which can be used to create string with NSSString(bytes: CConstVoidPointer, length len: Int, encoding: Uint)
Swift 3
let s = String(bytes: arr, encoding: .utf8)
I've been looking for a comprehensive answer regarding string manipulation in Swift myself. Relying on cast to and from NSString and other unsafe pointer magic just wasn't doing it for me. Here's a safe alternative:
First, we'll want to extend UInt8. This is the primitive type behind CodeUnit.
extension UInt8 {
var character: Character {
return Character(UnicodeScalar(self))
}
}
This will allow us to do something like this:
let codeUnits: [UInt8] = [
72, 69, 76, 76, 79
]
let characters = codeUnits.map { $0.character }
let string = String(characters)
// string prints "HELLO"
Equipped with this extension, we can now being modifying strings.
let string = "ABCDEFGHIJKLMONP"
var modifiedCharacters = [Character]()
for (index, utf8unit) in string.utf8.enumerate() {
// Insert a "-" every 4 characters
if index > 0 && index % 4 == 0 {
let separator: UInt8 = 45 // "-" in ASCII
modifiedCharacters.append(separator.character)
}
modifiedCharacters.append(utf8unit.character)
}
let modifiedString = String(modifiedCharacters)
// modified string == "ABCD-EFGH-IJKL-MONP"
// Swift4
var units = [UTF8.CodeUnit]()
//
// update units
//
let str = String(decoding: units, as: UTF8.self)
I would do something like this, it may be not such elegant than working with 'pointers' but it does the job well, those are pretty much about a bunch of new += operators for String like:
#infix func += (inout lhs: String, rhs: (unit1: UInt8)) {
lhs += Character(UnicodeScalar(UInt32(rhs.unit1)))
}
#infix func += (inout lhs: String, rhs: (unit1: UInt8, unit2: UInt8)) {
lhs += Character(UnicodeScalar(UInt32(rhs.unit1) << 8 | UInt32(rhs.unit2)))
}
#infix func += (inout lhs: String, rhs: (unit1: UInt8, unit2: UInt8, unit3: UInt8, unit4: UInt8)) {
lhs += Character(UnicodeScalar(UInt32(rhs.unit1) << 24 | UInt32(rhs.unit2) << 16 | UInt32(rhs.unit3) << 8 | UInt32(rhs.unit4)))
}
NOTE: you can extend the list of the supported operators with overriding + operator as well, defining a list of the fully commutative operators for String.
and now you are able to append a String with a unicode (UTF-8, UTF-16 or UTF-32) character like e.g.:
var string: String = "signs of the Zodiac: "
string += (0x0, 0x0, 0x26, 0x4b)
string += (38)
string += (0x26, 76)
This is a possible solution (now updated for Swift 2):
let utf8 : [CChar] = [65, 66, 67, 0]
if let str = utf8.withUnsafeBufferPointer( { String.fromCString($0.baseAddress) }) {
print(str) // Output: ABC
} else {
print("Not a valid UTF-8 string")
}
Within the closure, $0 is a UnsafeBufferPointer<CChar> pointing to the array's contiguous storage. From that a Swift String can be created.
Alternatively, if you prefer the input as unsigned bytes:
let utf8 : [UInt8] = [0xE2, 0x82, 0xAC, 0]
if let str = utf8.withUnsafeBufferPointer( { String.fromCString(UnsafePointer($0.baseAddress)) }) {
print(str) // Output: €
} else {
print("Not a valid UTF-8 string")
}
If you're starting with a raw buffer, such as from the Data object returned from a file handle (in this case, taken from a Pipe object):
let data = pipe.fileHandleForReading.readDataToEndOfFile()
var unsafePointer = UnsafeMutablePointer<UInt8>.allocate(capacity: data.count)
data.copyBytes(to: unsafePointer, count: data.count)
let output = String(cString: unsafePointer)
There is Swift 3.0 version of Martin R answer
public class UTF8Encoding {
public static func encode(bytes: Array<UInt8>) -> String {
var encodedString = ""
var decoder = UTF8()
var generator = bytes.makeIterator()
var finished: Bool = false
repeat {
let decodingResult = decoder.decode(&generator)
switch decodingResult {
case .scalarValue(let char):
encodedString += "\(char)"
case .emptyInput:
finished = true
case .error:
finished = true
}
} while (!finished)
return encodedString
}
public static func decode(str: String) -> Array<UInt8> {
var decodedBytes = Array<UInt8>()
for b in str.utf8 {
decodedBytes.append(b)
}
return decodedBytes
}
}
If you want show emoji from UTF-8 string, just user convertEmojiCodesToString method below. It is working properly for strings like "U+1F52B" (emoji) or "U+1F1E6 U+1F1F1" (country flag emoji)
class EmojiConverter {
static func convertEmojiCodesToString(_ emojiCodesString: String) -> String {
let emojies = emojiCodesString.components(separatedBy: " ")
var resultString = ""
for emoji in emojies {
var formattedCode = emoji
formattedCode.slice(from: 2, to: emoji.length)
formattedCode = formattedCode.lowercased()
if let charCode = UInt32(formattedCode, radix: 16),
let unicode = UnicodeScalar(charCode) {
let str = String(unicode)
resultString += "\(str)"
}
}
return resultString
}
}