Calling map on NSMutableData in swift - swift

I'm having some strange behavior with NSMutableData that I can't explain. I have a method that converts a string to a null-terminated UTF-8 array of bytes. However, if I then use "data.map(...)" to print it out, the first byte is right and the rest look like random memory. What's weird is if I make a copy with "let copy = data.copy() as! Data" and then use "copy.map(...)" it works just fine. I'm converting to NSMutableData instead of Data because that's the format the API I'm using this takes it in.
Here's code to convert a string to a UTF-8 bytes array in an NSMutableData:
public func getUtf8Bytes(of str: String) -> NSMutableData {
// Convert to a null-terminated UTF-8 NSMutableData
let utf8CStringInts: [UInt8] = str.utf8CString.map { UInt8($0) }
let count = utf8CStringInts.count
let data = NSMutableData(length: count)!
data.resetBytes(in: NSRange(location: 0, length: count))
// Copy into NSMutableData
let pointer = data.mutableBytes
var index = 0
for byte in utf8CStringInts {
pointer.storeBytes(of: byte, toByteOffset: index, as: UInt8.self)
index += 1
}
return data
}
The following will correctly print "UTF-8 Bytes: 0x31 0x32 0x33 0x00":
let utf8Data = getUtf8Bytes(of: "123")
let debugString = (utf8Data.copy() as! Data).map { String(format: "0x%02x ", $0) }.joined()
print("UTF-8 Bytes: " + debugString)
However, if I take out the copy as follows it will incorrectly print "0x31 0x00 0x00 0x00":
let utf8Data = getUtf8Bytes(of: "123")
let debugString = utf8Data.map { String(format: "0x%02x ", $0) }.joined()
print("UTF-8 Bytes: " + debugString)
Can someone explain why the results are printed correctly after copying it to a Data?

Interesting... So after some sniffing around, here's what I found.
Copying the NSMutableData is not the solution, but rather, bridging it to Data. This will work as well:
let utf8Data = getUtf8Bytes(of: "123")
let debugString = (mutableData as Data).map { String(format: "0x%02x ", $0) }.joined()
print("UTF-8 Bytes: \(debugString)")
But why? The problem appears to stem from NSData's conformance to DataProtocol (& its subsequent implicit inheritance of the Collection protocol). It's through this chain of implicit inheritance that permits the (mis)use of these generic Collection methods (e.g. subscript access, map, forEach, etc.) that are all "broken".
Furthermore, we can verify the byte contents of the NSMutableData are correct:
print((0..<mutableData.length)
.map({ String(format: "0x%02x ", mutableData.bytes.load(fromByteOffset: $0, as: UInt8.self)) })
.joined())
// Prints "0x31 0x32 0x33 0x00"
Also, there's a swift-ier way to implement getUtf8Bytes(of:):
public func getUtf8Bytes(of str: String) -> NSMutableData {
// Note: You may want to handle the force unwrapping here in a safer way...
return NSMutableData(data: (str + "\0").data(using: .utf8)!)
}

Related

How to translate Python HMAC Request into Swift

I've been at this about 10 hours now and no matter what HMAC combination I use for swift I can not get it to match the key generated by python.
Python Code:
signature = hmac.new(secret.decode('hex'), msg=datastring, digestmod=hashlib.sha256).hexdigest()
Swift Code:
let key = SymmetricKey(data: self.secret.data(using: .utf8)!)
let hexData = HMAC<SHA256>.authenticationCode(for: datastring.data(using: .utf8)!, using: key)
let signature = Data(hexData).map { String(format: "%02hhx", $0) }.joined()
Any help with what I'm doing wrong (or missing) in Swift would be greatly appreciated.
Based on the assumption that self.secret is a String containing the hex representation of the secret key, the difference between the two comes down to your use of:
self.secret.data(using: .utf8)!
which will just perform a straight conversion to the underlying bytes instead of converting each character pair into the corresponding byte, as:
secret.decode('hex')
does in Python 2.
From what I can tell, there isn't a function to do this conversion in the Swift standard library, but you could do it with something like:
func bytes(fromHex input: String) -> Data {
var result = Data()
var byte: UInt8 = 0
for (index, character) in input.enumerated() {
let codeUnit = character.utf8[character.utf8.startIndex]
var nibble: UInt8 = 0
switch codeUnit {
case 0x30..<0x3a:
nibble = codeUnit - 0x30
case 0x61..<0x67:
nibble = codeUnit - 0x57
default:
break
}
if index % 2 == 0 {
byte |= (nibble << 4)
} else {
byte |= nibble
result.append(contentsOf: [byte])
byte = 0
}
}
return result
}
and then your code would become:
let key = SymmetricKey(data: bytes(fromHex: self.secret))
let hexData = HMAC<SHA256>.authenticationCode(for: datastring.data(using: .utf8)!, using: key)
let signature = Data(hexData).map { String(format: "%02hhx", $0) }.joined()

Decode nsData to String Array

I want to decode my nsData to a String Array. I have this code right now:
func nsDataToStringArray(data: NSData) -> [String] {
var decodedStrings = [String]()
var stringTerminatorPositions = [Int]()
var currentPosition = 0
data.enumerateBytes() {
buffer, range, stop in
let bytes = UnsafePointer<UInt8>(buffer)
for i in 0 ..< range.length {
if bytes[i] == 0 {
stringTerminatorPositions.append(currentPosition)
}
currentPosition += 1
}
}
var stringStartPosition = 0
for stringTerminatorPosition in stringTerminatorPositions {
let encodedString = data.subdata(with: NSMakeRange(stringStartPosition, stringTerminatorPosition - stringStartPosition))
let decodedString = NSString(data: encodedString, encoding: String.Encoding.utf8.rawValue)! as String
decodedStrings.append(decodedString)
stringStartPosition = stringTerminatorPosition + 1
}
return decodedStrings
}
But I get an error on this line: let bytes = UnsafePointer<UInt8>(buffer)
Cannot invoke initializer for type 'UnsafePointer' with an
argument list of type '(UnsafeRawPointer)'
Do I need to convert the buffer to a UnsafePointer? If so, how can I do that?
buffer in the enumerateBytes() closure is a UnsafeRawPointer
and you have to "rebind" it to an UInt8 pointer in Swift 3:
// let bytes = UnsafePointer<UInt8>(buffer)
let bytes = buffer.assumingMemoryBound(to: UInt8.self)
But why so complicated? You can achieve the same result with
func nsDataToStringArray(nsData: NSData) -> [String] {
let data = nsData as Data
return data.split(separator: 0).flatMap { String(bytes: $0, encoding: .utf8) }
}
How does this work?
Data is a Sequence of UInt8, therefore
split(separator: 0) can be called on it, returning an array of
"data slices" (which are views into the source data, not copies).
Each "data slice" is again a Sequence of UInt8, from which a
String can be created with String(bytes: $0, encoding: .utf8).
This is a failable initializer (because the data may be invalid UTF-8).
flatMap { ... } returns an array with all non-nil results,
i.e. an array with all strings which could be created from
valid UTF-8 code sequences between zero bytes.

Convert hex-encoded String to String

I want to convert following hex-encoded String in Swift 3:
dcb04a9e103a5cd8b53763051cef09bc66abe029fdebae5e1d417e2ffc2a07a4
to its equivalant String:
Ü°J:\ص7cï ¼f«à)ýë®^A~/ü*¤
Following websites do the job very fine:
http://codebeautify.org/hex-string-converter
http://string-functions.com/hex-string.aspx
But I am unable to do the same in Swift 3. Following code doesn't do the job too:
func convertHexStringToNormalString(hexString:String)->String!{
if let data = hexString.data(using: .utf8){
return String.init(data:data, encoding: .utf8)
}else{ return nil}
}
Your code doesn't do what you think it does. This line:
if let data = hexString.data(using: .utf8){
means "encode these characters into UTF-8." That means that "01" doesn't encode to 0x01 (1), it encodes to 0x30 0x31 ("0" "1"). There's no "hex" in there anywhere.
This line:
return String.init(data:data, encoding: .utf8)
just takes the encoded UTF-8 data, interprets it as UTF-8, and returns it. These two methods are symmetrical, so you should expect this whole function to return whatever it was handed.
Pulling together Martin and Larme's comments into one place here. This appears to be encoded in Latin-1. (This is a really awkward way to encode this data, but if it's what you're looking for, I think that's the encoding.)
import Foundation
extension Data {
// From http://stackoverflow.com/a/40278391:
init?(fromHexEncodedString string: String) {
// Convert 0 ... 9, a ... f, A ...F to their decimal value,
// return nil for all other input characters
func decodeNibble(u: UInt16) -> UInt8? {
switch(u) {
case 0x30 ... 0x39:
return UInt8(u - 0x30)
case 0x41 ... 0x46:
return UInt8(u - 0x41 + 10)
case 0x61 ... 0x66:
return UInt8(u - 0x61 + 10)
default:
return nil
}
}
self.init(capacity: string.utf16.count/2)
var even = true
var byte: UInt8 = 0
for c in string.utf16 {
guard let val = decodeNibble(u: c) else { return nil }
if even {
byte = val << 4
} else {
byte += val
self.append(byte)
}
even = !even
}
guard even else { return nil }
}
}
let d = Data(fromHexEncodedString: "dcb04a9e103a5cd8b53763051cef09bc66abe029fdebae5e1d417e2ffc2a07a4")!
let s = String(data: d, encoding: .isoLatin1)
You want to use the hex encoded data as an AES key, but the
data is not a valid UTF-8 sequence. You could interpret
it as a string in ISO Latin encoding, but the AES(key: String, ...)
initializer converts the string back to its UTF-8 representation,
i.e. you'll get different key data from what you started with.
Therefore you should not convert it to a string at all. Use the
extension Data {
init?(fromHexEncodedString string: String)
}
method from hex/binary string conversion in Swift
to convert the hex encoded string to Data and then pass that
as an array to the AES(key: Array<UInt8>, ...) initializer:
let hexkey = "dcb04a9e103a5cd8b53763051cef09bc66abe029fdebae5e1d417e2ffc2a07a4"
let key = Array(Data(fromHexEncodedString: hexkey)!)
let encrypted = try AES(key: key, ....)
There is still a way to convert the key from hex to readable string by adding the below extension:
extension String {
func hexToString()->String{
var finalString = ""
let chars = Array(self)
for count in stride(from: 0, to: chars.count - 1, by: 2){
let firstDigit = Int.init("\(chars[count])", radix: 16) ?? 0
let lastDigit = Int.init("\(chars[count + 1])", radix: 16) ?? 0
let decimal = firstDigit * 16 + lastDigit
let decimalString = String(format: "%c", decimal) as String
finalString.append(Character.init(decimalString))
}
return finalString
}
func base64Decoded() -> String? {
guard let data = Data(base64Encoded: self) else { return nil }
return String(data: data, encoding: .init(rawValue: 0))
}
}
Example of use:
let hexToString = secretKey.hexToString()
let base64ReadableKey = hexToString.base64Decoded() ?? ""

Encoding and Decoding Strings using UnsafeMutablePointer in Swift

I'm having trouble converting strings to and from UnsafeMutablePointers. The following code doesn't work, returning the wrong string.
// func rfcommChannelData(rfcommChannel: IOBluetoothRFCOMMChannel!, data dataPointer: UnsafeMutablePointer<Void>, length dataLength: Int)
func receivingData(data dataPointer: UnsafeMutablePointer<Void>, length dataLength: Int) {
let data = NSData(bytes: dataPointer, length: dataLength)
println("str = \(NSString(data: data, encoding: NSASCIIStringEncoding))")
}
// - (IOReturn)writeSync:(void *)data length:(UInt16)length;
func sendingData(data: UnsafeMutablePointer<Void>, length: UInt16) {
receivingData(data: data, length: Int(length))
}
var str: NSString = "Hello, playground"
var data = str.dataUsingEncoding(NSASCIIStringEncoding)!
var bytes = data.bytes
sendingData(&bytes, UInt16(data.length))
A link to the playground file is here. If anyone has experience using UnsafeMutablePointers in Swift for strings, I would very much appreciate some guidance as I have made no progress in the last few days. Thanks again!
With
var bytes = data.bytes
sendingData(&bytes, UInt16(data.length))
you pass the address of the variable bytes itself to the function, so what you see is the bytes that are used to represent that pointer.
What you probably want is
let str = "Hello, playground"
let data = str.dataUsingEncoding(NSASCIIStringEncoding)!
sendingData(UnsafeMutablePointer(data.bytes), UInt16(data.length))
to pass the pointer to the data bytes.
You should also consider
to use NSUTF8StringEncoding instead, because a conversion to
NSASCIIStringEncoding can fail.

Swift - converting from UnsafePointer<UInt8> with length to String

I considered a lot of similar questions, but still can't get the compiler to accept this.
Socket Mobile API (in Objective-C) passes ISktScanDecodedData into a delegate method in Swift (the data may be binary, which I suppose is why it's not provided as string):
func onDecodedData(device: DeviceInfo?, DecodedData d: ISktScanDecodedData?) {
let symbology: String = d!.Name()
let rawData: UnsafePointer<UInt8> = d!.getData()
let rawDataSize: UInt32 = decoded!.getDataSize()
// want a String (UTF8 is OK) or Swifty byte array...
}
In C#, this code converts the raw data into a string:
string s = Marshal.PtrToStringAuto(d.GetData(), d.GetDataSize());
In Swift, I can get as far as UnsafeArray, but then I'm stuck:
let rawArray = UnsafeArray<UInt8>(start: rawData, length: Int(rawDataSize))
Alternatively I see String.fromCString and NSString.stringWithCharacters, but neither will accept the types of arguments at hand. If I could convert from UnsafePointer<UInt8> to UnsafePointer<()>, for example, then this would be available (though I'm not sure if it would even be safe):
NSData(bytesNoCopy: UnsafePointer<()>, length: Int, freeWhenDone: Bool)
Is there an obvious way to get a string out of all this?
This should work:
let data = NSData(bytes: rawData, length: Int(rawDataSize))
let str = String(data: data, encoding: NSUTF8StringEncoding)
Update for Swift 3:
let data = Data(bytes: rawData, count: Int(rawDataSize))
let str = String(data: data, encoding: String.Encoding.utf8)
The resulting string is nil if the data does not represent
a valid UTF-8 sequence.
How about this, 'pure' Swift 2.2 instead of using NSData:
public extension String {
static func fromCString
(cs: UnsafePointer<CChar>, length: Int!) -> String?
{
if length == .None { // no length given, use \0 standard variant
return String.fromCString(cs)
}
let buflen = length + 1
var buf = UnsafeMutablePointer<CChar>.alloc(buflen)
memcpy(buf, cs, length))
buf[length] = 0 // zero terminate
let s = String.fromCString(buf)
buf.dealloc(buflen)
return s
}
}
and Swift 3:
public extension String {
static func fromCString
(cs: UnsafePointer<CChar>, length: Int!) -> String?
{
if length == nil { // no length given, use \0 standard variant
return String(cString: cs)
}
let buflen = length + 1
let buf = UnsafeMutablePointer<CChar>.allocate(capacity: buflen)
memcpy(buf, cs, length)
buf[length] = 0 // zero terminate
let s = String(cString: buf)
buf.deallocate(capacity: buflen)
return s
}
}
Admittedly it's a bit stupid to alloc a buffer and copy the data just to add the zero terminator.
Obviously, as mentioned by Zaph, you need to make sure your assumptions about the string encoding are going to be right.