Converting a String to UnsafeMutablePointer<UInt16> - swift

I'm trying to use a library which was written in C. I've imported .a and .h files at Xcode project, and checked it works properly. I've already made them working on Objective-C, and now for Swift.
A problem I've got is functions' arguments. There's a function requires an argument widechar(defined as typedef Unsigned short int in Library), which was UnsafeMutablePointer<UInt16> in Swift. The function translates it and return the result.
So I should convert a String to UnsafeMutablePointer<UInt16>. I tried to find the right way to converting it, but I've only got converting it to UnsafeMutablePointer<UInt8>. I couldn't find answer/information about converting String to UnsafeMutablePointer<UInt16>.
Here's a source code I've written.
extension String{
var utf8CString: UnsafePointer<Int8> {
return UnsafePointer((self as NSString).utf8String!)
}
}
func translate(toBraille: String, withTable: String) -> [String]? {
let filteredString = toBraille.onlyAlphabet
let table = withTable.utf8CString
var inputLength = CInt(filteredString.count)
var outputLength = CInt(maxBufferSize)
let inputValue = UnsafeMutablePointer<widechar>.allocate(capacity: Int(outputLength))
let outputValue = UnsafeMutablePointer<widechar>.allocate(capacity: Int(outputLength))
lou_translateString(table, inputValue, &inputLength, outputValue, &outputLength, nil, nil, 0)
//This is a function that I should use.
let result:[String] = []
return result
}

You have to create an array with the UTF-16 representation of the Swift
string that you can pass to the function, and on return create
a Swift string from the UTF-16 array result.
Lets assume for simplicity that the C function is imported to Swift as
func translateString(_ source: UnsafeMutablePointer<UInt16>, _ sourceLen: UnsafeMutablePointer<CInt>,
_ dest: UnsafeMutablePointer<UInt16>, _ destLen: UnsafeMutablePointer<CInt>)
Then the following should work (explanations inline):
// Create array with UTF-16 representation of source string:
let sourceString = "Hello world"
var sourceUTF16 = Array(sourceString.utf16)
var sourceLength = CInt(sourceUTF16.count)
// Allocate array for UTF-16 representation of destination string:
let maxBufferSize = 1000
var destUTF16 = Array<UInt16>(repeating: 0, count: maxBufferSize)
var destLength = CInt(destUTF16.count)
// Call translation function:
translateString(&sourceUTF16, &sourceLength, &destUTF16, &destLength)
// Create Swift string from UTF-16 representation in destination buffer:
let destString = String(utf16CodeUnits: destUTF16, count: Int(destLength))
I have assumed that the C function updates destLength to reflect
the actual length of the translated string on return.

Related

Why does Swift return an unexpected pointer when converting an optional String into an UnsafePointer?

I noticed some unusual behaviour when working with a C library which took strings in as const char * (which is converted to Swift as UnsafePointer<Int8>!); passing a String worked as expected, but a String? seemed to corrupt the input. Consider the test I wrote:
func test(_ input: UnsafePointer<UInt8>?) {
if let string = input {
print(string[0], string[1], string[2], string[3], string[4], string[5])
} else {
print("nil")
}
}
let input: String = "Hello"
test(input)
This works as expected, printing a null-terminated list of UTF-8 bytes for the input string: 72 101 108 108 111 0
However, if I change the input to an optional string, so that it becomes:
let input: String? = "Hello"
I get a completely different set of values in the result (176 39 78 23 1 0), even though I would expect it to be the same. Passing in nil works as expected.
The C library's function allows NULL in place of a string, and I sometimes want to pass that in in Swift as well, so it makes sense for the input string to be an optional.
Is this a bug in Swift, or was Swift not designed to handle this case? Either way, what's the best way to handle this case?
Edit
It appears to have something to do with multiple arguments. The C function:
void multiString(const char *arg0, const char *arg1, const char *arg2, const char *arg3) {
printf("%p: %c %c %c\n", arg0, arg0[0], arg0[1], arg0[2]);
printf("%p: %c %c %c\n", arg1, arg1[0], arg1[1], arg1[2]);
printf("%p: %c %c %c\n", arg2, arg2[0], arg2[1], arg2[2]);
printf("%p: %c %c %c\n", arg3, arg3[0], arg3[1], arg3[2]);
}
Swift:
let input0: String? = "Zero"
let input1: String? = "One"
let input2: String? = "Two"
let input3: String? = "Three"
multiString(input0, input1, input2, input3)
Results in:
0x101003170: T h r
0x101003170: T h r
0x101003170: T h r
0x101003170: T h r
It appears that there's a bug with how Swift handles multiple arguments.
I didn't find anything useful on if this is desired behaviour or just a bug.
The pragmatic solution would probably be to just have a proxy method like this, but you probably did something similar already.
func proxy(_ str: String?, _ functionToProxy: (UnsafePointer<UInt8>?) -> ()) {
if let str = str {
functionToProxy(str)
} else {
functionToProxy(nil)
}
}
proxy(input, test)
Did you test if it was working in Swift 2? They changed something maybe related in Swift 3:
https://github.com/apple/swift-evolution/blob/master/proposals/0055-optional-unsafe-pointers.md
Just to be clear, there is a workaround until Apple fixes this. Unwrap your optional Strings before passing them and everything will work fine.
var anOptional: String?
var anotherOptional: String?
func mySwiftFunc() {
let unwrappedA = anOptional!
let unwrappedB = anotherOptional!
myCStringFunc(unwrappedA, unwrappedB)
}
As mentioned in the comments, this is a clear bug in Swift.
Here's a workaround I'm using. If you can't trust Swift to convert the strings to pointers for you, then you've got to do it yourself.
Assuming C function defined as:
void multiString(const char *arg0, const char *arg1, const char *arg2);
Swift code:
func callCFunction(arg0: String?, arg1: String?, arg2: String?) {
let dArg0 = arg0?.data(using: .utf8) as NSData?
let pArg0 = dArg0?.bytes.assumingMemoryBound(to: Int8.self)
let dArg1 = arg1?.data(using: .utf8) as NSData?
let pArg1 = dArg1?.bytes.assumingMemoryBound(to: Int8.self)
let dArg2 = arg2?.data(using: .utf8) as NSData?
let pArg2 = dArg2?.bytes.assumingMemoryBound(to: Int8.self)
multiString(pArg1, pArg2, pArg3)
}
Warning:
Don't be tempted to put this in a function like:
/* DO NOT USE -- BAD CODE */
func ocstr(_ str: String?) -> UnsafePointer<Int8>? {
guard let str = str else {
return nil
}
let nsd = str.data(using: .utf8)! as NSData
//This pointer is invalid on return:
return nsd.bytes.assumingMemoryBound(to: Int8.self)
}
which would remove repeated code. This doesn't work because the data object nsd gets deallocated at the end of the function. The pointer is therefore not valid on return.

Swift3: Proper way to convert string to null-terminated C-string

I am interfacing with libxml2 in swift, and the C APIs binding (still) produce UnsafePointer<Int8>! for c-strings. Whereas Swift APIs normally result in UnsafePointer<UInt8>!.
So my question is - am I doing the string to null-terminated C-string in a proper way?
let cfilePath = unsafeBitCast(myStringString.nulTerminatedUTF8.withUnsafeBufferPointer { $0.baseAddress }, to: UnsafePointer<Int8>.self)
Should I instead prefer using some other method instead of just bypassing Swift type checking with interpreting UInt8 bytes as Int8 bytes?
I'm not sure this solves your problem exactly but for a project where I am sending strings over bluetooth this did the trick:
extension String {
var nullTerminated: Data? {
if var data = self.data(using: String.Encoding.utf8) {
data.append(0)
return data
}
return nil
}
}
Use like this
let data = "asfasf".nullTerminated
I can't find the function the other answers are referencing: nulTerminatedUTF8. Maybe it already does this.
don't use unsafeBitCast for that!!
let cstr = "alpha".nulTerminatedUTF8
let int8arr = cstr.map{ Int8(bitPattern: $0) }
let uint8arr = Array(cstr)
print(int8arr.dynamicType, uint8arr.dynamicType)
// Array<Int8> Array<UInt8>
update
let uint8: UInt8 = 200
let int8 = Int8(bitPattern: uint8)
print(uint8, int8)
// 200 -56

Convert UInt8 Array to String

I have decrypted using AES (CrytoSwift) and am left with an UInt8 array. What's the best approach to covert the UInt8 array into an appripriate string? Casting the array only gives back a string that looks exactly like the array. (When done in Java, a new READABLE string is obtained when casting Byte array to String).
I'm not sure if this is new to Swift 2, but at least the following works for me:
let chars: [UInt8] = [ 49, 50, 51 ]
var str = String(bytes: chars, encoding: NSUTF8StringEncoding)
In addition, if the array is formatted as a C string (trailing 0), these work:
str = String.fromCString(UnsafePointer(chars)) // UTF-8 is implicit
// or:
str = String(CString: UnsafePointer(chars), encoding: NSUTF8StringEncoding)
I don't know anything about CryptoSwift. But I can read the README:
For your convenience CryptoSwift provides two function to easily convert array of bytes to NSData and other way around:
let data = NSData.withBytes([0x01,0x02,0x03])
let bytes:[UInt8] = data.arrayOfBytes()
So my guess would be: call NSData.withBytes to get an NSData. Now you can presumably call NSString(data:encoding:) to get a string.
SWIFT 3.1
Try this:
let decData = NSData(bytes: enc, length: Int(enc.count))
let base64String = decData.base64EncodedString(options: .lineLength64Characters)
This is string output
Extensions allow you to easily modify the framework to fit your needs, essentially building your own version of Swift (my favorite part, I love to customize). Try this one out, put at the end of your view controller and call in viewDidLoad():
func stringToUInt8Extension() {
var cache : [UInt8] = []
for byte : UInt8 in 97..<97+26 {
cache.append(byte)
print(byte)
}
print("The letters of the alphabet are \(String(cache))")
}
extension String {
init(_ bytes: [UInt8]) {
self.init()
for b in bytes {
self.append(UnicodeScalar(b))
}
}
}

User input in swift toInt() returning nil

i'm trying to create a simple user input but the only function that i have found in swift is this one
func input() -> String {
var keyboard = NSFileHandle.fileHandleWithStandardInput()
var inputData = keyboard.availableData
return NSString(data: inputData, encoding:NSUTF8StringEncoding) as String!
}
then i'm trying to convert input() to an Int for a mathematic operation (i'm using 1 as the input) with this
var inputToInt = input().toInt()!
in this point i get only nil i don't know what to do.
Swift 2.0 has a function called readLine() - using it is a much better idea than rolling your own.
http://swiftdoc.org/swift-2/func/readLine/
Your conversion to Int fails because the string contains newline. You can use this to clean it up:
let s = NSString(data: inputData, encoding:NSUTF8StringEncoding) as String!
if s.hasSuffix("\n") {
return s.substringToIndex(x.endIndex.predecessor())
} else {
return s
}

Swift: How to convert a String to UInt8 array?

How do you convert a String to UInt8 array?
var str = "test"
var ar : [UInt8]
ar = str
Lots of different ways, depending on how you want to handle non-ASCII characters.
But the simplest code would be to use the utf8 view:
let string = "hello"
let array: [UInt8] = Array(string.utf8)
Note, this will result in multi-byte characters being represented as multiple entries in the array, i.e.:
let string = "é"
print(Array(string.utf8))
prints out [195, 169]
There’s also .nulTerminatedUTF8, which does the same thing, but then adds a nul-character to the end if your plan is to pass this somewhere as a C string (though if you’re doing that, you can probably also use .withCString or just use the implicit conversion for bridged C functions.
let str = "test"
let byteArray = [UInt8](str.utf8)
swift 4
func stringToUInt8Array(){
let str:String = "Swift 4"
let strToUInt8:[UInt8] = [UInt8](str.utf8)
print(strToUInt8)
}
I came to this question looking for how to convert to a Int8 array. This is how I'm doing it, but surely there's a less loopy way:
Method on an Extension for String
public func int8Array() -> [Int8] {
var retVal : [Int8] = []
for thing in self.utf16 {
retVal.append(Int8(thing))
}
return retVal
}
Note: storing a UTF-16 encoded character (2 bytes) in an Int8 (1 byte) will lead to information loss.