Write a String init in swift - init

In my code, an array of [Int] of size 3 has a special meaning. I want to get its string representation.
The most idiomatic in swift seems to me to be writing a new String initializer.
Something like this :
extension String {
public init(point: [Int]) {
assert(condition: point.count == 3)
let r = "x=\(point[0]) y=\(point[1]) z=(point[2])"
self.init(stringLiteral: r) // what should I write here ?? This feels clumsy ?
}
What should go at the end of this init ? I can't assign to self, and there's no other obvious init that I should call.

First of all there is a backslash missing in the String Interpolation line.
Just call self.init with r as parameter. Technically it's a convenience initializer.
extension String {
public init(point: [Int]) {
assert(point.count == 3)
let r = "x=\(point[0]) y=\(point[1]) z=\(point[2])"
self.init(r)
}
}

Related

What aren't these two ways of expressing map function equivalent?

I got a surprise today while looking at another SO question:
let s = "1,a"
let arr = s.split(separator: ",")
let result = arr.compactMap{Int($0)} // ok
let result2 = arr.compactMap(Int.init) // error
Why is line 3 legal but line 4 is not? I would have thought these two ways of saying "coerce the incoming parameter to Int if possible" would be completely equivalent.
I understand that line 4 is choking on the Subsequence, and I see how to get out of the difficulty:
let result2 = arr.map(String.init).compactMap(Int.init) // ok
What I don't understand is why they both don't choke in the same way.
Looks like the Int.init overload that accepts a Substring has the following signature:
public init?<S>(_ text: S, radix: Int = 10) where S : StringProtocol
So, Int($0) works because it uses the default radix, but there isn't an Int.init(_:) that accepts a Substring - there's only Int.init(_:radix:) that does - and so it fails.
But if there was one:
extension Int {
public init?<S>(_ text: S) where S : StringProtocol {
self.init(text, radix: 10)
}
}
then this would work:
let result1 = arr.compactMap(Int.init)
In fact the first version (Int($0)) calls this initializer, which has two parameters (one of them has a default value):
#inlinable public init?<S>(_ text: S, radix: Int = 10) where S : StringProtocol
If I define a custom initializer like so, then the second example works too.
extension Int {
init?<S>(_ string: S) where S: StringProtocol {
// convert somehow, e.g: self.init(string, radix: 10)
return nil
}
}
let result2 = arr.compactMap(Int.init)
It seems to me that if I write Int.init in the compactMap, it can call only the exact initializer (or function), and the second parameter of the first called initializer cannot be inferred.
Another example:
func test1<S>(param1: S) -> String where S: StringProtocol {
return ""
}
func test2<S>(param1: S, defaultParam: String = "") -> String where S: StringProtocol {
return ""
}
extension Sequence {
func customCompactMap<ElementOfResult>(_ transform: (Element) -> ElementOfResult?) -> [ElementOfResult] {
compactMap(transform)
}
}
arr.customCompactMap(test1)
arr.customCompactMap(test2) // error
I think the function references cannot hold any default values. Unfortunately I didn't find any official reference to this, but seems interesting.
Proof, last example:
func test3(param1: String, defaultParam: String = "") { }
let functionReference = test3
functionReference("", "")
functionReference("") // error
Here the functionReference's type is (String, String) -> (), even though the test3 function has a default value for the second parameter. As you can see functionReference cannot be called with only one value.
I tried looking for the Swift forum post where someone on the core team explained this, but sorry, I couldn't find it. You can go asking there and get clarification on this point:
Default arguments don't actually produce overloads.
Instead, using default arguments at call site is syntactic sugar for using all arguments. The compiler inserts the defaults for the ones you don't use.
A few results of that…
You cannot use functions with default arguments as closures with simplified signatures. You have to wrap them in new closures, as you demonstrated in your question.
func ƒ(_: Int = 0) { }
let intToVoid: (Int) -> Void = ƒ // compiles
// Cannot convert value of type '(Int) -> ()' to specified type '() -> Void'
let voidToVoid: () -> Void = ƒ
Methods with different default argument patterns, that look the same at call site, are not considered overrides.
class Base {
func ƒ(_: Any? = nil) -> String { "Base" }
}
final class Derived: Base {
// No `override` required.
func ƒ() -> String { "Derived" }
}
Base().ƒ() // "Base"
Derived().ƒ() // "Derived"
(Derived().ƒ as (Any?) -> String)("argument") // "Base"
Default arguments do not allow for satisfaction of protocol requirements.
protocol Protocol {
func ƒ() -> String
}
// Type 'Base' does not conform to protocol 'Protocol'
extension Base: Protocol { }

Calling a method on an Optional Double to convert to String

I've written this in order to easily format floating point numbers as strings with various levels of precision.
extension FloatingPoint {
func str(_ precision: Int) -> String {
return String(format: "%."+String(precision)+"f", self as! CVarArg)
}
}
It works great for non-optional variables with floating point types:
var myDouble: Double = 3.1415
var text = myDouble.str(2) // sets text = "3.14"
Is there a way of getting something like this to work for an optional Double?
var myNilDouble: Double? = nil
var text = myNilDouble.str(2) // I'd like it to set text = ""
I'd like the implementation to support nil and non-nil conversion to string.
You're not calling the method on a Double, you're calling it on an Optional<Double>; completely different type. So you need the method to exist on Optional:
extension Optional where Wrapped : FloatingPoint {
func string(usingPrecision precision: Int) -> String {
guard let double = self else { return "" }
return double.str(precision)
}
}
I don't like this interface, though; I don't think it's Optional's job to decide to emit an empty string. Instead I would suggest an initializer on String itself that takes an Optional floating point:
import Foundation
extension String {
init<N : FloatingPoint>(_ value: N?, precision: Int) {
guard let value = value else { self.init(); return }
self.init(format: "%.\(precision)f", value as! CVarArg)
}
}
Now you write
let value: Double? = nil
let text = String(value, precision: 2) // ""
Another option is a free function, again taking an Optional. (You can also of course make it the caller's choice as to what nil resolves to, as Sulthan said.)
The simplest solution would be to use optional chaining and nil coalescing:
var text = myNilDouble?.str(2) ?? ""
Although you might end up with many repetitions of this pattern, the advantage is that you have better control over the nil scenario, maybe in some situations you'll want to use "(null)" as default value.

A swiftier way to convert String to UnsafePointer<xmlChar> in Swift 3 (libxml2)

I'm working on a Swift 3 wrapper for the libxml2 C-library.
There are two convenience methods to convert String to UnsafePointer<xmlChar> and vice versa. In libxml2 xmlChar is declared as unsigned char.
UnsafePointer<xmlChar> to String is uncomplicated
func stringFrom(xmlchar: UnsafePointer<xmlChar>) -> String {
let string = xmlchar.withMemoryRebound(to: CChar.self, capacity: 1) {
return String(validatingUTF8: $0)
}
return string ?? ""
}
For String to UnsafePointer<xmlChar> I tried many things for example
let bytes = string.utf8CString.map{ xmlChar($0) }
return UnsafePointer<xmlChar>(bytes)
but this doesn't work, the only working solution I figured out is
func xmlCharFrom(string: String) -> UnsafePointer<xmlChar> {
let pointer = (string as NSString).utf8String
return unsafeBitCast(pointer, to: UnsafePointer<xmlChar>.self)
}
Is there a better, swiftier way without the bridge cast to NSString and unsafeBitCast?
Swiftiest way I can think of is to just use the bitPattern: initializer:
let xmlstr = str.utf8CString.map { xmlChar(bitPattern: $0) }
This will give you an Array of xmlChars. Hang onto that, and use Array's withUnsafeBufferPointer method when you need to pass an UnsafePointer to something:
xmlstr.withUnsafeBufferPointer { someAPIThatWantsAPointer($0.baseAddress!) }
Don't let the UnsafePointer escape from the closure, as it won't be valid outside it.
EDIT: How's this for a compromise? Instead of having your function return a pointer, have it take a closure.
func withXmlString<T>(from string: String, handler: (UnsafePointer<xmlChar>) throws -> T) rethrows -> T {
let xmlstr = string.utf8CString.map { xmlChar(bitPattern: $0) }
return try xmlstr.withUnsafeBufferPointer { try handler($0.baseAddress!) }
}
Or, as an extension on String:
extension String {
func withXmlString<T>(handler: (UnsafePointer<xmlChar>) throws -> T) rethrows -> T {
let xmlstr = self.utf8CString.map { xmlChar(bitPattern: $0) }
return try xmlstr.withUnsafeBufferPointer { try handler($0.baseAddress!) }
}
}
I'm working on a Swift 3 wrapper for the libxml2 C-library.
Condolences.
[...] String to UnsafePointer [is complicated]
Agree. It is complicated because it is unclear who owns the xmlChar array.
[...] the only working solution I figured out is
let pointer = (string as NSString).utf8String
This works because of the ownership semantics of -[NSString utf8String]:
Apple docs:
This C string is a pointer to a structure inside the string object, which may have a lifetime shorter than the string object and will certainly not have a longer lifetime.
So the lifetime is probably something like the current autorelease pool or even shorter, depending on the compiler's ARC optimisations and the implementation of utf8String. Definitely not safe to keep around.
Is there a better, swiftier way [...]?
Well, that depends on the use case. There's no way to handle this without thinking about the ownership of the created xmlChar buffer.
It should be clear from the API how the functions are using the passed string (even though I know that libxml2's documentation is terrible).
For situations where a string is just used during a function call it might be nice to have a scoped access function:
extension String {
func withXmlChar(block: (UnsafePointer<xmlChar>) -> ()) { ... }
}
If the function keeps the pointer around you must guarantee for the lifetime of the pointee. Probably something like a container object that keeps a Data and pointer around for some ARC maintained lifetime...
It might be worthwile to go through one of Mike Ash's recent articles which is about managing ownership of objects beyond ARC.
String has a
public init(cString: UnsafePointer<UInt8>)
initializer, therefore the conversion from an XML string to a Swift string can be simplified to
let xmlString: UnsafePointer<xmlChar> = ...
let s = String(cString: xmlString)
Ill-formed UTF-8 sequences are replaced by the Unicode replacement
character U+FFFD.
For the conversion from a Swift string to an XML string I would suggest
a similar approach as Charles Srstka, but using the
existing String.withCString method instead of creating an intermediate
array:
extension String {
func withXmlString<T>(handler: (UnsafePointer<xmlChar>) throws -> T) rethrows -> T {
return try self.withCString { try handler(UnsafeRawPointer($0).assumingMemoryBound(to: UInt8.self)) }
}
}
If the throwing option is not needed, it simplifies to
extension String {
func withXmlString<T>(handler: (UnsafePointer<xmlChar>) -> T) -> T {
return self.withCString { handler(UnsafeRawPointer($0).assumingMemoryBound(to: UInt8.self)) }
}
}

Extension for Double to accept String as initializer in Swift

Is there a way in Swift to define an extension for type Double to accept String as initializer? In a nutshell, just to figure out feasibility, I need this to work:
var double:Double = "one"
println(double) // Outputs "1.0"
I am guessing it should be made compliant to StringLiteralConvertible, but not sure about the details.
So, you want to natural-language-parse a string, and generate a floating-point number from it?
Well, the extension is the easy part. Just create a failable initializer for it:
let digits = [
"zero", "one", "two", "three",
"four", "five", "six", "seven",
"eight", "nine",
]
extension Double {
init?(fromEnglishString s: String) {
if let digit = find(digits, s) {
self.init(Double(digit))
}
else {
return nil
}
}
}
let d = Double(fromEnglishString: "one")
// d is {Some 1.0}
The hard part is going to be finding a good parser for all the ways you can express numbers in English (especially floating-point numbers). That's much more tricky. You might find this more language-agnostic answer interesting.
You could also write a StringLiteralConvertible extension for it. However, this is only for when you are initializing your value directly from a string literal at compile time – which would be a bit pointless, I mean, do you really need word-based number literals in your source code? The other problem is literal convertible initializers can't be failable, so you'll be stuck with returning a default value (maybe NaN?) if the string can't be parsed.
Nevertheless, if you really want one:
extension Double: StringLiteralConvertible {
public typealias StringLiteralType = String
public typealias UnicodeScalarLiteralType = String
public typealias ExtendedGraphemeClusterLiteralType = String
public init(unicodeScalarLiteral value: UnicodeScalarLiteralType) {
self.init(stringLiteral: value)
}
public init(extendedGraphemeClusterLiteral value: ExtendedGraphemeClusterLiteralType) {
self.init(stringLiteral: value)
}
public init(stringLiteral value: String) {
if let d = Double(fromEnglishString: value) {
self = d
} else {
self = 0.0
}
}
}
let doubleFromLiteral: Double = "three"
// doubleFromLiteral is {Some 3.0}
If you want to do exactly what your code example does... Write an extension that implements the StringLiteralConvertible protocol. There's a decent write up on the literal convertibles at NSHipster.
It'd probably look something like this:
extension Double: StringLiteralConvertible {
convenience init(value: String) {
if value == "one" {
self = 1.0
/*
Add more English-to-number conversion magic here
*/
} else {
return NaN
}
}
}
There's a bit more to it than that — StringLiteralConvertible extends a couple of other protocols whose requirements you have to meet, and then there's the whole business of translating English to numbers. You may have a feasibility problem there, but making Doubles from strings is technically possible.
On top of all that, there are more questions as to whether this is a good idea.
Literal initializers can't fail, so you have to return a sentinel value for strings you can't parse a number from. Not very swifty.
Do you actually want to convert string literals, or strings passed in at runtime? The former doesn't seem super useful. The latter requires different syntax at the call site, but lets you make clear that you're defining/using a conversion function.

Swift: how can String.join() work custom types?

for example:
var a = [1, 2, 3] // Ints
var s = ",".join(a) // EXC_BAD_ACCESS
Is it possible to make the join function return "1,2,3" ?
Extend Int (or other custom types) to conform to some protocols ?
From Xcode 7.0 beta 6 in Swift 2 now you should use [String].joinWithSeparator(",").
In your case you still need to change Int to String type, therefore I added map().
var a = [1, 2, 3] // [1, 2, 3]
var s2 = a.map { String($0) }.joinWithSeparator(",") // "1,2,3"
From Xcode 8.0 beta 1 in Swift 3 code slightly changes to
[String].joined(separator: ",").
var s3 = a.map { String($0) }.joined(separator: ",") // "1,2,3"
try this
var a = [1, 2, 3] // Ints
var s = ",".join(a.map { $0.description })
or add this extension
extension String {
func join<S : SequenceType where S.Generator.Element : Printable>(elements: S) -> String {
return self.join(map(elements){ $0.description })
}
// use this if you don't want it constrain to Printable
//func join<S : SequenceType>(elements: S) -> String {
// return self.join(map(elements){ "\($0)" })
//}
}
var a = [1, 2, 3] // Ints
var s = ",".join(a) // works with new overload of join
join is defined as
extension String {
func join<S : SequenceType where String == String>(elements: S) -> String
}
which means it takes a sequence of string, you can't pass a sequence of int to it.
And just to make your life more complete, starting from Xcode 8.0 beta 1 in Swift 3 you should NOW use [String].joined(separator: ",").
This is the new "ed/ing" naming rule for Swift APIs:
Name functions and methods according to their side-effects
Those without side-effects should read as noun phrases, e.g. x.distance(to: y), i.successor().
Those with side-effects should read as imperative verb phrases, e.g., print(x), x.sort(), x.append(y).
Name Mutating/nonmutating method pairs consistently. A mutating method will often have a nonmutating variant with similar semantics, but that returns a new value rather than updating an instance in-place.
Swift: API Design Guidelines
The simplest way is a variation of #BryanChen's answer:
",".join(a.map { String($0) } )
Even if you can't make join work for custom types, there's an easy workaround.
All you have to do is define a method on your class (or extend a built-in class) to return a string, and then map that into the join.
So, for example, we could have:
extension Int {
func toString() -> String {
return "\(self)" // trivial example here, but yours could be more complex
}
Then you can do:
let xs = [1, 2, 3]
let s = join(xs.map { $0.toString() })
I wouldn't recommend using .description for this purpose, as by default it will call .debugDescription, which is not particularly useful in production code.
In any case, it would be better to provide an explicit method for transforming into a string suitable for joining, rather than relying on a generic 'description' method which you may change at a later date.
A Swift 3 solution
public extension Sequence where Iterator.Element: CustomStringConvertible {
func joined(seperator: String) -> String {
return self.map({ (val) -> String in
"\(val)"
}).joined(separator: seperator)
}
}