Array to String in Swift, stringInterpolationSegment - swift

I am trying to convert an array of enums to a string in Swift. My enum is Printable and has a description property.
I thought this would work:
", ".join(a.map { String($0) })
but the compiler complains
Missing argument label 'stringInterpolationSegment:' in call
So, I follow the suggestion,
", ".join(a.map { String(stringInterpolationSegment: $0) })
But I do not understand:
Why is the argument label needed?
What is the type of stringInterpolationSegment?

You can't call a String initializer with your enum type because there isn't an initializer that takes that type.
There are a number of initializers for String that have the stringInterpolationSegment argument and they each implement it for a different type. The types include Bool, Float, Int, and Character among others. When all else fails, there is a generic fallback:
/// Create an instance containing `expr`\ 's `print` representation
init<T>(stringInterpolationSegment expr: T)
This is the version that is being called for your enum since it isn't one of the supported types.
Note, you can also do the following which is more succinct:
", ".join(a.map { toString($0) })
and you can skip the closure expression (thanks for pointing that out #Airspeed Velocity):
", ".join(a.map(toString))

As #vacawama points out, the error message is a bit of a red herring, and you can use map and toString to convert it.
But what’s nice is, if you’ve already implemented Printable, then the array’s implementation of Printable will also use it, so you can just do toString(a) to get a similar output.

Related

How to solve generic type issue once we introduce one possible conformation to it in Swift?

I am working on this custom function:
func customPrint<T: CustomStringConvertible>(_ value: T...) {
var string: String = ""
value.forEach { item in
string += " " + String(describing: item)
}
print(string)
}
My issue is when I use one type and then use other type! for example this would work fine:
customPrint("hello", "100")
But this would not:
customPrint("hello", 100)
Xcode issue:
Cannot convert value of type 'Int' to expected argument type 'String'
The error is understandable for me, but my goal is to be able use String or Int beside together for feeding my function.
So how can i let my generic works for all types that conform to generic? obviously String and Int conforming to CustomStringConvertible, but they cannot get used both together for my function, looking to solve this issue.
Here what i am trying to solve:
I am thinking to use type Any like Any... where this Any type can conform to CustomStringConvertible.
Since generics constraint the type of the arguments to be uniform, I would suggest doing something like below:
func customPrint(_ value: CustomStringConvertible...) {
var string: String = ""
value.forEach { item in
string += " " + String(describing: item)
}
print(string)
}
Edit:
Thanks to vacawama, Any would also work with String(describing:):
func customPrint(_ value: Any...) {
print(value.map({String(describing: $0)}).joined(separator: " "))
}
When you are defining generic type T, you're actually giving the specific type for parameters that you will pass. If you give String parameter first, the function expects all the parameter types as String. If you give Int parameter first, function expects all parameters as Int. You should not use generic in this function. NoeOnJupiter's answer can be usable in your case.

Using type as a value, why is the "self" keyword required here?

I'm currently learning type as a value in functions and wrote this sample code to play around:
import Foundation
class Animal {
func sound() {
print("Generic animal noises")
}
}
func foo(_ t:Animal) {
print("Hi")
}
foo(Animal) //Cannot convert value of type 'Animal.Type' to expected argument type 'Animal'
I'm not surprised by this result. Obviously you cant pass the type itself as an argument where an instance of that type is expected. But notice that the compiler says that the argument I passed was of type Animal.Type. So if I did this, it should compile right?
func foo(_ t:Animal.Type) {
print("Hi")
}
foo(Animal) //Expected member name or constructor call after type name
This is what really confuses me a heck ton, the compiler told me it was of type Animal.Type *but after making this change it once again shows an error.
Of course I listened to the fix Swift suggests and do:
foo(Animal.self) //Works correctly
But my biggest question is: WHY? Isn't Animal itself the type? Why does the compiler require me to use Animal.self to get the type? This really confuses me, I would like for some guidance.
Self-answering, with help of comments, I was able to find out the reason:
Using .self after the type name is called Postfix Self Expression:
A postfix self expression consists of an expression or the name of a
type, immediately followed by .self. It has the following forms:
expression.self
type.self
The first form evaluates to the value of the expression. For example, x.self evaluates to x.
The second form evaluates to the value of the type. Use this form to access a type as a value. For example, because SomeClass.self evaluates to the SomeClass type itself, you can pass it to a function or method that accepts a type-level argument.
Thus, the .self keyword is required to consider the type as a value capable of being passed as an argument to functions.

Why does string.contains ("text") not work with strings in swift 5?

let string = "hello Swift"
if string.contains("Swift") {
print("exists")
}
Cannot convert value of type 'String' to expected argument type 'String.Element' (aka 'Character')
Why is version 5 such an error and what should I do?
When you use contains() and pass it a String, Swift tries to use an overload of the function that takes some kind of string like contains(_ other: StringProtocol) function that is not part of the pure Swift String. Instead it finds contains(_ element: Character) and it can't accept String as an argument and only accepts 'String.Element' (aka 'Character'). Referring to contains
The function you are looking for is defined in a protocol that String conforms to it, called StringProtocol that lives inside the Foundation.
So if you need it, make sure you import Foundation or a higher level framework like UIKit.

How do you provide a default argument for a generic function with type constraints?

The following function definition is legal Swift:
func doSomething<T: StringProtocol>(value: T = "abc") {
// ...
}
The compiler is able to determine that the default argument "abc" is a String, and String conforms to StringProtocol.
This code however does not compile:
func doSomething<T: Collection>(value: T = "abc") where T.Element == Character {
// ...
}
The compiler error:
Default argument value of type 'String' cannot be converted to type 'T'
It seems as though the compiler would have just as much information as in the first case to determine that String is indeed convertible to T. Furthermore, if I remove the default argument and call the function with the same value, it works:
doSomething(value: "abc")
Can this function be written differently so that I can provide the default String argument? Is this a limitation of Swift, or simply a limitation of my mental model?
The significant constraint is T: ExpressibleByStringLiteral. That's what allows something to be initialized from a string literal.
func doSomething<T: Collection>(value: T = "abc")
where T.Element == Character, T: ExpressibleByStringLiteral {
// ...
}
As Leo Dabus notes, T.Element == Character is technically not necessary, but removing it changes the meaning. Just because something is a collection and can be initialized by a string literal does not mean that its elements are characters.
It's also worth noting that while all of this is possible, it generally is poor Swift IMO. Swift does not have any way to express what the default type is, so doSomething() in all of these cases causes "Generic parameter 'T' could not be inferred".
The correct solution IMO is an overload, which avoids all of these problems:
func doSomething<T: StringProtocol>(value: T) {
}
func doSomething() {
doSomething(value: "abc")
}
This allows you to make the default parameter not just "something that can be initialized with the literal "abc"", but what you really mean: the default value is the String "abc".
As a rule, default parameters are just conveniences for overloads, so you can generally replace any default parameter with an explicit overload that lacks that parameter.

Using init() in map()

TL;DR
Why doesn't this work?
"abcdefg".characters.map(String.init) // error: type of expression is ambiguous without more context
Details
One really cool thing I like in Swift is the ability to convert a collection of one thing to another by passing in an init method (assuming an init() for that type exists).
Here's an example converting a list of tuples to instances of ClosedInterval.
[(1,3), (3,4), (4,5)].map(ClosedInterval.init)
That example also takes advantage of the fact that we can pass a tuple of arguments as a single argument as long as the tuple matches the function's argument list.
Here another example, this time converting a list of numbers to string instances.
(1...100).map(String.init)
Unfortunately, the next example does not work. Here I am trying to split up a string into a list of single-character strings.
"abcdefg".characters.map(String.init) // error: type of expression is ambiguous without more context
map() should be operating on a list of Character (and indeed I was able to verify in a playground that Swift infers the correct type of [Character] here being passed into map).
String definitely can be instantiated from a Character.
let a: Character = "a"
String(a) // this works
And interestingly, this works if the characters are each in their own array.
"abcdefg".characters.map { [$0] }.map(String.init)
Or the equivalent:
let cx2: [[Character]] = [["a"], ["b"], ["c"], ["d"]]
cx2.map(String.init)
I know that I could do this:
"abcdefg".characters.map { String($0) }
But I am specifically trying to understand why "abcdefg".characters.map(String.init) does not work (IMO this syntax is also more readable and elegant)
Simplified repro:
String.init as Character -> String
// error: type of expression is ambiguous without more context
This is because String has two initializers that accept one Character:
init(_ c: Character)
init(stringInterpolationSegment expr: Character)
As far as I know, there is no way to disambiguate them when using the initializer as a value.
As for (1...100).map(String.init), String.init is referred as Int -> String. Although there are two initializers that accept one Int:
init(stringInterpolationSegment expr: Int)
init<T : _SignedIntegerType>(_ v: T)
Generic type is weaker than explicit type. So the compiler choose stringInterpolationSegment: one in this case. You can confirm that by command + click on .init.