Understanding why this Swift tuple assignment isn't allowed - swift

The following code is fine:
func intTest() -> Int? {
var res : Int
res = 5
return res
}
There is no problem returning a non-optional Int from a method with a return type of optional Int.
Now lets see what happens when the return value is a tuple of optional values:
func tupleTest() -> (Int?, Int?) {
var res : (Int, Int)
res = (5, 5)
return res
}
The return res line results in the error:
error: cannot express tuple conversion '(Int, Int)' to '(Int?, Int?)' (aka '(Optional, Optional)')
Why does Swift not allow this?
I certainly understand why this would be an error in the other direction (optional to non-optional) but why can't a tuple accept a non-optional value in the place of an optional when a non-tuple works fine?
I've seen Swift tuple to Optional assignment but that doesn't cover why you can't do the assignment, it just covers how to solve it.

The reason that's happening is because of a type mismatch. Consider the following case:
let myClosure: (Int) -> ()
let myOtherClosure: (Int?) -> ()
The type of myClosure is (Int) -> () while myOtherClosure is of type (Int?) -> () which makes them fundamentally different types, despite the similarity of the parameters. When Swift is looking at these two constants, it's evaluating the type as a whole, not the individual pieces. The same is happening with your tuple; it's looking at the tuple type signature as a whole unit, not breaking down the parameters and recognizing that they're non-optional versions of the same type.
Converting from Int to Int? works because they're the same type, one is just an optional. Going from (Int, Int) to (Int?, Int?) doesn't because the parameters of the tuple are different therefore causing the overall type to be different, hence your error.
Taking a look at one of your examples:
func tupleTest() -> (Int?, Int?) {
let first: Int = 1
let second: Int = 2
let res: (Int?, Int?) = (first, second)
return res
}
This works because even though the values are non-optional integers, the tuple type is marked as (Int?, Int?) whereas:
func tupleTest() -> (Int?, Int?) {
let first: Int = 1
let second: Int = 2
let res = (first, second)
return res
}
doesn't compile because now res is of type (Int, Int). I'm not an expert on the Swift compiler but my guess is that the way the type system works is it doesn't deconstruct each individual part of a function or a tuple and recognize that the corresponding return parameter is the same type, just an optional.

Related

Is there any overhead in wrapping a function in a closure to pass it as argument

In swift, you can pass functions as parameters to functions accepting closures. This is particularly useful to avoid syntactically polluting your code when using operators. For instance, you can write a sum as follows:
let values = 0 ..< 10
let sum = values.reduce(0, +)
Unfortunately, overloaded functions can lead to ambiguous situations when Swift’s inference is unable to determine the type of the expected closure from other arguments. Consider the code below for instance. The last line does not compile because Swift cannot decide what “version” of + I am referring to.
func castAndCombine<T, U>(_ pair: (Any, Any), with fn: (T, T) -> U) -> U? {
guard let first = pair.0 as? T, let second = pair.1 as? T
else { return nil }
return fn(first, second)
}
// The following line cannot compile.
let x = castAndCombine((1, 2), with: +)
Unfortunately, there isn’t (or at least I am not aware of) any way to specify which + I mean. Nonetheless, I came up with two solutions to this problem:
Add a parameter to the function to disambiguate the situation:
func castAndCombine<T, U>(_ pair: (Any, Any), toType: T.Type, with fn: (T, T) -> U) -> U? {
// ...
}
let x = castAndCombine((1, 2), toType: Int.self, with: +)
Leave the function’s signature unchanged and use a closure with explicit type annotations:
func castAndCombine<T, U>(_ pair: (Any, Any), with fn: (T, T) -> U) -> U? {
// ...
}
let x = castAndCombine((1, 2), with: { (a: Int, b: Int) in a + b })
I personally dislike the first solution, as I feel it is not aesthetic and unnatural to use. However, I wonder if the second one adds any performance overhead, due to the creation of a closure that essentially wraps a single function, without adding any behavior.
Does anyone know if this performance overhead does actually exist and/or is significant to any extent?
There should not be any overhead if you compile with optimizations, as the compiler will most likely inline your closure.
You can verify this assumption with your first solution (as it supports both styles) by comparing the LLVM code Swift writes. LLVM is an intermediate representation used by the compiler right before creating actual machine code.
Write one file using the operator directly, i.e.:
let x = castAndCombine((1, 2), toType: Int.self, with: +)
Write a second file using the closure, i.e.:
let x = castAndCombine((1, 2), toType: Int.self, with: { (a: Int, b: Int) in a + b })
Now compile both with optimizations, asking Swift's compiler to produce the LLVM IR. Assuming your files are named main1.swift and main2.swift, you can run the following:
swift -O -emit-ir main1.swift 1>main1.ll
swift -O -emit-ir main2.swift 1>main2.ll
Both produced files should be identical.
diff main1.ll main2.ll
# No output
Note that the solutions suggested in the comments do not add any performance overhead either, as statically guaranteed casts do not cost any operation.
Instead of creating a closure to disambiguate the type, you can cast + to the desired type:
func castAndCombine<T, U>(_ pair: (Any, Any), with fn: (T, T) -> U) -> U? {
guard let first = pair.0 as? T, let second = pair.1 as? T
else { return nil }
return fn(first, second)
}
// Add two Ints by concatenating them as Strings
func +(_ lhs: Int, _ rhs: Int) -> String {
return "\(lhs)\(rhs)"
}
if let x = castAndCombine((1, 2), with: (+) as (Int, Int) -> String) {
print(x)
}
12
if let x = castAndCombine((1, 2), with: (+) as (Int, Int) -> Int) {
print(x)
}
3

Reduce higher order function in Swift 3.0 with Int enum

I am learning Swift higher order functions associated with Collections. I have following query with reduce
enum Coin : Int {
case Penny = 1
case Nickel = 5
case Dime = 10
case Quarter = 25
}
let coinArray: [Coin] = [.Dime, .Quarter, .Penny, .Penny, .Nickel, .Nickel]
coinArray.reduce(0,{ (x:Coin, y:Coin) -> Int in
return x.rawValue + y.rawValue
})
I am getting following error:
Declared closure result Int is incompatible with contextual type _
Let's see how reduce is declared:
public func reduce<Result>(_ initialResult: Result, _ nextPartialResult: (Result, Element) throws -> Result) rethrows -> Result
See the type of nextPartialResult? It is (Result, Element) -> Result. What is the type of Result, in your case? It is Int, because you want to reduce the whole thing to an integer.
Therefore, passing a (Coin, Coin) -> Int does not really work here, does it?
You should pass in a (Int, Coin) -> Int instead.
coinArray.reduce(0,{ (x:Int, y:Coin) -> Int in
return x + y.rawValue
})
Or simply:
coinArray.reduce(0) { $0 + $1.rawValue }
Once you apply reduce onto the coinArray you get the following signature:
Ask yourself what is the type of the generic Result? Is it of type coin or of type Int? What is the type of nextPartialResult? Is it of type coin or of type Int?
The answer is: Result is an Int and and nextPartialResult is a closure 'that takes one parameter of type result which here is Int and another parameter of type coin and eventually returns an Int'
So the correct way of writing it is:
coinArray.reduce(0,{ (x, y) -> Int in
return x + y.rawValue
})
Or in a more meaningful sense you could have wrote:
coinArray.reduce(0,{ (currentResult, coin) -> Int in
return currentResult + coin.rawValue
})
also coinArray isn't a good name. Just write coins. It being plural makes is more readable than coinArray / arrayOfCoins!

Why does use of closure shorthand-named variables has to be exhaustive in a singular return expression in Swift?

The following piece of code are erroneous in Swift.
func foo(closure: (Int, Int) -> Int) -> Int {
return closure(1, 2)
}
print(foo(closure: {$0}))
func foo(closure: (Int, Int) -> Int) -> Int {
return closure(1, 2)
}
print(foo(closure: {return $0}))
The error given by XCode playground is Cannot convert value of type '(Int, Int)' to closure result type 'Int'.
While the following pieces of code are completely fine.
func foo(closure: (Int, Int) -> Int) -> Int {
return closure(1, 2)
}
print(foo(closure: {$0 + $1}))
func foo(closure: (Int, Int) -> Int) -> Int {
return closure(1, 2)
}
print(foo(closure: {$1; return $0}))
func foo(closure: (Int, Int) -> Int) -> Int {
return closure(1, 2)
}
print(foo(closure: {a, b in a}))
It seems that in a situation where arguments to a closure are referred to by shorthand argument names, they must be used exhaustively if the the body of the closure only consists of the return expression. Why?
If you just use $0, the closure arguments are assumed to be a tuple instead of multiple variables $0, $1 etc. So you should be able to work around this by extracting the first value of that tuple:
print(foo(closure: {$0.0}))
Your "why" is like asking "why is an American football field 100 yards long?" It's because those are the rules. An anonymous function body that takes parameters must explicitly acknowledge all parameters. It can do this in any of three ways:
Represent them using $0, $1, ... notation.
Represent them using parameter names in an in line.
Explicitly discard them by using _ in an in line.
So, let's take a much simpler example than yours:
func f(_ ff:(Int)->(Void)) {}
As you can see, the function f takes one parameter, which is a function taking one parameter.
Well then, let's try handing some anonymous functions to f.
This is legal because we name the parameter in an in line:
f {
myParam in
}
And this is legal because we accept the parameter using $0 notation:
f {
$0
}
And this is legal because we explicitly throw away the parameter using _ in the in line:
f {
_ in
}
But this is not legal:
f {
1 // error: contextual type for closure argument list expects 1 argument,
// which cannot be implicitly ignored
}

Comparing type of literal in swift fails?

This code works is Swift 3:
let a = 1
type(of: a) == Int.self // true
However, remarkably this code fails:
// error: binary operator '==' cannot be applied to two 'Int.Type' operands
type(of: 1) == Int.self
What is the syntax to make the second comparison work, if any?
Thanks a lot.
I think the error message was misleading. The real issue was how to interpret the literal 1 in the second call. Swift defaults to an Int when you define a variable:
let a = 1 // a is an Int
But the compiler can read it as Double, UInt32, CChar, etc. depending on the context:
func takeADouble(value: Double) { ... }
func takeAUInt(value: UInt) { ... }
takeADouble(value: 1) // now it's a Double
takeAUInt(value: 1) // now it's a UInt
type(of:) is defined as a generic function:
func type<Type, Metatype>(of: Type) -> Metatype
The compiler has no clue on how to interpret the Type generic parameter: should it be an Int, UInt, UInt16, etc.? Here's the error I got from the IBM Swift Sandbox:
Overloads for '==' exist with these partially matching parameter lists
(Any.Type?, Any.Type?), (UInt8, UInt8), (Int8, Int8),
(UInt16, UInt16), (Int16, Int16), (UInt32, UInt32), ...
You can give the coompiler some help by tell it what type it is:
type(of: 1 as Int) == Int.self

Swift higher order function (Church pair aka cons) with generic parameter types not accepting input parameter types

I was messing around with the functional programming in Swift 2.1, trying to implement the Church encoding pair/cons function (cons = λx λy λf f x y in untyped lambda calculus), which I had read couldn't be done in earlier versions of Swift.
With generics it looks like
func cons<S,T,U>(x:S,_ y:T) -> ((S,T) -> U) -> U
{
return { (f:((S,T) -> U)) -> U in return f(x,y)}
}
cons(1,2)
//error: cannot invoke 'cons' with an argument list of type '(Int, Int)'
//note: expected an argument list of type '(S, T)'
which doesn't work, and gives an error I cannot understand (surely parameter list of type (Int,Int) can match generic type variables (S,T)?)
If you get rid of the generic types, and declare them all Ints, the function works, but of course we want to be able to cons together lists longer than 2; consing a list of length 3 is consing an Int with an (Int,Int) -> Int, for example.
Another option is to type everything as Any (see Type Casting for Any and AnyObject), but I couldn't make that work either.
Do you have any ideas? Is this possible in Swift yet? I'm sure there are simpler ways to implement cons/car/cdr, but I'm specifically interested in the Church encoding, where the list elements are arguments to anonymous functions (lambdas).
func cons<S,T,U>(x:S,_ y:T) -> ((S,T) -> U) -> U
{
return { (f:((S,T) -> U)) -> U in return f(x,y)}
}
let i: ((Int,Int)->Int)->Int = cons(1,2)
let d: ((Int,Int)->Double)->Double = cons(2,3)
let e: ((Double,Int)->String)->String = cons(2.2, 1)
let e: ((Double,Int)->Double)->Double = cons(2.2, 1)
stil one of type is an 'extra' type and could not be inferred by compilator. if you define the types, you can see, that not all combinations are valid. Just define the output type and the compilator should be happy
func cons<S,T, U>(x:S,_ y:T, outptAs: U.Type) -> ((S,T) -> U ) -> U
{
return { (f:((S,T) -> U)) -> U in return f(x,y) }
}
let i = cons(1.2 ,"A", outptAs: Int.self)
let j = cons("alfa","beta", outptAs: Double.self)