Type coercion issue in Swift - swift

EDIT: This works absolutely fine in Swift 3, which we should all be using by now :)
If I have two protocols, X and Y where Y implements X, why can't I assign an array of Y to a variable with the type [X]?
Even more curiously, I can convert it one by one into an array of X, and that compiles fine.
protocol X { }
protocol Y: X { }
// Make a test array of Y
extension String: Y { }
let a: [Y] = [ "Hello" ]
// Make it into an array of X - this works absolutely fine
let b: [X] = [ a[0] as X ]
// Why won't this cast work?
let c: [X] = a as [X]
I thought that given Y does everything X can do, this assignment should be fine (yea, I lose some type information, but it should at least compile!)
Any ideas?
EDIT: The current answer points out that it's a dangerous thing to do if you are using mutable arrays - which I didn't get but do now :) However, if my arrays are all immutable why won't Swift let this happen? Even if c is a mutable array (i.e. var c = a as [X]) why won't Swift copy it, leaving a intact?

This doesn't work because it could create a few problems. For example:
var squares: Array<Square> = Array<Square>()
(squares as [Shape]).append(Circle())
Now we have a circle in our Array of Squares. We don't want that, so the compiler doesn't let us. In other languages like Scala you can specify generics to be covariant, contravariant or invariant.
If Swift would let you use covariant generics an Array<Square> would be a subtype of Array<Shape>.
If using contravariance an Array<Square> would be a supertype(!) of Array<Shape>.
But when using invariant generics, neither is the subtype of neither.
The easiest way to do it in your case would probably be this:
let c = a.map{$0 as X}
Now c is of type [X].
For more information on type variance and why it can be problematic visit, you can visit this wiki page.
EDIT: After further back and forth, it seems the real problem is, that Protocols allow default implementations and can therefore cause problems. Your code will compile flawlessly when using classes. Here's some code that could potentially lead to problems with protocols:
protocol Shape {
func surfaceArea() -> Double
}
extension Shape {
func surfaceArea() -> Double {
return 0.0
}
}
protocol Circle : Shape {
func getRadius() -> Double
}
extension Circle {
func surfaceArea() -> Double {
return getRadius() * getRadius() * 3.14
}
}
Now, when we upcast between these two protocols, the surfaceArea() method returns different values, and Swift doesn't allow it.
Try the same thing with classes, instead of protocols and Swift won't have any problems compiling your current code. This is, to me, kind of a weird decision by the Swift team, but I still think the best way to mitigate is to just use
let c = a.map{$0 as X}

Related

Swift casting one custom type to another

I have a function that returns "Any?" but in my particular instance, I know that it should return a custom "Tuple2" object. But since the Tuple2 is part of a separate Swift plugin, I cannot access the struct directly. I have a duplicate of that struct in my main swift code and I'm trying to cast the "Any?" result to my duplicate struct. But I cannot convince the compiler that such a cast is allowed.
Any ideas on how to get around this?
An example that outlines the problem in a single file is as follows.
typealias Scalar = Double
//~~~~~~ THIS HAPPENS IN THE PLUGIN ~~~~~~
struct Tuple2 {
public var x0: Scalar
public var x1: Scalar
public init(_ x0:Scalar, _ x1:Scalar) {
self.x0 = x0
self.x1 = x1
}
}
let A = Tuple2(1.0,2.0)
let B = A as! Any?
//~~~~~~ THIS HAPPENS IN MY CODE ~~~~~~
struct TupleTwo {
public var x0: Scalar
public var x1: Scalar
public init(_ x0:Scalar, _ x1:Scalar) {
self.x0 = x0
self.x1 = x1
}
}
let C = B as! TupleTwo //Error thrown here
print("x0:\(C.x0) x1:\(C.x1)")
But I cannot convince the compiler that such a cast is allowed.
Good! Because it’s not.
Swift (mostly) uses nominal typing, not structural typing. That is, two types are equal only if their (fully qualified) names are equal. Two nominal types with different names, even having the exact same content, are never equivalent.
Your copy of TupleTwo in this module introduces a new type that is completely unrelated to the TupleTwo of the other module you’re using.
To solve your problem you’ll need to import the other module, and use its TupleTwo type in your cast.

How to do Type erasure in Swift using if-let?

I see many articles on type-erasure. But most of their examples focus on putting different types into an array.
Is there any way I can get this code to work?
protocol A {
associatedtype Data
func printThis(value: Data)
}
class B {
}
let x = B()
if let y = x as? A { // I get error on this line
// Do nothing
}
Xcode error states
Protocol 'A' can only be used as a generic constraint because it has Self or associated type requirements
This example code is just for demonstration purposes.
As of Swift 4, protocols that have associated type requirements can only be used as generic constraints in function declarations, as in:
func foo<T: A>(t: T) where A.Data: Whatever { ... }
Unless you remove the associated type from the protocol, you cannot just type variables to it; you can only use it to define a generic type.
If Swift ever gains the ability to have generalized existentials in the future, then this may change. But for the time being, this just isn't possible in Swift.

Generic Where Clause Ambiguity with Associated Types in Swift

I was writing some example code in a playground and wanted a function that returns the distance between two values, both of which conform to the Strideable protocol in Swift so that I could use the distance(to other: Self) -> Self.Stride function. My implementation was as follows:
func distanceFrom<T: Strideable, U>(_ a: T, to b: T) -> U where T.Stride == U
{
return a.distance(to: b)
}
After observing this function for a while, I realized that I wasn't sure which Stride was being used in the where clause, the one from a or from b. From what I understand it would be possible for a and b to define different associated types for Stride. Also I haven't made any statements to ensure that a.Stride == b.Stride, although I understand that I could expand my where clause to do so.
So, which one would get used to check equivalence to U? To be clear, the question isn't about this particular code block, but rather any situation in which this ambiguity would exist.
a and b are the same type. If you wanted them to be different Strideable types you would add another generic parameter conforming to Strideable such that the function signature appears as follows:
func bar<T: Strideable, V: Strideable, U>(_ a: T, to b: V) -> U where T.Stride == U, V.Stride == U {
return a.distance(to: a) //Trivial return statement (see explanation below)
}
Although the aforementioned code would compile, return a.distance(to: b) would not compile because they (a and b) are different types and the definition of distance in Swift3 is public func distance(to other: Self) -> Self.Stride (note the use of Self which restricts other to the same type as the Strideable upon which this function is called). In conclusion, although you could make a and b different types, for your application it would not make sense to do so.
As further evidence for not being able to call your original posted code with different types please see the attached
Playground screenshot which shows an error when using different types.
However, this works fine in the playground.
func distanceFrom<T: Strideable, U>(_ a: T, to b: T) -> U where T.Stride == U {
return a.distance(to: b)
}
let doubleFoo: Double = 4.5
let intFoo: Double = 4
let g = distanceFrom(doubleFoo, to: intFoo) // gives me a double of -0.5
I hope this helps.

Swift not finding the correct type

I am trying to use SwiftHamcrest
I have a function
func equalToArray<T, S>(_ vector:Array<S>) -> Matcher<T> {
let v: Matcher<T> = Hamcrest.hasCount(16)
return v
}
This gives an error
Error:(16, 31) 'hasCount' produces 'Matcher<T>', not the expected contextual result type 'Matcher<T>'
SwiftHamcrest has two hasCount functions
public func hasCount<T: Collection>(_ matcher: Matcher<T.IndexDistance>) -> Matcher<T>
public func hasCount<T: Collection>(_ expectedCount: T.IndexDistance) -> Matcher<T>
Why is my code complaining isn't it returning the same type that is needed.
As a note and possibly a different question I had to add the Hamcrest. before the hasCount method call as otherwise it tried to match to the first function
What am I missing with types?
Your method equalToArray<T, S> does not know that T is a collection, so the result from the generic hasCount(...) methods above will not be assignable to v in your method (since these results returns Matcher<T> instances constrained to T:s that are Collection:s). I.e., v is of a type Matcher<T> for a non-constrained T, meaning, in the eyes of the compiler, there is e.g. no T.IndexDistance for the T of v:s type.
If you add a Collection type constraint to the T of your method, the assignment from hasCount(...) result to v should compile:
func equalToArray<T: Collection, S>(_ vector: Array<S>) -> Matcher<T> {
let v: Matcher<T> = Hamcrest.hasCount(16)
return v
}
In a perfect world, the compiler could've given us a more telling error message, say along the lines of
Error:(16, 31) 'hasCount' produces 'Matcher<T>' where 'T: Collection',
not the expected contextual result type 'Matcher<T>'
Now, I don't know what you're intending to test here, but as #Hamish points out, you might actually want to return a Matcher<[S]> and drop the T placeholder. E.g. using the count property of the supplied vector parameter as argument to hasCount(...)?
func equalToArray<S>(_ vector: Array<S>) -> Matcher<[S]> {
return hasCount(vector.count)
}
Not having used Hamcrest myself, I might be mistaken, but based on a quick skim over the SwiftHamcrest docs, I believe equalToArray(_:) defined as above would construct a matcher for "vector equality" (w.r.t. semantics of the function name) based only on the count of two vectors, in which case the following assert would be a success
let arr1 = ["foo", "bar"]
let arr2 = ["bar", "baz"]
assertThat(arr1, equalToArray(arr2)) // success! ...
But this is just a byline, as you haven't shown us the context where you intend to apply your equalToArray(_:) method/matcher; maybe you're only showing us a minimal example, whereas the actual body of you custom matcher is more true to the method's name.

Why do I have to force a downcast of an array object to a subclass in swift

I have the following -- simplified -- code:
class A {
var x: Int = 9
}
class B: A {
var y: Int = 8
}
class S {
var myList = [A]()
}
//class T: S {
// override var myList = [B]()
//}
class V: S {
func foo() {
let bar = myList[2] as! B
print(bar.y)
}
}
In swift 2.1 I have to use the as! or I get the error message that A is not convertible to B. It seems wrong that I should have to force a conversion to the subclass, but maybe I'm missing something.
The commented out part was my first attempt, but that does not compile either with the message myList with type '[B]' cannot override a property with type '[A]'
In both cases I don't understand the behavior since B is clearly a subclass of A.
Can someone explain why I cannot override the declaration?
Can someone explain why I have to force the downcast?
Thanks!
For point (1), you cannot override stored properties at all, only computed properties. Moreover, even when you override a computed property, you can't give it a different type, as you attempt to do here. And since you can't change the type, it actually makes no sense to override a stored property - what would you override it to? If it was an int in the superclass and an int in the subclass, or an [A] in the the superclass and an [A] in the subclass, then you haven't actually changed anything by overriding it. You can change behavior by overriding a computed property, but obviously not with a stored property, so it makes little sense to do so.
For point (2), you have to force the downcast because you are attempting to access a property - namely, y - that isn't present in the superclass. If you want to access a subclass through a superclass reference, you have to restrict yourself to the interface provided by that superclass. A doesn't have a y property, so trying to access bar.y when bar is of type A makes no sense - you have to downcast it to type B to do that.
In OOP generally, converting upwards is usually easy, since we know from the inheritance structure that B is an A, so converting it that way is simple. But the same is not true in the opposite direction - when B inherits from A, then a B is an A, but an A is not a B. Therefore, when you convert downwards from A to B, you have to do so explicitly, because bad things can happen if the thing you are converting turns out not to be a B at all.
Incidentally, you don't have to force the downcast with as! - you can safely attempt it with as?, for instance:
if let bar = mylist[2] as? B {
print(bar.y)
}
else {
print("not a B!")
}