Can a condition be used to determine the type of a generic? - swift

I will first explain what I'm trying to do and how I got to where I got stuck before getting to the question.
As a learning exercise for myself, I took some problems that I had already solved in Objective-C to see how I can solve them differently with Swift. The specific case that I got stuck on is a small piece that captures a value before and after it changes and interpolates between the two to create keyframes for an animation.
For this I had an object Capture with properties for the object, the key path and two id properties for the values before and after. Later, when interpolating the captured values I made sure that they could be interpolated by wrapping each of them in a Value class that used a class cluster to return an appropriate class depending on the type of value it wrapped, or nil for types that wasn't supported.
This works, and I am able to make it work in Swift as well following the same pattern, but it doesn't feel Swift like.
What worked
Instead of wrapping the captured values as a way of enabling interpolation, I created a Mixable protocol that the types could conform to and used a protocol extension for when the type supported the necessary basic arithmetic:
protocol SimpleArithmeticType {
func +(lhs: Self, right: Self) -> Self
func *(lhs: Self, amount: Double) -> Self
}
protocol Mixable {
func mix(with other: Self, by amount: Double) -> Self
}
extension Mixable where Self: SimpleArithmeticType {
func mix(with other: Self, by amount: Double) -> Self {
return self * (1.0 - amount) + other * amount
}
}
This part worked really well and enforced homogeneous mixing (that a type could only be mixed with its own type), which wasn't enforced in the Objective-C implementation.
Where I got stuck
The next logical step, and this is where I got stuck, seemed to be to make each Capture instance (now a struct) hold two variables of the same mixable type instead of two AnyObject. I also changed the initializer argument from being an object and a key path to being a closure that returns an object ()->T
struct Capture<T: Mixable> {
typealias Evaluation = () -> T
let eval: Evaluation
let before: T
var after: T {
return eval()
}
init(eval: Evaluation) {
self.eval = eval
self.before = eval()
}
}
This works when the type can be inferred, for example:
let captureInt = Capture {
return 3.0
}
// > Capture<Double>
but not with key value coding, which return AnyObject:\
let captureAnyObject = Capture {
return myObject.valueForKeyPath("opacity")!
}
error: cannot invoke initializer for type 'Capture' with an argument list of type '(() -> _)'
AnyObject does not conform to the Mixable protocol, so I can understand why this doesn't work. But I can check what type the object really is, and since I'm only covering a handful of mixable types, I though I could cover all the cases and return the correct type of Capture. Too see if this could even work I made an even simpler example
A simpler example
struct Foo<T> {
let x: T
init(eval: ()->T) {
x = eval()
}
}
which works when type inference is guaranteed:
let fooInt = Foo {
return 3
}
// > Foo<Int>
let fooDouble = Foo {
return 3.0
}
// > Foo<Double>
But not when the closure can return different types
let condition = true
let foo = Foo {
if condition {
return 3
} else {
return 3.0
}
}
error: cannot invoke initializer for type 'Foo' with an argument list of type '(() -> _)'
I'm not even able to define such a closure on its own.
let condition = true // as simple as it could be
let evaluation = {
if condition {
return 3
} else {
return 3.0
}
}
error: unable to infer closure type in the current context
My Question
Is this something that can be done at all? Can a condition be used to determine the type of a generic? Or is there another way to hold two variables of the same type, where the type was decided based on a condition?
Edit
What I really want is to:
capture the values before and after a change and save the pair (old + new) for later (a heterogeneous collection of homogeneous pairs).
go through all the collected values and get rid of the ones that can't be interpolated (unless this step could be integrated with the collection step)
interpolate each homogeneous pair individually (mixing old + new).
But it seems like this direction is a dead end when it comes to solving that problem. I'll have to take a couple of steps back and try a different approach (and probably ask a different question if I get stuck again).

As discussed on Twitter, the type must be known at compile time. Nevertheless, for the simple example at the end of the question you could just explicitly type
let evaluation: Foo<Double> = { ... }
and it would work.
So in the case of Capture and valueForKeyPath: IMHO you should cast (either safely or with a forced cast) the value to the Mixable type you expect the value to be and it should work fine. Afterall, I'm not sure valueForKeyPath: is supposed to return different types depending on a condition.
What is the exact case where you would like to return 2 totally different types (that can't be implicitly casted as in the simple case of Int and Double above) in the same evaluation closure?

in my full example I also have cases for CGPoint, CGSize, CGRect, CATransform3D
The limitations are just as you have stated, because of Swift's strict typing. All types must be definitely known at compile time, and each thing can be of only one type - even a generic (it is resolved by the way it is called at compile time). Thus, the only thing you can do is turn your type into into an umbrella type that is much more like Objective-C itself:
let condition = true
let evaluation = {
() -> NSObject in // *
if condition {
return 3
} else {
return NSValue(CGPoint:CGPointMake(0,1))
}
}

Related

Swift method needs to sort a polymorphic collection of UInt32

Confused (and a little frustrated) with Swift5 at the moment.
I have a method:
func oidsMany(_ oids:Array<UInt32>) -> MCCommandBuilder {
let sorted_oids:[UInt32] = oids.sorted()
...
}
Discovered I have a case where I want to pass a Set to this method just as well. Either way, I'm going to sort an Array or a Set to an Array right away.
Waded through the many many many protocols that both Set and Array conform to, noticed that they both conform to [Sequence][1] and that Sequence responds to sorted. Perfect.
But when I change the above to:
func oidsMany(_ Sequence<UInt32>) -> MCCommandBuilder {
let sorted_oids:[UInt32] = oids.sorted()
...
}
I get the following error hints:
Cannot specialize non-generic type 'Sequence'
Member 'sorted' cannot be used on value of protocol type 'Sequence'; use a generic constraint instead
What's the right way to approach this? I could just add a second oidsMany(_ Set...) that casts its arg as an array and recalls. But I feel like I'm missing something fundamental here. My experience from other languages is not mapping over well here.
You can as the error message suggest use it as a generic constraint instead
func oidsMany2<Sortable: Sequence>(_ oids: Sortable) -> MCCommandBuilder where Sortable.Element: Comparable {
let sorted_oids:[Sortable.Element] = oids.sorted()
//...
}
if you only want to accept collections where the element is UInt32 you can change the where condition to
where Sortable.Element == UInt32

Why can we not cast to protocol types with associated types but achieve the same effect using generics?

Consider this code:
extension Collection {
func foo() -> Int {
if self.first is Collection {
return (self.first as! Collection).underestimatedCount // ERROR
}
else {
return self.underestimatedCount
}
}
}
We get the dreaded and apparently widely puzzling:
protocol 'Collection' can only be used as a generic constraint because it has Self or associated type requirements.
However, this happily compiles:
func foo<C: Collection>(_ c: C) -> Int where C.Iterator.Element: Collection {
if let first = c.first {
return first.underestimatedCount // *
} else {
return c.underestimatedCount
}
}
Why?!
In particular, the compiler does not know in * how the associated types of (the type of) first have been realized; it only gets the promise that they have been (because any object of type Collection has to realize them). This same guarantee is there in the first example! So why does the compiler complain about one but not the other?
My question is: at line *, what does the compiler know that it does not in line ERROR?
Protocol-typed values are represented using an 'existential container' (see this great WWDC talk on them; or on Youtube), which consists of a value-buffer of fixed size in order to store the value (if the value size exceeds this, it'll heap allocate), a pointer to the protocol witness table in order to lookup method implementations and a pointer to the value witness table in order to manage the lifetime of the value.
Unspecialised generics use pretty much the same format (I go into this in slightly more depth in this Q&A) – when they're called, pointers to the protocol and value witness tables are passed to the function, and the value itself is stored locally inside the function using a value-buffer, which will heap allocate for values larger than that buffer.
Therefore, because of the sheer similarity in how these are implemented, we can draw the conclusion that not being able to talk in terms of protocols with associated types or Self constraints outside of generics is just a current limitation of the language. There's no real technical reason why it's not possible, it just hasn't been implemented (yet).
Here's an excerpt from the Generics Manifesto on "Generalized existentials", which discusses how this could work in practice:
The restrictions on existential types came from an implementation
limitation, but it is reasonable to allow a value of protocol type
even when the protocol has Self constraints or associated types. For
example, consider IteratorProtocol again and how it could be used as
an existential:
protocol IteratorProtocol {
associatedtype Element
mutating func next() -> Element?
}
let it: IteratorProtocol = ...
it.next() // if this is permitted, it could return an "Any?", i.e., the existential that wraps the actual element
Additionally, it is reasonable to want to constrain the associated
types of an existential, e.g., "a Sequence whose element type is
String" could be expressed by putting a where clause into
protocol<...> or Any<...> (per "Renaming protocol<...> to Any<...>"):
let strings: Any<Sequence where .Iterator.Element == String> = ["a", "b", "c"]
The leading . indicates that we're talking about the dynamic type,
i.e., the Self type that's conforming to the Sequence protocol.
There's no reason why we cannot support arbitrary where clauses within
the Any<...>.
And from being able to type a value as a protocol with an associated type, it's but a short step to allow for type-casting to that given type, and thus allow something like your first extension to compile.

Swift Generic "Numeric" Protocol that Validates Numeric Values

Swift 3.
Ultimately my functions need to receive UInt8 data types, but I'm never sure if the arguments I will receive from callers will be Int, Int64, UInt, Float, etc. I know they will be numeric types, I just don't know which flavor.
I could do:
func foo(value: Int) { }
func foo(value: Float) {}
func foo(value: UInt) {}
But that's crazy. So I thought I could do something like create a protocol
protocol ValidEncodable {
}
And then pass in types that conform:
func foo(value: ValidEncodable) { }
And then in that function I could get my values into the correct format
func foo(value: ValidEncoable) -> UInt8 {
let correctedValue = min(max(floor(value), 0), 100)
return UInt8(correctedValue)
}
I'm really struggling to figure out
1) How to create this ValidEncodable protocol that contains all the numeric types
2) And how to do things like floor(value) when the value I get is an Int without iterating over every possible numeric type (i.e. (floor(x) is only available on floating-point types)
Ultimately I need the values to be UInt8 in the range of 0-100. The whole reason for this madness is that I'm parsing XML files to my own internal data structures and I want to bake in some validation to my values.
This can be done without a protocol, and by making use of compiler checks, which greatly reduces the changes of bugs.
My recommendation is to use a partial function - i.e. a function that instead of taking an int, it takes an already validated value. Check this article for a more in-depth description of why partial functions are great.
You can build a Int0to100 struct, which has either a failable or throwable initializer (depending on taste):
struct Int0to100 {
let value: UInt8
init?(_ anInt: Int) {
guard anInt >= 0 && anInt <= 100 else { return nil }
value = UInt8(anInt)
}
init?(_ aFloat: Float) {
let flooredValue = floor(aFloat)
guard flooredValue >= 0 && flooredValue <= 100 else { return nil }
value = UInt8(flooredValue)
}
// ... another initializers can be added the same way
}
and change foo to allow to be called with this argument instead:
func foo(value: Int0to100) {
// do your stuff here, you know for sure, at compile time,
// that the function can be called with a valid value only
}
You move to the caller the responsibility of validating the integer value, however the validation resolves to checking an optional, which is easy, and allows you to handle the scenario of an invalid number with minimal effort.
Another important aspect is that you explicitly declare the domain of the foo function, which improves the overall design of the code.
And not last, enforcements set at compile time greatly reduce the potential of having runtime issues.
If you know your incoming values will lie in 0..<256, then you can just construct a UInt8 and pass it to the function.
func foo(value: UInt8) -> UInt8 { return value }
let number = arbitraryNumber()
print(foo(UInt8(number)))
This will throw a runtime exception if the value is too large to fit in a byte, but otherwise will work. You could protect against this type of error by doing some bounds checking between the second and third lines.

How do I store a value of type Class<ClassImplementingProtocol> in a Dictionary of type [String:Class<Protocol>] in Swift?

I want to store a more specialized type in a Dictionary of type [String:SomeClass]. Here is some sample code illustrating my problem (also available to play with at https://swiftlang.ng.bluemix.net/#/repl/579756cf9966ba6275fc794a):
class Thing<T> {}
protocol Flavor {}
class Vanilla: Flavor {}
var dict = [String:Thing<Flavor>]()
dict["foo"] = Thing<Vanilla>()
It produces the error ERROR at line 9, col 28: cannot assign value of type 'Thing<Vanilla>' to type 'Thing<Any>?'.
I've tried casting Thing<Vanilla>() as Thing<Flavor> but that produces the error cannot convert value of type 'Thing<Vanilla>' to type 'Thing<Flavor>' in coercion.
I've also tried to define the Dictionary as type [String:Thing<Any>] but that doesn't change anything either.
How do I create a collection of different Things without resorting to plain [String:AnyObject]?
I should also mention that the class Thing is not defined by me (in fact it's about BoltsSwift Tasks), so the solution to create a base class of Thing without a type parameter doesn't work.
A Thing<Vanilla> is not a Thing<Flavor>. Thing is not covariant. There is no way in Swift to express that Thing is covariant. There are good reasons for this. If what you were asking for were allowed without careful rules around it, I would be allowed to write the following code:
func addElement(array: inout [Any], object: Any) {
array.append(object)
}
var intArray: [Int] = [1]
addElement(array: &intArray, object: "Stuff")
Int is a subtype of Any, so if [Int] were a subtype of [Any], I could use this function to append strings to an int array. That breaks the type system. Don't do that.
Depending on your exact situation, there are two solutions. If it is a value type, then repackage it:
let thing = Thing<Vanilla>(value: Vanilla())
dict["foo"] = Thing(value: thing.value)
If it is a reference type, box it with a type eraser. For example:
// struct unless you have to make this a class to fit into the system,
// but then it may be a bit more complicated
struct AnyThing {
let _value: () -> Flavor
var value: Flavor { return _value() }
init<T: Flavor>(thing: Thing<T>) {
_value = { return thing.value }
}
}
var dict = [String:AnyThing]()
dict["foo"] = AnyThing(thing: Thing<Vanilla>(value: Vanilla()))
The specifics of the type eraser may be different depending on your underlying type.
BTW: The diagnostics around this have gotten pretty good. If you try to call my addElement above in Xcode 9, you get this:
Cannot pass immutable value as inout argument: implicit conversion from '[Int]' to '[Any]' requires a temporary
What this is telling you is that Swift is willing to pass [Int] where you ask for [Any] as a special-case for Arrays (though this special treatment isn't extended to other generic types). But it will only allow it by making a temporary (immutable) copy of the array. (This is another example where it can be hard to reason about Swift performance. In situations that look like "casting" in other languages, Swift might make a copy. Or it might not. It's hard to be certain.)
One way to solve this is adding an initialiser to Thing and creating a Thing<Flavor> that will hold a Vanilla object.
It will look something like:
class Thing<T> {
init(thing : T) {
}
}
protocol Flavor {}
class Vanilla: Flavor {}
var dict = [String:Thing<Flavor>]()
dict["foo"] = Thing<Flavor>(thing: Vanilla())

Type inference fails when using nil-coalescing operator with two optionals

We are trying to figure whether this is a bug in Swift or us misusing generics, optionals, type inference and/or nil coalescing operator.
Our framework contains some code for parsing dictionaries into models and we've hit a problem with optional properties with default values.
We have a protocol SomeProtocol and two generic functions defined in a protocol extension:
mapped<T>(...) -> T?
mapped<T : SomeProtocol>(...) -> T?
Our structs and classes adhere to this protocol and then parse their properties inside an init function required by the protocol.
Inside the init(...) function we try to set a value of the property someNumber like this:
someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber
The dictionary of course contains the actual value for key someNumber. However, this will always fail and the actual value will never get returned from the mapped() function.
Either commenting out the second generic function or force downcasting the value on the rhs of the assignment will fix this issue, but we think this should work the way it currently is written.
Below is a complete code snippet demonstrating the issue, along with two options that (temporarily) fix the issue labeled OPTION 1 and OPTION 2 in the code:
import Foundation
// Some protocol
protocol SomeProtocol {
init(dictionary: NSDictionary?)
}
extension SomeProtocol {
func mapped<T>(dictionary: NSDictionary?, key: String) -> T? {
guard let dictionary = dictionary else {
return nil
}
let source = dictionary[key]
switch source {
case is T:
return source as? T
default:
break
}
return nil
}
// ---
// OPTION 1: Commenting out this makes it work
// ---
func mapped<T where T:SomeProtocol>(dictionary: NSDictionary?, key: String) -> T? {
return nil
}
}
// Some struct
struct SomeStruct {
var someNumber: Double? = 0.0
}
extension SomeStruct: SomeProtocol {
init(dictionary: NSDictionary?) {
someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber
// OPTION 2: Writing this makes it work
// someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber!
}
}
// Test code
let test = SomeStruct(dictionary: NSDictionary(object: 1234.4567, forKey: "someNumber"))
if test.someNumber == 1234.4567 {
print("success \(test.someNumber!)")
} else {
print("failure \(test.someNumber)")
}
Please note, that this is an example which misses the actual implementations of the mapped functions, but the outcome is identical and for the sake of this question the code should be sufficient.
EDIT: I had reported this issue a while back and now it was marked as fixed, so hopefully this shouldn't happen anymore in Swift 3.
https://bugs.swift.org/browse/SR-574
You've given the compiler too many options, and it's picking the wrong one (at least not the one you wanted). The problem is that every T can be trivially elevated to T?, including T? (elevated to T??).
someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber
Wow. Such types. So Optional. :D
So how does Swift begin to figure this thing out. Well, someNumber is Double?, so it tries to turn this into:
Double? = Double?? ?? Double?
Does that work? Let's look for a generic mapped, starting at the most specific.
func mapped<T where T:SomeProtocol>(dictionary: NSDictionary?, key: String) -> T? {
To make this work, T has to be Double?. Is Double?:SomeProtocol? Nope. Moving on.
func mapped<T>(dictionary: NSDictionary?, key: String) -> T? {
Does this work? Sure! T can be Double? We return Double?? and everything resolves.
So why does this one work?
someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber!
This resolves to:
Double? = Optional(Double? ?? Double)
And then things work the way you think they're supposed to.
Be careful with so many Optionals. Does someNumber really have to be Optional? Should any of these things throw? (I'm not suggesting throw is a general work-around for Optional problems, but at least this problem gives you a moment to consider if this is really an error condition.)
It is almost always a bad idea to type-parameterize exclusively on the return value in Swift the way mapped does. This tends to be a real mess in Swift (or any generic language that has lots of type inference, but it really blows up in Swift when there are Optionals involved). Type parameters should generally appear in the arguments. You'll see the problem if you try something like:
let x = test.mapped(...)
It won't be able to infer the type of x. This isn't an anti-pattern, and sometimes the hassle is worth it (and in fairness, the problem you're solving may be one of those cases), but avoid it if you can.
But it's the Optionals that are killing you.
EDIT: Dominik asks a very good question about why this behaves differently when the constrained version of mapped is removed. I don't know. Obviously the type matching engine checks for valid types in a little different order depending on how many ways mapped is generic. You can see this by adding print(T.self) to mapped<T>. That might be considered a bug in the compiler.