Ambiguous use of operator '??' - swift

This article shows: ?? is very time-consuming, and the test found it to be true. So I want to optimize this:
#if DEBUG
public func ?? <T>(left: T?, right: T) -> T {
guard let value = left else {
return right
}
return value
}
#endif
BUT
string = string ?? ""
ERROR: Ambiguous use of operator '??'

The article you link to does not describe how the nil coalescing operator is "time consuming", but states the simple fact that a compound expression of two nil coalescing operator calls as well as other evaluations has a significantly longer build time than if we break down this compound expressions into smaller parts, especially if the former contains lazy evaluations at some point (which is the case e.g. for the evaluation of the rhs of the ?? operator). By breaking down complex expression we help the compiler out; Swift has been known to have some difficulty compiling complex compound statements, so this is expected. This, however, should generally not affect runtime performance, given that you build your implementation for release mode.
By overloading the implementation of the ?? operator during debug with a non-lazy evaluation of the lhs while still using long compound statements shouldn't fully redeem the issue described in the previous clause. If compile time is really an issue for you, use the same approach as the author of the article; break down your complex expressions into several smaller ones, in so doing some of the work of the compiler yourself. Refactoring with the sole purpose of decreasing compile times (which we might believe is the job of the compiler, not us) might be frustrating at times, but is an issue we (currently) has to live with when working with Swift. See e.g. the following excellent article covering the Swift compiler issues with long/complex expressions:
Exponential time complexity in the Swift type checker
W.r.t. to the ambiguity error: for reference, the official implementation of the nil coelescing operator can be found here:
#_transparent
public func ?? <T>(optional: T?, defaultValue: #autoclosure () throws -> T)
rethrows -> T {
switch optional {
case .some(let value):
return value
case .none:
return try defaultValue()
}
}
And we may note that it has the same specificity as your own generic implementation (compare the signatures!), meaning the ambiguity error is to be expected. E.g., compare with:
// due to the '#autoclosure', these two have the
// same precedecence/specificity in the overload
// resolution of 'foo(_:)'
func foo<T>(_ obj: T) {}
func foo<T>(_ closure: #autoclosure () -> T) {}
foo("foo") // ERROR at line 9, col 1: ambiguous use of 'foo'
When you implement your ?? overload it as in your answer, typing out Any, this implementation becomes more specific than the generic one, which means it will take precedence in the overload resolution of ??. Using Any is such contexts, however, is generally a bad idea, attempting to mimic dynamic typing rather than relying on Swift's renowned static typing.

if I overloading ??,I can't use <T>...
then
#if DEBUG
public func ?? (left: Any?, right: Any) -> Any {
guard let value = left else {
return right
}
return value
}
#endif
is OK!
BUT I can't get types of return value... 😭

Related

Why does the nil-coalescing operator return an optional?

I use Swift's nil-coalescing operator in a chain to print one of three values. The first two are optional, the last one is non-optional:
let optionalString: String? = nil
let optionalError: Error? = MyError.anError
let defaultString: String = "default"
print(optionalString ?? (optionalError ?? defaultString))
I was surprised to see that this code prints
Optional(MyError.anError)
to the console. I was expecting a non-optional:
MyError.anError
Why is this an optional?
The error is defined as follows:
enum MyError: Error {
case anError
}
I know that there are two implementations of the nil-coalescing operator:
func ?? <T>(optional: T?, defaultValue: #autoclosure () throws -> T) rethrows -> T
and
func ?? <T>(optional: T?, defaultValue: #autoclosure () throws -> T?) rethrows -> T?
According to this, ?? should return an optional (T?) only if the defaultValue is an optional as well (T?). But this is not the case:
The default value for the first ?? operator is (optionalError ?? defaultString). To evaluate this value's type, we need to evaluate the result of the second ?? operator. As defaultString is of type String and thus a non-optional, function (1) is used for the nil-coalescing operator. Hence, the result of (optionalError ?? defaultString) must be non-optional as well, which is the default value for the first ?? operator. So again, function (1) is used for the nil-coalescing operator and thus, optionalString ?? (optionalError ?? defaultString) should return a *non-*optional.
Apparently, there is a flaw in my logic, but I don't see it. Any explanation for this?
Disclaimer: I wasn't sure if I should add this as a supplement to my question or if I should post it as an answer. I decided for the latter. It's not a bullet-proof explanation, but I think it's "close enough" for many who might run into the same problem.
What's going wrong: The Big Picture
The two comments stating that the code provided doesn't compile for the commentators pushed me in the right direction:
I copied the code in a blank playground and ran it. For me, it did compile, but it also showed a warning which it didn't show before (because for some reason, Auto completion and compiler warnings were broken in the original project).
The warning
Expression implicitly coerced from 'Any?' to 'Any'
tells me that the expression (optionalError ?? defaultString) is apparently evaluated as type Any?, which is an optional. The compiler then chooses the ?? operator function (1) to evaluate the entire expression which requires that its default value (optionalError ?? defaultString) is non-optional. Thus, this value is implicitly coerced from Any? to Any. And that's why Optional(MyError.anError) is printed.
Why is this going wrong?
There are two questions remaining:
Why does the compiler choose operator function (1)?
Why is (optionalError ?? defaultString) evaluated to Any??
I don't have an answer for question 1,
but my guess would be that the compiler simply assigns a higher precedence to function (1) when the parameter types don't match and it's unclear which function to use. (I'll be happy to be corrected.)
Regarding question 2:
optionalError is of type MyError? and defaultString is of type String. The signature of the ?? operator functions read:
func ?? <T>(optional: T?, defaultValue: #autoclosure () throws -> T) rethrows -> T
func ?? <T>(optional: T?, defaultValue: #autoclosure () throws -> T?) rethrows -> T?
So both parameters, optional and defaultValue must have the "same" generic type (only that in the first case, one parameter is an optional of that particular type and the other one is not). MyError and String are obviously not the same type. So in order for this to match the criteria of the function signature, the compiler must coerce both types to Any.
So MyError? is coerced to Any? and apparently, String is also coerced to Any? and thus, function (2) is used which evaluates (optionalError ?? defaultString)toAny?`.
That again raises the question why function (2) is used in this case which kind of contradicts my assumption for question (1), but my main take-away here is that it's simply a bad idea to use two different types with the nil-coalescing operator.
Background:
I got this code example from a textbook on RxSwift which is slightly different as it uses a generic type that is required to conform to CustomStringConvertible:
func print<T: CustomStringConvertible>(label: String, event: Event<T>) {
print(label, event.element ?? event.error ?? event)
}
What got me experimenting with this in the first place was that when I ran the code from the book it printed an optional whereas the book claimed that it would print a non-optional.

Swift short syntax of execution

I am looking for the way to write short syntax.
For instance. In JS, PHP and etc.
var a = 1 ;
function Foo ()-> void {}
a && Foo() ;
if a exists, run Foo.
a and Foo itself already mean exist or not, the syntax is away better looks....
However, in Swift, the typing checking is kinda of tough.
var a = 1 ;
func Foo ()-> Foid {} ;
a && Foo();
will generate neither are Bool returning error.
a != nil && Foo() ;
this can resolve and variable condition, but what if the better bypass for the function condition? I just dont want to write something like
if( a != nil ) { Foo() } ;
Yet what is the better syntax for Not Exist?
if ( !a ) or !a //is easy and better looks...
I found not similar thing in swift...
if( a == nil ) // will throws error when its not Optional Typing.
guard var b = xxx else {} // simply for Exist and very long syntax.
Thank you for your advice!
As mentioned by other contributors, Swift emphasizes readability and thus, explicit syntax. It would be sacrilege for the Swift standard library to support Python-style truth value testing.
That being said, Swift’s extensibility allows us to implement such functionality ourselves—if we really want to.
prefix func !<T>(value: T) -> Bool {
switch T.self {
case is Bool.Type:
return value as! Bool
default:
guard Double(String(describing: value)) != 0
else { return false }
return true
}
}
prefix func !<T>(value: T?) -> Bool {
guard let unwrappedValue = value
else { return false }
return !unwrappedValue
}
var a = 1
func foo() -> Void { }
!a && !foo()
Or even define our own custom operator:
prefix operator ✋
prefix func ✋<T>(value: T) -> Bool {
/// Same body as the previous example.
}
prefix func ✋<T>(value: T?) -> Bool {
guard let unwrappedValue = value
else { return false }
return ✋unwrappedValue
}
var a = 1
func foo() -> Void { }
✋a && ✋foo()
The expectations you've developed from dynamic languages like PHP and JS (and Ruby, Python for that matter) are almost universally inapplicable to static languages like Swift.
Swift is a statically compiled language. If you reference a variable that doesn't exist, it's not legal Swift code, and the compiler will fail your build. Given that, the question of "how do I check if a variable is undefined?" is completely moot in Swift. If you have a successfully compiling program that references a variable a, then a exists. There's absolutely no reason for a check, and so a mechanism for it doesn't even exist.
Static vs Dynamic typing
Static type systems are like mathematical proof systems. They produce rigerous proofs that certain aspects of your program are valid. This has trade-offs. The rigidity buys you many guarantees. For example, you'll never have a valid Swift program where you accidentally pass an Int where a Bool is expected. The static type system makes that class of error literally impossible, so it's not something you have to remember to check for yourself.
On the other hand, many truths are easier to intuit than to prove. Thus, there's great utility in scripting and dynamic languages, because they don't demand the rigorous proofs of your claims that static languages require. On the down side, their type systems "do" much less. For example, JS happily lets you reference an undefined variable. To remedy this, JS provides a way for you to do a run-time check to see whether a variable is defined or not. But this isn't a problem Swift has, so the "solution" is absent.
When static typing is too hard
Swift actually takes a middle ground position. If you find yourself with a statement that's obviously true, but hard to prove to the compiler, various "escape hatches" exist that allow you to leave the safety of the type system, and go into dynamic land. For example, if you look at an IBOutlet, and see that it's connected to an element in a storyboard, you can intuitively be sure that the IBOutlet is not nil. But that's not something you can prove to the compiler, and hence when you see implicitly unwrapped optionals being used for IBOutlets.
Implicitly unwrapped optionals are one such "escape hatch". The Any type is another, as is unsafeBitcast(_:to:), withoutActuallyEscaping(_:), as!, try!, etc.
Swift takes type safety very seriously. Unlike C or JS we can not use anything that doesn't resolve to Bool value type in If statement in Swift. So there won't be a short hand for that(at-least that I know of). Regarding below code
if( a == nil ) // will throws error when its not Optional Typing.
Swift doesn't allow you to set nil to non optional types. So there is no need to check for nil. By the way both Obj-C and Swift use verbose syntax, we need to get use to that.
In this case you are trying to force Swift to work in a way that you are used to with other languages like JavaScript or PHP, as you say in your comment. There are a few reasons why your code won't compile, but it mainly falls on the fact that Swift doesn't do the same truthy and falsy stuff JS does.
var a = 1
if a {
print("won't compile")
}
//'Int' is not convertible to 'Bool'
In Swift it's better to use an actual Bool value if that's what it's supposed to be, or if it's truly supposed to be an Int you're just going to have to check the value
var a = true
if a {
print("this compiles")
}
or
var a = 1
if a > 0 {
print("this compiles too")
}
Swift really isn't meant to be as loose as JS, so you should just embrace that and take advantage of the safety and readability.
Here is one way most similar to what you designed.
You may have to set the type of a to Int?:
var a: Int? = 1
func foo ()-> Void {}
a.map{_ in foo()}

Type inference fails when using nil-coalescing operator with two optionals

We are trying to figure whether this is a bug in Swift or us misusing generics, optionals, type inference and/or nil coalescing operator.
Our framework contains some code for parsing dictionaries into models and we've hit a problem with optional properties with default values.
We have a protocol SomeProtocol and two generic functions defined in a protocol extension:
mapped<T>(...) -> T?
mapped<T : SomeProtocol>(...) -> T?
Our structs and classes adhere to this protocol and then parse their properties inside an init function required by the protocol.
Inside the init(...) function we try to set a value of the property someNumber like this:
someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber
The dictionary of course contains the actual value for key someNumber. However, this will always fail and the actual value will never get returned from the mapped() function.
Either commenting out the second generic function or force downcasting the value on the rhs of the assignment will fix this issue, but we think this should work the way it currently is written.
Below is a complete code snippet demonstrating the issue, along with two options that (temporarily) fix the issue labeled OPTION 1 and OPTION 2 in the code:
import Foundation
// Some protocol
protocol SomeProtocol {
init(dictionary: NSDictionary?)
}
extension SomeProtocol {
func mapped<T>(dictionary: NSDictionary?, key: String) -> T? {
guard let dictionary = dictionary else {
return nil
}
let source = dictionary[key]
switch source {
case is T:
return source as? T
default:
break
}
return nil
}
// ---
// OPTION 1: Commenting out this makes it work
// ---
func mapped<T where T:SomeProtocol>(dictionary: NSDictionary?, key: String) -> T? {
return nil
}
}
// Some struct
struct SomeStruct {
var someNumber: Double? = 0.0
}
extension SomeStruct: SomeProtocol {
init(dictionary: NSDictionary?) {
someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber
// OPTION 2: Writing this makes it work
// someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber!
}
}
// Test code
let test = SomeStruct(dictionary: NSDictionary(object: 1234.4567, forKey: "someNumber"))
if test.someNumber == 1234.4567 {
print("success \(test.someNumber!)")
} else {
print("failure \(test.someNumber)")
}
Please note, that this is an example which misses the actual implementations of the mapped functions, but the outcome is identical and for the sake of this question the code should be sufficient.
EDIT: I had reported this issue a while back and now it was marked as fixed, so hopefully this shouldn't happen anymore in Swift 3.
https://bugs.swift.org/browse/SR-574
You've given the compiler too many options, and it's picking the wrong one (at least not the one you wanted). The problem is that every T can be trivially elevated to T?, including T? (elevated to T??).
someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber
Wow. Such types. So Optional. :D
So how does Swift begin to figure this thing out. Well, someNumber is Double?, so it tries to turn this into:
Double? = Double?? ?? Double?
Does that work? Let's look for a generic mapped, starting at the most specific.
func mapped<T where T:SomeProtocol>(dictionary: NSDictionary?, key: String) -> T? {
To make this work, T has to be Double?. Is Double?:SomeProtocol? Nope. Moving on.
func mapped<T>(dictionary: NSDictionary?, key: String) -> T? {
Does this work? Sure! T can be Double? We return Double?? and everything resolves.
So why does this one work?
someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber!
This resolves to:
Double? = Optional(Double? ?? Double)
And then things work the way you think they're supposed to.
Be careful with so many Optionals. Does someNumber really have to be Optional? Should any of these things throw? (I'm not suggesting throw is a general work-around for Optional problems, but at least this problem gives you a moment to consider if this is really an error condition.)
It is almost always a bad idea to type-parameterize exclusively on the return value in Swift the way mapped does. This tends to be a real mess in Swift (or any generic language that has lots of type inference, but it really blows up in Swift when there are Optionals involved). Type parameters should generally appear in the arguments. You'll see the problem if you try something like:
let x = test.mapped(...)
It won't be able to infer the type of x. This isn't an anti-pattern, and sometimes the hassle is worth it (and in fairness, the problem you're solving may be one of those cases), but avoid it if you can.
But it's the Optionals that are killing you.
EDIT: Dominik asks a very good question about why this behaves differently when the constrained version of mapped is removed. I don't know. Obviously the type matching engine checks for valid types in a little different order depending on how many ways mapped is generic. You can see this by adding print(T.self) to mapped<T>. That might be considered a bug in the compiler.

Can a condition be used to determine the type of a generic?

I will first explain what I'm trying to do and how I got to where I got stuck before getting to the question.
As a learning exercise for myself, I took some problems that I had already solved in Objective-C to see how I can solve them differently with Swift. The specific case that I got stuck on is a small piece that captures a value before and after it changes and interpolates between the two to create keyframes for an animation.
For this I had an object Capture with properties for the object, the key path and two id properties for the values before and after. Later, when interpolating the captured values I made sure that they could be interpolated by wrapping each of them in a Value class that used a class cluster to return an appropriate class depending on the type of value it wrapped, or nil for types that wasn't supported.
This works, and I am able to make it work in Swift as well following the same pattern, but it doesn't feel Swift like.
What worked
Instead of wrapping the captured values as a way of enabling interpolation, I created a Mixable protocol that the types could conform to and used a protocol extension for when the type supported the necessary basic arithmetic:
protocol SimpleArithmeticType {
func +(lhs: Self, right: Self) -> Self
func *(lhs: Self, amount: Double) -> Self
}
protocol Mixable {
func mix(with other: Self, by amount: Double) -> Self
}
extension Mixable where Self: SimpleArithmeticType {
func mix(with other: Self, by amount: Double) -> Self {
return self * (1.0 - amount) + other * amount
}
}
This part worked really well and enforced homogeneous mixing (that a type could only be mixed with its own type), which wasn't enforced in the Objective-C implementation.
Where I got stuck
The next logical step, and this is where I got stuck, seemed to be to make each Capture instance (now a struct) hold two variables of the same mixable type instead of two AnyObject. I also changed the initializer argument from being an object and a key path to being a closure that returns an object ()->T
struct Capture<T: Mixable> {
typealias Evaluation = () -> T
let eval: Evaluation
let before: T
var after: T {
return eval()
}
init(eval: Evaluation) {
self.eval = eval
self.before = eval()
}
}
This works when the type can be inferred, for example:
let captureInt = Capture {
return 3.0
}
// > Capture<Double>
but not with key value coding, which return AnyObject:\
let captureAnyObject = Capture {
return myObject.valueForKeyPath("opacity")!
}
error: cannot invoke initializer for type 'Capture' with an argument list of type '(() -> _)'
AnyObject does not conform to the Mixable protocol, so I can understand why this doesn't work. But I can check what type the object really is, and since I'm only covering a handful of mixable types, I though I could cover all the cases and return the correct type of Capture. Too see if this could even work I made an even simpler example
A simpler example
struct Foo<T> {
let x: T
init(eval: ()->T) {
x = eval()
}
}
which works when type inference is guaranteed:
let fooInt = Foo {
return 3
}
// > Foo<Int>
let fooDouble = Foo {
return 3.0
}
// > Foo<Double>
But not when the closure can return different types
let condition = true
let foo = Foo {
if condition {
return 3
} else {
return 3.0
}
}
error: cannot invoke initializer for type 'Foo' with an argument list of type '(() -> _)'
I'm not even able to define such a closure on its own.
let condition = true // as simple as it could be
let evaluation = {
if condition {
return 3
} else {
return 3.0
}
}
error: unable to infer closure type in the current context
My Question
Is this something that can be done at all? Can a condition be used to determine the type of a generic? Or is there another way to hold two variables of the same type, where the type was decided based on a condition?
Edit
What I really want is to:
capture the values before and after a change and save the pair (old + new) for later (a heterogeneous collection of homogeneous pairs).
go through all the collected values and get rid of the ones that can't be interpolated (unless this step could be integrated with the collection step)
interpolate each homogeneous pair individually (mixing old + new).
But it seems like this direction is a dead end when it comes to solving that problem. I'll have to take a couple of steps back and try a different approach (and probably ask a different question if I get stuck again).
As discussed on Twitter, the type must be known at compile time. Nevertheless, for the simple example at the end of the question you could just explicitly type
let evaluation: Foo<Double> = { ... }
and it would work.
So in the case of Capture and valueForKeyPath: IMHO you should cast (either safely or with a forced cast) the value to the Mixable type you expect the value to be and it should work fine. Afterall, I'm not sure valueForKeyPath: is supposed to return different types depending on a condition.
What is the exact case where you would like to return 2 totally different types (that can't be implicitly casted as in the simple case of Int and Double above) in the same evaluation closure?
in my full example I also have cases for CGPoint, CGSize, CGRect, CATransform3D
The limitations are just as you have stated, because of Swift's strict typing. All types must be definitely known at compile time, and each thing can be of only one type - even a generic (it is resolved by the way it is called at compile time). Thus, the only thing you can do is turn your type into into an umbrella type that is much more like Objective-C itself:
let condition = true
let evaluation = {
() -> NSObject in // *
if condition {
return 3
} else {
return NSValue(CGPoint:CGPointMake(0,1))
}
}

Implementing an enum ForwardIndexType

I have been struggling to properly implement the ForwardIndexType protocol for an enum, in particular the handling of the end case (i.e for the last item without a successor). This protocol is not really covered in the Swift Language book.
Here is a simple example
enum ThreeWords : Int, ForwardIndexType {
case one=1, two, three
func successor() ->ThreeWords {
return ThreeWords(rawValue:self.rawValue + 1)!
}
}
The successor() function will return the next enumerator value, except for the last element, where it will fail with an exception, because there is no value after .three
The ForwardTypeProtocol does not allow successor() to return a conditional value, so there seems to be no way of signalling that there is no successor.
Now using this in a for loop to iterate over the closed range of all the possible values of an enum, one runs into a problem for the end case:
for word in ThreeWords.one...ThreeWords.three {
print(" \(word.rawValue)")
}
println()
//Crashes with the error:
fatal error: unexpectedly found nil while unwrapping an Optional value
Swift inexplicably calls the successor() function of the end value of the range, before executing the statements in the for loop. If the range is left half open ThreeWords.one..<ThreeWords.three then the code executes correctly, printing 1 2
If I modify the successor function so that it does not try to create a value larger than .three like this
func successor() ->ThreeWords {
if self == .three {
return .three
} else {
return ThreeWords(rawValue:self.rawValue + 1)!
}
}
Then the for loop does not crash, but it also misses the last iteration, printing the same as if the range was half open 1 2
My conclusion is that there is a bug in swift's for loop iteration; it should not call successor() on the end value of a closed range. Secondly, the ForwardIndexType ought to be able to return an optional, to be able to signal that there is no successor for a certain value.
Has anyone had more success with this protocol ?
Indeed, it seems that successor will be called on the last value.
You may wish to file a bug, but to work around this you could simply add a sentinel value to act as a successor.
It seems, ... operator
func ...<Pos : ForwardIndexType>(minimum: Pos, maximum: Pos) -> Range<Pos>
calls maximum.successor(). It constructs Range<T> like
Range(start: minimum, end: maximum.successor())
So, If you want to use enum as Range.Index, you have to define the next of the last value.
enum ThreeWords : Int, ForwardIndexType {
case one=1, two, three
case EXHAUST
func successor() ->ThreeWords {
return ThreeWords(rawValue:self.rawValue + 1) ?? ThreeWords.EXHAUST
}
}
This is an old Question but I would like to sum up some things and post another possible solution.
As #jtbandes and #rintaro already stated a closed range created with the start...end operator is internally created with start..<end.successor()
Afaik this is an intentional behavior in Swift.
In many cases you can also use an Interval where you thought about using a Range or where Swift declared a Range by default. The point here is that intervals aren't collections.
So this is not possible with Intervals
for word in ThreeWords.one...ThreeWords.three {...}
================
For the following I assume the snippet above was just a debug case to cross-check the values.
To declare an Interval you need to explicitly specify the type. Either a HalfOpenInterval (..<) or a ClosedInterval (...)
var interval:ClosedInterval = ThreeWords.one...ThreeWords.four
This requires you to make your enumeration Comparable. Although Int is Comparable already, you still need to add it to the inheritance list
enum ThreeWords : Int, ForwardIndexType, Comparable {
case one=1, two, three, four
func successor() ->ThreeWords {
return ThreeWords(rawValue:self.rawValue + 1)!
}
}
And finally the enumeration need to conform to Comparable. This is a generic approach since your enumeration also conforms to the protocol RawRepresentable
func <<T: RawRepresentable where T.RawValue: Comparable>(lhs: T, rhs: T) -> Bool {
return lhs.rawValue < rhs.rawValue
}
Like I wrote you can't iterate over it in a loop anymore, but you can have a quick cross-check using a switch:
var interval:ClosedInterval = ThreeWords.one...ThreeWords.four
switch(ThreeWords.four) {
case ThreeWords.one...ThreeWords.two:
print("contains one or two")
case let word where interval ~= word:
print("contains: \(word) with raw value: \(word.rawValue)")
default:
print("no case")
}
prints "contains: four with raw value: 4"