Binary operator '===' cannot be applied to two 'String' operands - swift

Why can't the === be used with String's in Swift? I am unable to compile the following:
let string1 = "Bob"
let string2 = "Fred"
if string1 === string2 {
...
}
and get the following error (on the if line):
Binary operator '===' cannot be applied to two 'String' operands
What I want to be able to do in my unit tests is, having performed a copyWithZone:, verify that two objects are indeed a different object with different pointers even if their values are the same. The following code doesn't work...
XCTAssertFalse(object1.someString === object2.someString)
If anyone knows of an alternative way please advise.

string1 and string2 are not NSString, but String. Since they are value objects, not reference objects, there is no reference that could be compared with ===.

Swift's === operator, by default, is only defined for classes.
Swift's String type is not a class but a struct. It does not inherit from AnyObject and therefore cannot be compared by reference.
You could of course implement an === operator for String in Swift, but I'm not sure how it would be any different from the implementation of == for Swift's String type.
func ===(lhs: String, rhs: String) -> Bool {
return lhs == rhs
}
Unless, of course, you really wanted to compare the references, I suppose you could do something like this:
func ===(lhs: String, rhs: String) -> Bool {
return unsafeAddressOf(lhs) == unsafeAddressOf(rhs)
}
However, for the sake of tests, rather than using the == or === operators, you should use the appropriate assertions:
XCTAssertEqual(foo, bar)
XCTAssertNotEqual(foo, bar)

The === operator is the identity operator. It checks if two variables or constants refer to the same instance of a class. Strings are not classes (they are structs) so the === operator does not apply to them.
If you want to check if two strings are the same, use the equality operator == instead.
Read all about the identity operator in the Swift documentation.
You can just check two objects for identity directly, instead of checking a property of type String.
XCTAssertFalse(object1 === object2)

Swift Strings are value type, not reference type, so there's no need for that, a copy will always be a different object.
You should just compare by value with ==.

If you try really hard, you can force things to happen, but I'm not sure what that buys you.
class MyClass: NSObject, NSCopying {
var someString: NSString = ""
required override init() {
super.init()
}
func copyWithZone(zone: NSZone) -> AnyObject {
let copy = self.dynamicType.init()
copy.someString = someString.copy() as? NSString ?? ""
return copy
}
}
let object1 = MyClass()
object1.someString = NSString(format: "%d", arc4random())
let object2 = object1.copy()
if object1.someString === object2.someString {
print("identical")
} else {
print("different")
}
prints identical, the system is really good at conserving strings.

Related

What's the equivalent of Obj.C's NSMutableDictionary<Class, Class> in Swift?

What's the equivalent of Obj.C's NSMutableDictionary<Class, Class> in Swift?
I tried:
var dictionary = [AnyClass: AnyClass]()
However this throws the error: Type 'AnyClass' (aka 'AnyObject.Type') does not conform to protocol 'Hashable'
Since there can only be one class per name, we know a given class reference refers to a unique class; there is only one "String" in a given namespace. So why is this not hashable?
I also tried:
var dictionary = NSMutableDictionary<AnyClass, AnyClass>
However this also fails with: Cannot specialize non-generic type 'NSMutableDictionary'
I thought Swift was supposed to be type-safe, but here the compiler is encouraging me to just throw anything into this NSMutableDictionary without type-checking it to make sure it's an AnyClass!
Also, DO NOT lecture me about "You shouldn't be doing that in the first place," because, I am not doing it, it's already like that in some Objective C code that I am required to translate into Swift. I am simply trying to do it in the best possible way—if it means I must resort to a non-type-safe NSMutableDictionary, then so be it, however this seems ridiculous.
Surely I'm missing something here... what am I missing?
The closest you can get in Swift, I discovered, is to do this:
var classToClassMapping = Dictionary<ObjectIdentifier, AnyClass>()
extension Dictionary where Key == ObjectIdentifier, Value == AnyClass {
subscript<T>(keyType: T.Type) -> AnyClass? {
get {
let id = ObjectIdentifier(keyType)
return self[id]
}
set {
let id = ObjectIdentifier(keyType)
self[id] = newValue
}
}
}
classToClassMapping[Yay.self] = NSString.self
if let stringClass = classToClassMapping[Yay.self] as? NSString.Type {
print(stringClass.init(string: "hell yeah"))
}
// Prints "hell yeah"
// Alternative:
switch classToClassMapping[Yay.self] {
case let val as NSString.Type:
print(val.init(string: "yaiirrr boy"))
default:
print("woops")
}
// prints "yaiirrr boy"
Works perfectly for my needs! (Using Swift 5.1 here)

When filtering an array literal in swift, why does the result contain optionals?

A contrived example to be sure, but why is the result an array of optionals?
let r = [1,2,3].filter { sourceElement in
return !["1", "2"].contains { removeElement in
sourceElement == Int(removeElement)
}
}
print(r.dynamicType)
Either type casting the source array or assigning it to a variable returns an array of Ints.
let seq = [1,2,3]
let r2 = seq.filter { sourceElement in
return !["1", "2"].contains { removeElement in
sourceElement == Int(removeElement)
}
}
print(r2.dynamicType) // "Array<Int>\n"
Shouldn't both results be of the same type?
I don’t think it’s necessarily a bug though it is confusing. It’s a question of where the promotion to optional happens to make the whole statement compile. A shorter repro that has the same behavior would be:
let i: Int? = 1
// x will be [Int?]
let x = [1,2,3].filter { $0 == i }
Bear in mind when you write nonOptional == someOptional the type of the lhs must be promoted to optional implicitly in order for it to work, because the == that you are using is this one in which both sides must be optional:
public func ==<T>(lhs: T?, rhs: T?) -> Bool
The compiler needs to promote something in this entire statement to be an optional, and what it chose was the integer literals inside [1,2,3]. You were instead expecting the promotion to happen at the point of the ==, so you could compare the non-optional sourceElement with the optional result of Int(_:String), but this isn’t necessarily guaranteed (not sure to what extent the ordering/precedence of these promotions is specced vs just the way the compiler was coded…)
The reason this doesn’t happen in the two-line version is when you write as one line let seq = [1,2,3], the type of seq is decided there. Then on the next line, the compiler doesn’t have as much latitude, therefore it must promote sourceElement to be an Int? so it can be compared with Int(removeElement) using ==.
Another way of making the code perform the conversion at the point you expect would be:
let r = [1,2,3].filter { sourceElement in
return !["1", "2"].contains { (removeElement: String)->Bool in
// force the optional upgrade to happen here rather than
// on the [1,2,3] literal...
Optional(sourceElement) == Int(removeElement)
}
}

Can a condition be used to determine the type of a generic?

I will first explain what I'm trying to do and how I got to where I got stuck before getting to the question.
As a learning exercise for myself, I took some problems that I had already solved in Objective-C to see how I can solve them differently with Swift. The specific case that I got stuck on is a small piece that captures a value before and after it changes and interpolates between the two to create keyframes for an animation.
For this I had an object Capture with properties for the object, the key path and two id properties for the values before and after. Later, when interpolating the captured values I made sure that they could be interpolated by wrapping each of them in a Value class that used a class cluster to return an appropriate class depending on the type of value it wrapped, or nil for types that wasn't supported.
This works, and I am able to make it work in Swift as well following the same pattern, but it doesn't feel Swift like.
What worked
Instead of wrapping the captured values as a way of enabling interpolation, I created a Mixable protocol that the types could conform to and used a protocol extension for when the type supported the necessary basic arithmetic:
protocol SimpleArithmeticType {
func +(lhs: Self, right: Self) -> Self
func *(lhs: Self, amount: Double) -> Self
}
protocol Mixable {
func mix(with other: Self, by amount: Double) -> Self
}
extension Mixable where Self: SimpleArithmeticType {
func mix(with other: Self, by amount: Double) -> Self {
return self * (1.0 - amount) + other * amount
}
}
This part worked really well and enforced homogeneous mixing (that a type could only be mixed with its own type), which wasn't enforced in the Objective-C implementation.
Where I got stuck
The next logical step, and this is where I got stuck, seemed to be to make each Capture instance (now a struct) hold two variables of the same mixable type instead of two AnyObject. I also changed the initializer argument from being an object and a key path to being a closure that returns an object ()->T
struct Capture<T: Mixable> {
typealias Evaluation = () -> T
let eval: Evaluation
let before: T
var after: T {
return eval()
}
init(eval: Evaluation) {
self.eval = eval
self.before = eval()
}
}
This works when the type can be inferred, for example:
let captureInt = Capture {
return 3.0
}
// > Capture<Double>
but not with key value coding, which return AnyObject:\
let captureAnyObject = Capture {
return myObject.valueForKeyPath("opacity")!
}
error: cannot invoke initializer for type 'Capture' with an argument list of type '(() -> _)'
AnyObject does not conform to the Mixable protocol, so I can understand why this doesn't work. But I can check what type the object really is, and since I'm only covering a handful of mixable types, I though I could cover all the cases and return the correct type of Capture. Too see if this could even work I made an even simpler example
A simpler example
struct Foo<T> {
let x: T
init(eval: ()->T) {
x = eval()
}
}
which works when type inference is guaranteed:
let fooInt = Foo {
return 3
}
// > Foo<Int>
let fooDouble = Foo {
return 3.0
}
// > Foo<Double>
But not when the closure can return different types
let condition = true
let foo = Foo {
if condition {
return 3
} else {
return 3.0
}
}
error: cannot invoke initializer for type 'Foo' with an argument list of type '(() -> _)'
I'm not even able to define such a closure on its own.
let condition = true // as simple as it could be
let evaluation = {
if condition {
return 3
} else {
return 3.0
}
}
error: unable to infer closure type in the current context
My Question
Is this something that can be done at all? Can a condition be used to determine the type of a generic? Or is there another way to hold two variables of the same type, where the type was decided based on a condition?
Edit
What I really want is to:
capture the values before and after a change and save the pair (old + new) for later (a heterogeneous collection of homogeneous pairs).
go through all the collected values and get rid of the ones that can't be interpolated (unless this step could be integrated with the collection step)
interpolate each homogeneous pair individually (mixing old + new).
But it seems like this direction is a dead end when it comes to solving that problem. I'll have to take a couple of steps back and try a different approach (and probably ask a different question if I get stuck again).
As discussed on Twitter, the type must be known at compile time. Nevertheless, for the simple example at the end of the question you could just explicitly type
let evaluation: Foo<Double> = { ... }
and it would work.
So in the case of Capture and valueForKeyPath: IMHO you should cast (either safely or with a forced cast) the value to the Mixable type you expect the value to be and it should work fine. Afterall, I'm not sure valueForKeyPath: is supposed to return different types depending on a condition.
What is the exact case where you would like to return 2 totally different types (that can't be implicitly casted as in the simple case of Int and Double above) in the same evaluation closure?
in my full example I also have cases for CGPoint, CGSize, CGRect, CATransform3D
The limitations are just as you have stated, because of Swift's strict typing. All types must be definitely known at compile time, and each thing can be of only one type - even a generic (it is resolved by the way it is called at compile time). Thus, the only thing you can do is turn your type into into an umbrella type that is much more like Objective-C itself:
let condition = true
let evaluation = {
() -> NSObject in // *
if condition {
return 3
} else {
return NSValue(CGPoint:CGPointMake(0,1))
}
}

How to handle initial nil value for reduce functions

I would like to learn and use more functional programming in Swift. So, I've been trying various things in playground. I don't understand Reduce, though. The basic textbook examples work, but I can't get my head around this problem.
I have an array of strings called "toDoItems". I would like to get the longest string in this array. What is the best practice for handling the initial nil value in such cases? I think this probably happens often. I thought of writing a custom function and use it.
func optionalMax(maxSofar: Int?, newElement: Int) -> Int {
if let definiteMaxSofar = maxSofar {
return max(definiteMaxSofar, newElement)
}
return newElement
}
// Just testing - nums is an array of Ints. Works.
var maxValueOfInts = nums.reduce(0) { optionalMax($0, $1) }
// ERROR: cannot invoke 'reduce' with an argument list of type ‘(nil, (_,_)->_)'
var longestOfStrings = toDoItems.reduce(nil) { optionalMax(count($0), count($1)) }
It might just be that Swift does not automatically infer the type of your initial value. Try making it clear by explicitly declaring it:
var longestOfStrings = toDoItems.reduce(nil as Int?) { optionalMax($0, count($1)) }
By the way notice that I do not count on $0 (your accumulator) since it is not a String but an optional Int Int?
Generally to avoid confusion reading the code later, I explicitly label the accumulator as a and the element coming in from the serie as x:
var longestOfStrings = toDoItems.reduce(nil as Int?) { a, x in optionalMax(a, count(x)) }
This way should be clearer than $0 and $1 in code when the accumulator or the single element are used.
Hope this helps
Initialise it with an empty string "" rather than nil. Or you could even initialise it with the first element of the array, but an empty string seems better.
Second go at this after writing some wrong code, this will return the longest string if you are happy with an empty string being returned for an empty array:
toDoItems.reduce("") { count($0) > count($1) ? $0 : $1 }
Or if you want nil, use
toDoItems.reduce(nil as String?) { count($0!) > count($1) ? $0 : $1 }
The problem is that the compiler cannot infer the types you are using for your seed and accumulator closure if you seed with nil, and you also need to get the optional type correct when using the optional string as $0.

Convert or cast object to string

how can i convert any object type to a string?
let single_result = results[i]
var result = ""
result = single_result.valueForKey("Level")
now i get the error: could not assign a value of type any object to a value of type string.
and if i cast it:
result = single_result.valueForKey("Level") as! String
i get the error:
Could not cast value of type '__NSCFNumber' (0x103215cf0) to 'NSString' (0x1036a68e0).
How can i solve this issue?
You can't cast any random value to a string. A force cast (as!) will fail if the object can't be cast to a string.
If you know it will always contain an NSNumber then you need to add code that converts the NSNumber to a string. This code should work:
if let result_number = single_result.valueForKey("Level") as? NSNumber
{
let result_string = "\(result_number)"
}
If the object returned for the "Level" key can be different object types then you'll need to write more flexible code to deal with those other possible types.
Swift arrays and dictionaries are normally typed, which makes this kind of thing cleaner.
I'd say that #AirSpeedVelocity's answer (European or African?) is the best. Use the built-in toString function. It sounds like it works on ANY Swift type.
EDIT:
In Swift 3, the answer appears to have changed. Now, you want to use the String initializer
init(describing:)
Or, to use the code from the question:
result = single_result.valueForKey("Level")
let resultString = String(describing: result)
Note that usually you don't want valueForKey. That is a KVO method that will only work on NSObjects. Assuming single_result is a Dictionary, you probably want this syntax instead:
result = single_result["Level"]
This is the documentation for the String initializer provided here.
let s = String(describing: <AnyObject>)
Nothing else is needed. This works for a diverse range of objects.
The toString function accepts any type and will always produce a string.
If it’s a Swift type that implements the Printable protocol, or has overridden NSObject’s description property, you’ll get whatever the .description property returns. In the case of NSNumber, you’ll get a string representation of the number.
If it hasn’t, you’ll get a fairly unhelpful string of the class name plus the memory address. But most standard classes, including NSNumber, will produce something sensible.
import Foundation
class X: NSObject {
override var description: String {
return "Blah"
}
}
let x: AnyObject = X()
toString(x) // return "Blah"
"\(x)" // does the same thing but IMO is less clear
struct S: Printable {
var description: String {
return "asdf"
}
}
// doesn't matter if it's an Any or AnyObject
let s: Any = S()
toString(s) // reuturns "asdf"
let n = NSNumber(double: 123.45)
toString(n) // returns "123.45"
n.stringValue // also works, but is specific to NSNumber
(p.s. always use toString rather than testing for Printable. For one thing, String doesn’t conform to Printable...)
toString() doesn't seem to exist in Swift 3 anymore.
Looks like there's a failable initializer that will return the passed in value's description.
init?(_ description: String)
Docs here https://developer.apple.com/reference/swift/string/1540435-init