Comparing type of literal in swift fails? - swift

This code works is Swift 3:
let a = 1
type(of: a) == Int.self // true
However, remarkably this code fails:
// error: binary operator '==' cannot be applied to two 'Int.Type' operands
type(of: 1) == Int.self
What is the syntax to make the second comparison work, if any?
Thanks a lot.

I think the error message was misleading. The real issue was how to interpret the literal 1 in the second call. Swift defaults to an Int when you define a variable:
let a = 1 // a is an Int
But the compiler can read it as Double, UInt32, CChar, etc. depending on the context:
func takeADouble(value: Double) { ... }
func takeAUInt(value: UInt) { ... }
takeADouble(value: 1) // now it's a Double
takeAUInt(value: 1) // now it's a UInt
type(of:) is defined as a generic function:
func type<Type, Metatype>(of: Type) -> Metatype
The compiler has no clue on how to interpret the Type generic parameter: should it be an Int, UInt, UInt16, etc.? Here's the error I got from the IBM Swift Sandbox:
Overloads for '==' exist with these partially matching parameter lists
(Any.Type?, Any.Type?), (UInt8, UInt8), (Int8, Int8),
(UInt16, UInt16), (Int16, Int16), (UInt32, UInt32), ...
You can give the coompiler some help by tell it what type it is:
type(of: 1 as Int) == Int.self

Related

Getting error: String is not convertible to 'T'

This code compiles and works fine:
class Numbers {
func operateOn<T>(_ num1: T, _ num2: T, do task: (T, T) -> ()) {
task(num1, num2)
}
}
let n = Numbers()
n.operateOn(1,2) {
print(($0 + $1) * 10)
}
n.operateOn("l","ll") {
print(($0 + $1))
}
Yet for for following code does not compile.
func process<T> (add: String, completion: (T) -> () ) {
completion("k") // ERROR
}
Yet I get the following error:
'String' is not convertible to 'T'
I tried passing an Int, but I just got another error:
'Int' is not convertible to 'T'
Can't an Int or a String satisfy a generic requirement that doesn't have any constraints?!
The problem is that your code needs to work for any T. E.g. if T is Int then completion has type (Int) -> (), it's completely legitimate to call
n.process<Int>("") { $0 + 1 }
and completion("k") would have to do "k" + 1 which doesn't make sense.
This is going to be the same in basically any language with generics (C++ is different because it uses templates for the same purpose instead).
Can't an Int or a String satisfy a generic requirement that
doesn't have any constraints?!
Sure it can. But that's not the reason the compiler is giving you an error.
Think about it, what happens if you constrain a generic function/parameter within the function body itself?! It will no longer be a generic function!
Imagine if you had wrote your operateOn function as such:
class Numbers {
func operateOn<T>(_ num1: T, _ num2: T, do task: (T, T) -> ()) {
task("k", num2) // add two strings
}
}
Would you say that T is generic? Or that it's of type String? If you made it a String then can num2 be any generic type it wants to be? It can't!
If it's of type String then it's no longer generic. Since the compiler can't allow that it will throw that error.

Understanding why this Swift tuple assignment isn't allowed

The following code is fine:
func intTest() -> Int? {
var res : Int
res = 5
return res
}
There is no problem returning a non-optional Int from a method with a return type of optional Int.
Now lets see what happens when the return value is a tuple of optional values:
func tupleTest() -> (Int?, Int?) {
var res : (Int, Int)
res = (5, 5)
return res
}
The return res line results in the error:
error: cannot express tuple conversion '(Int, Int)' to '(Int?, Int?)' (aka '(Optional, Optional)')
Why does Swift not allow this?
I certainly understand why this would be an error in the other direction (optional to non-optional) but why can't a tuple accept a non-optional value in the place of an optional when a non-tuple works fine?
I've seen Swift tuple to Optional assignment but that doesn't cover why you can't do the assignment, it just covers how to solve it.
The reason that's happening is because of a type mismatch. Consider the following case:
let myClosure: (Int) -> ()
let myOtherClosure: (Int?) -> ()
The type of myClosure is (Int) -> () while myOtherClosure is of type (Int?) -> () which makes them fundamentally different types, despite the similarity of the parameters. When Swift is looking at these two constants, it's evaluating the type as a whole, not the individual pieces. The same is happening with your tuple; it's looking at the tuple type signature as a whole unit, not breaking down the parameters and recognizing that they're non-optional versions of the same type.
Converting from Int to Int? works because they're the same type, one is just an optional. Going from (Int, Int) to (Int?, Int?) doesn't because the parameters of the tuple are different therefore causing the overall type to be different, hence your error.
Taking a look at one of your examples:
func tupleTest() -> (Int?, Int?) {
let first: Int = 1
let second: Int = 2
let res: (Int?, Int?) = (first, second)
return res
}
This works because even though the values are non-optional integers, the tuple type is marked as (Int?, Int?) whereas:
func tupleTest() -> (Int?, Int?) {
let first: Int = 1
let second: Int = 2
let res = (first, second)
return res
}
doesn't compile because now res is of type (Int, Int). I'm not an expert on the Swift compiler but my guess is that the way the type system works is it doesn't deconstruct each individual part of a function or a tuple and recognize that the corresponding return parameter is the same type, just an optional.

Swift 2.0 Generics and Type Safety Issues

I was doing this tutorial http://blog.teamtreehouse.com/introduction-learn-power-swift-generics and I came upon this code;
func someFunction<T, U>(a: T, b: U) {}
The problem is when I call the function using
someFunction<String, Int>(1, "Test")
I get an error saying "cannot explicitly specialize a generic function".
I then change it to
someFunction(1,b: "Test")
and now there is no error. The problem is that there is now no type safety. (Is there something wrong with the code, as it was written before swift 2.0?) What is the best way to re-introduce type safety?
The declaration is completely generic and is specifying that any two types can be used.
func someFunction<T, U>(a: T, b: U) {}
It is not that there is no type safety in Swift, this is how you express a generic without any type constraints.
You get what you ask for.
If you wanted to constrain the function to String and Int, you would have written it as
func someFunction(a:String, b:Int)
Generics are more often used with collections, protocols and classes. Basic types rarely need them :
func someFunction<T:Comparable, U:Comparable>(a:T, b:U) -> Bool
{ return (a < b) || (a > b) }
ok, see this 'self explanatory' example. try it in playground and play with it a little bit.
func foo<T>(t: T)->T {
return t
}
// here the resulting type is infered by compiler
let i = foo(1) // 1
let j: Int = foo(1) // 1
let t = foo("alpha") // "alpha"
// if you declare it self ..
//let s: String = foo(1) // error: cannot convert value of type 'Int' to expected argument type 'String'
/* this fails to compile!
func bar<T>(t:T)->Int {
return t.count
}
*/
/* this fails to compile too !!
func bar0<T:CollectionType>(t:T)->Int {
return t.count
}
*/
func bar<T:CollectionType>(t:T)->T.Index.Distance {
return t.count
}
let arr = [1,2,3]
let k:Int = bar(arr) // 3
print(k.dynamicType) // Int
// and even more interesting
let l = bar(arr)
print(l.dynamicType) // Int

Why can't Swift find a Float extension method when called on an integer literal?

I have defined a method on Float called printme, and when I try to call it with an integer literal, Swift fails to find the method:
extension Float {
func printme() {
print("value: \(self)")
}
}
12.printme() // error: value of type 'Int' has no member 'printme'
If I use an explicit cast it works:
(12 as Float).printme() // prints "value: 12.0"
Why, if Float conforms to the IntegerLiteralConvertible protocol, does 12.printme() fail to find the method on Float? It works if the type
is Double, but fails for Int32, UInt, and other types. Why does it work for Double, but not for Float?
Note that the following does work:
func printit(f: Float) {
print("value: \(f)")
}
printit(10) // prints "value: 10.0"
So, it fails when the method is called on the integer literal but not when the integer literal is a parameter to a function.
In Xcode 6.4 it fails in a different way:
12.printme() // error: cannot invoke 'printme' with no arguments
When you don't have an explicit type, Swift assumes either Int or Double. From the Swift book:
For example, if you assign a literal value of 42 to a new constant without saying what type it is, Swift infers that you want the constant to be an Int, because you have initialized it with a number that looks like an integer ... Likewise, if you don’t specify a type for a floating-point literal, Swift infers that you want to create a Double.
Float is not on the inferred type list for literals. If you change your extension to Double, it works (Xcode 7.1):
extension Double {
func printme() {
print("I'm a Double")
}
}
12.printme()
12.0.printme()
For these kind of extensions I look for protocols all types I want to affect conform too, instead of counting on it that the compiler will play nice and convert Double to Float and the other way around.
IntegerLiteralConvertible works for all, but then you have no access to the numeric value. If you add a constraint to either IntegerType or FloatingPointType you have access to toIntMax() and self.advancedBy(0)
extension IntegerLiteralConvertible where Self : IntegerType {
func printMe() {
print(self.toIntMax())
}
}
extension IntegerLiteralConvertible where Self : FloatingPointType {
func printMe() {
print(self.advancedBy(0))
}
}
let float = Float(10.234234)
float.printMe()
let double = Double(234.234234)
double.printMe()
let int = Int(234234)
int.printMe()
let int16 = Int16(234)
int16.printMe()

Swift operators == vs ===

I have stupid question, I'm most certain, that I do something wrong, but can't figure out what it is. I have simple playground where I play with Swift Operators and came on case, where I have following code:
1 != 1 // false
1 !== "1".toInt() //false
1 === 1 // true
1 == "1".toInt() // true
Which should be perfectly fine, but the playground compiler is displaying following error:
What am I doing wrong? What exactly this issue mean?
Update
When I delete the line #2, the error disappear:
Xcode Version 6.1 (6A1052d)
Update 2
When I compare 1 === 1.toInt() I got another error:
=== is the identity operator and can only be applied to instances of classes,
it is declared as
func ===(lhs: AnyObject?, rhs: AnyObject?) -> Bool
Now 1 === 1 works because the compiler creates NSNumber instances here
automatically. NSNumber conforms to the IntegerLiteralConvertible
protocol:
extension NSNumber : FloatLiteralConvertible, IntegerLiteralConvertible, BooleanLiteralConvertible {
/// Create an instance initialized to `value`.
required convenience init(integerLiteral value: Int)
/// Create an instance initialized to `value`.
required convenience init(floatLiteral value: Double)
/// Create an instance initialized to `value`.
required convenience init(booleanLiteral value: Bool)
}
This can also be seen from the assembly code generated with
xcrun -sdk macosx swiftc -emit-assembly main.swift
which shows two calls of
callq __TFE10FoundationCSo8NSNumberCfMS0_FT14integerLiteralSi_S0_
and demangling this function name with
xcrun swift-demangle __TFE10FoundationCSo8NSNumberCfMS0_FT14integerLiteralSi_S0_
gives
ext.Foundation.ObjectiveC.NSNumber.init (ObjectiveC.NSNumber.Type)(integerLiteral : Swift.Int) -> ObjectiveC.NSNumber
So in 1 === 1 two instances of NSNumber are compared (which are objects).
Note that this works only if the Foundation framework is included (i.e., NSNumber
is available). Otherwise 1 === 1 fails to compile with
type 'AnyObject?' does not conform to protocol 'IntegerLiteralConvertible'
Both
1 !== "1".toInt() // value of optional type 'Int?' not unwrapped
1 !== "1".toInt()! // type 'AnyObject?' does not conform to protocol 'IntegerLiteralConvertible'
do not compile because the right-hand side is not an object and not a literal that
the compiler converts in to an object automatically. For the same reason,
let i = 1
1 !== i // type 'Int' does not conform to protocol 'AnyObject'
does not compile.
==, on the other hand, is the equality operator and compares the contents of its
operands. It is defined for optionals if the underlying type is equatable:
func ==<T : Equatable>(lhs: T?, rhs: T?) -> Bool
Therefore in 1 == "1".toInt(), the lhs is converted to Int? and then compared
with the rhs.
Mind you, I never programmed or even heard of Swift, but from the
documentation I found and the experience in C#, I try to give an
answer I think is correct. If not, I'll remove the answer.
Swift seems to have (like C#) "optional types", that are denoted with a question mark after to the type identifier, like this: int?. toInt() seems to return an "optional int", which is not identical with an int (as shown by the error).
To make that optional int to a "normal" int, use an exclamation mark:
1 == "1".toInt()!
The "===" operator is to test whether two object references both refer to the same object instance
In other languages, i know it is used to test if it is the same type and value.
But you called the to.Int() function on a string literal, when the code was expecting an integer literal.
I'd do it like this:
let a = "1";
let b = a.toInt()
if 1 !== b {// Do something}