Could not cast value of type 'Swift.UInt32' to 'Swift.Int' - swift

In Tests project I've got extensions with some test helper functions. Like this:
extension Employee {
static func mockDict() -> Dictionary<String, Any>! {
return ["ID": arc4random() % 1000,
"FirstName": "Employee First Name",
...]
}
}
(I've stripped unnecessary code). I've got problem accessing ID from this dictionary for some yet unknown reason. I've got SIGABRT 6 when casting
employeeDict["ID"] as! Int
Xcode debugger console also don't like this particular integer:
Strings work fine. Have you encountered such problem? Any ideas?
EDIT: Just in case anyone will encounter this problem too. CASTING FROM UInt32/Int32 TO Int FAILS BY DESIGN. Even if object was casted into Any or Anyobject inbetween.
Even though
#available(*, message: "Converting UInt32 to Int will always succeed.")
public init?(exactly value: UInt32)
in Int's declaration
public struct Int : SignedInteger, Comparable, Equatable {
...
}
and
public struct Int32 : SignedInteger, Comparable, Equatable {
...
}
EDIT 2 for those who might encounter this behaviour in JSON serialization. Yes, serialization fails with error NSInvalidArgumentException Invalid type in JSON write (_SwiftValue) if asked to serialize UInt32, Int64 or any Integer protocol instance other than Int

Try this:
let a = employeeDict["ID"] as! UInt32
let number = Int(a)
Now you can use number to perform any action.

This works for me:
Int("\(employeeDict["ID"]!)")

Swift "primitive" numeric types are not interchangeable and cannot be cast to each other.
You need to use an initializer.
Since arcRandom() returns UInt32 and you want to use the value as Int, convert it right away in the dictionary declaration:
["ID": Int(arc4random() % 1000), ...
PS: Do not declare a clearly non-optional as implicit unwrapped optional return value, that defeats the strong type system of Swift.
static func mockDict() -> Dictionary<String, Any>

Related

How do I store a value of type Class<ClassImplementingProtocol> in a Dictionary of type [String:Class<Protocol>] in Swift?

I want to store a more specialized type in a Dictionary of type [String:SomeClass]. Here is some sample code illustrating my problem (also available to play with at https://swiftlang.ng.bluemix.net/#/repl/579756cf9966ba6275fc794a):
class Thing<T> {}
protocol Flavor {}
class Vanilla: Flavor {}
var dict = [String:Thing<Flavor>]()
dict["foo"] = Thing<Vanilla>()
It produces the error ERROR at line 9, col 28: cannot assign value of type 'Thing<Vanilla>' to type 'Thing<Any>?'.
I've tried casting Thing<Vanilla>() as Thing<Flavor> but that produces the error cannot convert value of type 'Thing<Vanilla>' to type 'Thing<Flavor>' in coercion.
I've also tried to define the Dictionary as type [String:Thing<Any>] but that doesn't change anything either.
How do I create a collection of different Things without resorting to plain [String:AnyObject]?
I should also mention that the class Thing is not defined by me (in fact it's about BoltsSwift Tasks), so the solution to create a base class of Thing without a type parameter doesn't work.
A Thing<Vanilla> is not a Thing<Flavor>. Thing is not covariant. There is no way in Swift to express that Thing is covariant. There are good reasons for this. If what you were asking for were allowed without careful rules around it, I would be allowed to write the following code:
func addElement(array: inout [Any], object: Any) {
array.append(object)
}
var intArray: [Int] = [1]
addElement(array: &intArray, object: "Stuff")
Int is a subtype of Any, so if [Int] were a subtype of [Any], I could use this function to append strings to an int array. That breaks the type system. Don't do that.
Depending on your exact situation, there are two solutions. If it is a value type, then repackage it:
let thing = Thing<Vanilla>(value: Vanilla())
dict["foo"] = Thing(value: thing.value)
If it is a reference type, box it with a type eraser. For example:
// struct unless you have to make this a class to fit into the system,
// but then it may be a bit more complicated
struct AnyThing {
let _value: () -> Flavor
var value: Flavor { return _value() }
init<T: Flavor>(thing: Thing<T>) {
_value = { return thing.value }
}
}
var dict = [String:AnyThing]()
dict["foo"] = AnyThing(thing: Thing<Vanilla>(value: Vanilla()))
The specifics of the type eraser may be different depending on your underlying type.
BTW: The diagnostics around this have gotten pretty good. If you try to call my addElement above in Xcode 9, you get this:
Cannot pass immutable value as inout argument: implicit conversion from '[Int]' to '[Any]' requires a temporary
What this is telling you is that Swift is willing to pass [Int] where you ask for [Any] as a special-case for Arrays (though this special treatment isn't extended to other generic types). But it will only allow it by making a temporary (immutable) copy of the array. (This is another example where it can be hard to reason about Swift performance. In situations that look like "casting" in other languages, Swift might make a copy. Or it might not. It's hard to be certain.)
One way to solve this is adding an initialiser to Thing and creating a Thing<Flavor> that will hold a Vanilla object.
It will look something like:
class Thing<T> {
init(thing : T) {
}
}
protocol Flavor {}
class Vanilla: Flavor {}
var dict = [String:Thing<Flavor>]()
dict["foo"] = Thing<Flavor>(thing: Vanilla())

Type inference fails when using nil-coalescing operator with two optionals

We are trying to figure whether this is a bug in Swift or us misusing generics, optionals, type inference and/or nil coalescing operator.
Our framework contains some code for parsing dictionaries into models and we've hit a problem with optional properties with default values.
We have a protocol SomeProtocol and two generic functions defined in a protocol extension:
mapped<T>(...) -> T?
mapped<T : SomeProtocol>(...) -> T?
Our structs and classes adhere to this protocol and then parse their properties inside an init function required by the protocol.
Inside the init(...) function we try to set a value of the property someNumber like this:
someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber
The dictionary of course contains the actual value for key someNumber. However, this will always fail and the actual value will never get returned from the mapped() function.
Either commenting out the second generic function or force downcasting the value on the rhs of the assignment will fix this issue, but we think this should work the way it currently is written.
Below is a complete code snippet demonstrating the issue, along with two options that (temporarily) fix the issue labeled OPTION 1 and OPTION 2 in the code:
import Foundation
// Some protocol
protocol SomeProtocol {
init(dictionary: NSDictionary?)
}
extension SomeProtocol {
func mapped<T>(dictionary: NSDictionary?, key: String) -> T? {
guard let dictionary = dictionary else {
return nil
}
let source = dictionary[key]
switch source {
case is T:
return source as? T
default:
break
}
return nil
}
// ---
// OPTION 1: Commenting out this makes it work
// ---
func mapped<T where T:SomeProtocol>(dictionary: NSDictionary?, key: String) -> T? {
return nil
}
}
// Some struct
struct SomeStruct {
var someNumber: Double? = 0.0
}
extension SomeStruct: SomeProtocol {
init(dictionary: NSDictionary?) {
someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber
// OPTION 2: Writing this makes it work
// someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber!
}
}
// Test code
let test = SomeStruct(dictionary: NSDictionary(object: 1234.4567, forKey: "someNumber"))
if test.someNumber == 1234.4567 {
print("success \(test.someNumber!)")
} else {
print("failure \(test.someNumber)")
}
Please note, that this is an example which misses the actual implementations of the mapped functions, but the outcome is identical and for the sake of this question the code should be sufficient.
EDIT: I had reported this issue a while back and now it was marked as fixed, so hopefully this shouldn't happen anymore in Swift 3.
https://bugs.swift.org/browse/SR-574
You've given the compiler too many options, and it's picking the wrong one (at least not the one you wanted). The problem is that every T can be trivially elevated to T?, including T? (elevated to T??).
someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber
Wow. Such types. So Optional. :D
So how does Swift begin to figure this thing out. Well, someNumber is Double?, so it tries to turn this into:
Double? = Double?? ?? Double?
Does that work? Let's look for a generic mapped, starting at the most specific.
func mapped<T where T:SomeProtocol>(dictionary: NSDictionary?, key: String) -> T? {
To make this work, T has to be Double?. Is Double?:SomeProtocol? Nope. Moving on.
func mapped<T>(dictionary: NSDictionary?, key: String) -> T? {
Does this work? Sure! T can be Double? We return Double?? and everything resolves.
So why does this one work?
someNumber = self.mapped(dictionary, key: "someNumber") ?? someNumber!
This resolves to:
Double? = Optional(Double? ?? Double)
And then things work the way you think they're supposed to.
Be careful with so many Optionals. Does someNumber really have to be Optional? Should any of these things throw? (I'm not suggesting throw is a general work-around for Optional problems, but at least this problem gives you a moment to consider if this is really an error condition.)
It is almost always a bad idea to type-parameterize exclusively on the return value in Swift the way mapped does. This tends to be a real mess in Swift (or any generic language that has lots of type inference, but it really blows up in Swift when there are Optionals involved). Type parameters should generally appear in the arguments. You'll see the problem if you try something like:
let x = test.mapped(...)
It won't be able to infer the type of x. This isn't an anti-pattern, and sometimes the hassle is worth it (and in fairness, the problem you're solving may be one of those cases), but avoid it if you can.
But it's the Optionals that are killing you.
EDIT: Dominik asks a very good question about why this behaves differently when the constrained version of mapped is removed. I don't know. Obviously the type matching engine checks for valid types in a little different order depending on how many ways mapped is generic. You can see this by adding print(T.self) to mapped<T>. That might be considered a bug in the compiler.

Defining Nested Dictionaries in Swift

In the below code, I get an error [String : Double] does not conform to Hashable. How do i get around this?
I see the problem of non-conformance to Hashable protocol, but i'm wondering why this would be the case , that other way works. Is only 'Key' in a dictionary is required to confirm to Hashable? Some explanation would help
enum someEnumType {
case First(String, (Int, Int)->Int)
case Second (String, Int)
}
// var operations = [someEnumType : [String : Double]](); <--- This syntax Works
var operations = [[String : Double] : someEnumType ](); <--- But this does not work, ideally - i want this.
Dictionaries are also called† hash tables; they work by hashing the key. So, yes, it does need to be Hashable. The value doesn't since the point is to look up values by key.
† Well, strictly speaking one could implement a dictionary without hashing, but in practice a data structure called a dictionary in programming languages is usually understood to be a hash map. In Swift as well, the Dictionary documentation specifies it as a “hash-based mapping”.
You are correct, only a Dictionary's key must conform to the Hashable protocol.
"How do I get around this?"
Probably the most direct way of using a Dictionary the way you want to is to define your own key type. A struct makes sense here; it offers the same value semantics that your [String: Double] would-be key offers, and it is easy to define:
struct MyKey {
let myString: String
let myDouble: Double
}
Of course, it must be hashable to be used as a Dictionary key, so we add Hashable conformance:
struct MyKey: Hashable {
let myString: String
let myDouble: Double
var hashValue: Int {
return self.myString.hashValue ^ self.myDouble.hashValue
}
}
I did a cute trick there to calculate a "unique-ish" hash value for this key type: just an XOR of the string's and double's hash values. I won't guarantee uniqueness, but it should be good enough for most cases. Calculating a better hash value is an exercise I'll leave up to you if you want (but it will work just fine like this).
Finally, to conform to Hashable, one must also conform to Equatable. In our case we'll just check to see if each of a key's properties match the other's to determine if keys are equal. The full implementation:
struct MyKey: Hashable {
let myString: String
let myDouble: Double
var hashValue: Int {
return self.myString.hashValue ^ self.myDouble.hashValue
}
}
func ==(lhs: MyKey, rhs: MyKey) -> Bool {
return lhs.myString == rhs.myString && lhs.myDouble == rhs.myDouble
}
Now, you may define your Dictionary like this:
var operations = [MyKey : someEnumType ]()
And add an entry like this:
let myFirstKey = MyKey(myString: "Hello", myDouble: 1.0)
operations[myFirstKey] = someEnumType
Yes, the key needs to be hashable.
You should be able to work around this by using NSString rather than String.

Convert or cast object to string

how can i convert any object type to a string?
let single_result = results[i]
var result = ""
result = single_result.valueForKey("Level")
now i get the error: could not assign a value of type any object to a value of type string.
and if i cast it:
result = single_result.valueForKey("Level") as! String
i get the error:
Could not cast value of type '__NSCFNumber' (0x103215cf0) to 'NSString' (0x1036a68e0).
How can i solve this issue?
You can't cast any random value to a string. A force cast (as!) will fail if the object can't be cast to a string.
If you know it will always contain an NSNumber then you need to add code that converts the NSNumber to a string. This code should work:
if let result_number = single_result.valueForKey("Level") as? NSNumber
{
let result_string = "\(result_number)"
}
If the object returned for the "Level" key can be different object types then you'll need to write more flexible code to deal with those other possible types.
Swift arrays and dictionaries are normally typed, which makes this kind of thing cleaner.
I'd say that #AirSpeedVelocity's answer (European or African?) is the best. Use the built-in toString function. It sounds like it works on ANY Swift type.
EDIT:
In Swift 3, the answer appears to have changed. Now, you want to use the String initializer
init(describing:)
Or, to use the code from the question:
result = single_result.valueForKey("Level")
let resultString = String(describing: result)
Note that usually you don't want valueForKey. That is a KVO method that will only work on NSObjects. Assuming single_result is a Dictionary, you probably want this syntax instead:
result = single_result["Level"]
This is the documentation for the String initializer provided here.
let s = String(describing: <AnyObject>)
Nothing else is needed. This works for a diverse range of objects.
The toString function accepts any type and will always produce a string.
If it’s a Swift type that implements the Printable protocol, or has overridden NSObject’s description property, you’ll get whatever the .description property returns. In the case of NSNumber, you’ll get a string representation of the number.
If it hasn’t, you’ll get a fairly unhelpful string of the class name plus the memory address. But most standard classes, including NSNumber, will produce something sensible.
import Foundation
class X: NSObject {
override var description: String {
return "Blah"
}
}
let x: AnyObject = X()
toString(x) // return "Blah"
"\(x)" // does the same thing but IMO is less clear
struct S: Printable {
var description: String {
return "asdf"
}
}
// doesn't matter if it's an Any or AnyObject
let s: Any = S()
toString(s) // reuturns "asdf"
let n = NSNumber(double: 123.45)
toString(n) // returns "123.45"
n.stringValue // also works, but is specific to NSNumber
(p.s. always use toString rather than testing for Printable. For one thing, String doesn’t conform to Printable...)
toString() doesn't seem to exist in Swift 3 anymore.
Looks like there's a failable initializer that will return the passed in value's description.
init?(_ description: String)
Docs here https://developer.apple.com/reference/swift/string/1540435-init

Swift: Casting collections, and creating custom convertible protocols

Consider this Person class, which simply implements StringLiteralConvertible and assigns the string literal to name:
class Person : StringLiteralConvertible {
var name : String?
typealias StringLiteralType = String
required init(stringLiteral value: StringLiteralType) {
println("stringLiteral \(value)")
name = value
}
typealias ExtendedGraphemeClusterLiteralType = String
required init(extendedGraphemeClusterLiteral value: ExtendedGraphemeClusterLiteralType) {
println("extendedGraphemeClusterLiteral \(value)")
name = value
}
typealias UnicodeScalarLiteralType = Character
required init(unicodeScalarLiteral value: UnicodeScalarLiteralType) {
println("unicodeScalarLiteral \(value)")
name = "\(value)"
}
}
This allows me to create a Person instance using a string:
let aaron : Person = "Aaron"
I can even cast an array of Persons from an array of strings:
let names = ["John", "Jane"] as [Person]
However this only works with string literals. If I use a string variable, it fails:
let aaronString = "Aaron"
let aaron : Person = aaronString
// Error: 'NSString' is not a subtype of 'Person'
Similarly, trying to cast an array of non-literal strings fails:
let nameStrings = ["John", "Jane"]
let people : [Person] = nameStrings
// Error: 'String' is not identical to 'Person'
I have three questions:
Is there another protocol I can implement to cast a non-literal string to a Person? I'd like to do this so I can cast entire collections to convert the objects.
If no to #1, is map + an initializer the best way to perform the conversion myself?
let nameStrings = ["John", "Jane"]
let people = nameStrings.map{Person(name: $0)}
If yes to #1, is there a similar approach I can use to specify an approach to convert two objects which are unrelated in hierarchy? That is, can I work around this error without an initializer?
let rikerPerson : Person = "Riker"
let rikerEmployee = rikerPerson as Employee
// Error: 'Person' is not convertible to 'Employee'
What you are describing as “casting” isn’t really casting (in the way that, say, s = “fred”; ns = s as NSString is, or that casts in C++ are).
let names = ["John", "Jane"] as [Person]
is just another a way of writing:
let names: [Person] = ["John", "Jane"]
that is, a way of telling Swift which of the many possible versions of StringLiteralConvertible to use (and not the one for String, which is the default).
Put it another way – your as is fulfilling a similar function to the as in this snippet that disambiguates two overloaded functions that differ only by return type:
func f() -> String { return "foo" }
func f() -> Int { return 42 }
let i = f() as Int // i will be 42
let s = f() as String // s will be “foo"
No “conversion” is going on here – the as is just being used to disambiguate which f Swift calls. It’s the same with which init(stringLiteral:) is chosen.
Definitely (but only if you put a space between map and the { } ;-).
If you’re concerned about the waste of converting it all to an array just to do some other thing with it, check out lazy(a).map
Nope. In the betas, there used to be a __conversion() -> T method you could implement to do “casts” like this on your own classes – or more importantly, allowed you to pass your Person class into a function that took an Employee argument and have it be converted implicitly. But that got disappeared. Generally that kind of implicit conversion is antithetical to Swift’s style, except in rare cases (Obj-C and C interop, and implicit wrapping in optionals, being the main ones). You have to write an init for Employee that takes a Person (or some class or protocol that Person conforms to), and then call it.