Why does return <expr> work when let var = <expr>; return var give an error that it can't convert the type - swift

I have the following code, which compiles & works fine:
import RealmSwift
struct Bucket: Codable, Identifiable {
var id: UUID
var title: String
init() {
self.id = UUID()
self.title = "new bucket"
}
init(title: String) {
self.id = UUID()
self.title = title
}
init(id: UUID, title: String) {
self.id = id
self.title = title
}
}
class RealmBucket : Object {
#Persisted var id : UUID
#Persisted var title : String
convenience init(_ id: UUID, _ title: String) {
self.init()
self.id = id
self.title = title
}
}
func loadBuckets() -> [Bucket] {
let realm = try! Realm()
let realmBuckets = realm.objects(RealmBucket.self)
return realmBuckets.map { Bucket(id: $0.id, title: $0.title) }
}
but if I change the loadBuckets() function to:
func loadBuckets() -> [Bucket] {
let realm = try! Realm()
let realmBuckets = realm.objects(RealmBucket.self)
let result = realmBuckets.map { Bucket(id: $0.id, title: $0.title) }
return result
}
(just the last line changed)
I get the error:
Cannot convert return expression of type 'LazyMapSequence<Results<RealmBucket>, Bucket>' to return type '[Bucket]'
If I change the let line to be:
let result : [Bucket] = realmBuckets.map { Bucket(id: $0.id, title: $0.title) }
then it works again.
I can think of a couple possible explanations for the behavior, but they all seem to point to a compiler bug, or language deficiency, and I'm guessing that perhaps there is some language feature I'm unaware of.
Does anyone know why the compiler is able to automatically convert the LazyMapSequence in the case of the return value being a variable, when it clearly knows the type of the variable given the error it is giving me. I'm relatively new to Swift, and hoping to learn something from this.
Based on the current answers, my assumption is that it is just a slightly different case in the compiler code to convert the variable versus a method call, so it's really just a compiler deficiency, and likely to not exist down the road. In any case, it's easy enough to work around, it just struck me as odd.

You have to explicitly define the return type of map function, when you use shorthand closure syntax, probably that is the reason.

A “return” statement knows the type it has to return, so if applies that knowledge when the right hand side is evaluated, and can often convert that result to the correct type.
A let statement with no type means the compiler doesn’t know which type is needed. So the right hand side can be evaluated differently.

Related

Is there a way to compile with a generic type?

This is something that has vexed a number of developers including myself. Let say we have a protocol that defines a subscript which we apply to a simple class.
protocol Cache {
subscript<Value>(_: String) -> Value? { get set }
}
class InMemoryCache: Cache {
private var cache: [String: Any] = [:]
subscript<Value>(key: String) -> Value? {
get {
cache[key] as? Value
}
set {
if let value = newValue {
cache[key] = value
} else {
cache.remove(key)
}
}
}
}
This works fine as long as we know the types:
cache["abc"] = 5
let x: Int? = cache["abc"]
but the developers want to do this:
cache["abc"] = nil
Which won't compile because the compiler cannot determine the Value generic type. This works however
cache["abc"] = nil as String?
I've tried a number of things but they all have drawbacks. Things like adding a second subscript with the Any type. Nothing seems to work well even though it would seem like a simple problem.
Has anyone found a solution that handles cache["abc"] = nil?
You can do this by changing your protocol requirements somewhat.
Have the protocol require a subscript that does not use generics, and returns an Any?.
protocol Cache {
subscript(key: String) -> Any? { get set }
}
This subscript will let you do the following:
cache["abc"] = 5
cache["abc"] = nil
let value = cache["abc"] // value is an `Any?`
but it will not let you do this:
let number: Int? = cache["abc"] // error
So, let's fix that by adding another subscript to Cache. This subscript is equivalent to your original subscript requirement, except it doesn't need a setter and will call the other subscript (the one required by the protocol):
extension Cache {
subscript<Value>(key: String) -> Value? {
self[key] as? Value
}
}
(If you're worried that this subscript calls itself, don't be. self[key] here actually calls the other subscript, not this one. You can confirm this in Xcode by command-clicking on the [ or the ] in self[key] to jump to the definition of the other subscript.)
Then, implement the required subscript in your class:
class InMemoryCache: Cache {
private var cache: [String: Any] = [:]
subscript(key: String) -> Any? {
get { cache[key] }
set { cache[key] = newValue }
}
}
This will allow all of the following to compile:
let cache = InMemoryCache()
cache["abc"] = 5
let x: Int? = cache["abc"]
cache["abc"] = nil
There is a workaround to have your desire output.
Because this is a dictionary so you get assign nil directly in your InMemoryCache
class InMemoryCache: Cache {
private var cache: [String: Any] = [:]
subscript<Value>(key: String) -> Value? {
get {
cache[key] as? Value
}
set {
if let value = newValue {
cache[key] = value
} else {
cache[key] = nil // make nil directly here
}
}
}
}
In here because of Value is a generic type. So you can not assign nil directly. It must have a specific type.
Instead you can do like this
let nilValue : Int? = nil // any type nil you want
cache["abc"] = nilValue
or directly cast it to nil of any tupe before assign to dictionary
cache["abc"] = (nil as String?)
It will refresh anything value is store in the key.
Example
// value
let nilValue : Int? = nil
var number : Int? = nil
var string : String? = nil
cache["abc"] = 5
number = cache["abc"] // Optional.some(5)
cache["abc"] = "abc"
number = cache["abc"] // nil
string = cache["abc"] // Optional.some("abc")
cache["abc"] = nilValue
number = cache["abc"] // nil
string = cache["abc"] // nil
The reason why you are having a hard time with this is because
cache["abc"] = nil
cannot be compiled. There is not enough information to infer the generic type of the subscript - or of the optional value. The compiler sees something like
cache<?>["abc"] = Optional<?>.none
How is it supposed to figure out what to put in place of the question marks?
There's another ambiguity. Your cache can contain any type, even Optional. When you are assigning nil to the subscript, how does anybody know if you want to remove the element or store an instance of Optional<Something>.none at the subscript?
When I find myself fighting the language in this way, I usually try to take a step back and ask if I am perhaps doing something fundamentally bad. I think, in this case, the answer is yes. You are trying to pretend something is more strictly typed than it really is.
I think your getter/setter should explicitly take a value that is of type Any. It works better and it has the advantage that it explicitly documents for the user that a Cache conforming type can store anything in it.
For this reason, I would say TylerP's solution is the best. However, I would not create a subscript in the extension, I would define a function
extension Cache
{
func value<Value>(at key: String) -> Value?
{
self[key] as? Value
}
}
The reason for this is that the compiler can get confused when you have multiple subscripts with similar signatures. With the extension above, I can conform Dictionary<String, Any> to the protocol and not need a new class.
extension Dictionary: Cache where Key == String, Value == Any {}
var dict: [String : Any] = [:]
dict["abc"] = 5
let y: Int? = dict.value(at: "abc")
dict["abc"] = nil
Obviously, the above won't be useful to you if you need reference semantics for your cache.
TylerP's solution was pretty much bang on the money. For completeness though, here's what the code now looks like:
protocol Cache {
/// Handles when we need a value of a specific type.
subscript<Value>(_: String) -> Value? { get }
/// Handles getting and setting any value.
/// The getter is rarely used because the generic getter above
/// is used. Setting a value compiles because we don't care what
/// type is it. Setting a `nil` also compiles for the same reason.
subscript(_: String) -> Any? { get set }
}
class InMemoryCache: Cache {
private var cache: [String: Any] = [:]
subscript(key: String) -> Any? {
get { cache[key] }
set {
if let value = newValue {
cache[key] = value
} else {
remove(key)
}
}
}
subscript<Value>(key: String) -> Value? {
cache[key] as? Value
}
}

How to subscript a dictionary that is stored inside a dictionary in swift?

let current_data: [String : Any] = ["table": "tasks", "data": ["title": title_of_task, "completed_by": date_of_task]]
if (!description_of_task.isEmpty){
current_data["data"]["description"] = description_of_task
}
I get the following error:
Value of type 'Any?' has no subscripts
As matt notes, you need to redesign current_data. It is possible to do what you're describing, but the code is extremely ugly, complicated, and fragile because all interactions with Any are ugly, complicated, and fragile. It seems unlikely that you mean "Any" here. Would it be reasonable for the value to be a UIViewController, or a CBPeripheral, or an array of NSTimers? If it would be a problem to pass any of those types, you don't mean "literally any type." You almost never mean that, so you should almost never use Any.
To answer the question as asked, the code would be:
if (!description_of_task.isEmpty) {
if var data = current_data["data"] as? [String: Any] {
data["description"] = description_of_task
current_data["data"] = data
}
}
Yes, that's horrible, and if there are any mistakes, it will quietly do nothing without giving you any errors.
You could, however, redesign this data type using structs:
struct Task {
var title: String
var completedBy: String
var description: String
}
struct Row {
var table: String
var data: Task
}
With that, the code is trivial:
var row = Row(table: "tasks",
data: Task(title: "title_of_task",
completedBy: "date_of_task",
description: ""))
// ...
if !descriptionOfTask.isEmpty {
row.data.description = descriptionOfTask
}

Is there a shorter way of declaring CodingKeys?

Say you have a struct for a model of your API response. Let's say it has 50 members. However, 5-7 members are non-standard casing, you could have AUsernAme or _BTmember, but the rest are all snake case credit_score or status_code.
Rather than writing all members like this:
struct MyStruct {
let aUserName: String
// +50 more...
private enum CodingKeys: String, CodingKey {
case aUserName = "AUsernAme"
// +50 more...
}
}
Is there a way that we can write it like this?
struct MyStruct {
#CodingKey("AUsernAme") let aUserName: String
let creditScore: Int
// +50 more ...
}
Edit: I guess this is not possible with the current Swift version, but does anyone know if this would somehow be included in the future versions of Swift?
The solution which Sweeper provided is a great solution to your problem, but IMO it may display great complexity to your problem and to the next developers who will read this code.
If I were you, I would just stick to writing all the CodingKeys for simplicity. If your worry is writing a lot of lines of cases, you can write all the cases that doesn't need custom keys in one line and just add the keys with unusual/non-standard casing on new lines:
case property1, property2, property3, property4, property5...
case property50 = "_property50"
And since you mentioned that the rest are in snake case, not sure if you know yet, but we have JSONDecoder.KeyDecodingStrategy.convertFromSnakeCase.
Hope this helps `tol! :)
How about setting a custom keyDecodingStrategy just before you decode instead?
struct AnyCodingKey: CodingKey, Hashable {
var stringValue: String
init(stringValue: String) {
self.stringValue = stringValue
}
var intValue: Int?
init(intValue: Int) {
self.intValue = intValue
self.stringValue = "\(intValue)"
}
}
let mapping = [
"AUsernAme": "aUserName",
// other mappings...
]
let decoder = JSONDecoder()
decoder.keyDecodingStrategy = .custom({ codingPath in
let key = codingPath[0].stringValue
guard let mapped = mapping[key] else { return codingPath.last! }
return AnyCodingKey(stringValue: mapped)
})
This assumes your JSON has a single level flat structure. You can make this into an extension:
extension JSONDecoder.KeyDecodingStrategy {
static func mappingRootKeys(_ dict: [String: String]) -> JSONDecoder.KeyDecodingStrategy {
return .custom { codingPath in
let key = codingPath[0].stringValue
guard let mapped = dict[key] else { return codingPath.last! }
return AnyCodingKey(stringValue: mapped)
}
}
}
let decoder = JSONDecoder()
decoder.keyDecodingStrategy = .mappingRootKeys(mapping)
If your JSON has more levels, you can change the type of the dictionary to [JSONPath: String], where JSONPath is a type that you can create that represents a key in a nested JSON. Then add a bit of code that converts the coding path, which is just an array of coding keys, to JSONPath. This should not be hard to write on your own.
A simple way is to just use [AnyCodingKey] as JSONPath, but there are many other ways too, and I encourage you to experiment and find the one you like the best.
typealias JSONPath = [AnyCodingKey]
extension AnyCodingKey {
init(codingKey: CodingKey) {
if let int = codingKey.intValue {
self.init(intValue: int)
} else {
self.init(stringValue: codingKey.stringValue)
}
}
}
extension JSONDecoder.KeyDecodingStrategy {
static func mappingRootKeys(_ dict: [JSONPath: String]) -> JSONDecoder.KeyDecodingStrategy {
return .custom { codingPath in
guard let mapped = dict[codingPath.map(AnyCodingKey.init(codingKey:))] else { return codingPath.last! }
return AnyCodingKey(stringValue: mapped)
}
}
}
let mapping = [
[AnyCodingKey(stringValue: "AUsernAme")]: "aUserName"
]
It is not possible to use a property wrapper for this. Your property wrapper #CodingKey("AUsernAme") let aUserName: String will be compiled to something like this (as per here):
private var _aUserName: CodingKey<String> = CodingKey("AUsernAme")
var aUserName: String {
get { _aUserName.wrappedValue }
set { _aUserName.wrappedValue = newValue }
}
There are two main problems with this:
Assuming you don't want to write init(from:) for all the 50+ properties in MyStruct, code will be synthesised to decode it, assigning to its _aUserName property. You only have control over the init(from:) initialiser of the CodingKey property wrapper, and you cannot do anything about how MyStruct is decoded in there. If MyStruct is contained in another struct:
struct AnotherStruct: Decodable {
let myStruct: MyStruct
}
Then you can indeed control the coding keys used to decode myStruct by marking it with a property wrapper. You can do whatever you want in the decoding process by implementing the property wrapper's init(from:), which brings us to the second problem:
The coding key you pass to the CodingKey property wrapper is passed via an initialiser of the form init(_ key: String). But you control the decoding via the initialiser init(from decoder: Decoder) because that is what will be called when the struct is decoded. In other words, there is no way for you to send the key mappings to the property wrapper.

Extending a constrained protocol for an array argument is not possible

I'm going to explain it by an example. We have a protocol for force having firstName and lastName like:
protocol ProfileRepresentable {
var firstName: String { get }
var lastName: String { get }
}
the type we are going to use have these two, but in an optional form:
struct Profile {
var firstName: String?
var lastName: String?
}
so after conforming to the ProfileRepresentable, we will extend the ProfileRepresentable and try to return the value and a default one for nil state:
extension Profile: ProfileRepresentable { }
extension ProfileRepresentable where Self == Profile {
var firstName: String { self.firstName ?? "NoFirstName" }
var lastName: String { self.lastName ?? "NoLastName" }
}
So far so good
Now there is a similar flow for a list of Profiles.
protocol ProfilerRepresentable {
var profiles: [ProfileRepresentable] { get }
}
struct Profiler {
var profiles: [Profile]
}
First issue
conforming to ProfilerRepresentable does NOT automatically done the implementation as expected (since Profile already conforms to ProfileRepresentable)
extension Profiler: ProfilerRepresentable { }
Second Issue
Following the previous pattern, extending ProfilerRepresentable is not working as expected and it raises a warning:
⚠️ All paths through this function will call itself
extension ProfilerRepresentable where Self == Profiler {
var profiles: [ProfileRepresentable] { self.profiles }
}
How can I achieve the goal for arrays by the way ?
Here is possible solution. Tested with Xcode 12 / swift 5.3
protocol ProfilerRepresentable {
associatedtype T:ProfileRepresentable
var profiles: [T] { get }
}
extension Profiler: ProfilerRepresentable { }
struct Profiler {
var profiles: [Profile]
}
[Profile] is not a subtype of [ProfileRepresentable]. (See Swift Generics & Upcasting for a related but distinct version of this question.) It can be converted through a compiler-provided copying step when passed as a parameter or assigned to a variable, but this is provided as a special-case for those very common uses. It doesn't apply generally.
How you should address this depends on what precisely you want to do with this type.
If you have an algorithm that relies on ProfilerRepresentable, then Asperi's solution is ideal and what I recommend. But going that way won't allow you to create a variable of type ProfileRepresentable or put ProfileRepresentable in an Array.
If you need variables or arrays of ProfilerRepresentable, then you should ask yourself what these protocols are really doing. What algorithms rely on these protocols, and what other reasonable implementations of ProfileRepresentable really make sense? In many cases, ProfileRepresentable should just be replaced with a simple Profile struct, and then have different init methods for creating it in different contexts. (This is what I recommend if your real problem looks a lot like your example, and Asperi's answer doesn't work for you.)
Ultimately you can create type erasers (AnyProfile), but I suggest exploring all other options (particularly redesigning how you do composition) first. Type erasers are perfect if your goal is to erase a complicated or private type (AnyPublisher), but that generally isn't what people mean when they reach for them.
But designing this requires knowing a more concrete goal. There is no general answer that universally applies.
Looking at your comments, there no problem with having multiple types for the same entity if they represent different things. Structs are values. It's fine to have both Double and Float types, even though every Float can also be represented as a Double. So in your case it looks like you just want Profile and PartialProfile structs, and an init that lets you convert one to the other.
struct Profile {
var firstName: String
var lastName: String
}
struct PartialProfile {
var firstName: String?
var lastName: String?
}
extension Profile {
init(_ partial: PartialProfile) {
self.firstName = partial.firstName ?? "NoFirstName"
self.lastName = partial.lastName ?? "NoLastName"
}
}
extension PartialProfile {
init(_ profile: Profile) {
self.firstName = profile.firstName
self.lastName = profile.lastName
}
}
It's possible that you have a lot of these, so this could get a bit tedious. There are many ways to deal with that depending on exactly the problem you're solving. (I recommend starting by writing concrete code, even if it causes a lot of duplication, and then seeing how to remove that duplication.)
One tool that could be useful would be Partial<Wrapped> (inspired by TypeScript) that would create an "optional" version of any non-optional struct:
#dynamicMemberLookup
struct Partial<Wrapped> {
private var storage: [PartialKeyPath<Wrapped>: Any] = [:]
subscript<T>(dynamicMember member: KeyPath<Wrapped, T>) -> T? {
get { storage[member] as! T? }
set { storage[member] = newValue }
}
}
struct Profile {
var firstName: String
var lastName: String
var age: Int
}
var p = Partial<Profile>()
p.firstName = "Bob"
p.firstName // "Bob"
p.age // nil
And a similar converter:
extension Profile {
init(_ partial: Partial<Profile>) {
self.firstName = partial.firstName ?? "NoFirstName"
self.lastName = partial.lastName ?? "NoLastName"
self.age = partial.age ?? 0
}
}
Now moving on to your Array problem, switching between these is just a map.
var partials: [Partial<Profile>] = ...
let profiles = partials.map(Profile.init)
(Of course you could create an Array extension to make this a method like .asWrapped() if it were convenient.)
The other direction is slightly tedious in the simplest approach:
extension Partial where Wrapped == Profile {
init(_ profile: Profile) {
self.init()
self.firstName = profile.firstName
self.lastName = profile.lastName
self.age = profile.age
}
}
If there were a lot of types, it might be worth it to make Partial a little more complicated so you could avoid this. Here's one approach that allows Partial to still be mutable (which I expect would be valuable) while also allowing it to be trivially mapped from the wrapped instances.
#dynamicMemberLookup
struct Partial<Wrapped> {
private var storage: [PartialKeyPath<Wrapped>: Any] = [:]
private var wrapped: Wrapped?
subscript<T>(dynamicMember member: KeyPath<Wrapped, T>) -> T? {
get { storage[member] as! T? ?? wrapped?[keyPath: member] }
set { storage[member] = newValue }
}
}
extension Partial {
init(_ wrapped: Wrapped) {
self.init()
self.wrapped = wrapped
}
}
I don't love this solution; it has a weird quirk where partial.key = nil doesn't work to clear a value. But I don't have a nice fix until we get KeyPathIterable. But there are some other routes you could take depending on your precise problem. And of course things can be simpler if Partial isn't mutable.
The point is that there's no need for protocols here. Just values and structs, and convert between them when you need to. Dig into #dynamicMemberLookup. If your problems are very dynamic, then you may just want more dynamic types.

Reference Types/Subclassing, and Changes to Swift 4 Codable & encoder/decoders

I'm struggling to understand class/reference type behavior and how this relates to changes as I try to upgrade and reduce code using Codable in Swift 4.
I have two classes – a SuperClass with all of the data that will be persistent and that I save to UserDefaults (a place name & string with coordinates), and a SubClass that contains additional, temporary info that I don't need (weather data for the SuperClass coordinates).
In Swift 3 I used to save data like this:
func saveUserDefaults() {
var superClassArray = [SuperClass]()
// subClassArray is of type [SubClass] and contains more data per element.
superClassArray = subClassArray
let superClassData = NSKeyedArchiver.archivedData(withRootObject: superClassArray)
UserDefaults.standard.set(superClassData, forKey: " superClassData")
}
SuperClass conformed to NSObject & NSCoding
It also included the required init decoder & the encode function.
It all worked fine.
In trying to switch to Swift 4 & codable I've modified SuperClass to conform to Codable.
SuperClass now only has one basic initializer and none of the encoder/decoder stuff from Swift 3. There is no KeyedArchiving happening with this new approach (below). SubClass remains unchanged. Unfortunately I crash on the line where I try? encoder.encode [giving a Thread 1: EXC_BAD_ACCESS (code=1, address=0x10)]. My assumption is that the encoder is getting confused with identical reference types where one is SuperClass and one SubClass (subClassArray[0] === superClassArray[0] is true).
I thought this might work:
func saveUserDefaults() {
var superClassArray = [SuperClass]()
superClassArray = subClassArray
// assumption was that the subclass would only contain parts of the superclass & wouldn't produce an error when being encoded
let encoder = JSONEncoder()
if let encoded = try? encoder.encode(superClassArray){
UserDefaults.standard.set(encoded, forKey: " superClassArray ")
} else {
print("Save didn't work!")
}
}
Then, instead of creating an empty superClassArray, then using:
superClassArray = subClassArray, as shown above, I replace this with the single line:
let superClassArray: [SuperClass] = subClassArray.map{SuperClass(name: $0.name, coordinates: $0.coordinates)}
This works. Again, assumption is because I'm passing in the values inside of the class reference type & haven't made the superClassArray = subClassArray. Also, as expected, subClassArray[0] === superClassArray[0] is false
So why did the "old stuff" in Swift 3 work, even though I used the line superClassArray = subClassArray before the let superClassData = NSKeyedArchiver.archivedData(withRootObject: superClassArray)
? Am I essentially achieving the same result by creating the array in Swift 4 that was happening with the old Swift 3 encoder/decoder? Is the looping / recreation
Thanks!
Polymorphic persistence appears to be broken by design.
The bug report SR-5331 quotes the response they got on their Radar.
Unlike the existing NSCoding API (NSKeyedArchiver), the new Swift 4 Codable implementations do not write out type information about encoded types into generated archives, for both flexibility and security. As such, at decode time, the API can only use the concrete type your provide in order to decode the values (in your case, the superclass type).
This is by design — if you need the dynamism required to do this, we recommend that you adopt NSSecureCoding and use NSKeyedArchiver/NSKeyedUnarchiver
I am unimpressed, having thought from all the glowing articles that Codable was the answer to some of my prayers. A parallel set of Codable structs that act as object factories is one workaround I'm considering, to preserve type information.
Update I have written a sample using a single struct that manages recreating polymorphic classes. Available on GitHub.
I was not able to get it to work easily with subclassing. However, classes that conform to a base protocol can apply Codable for default encoding. The repo contains both keyed and unkeyed approaches. The simpler is unkeyed, copied below
// Demo of a polymorphic hierarchy of different classes implementing a protocol
// and still being Codable
// This variant uses unkeyed containers so less data is pushed into the encoded form.
import Foundation
protocol BaseBeast {
func move() -> String
func type() -> Int
var name: String { get }
}
class DumbBeast : BaseBeast, Codable {
static let polyType = 0
func type() -> Int { return DumbBeast.polyType }
var name:String
init(name:String) { self.name = name }
func move() -> String { return "\(name) Sits there looking stupid" }
}
class Flyer : BaseBeast, Codable {
static let polyType = 1
func type() -> Int { return Flyer.polyType }
var name:String
let maxAltitude:Int
init(name:String, maxAltitude:Int) {
self.maxAltitude = maxAltitude
self.name = name
}
func move() -> String { return "\(name) Flies up to \(maxAltitude)"}
}
class Walker : BaseBeast, Codable {
static let polyType = 2
func type() -> Int { return Walker.polyType }
var name:String
let numLegs: Int
let hasTail: Bool
init(name:String, legs:Int=4, hasTail:Bool=true) {
self.numLegs = legs
self.hasTail = hasTail
self.name = name
}
func move() -> String {
if numLegs == 0 {
return "\(name) Wriggles on its belly"
}
let maybeWaggle = hasTail ? "wagging its tail" : ""
return "\(name) Runs on \(numLegs) legs \(maybeWaggle)"
}
}
// Uses an explicit index we decode first, to select factory function used to decode polymorphic type
// This is in contrast to the current "traditional" method where decoding is attempted and fails for each type
// This pattern of "leading type code" can be used in more general encoding situations, not just with Codable
//: **WARNING** there is one vulnerable practice here - we rely on the BaseBeast types having a typeCode which
//: is a valid index into the arrays `encoders` and `factories`
struct CodableRef : Codable {
let refTo:BaseBeast //In C++ would use an operator to transparently cast CodableRef to BaseBeast
typealias EncContainer = UnkeyedEncodingContainer
typealias DecContainer = UnkeyedDecodingContainer
typealias BeastEnc = (inout EncContainer, BaseBeast) throws -> ()
typealias BeastDec = (inout DecContainer) throws -> BaseBeast
static var encoders:[BeastEnc] = [
{(e, b) in try e.encode(b as! DumbBeast)},
{(e, b) in try e.encode(b as! Flyer)},
{(e, b) in try e.encode(b as! Walker)}
]
static var factories:[BeastDec] = [
{(d) in try d.decode(DumbBeast.self)},
{(d) in try d.decode(Flyer.self)},
{(d) in try d.decode(Walker.self)}
]
init(refTo:BaseBeast) {
self.refTo = refTo
}
init(from decoder: Decoder) throws {
var container = try decoder.unkeyedContainer()
let typeCode = try container.decode(Int.self)
self.refTo = try CodableRef.factories[typeCode](&container)
}
func encode(to encoder: Encoder) throws {
var container = encoder.unkeyedContainer()
let typeCode = self.refTo.type()
try container.encode(typeCode)
try CodableRef.encoders[typeCode](&container, refTo)
}
}
struct Zoo : Codable {
var creatures = [CodableRef]()
init(creatures:[BaseBeast]) {
self.creatures = creatures.map {CodableRef(refTo:$0)}
}
func dump() {
creatures.forEach { print($0.refTo.move()) }
}
}
//: ---- Demo of encoding and decoding working ----
let startZoo = Zoo(creatures: [
DumbBeast(name:"Rock"),
Flyer(name:"Kookaburra", maxAltitude:5000),
Walker(name:"Snake", legs:0),
Walker(name:"Doggie", legs:4),
Walker(name:"Geek", legs:2, hasTail:false)
])
startZoo.dump()
print("---------\ntesting JSON\n")
let encoder = JSONEncoder()
encoder.outputFormatting = .prettyPrinted
let encData = try encoder.encode(startZoo)
print(String(data:encData, encoding:.utf8)!)
let decodedZoo = try JSONDecoder().decode(Zoo.self, from: encData)
print ("\n------------\nAfter decoding")
decodedZoo.dump()
Update 2020-04 experience
This approach continues to be more flexible and superior to using Codable, at the cost of a bit more programmer time. It is used very heavily in the Touchgram app which provides rich, interactive documents inside iMessage.
There, I need to encode multiple polymorphic hierarchies, including different Sensors and Actions. By storing signatures of decoders, it not only provides with subclassing but also allows me to keep older decoders in the code base so old messages are still compatible.