generic [T.Generator.Element] initializer - swift

I'm trying to learn Swift and working through the examples in Ch. 1 of Apple's book. The last exercise gives me a lot of headaches, I am trying to construct a function that returns the common elements of two sequences passed as parameters. Here is the code I tried:
func anyCommonElements <T, U where T: SequenceType, U: SequenceType, T.Generator.Element:
Equatable, T.Generator.Element == U.Generator.Element>
(lhs: T, rhs: U) -> [T.Generator.Element] {
var result : [T.Generator.Element] // how to default-initialize it?
for lhsItem in lhs {
for rhsItem in rhs {
if lhsItem == rhsItem {
result.append(lhsItem)
}
}
}
return result
}
My only problem is that I don't know how to initialize result (of type [T.Generator.Element]), and I cannot therefore use it latter to append the common elements into it. I tried the obvious
var result : [T.Generator.Element] = [T.Generator.Element]()
however I'm getting the error Could not find member Element, and if I try
var result : [T.Generator.Element] = [T.Generator.Element()]()
the compiler spits T.Generator.Element cannot be constructed because it has no accessible initializers
Any idea how to initialize such an array? Or is there any other obvious solution (which I'm not seeing now)?

You can create an empty array (or dictionary) without calling the initializer of its element type:
var result : [T.Generator.Element] = []
(For an empty dictionary, use [:].)
However, it seems like your first try ([T.Generator.Element]()) should be acceptable to the compiler -- that doesn't call the element type's initializer. I'd recommend filing a bug and asking on the developer forums to see what the official Apple answer is.

Related

Swift method needs to sort a polymorphic collection of UInt32

Confused (and a little frustrated) with Swift5 at the moment.
I have a method:
func oidsMany(_ oids:Array<UInt32>) -> MCCommandBuilder {
let sorted_oids:[UInt32] = oids.sorted()
...
}
Discovered I have a case where I want to pass a Set to this method just as well. Either way, I'm going to sort an Array or a Set to an Array right away.
Waded through the many many many protocols that both Set and Array conform to, noticed that they both conform to [Sequence][1] and that Sequence responds to sorted. Perfect.
But when I change the above to:
func oidsMany(_ Sequence<UInt32>) -> MCCommandBuilder {
let sorted_oids:[UInt32] = oids.sorted()
...
}
I get the following error hints:
Cannot specialize non-generic type 'Sequence'
Member 'sorted' cannot be used on value of protocol type 'Sequence'; use a generic constraint instead
What's the right way to approach this? I could just add a second oidsMany(_ Set...) that casts its arg as an array and recalls. But I feel like I'm missing something fundamental here. My experience from other languages is not mapping over well here.
You can as the error message suggest use it as a generic constraint instead
func oidsMany2<Sortable: Sequence>(_ oids: Sortable) -> MCCommandBuilder where Sortable.Element: Comparable {
let sorted_oids:[Sortable.Element] = oids.sorted()
//...
}
if you only want to accept collections where the element is UInt32 you can change the where condition to
where Sortable.Element == UInt32

Forcing an Encoder's UnkeyedEncodingContainer to only contain one type of value

As part of a custom Encoder, I am coding an UnkeyedEncodingContainer. However, the specific format I am making it for asks that all elements of an array be of the same type. Specifically, arrays can contain :
Integers one same size
Floats or Doubles
Other arrays (not necessarily all containing the same kinds of elements)
Objects
Here is the type of answer I need : The basis of an UnkeyedEncodingContainer implementation that conforms to the protocol, and enforces that all elements be of one same type among the above specified ones.
As requested, here are examples of things that should or should not be encodable :
var valid1 = []
var valid2 = [3, 3, 5, 9]
var valid3 = ["string", "array"]
var invalid1 = [3, "test"]
var invalid2 = [5, []]
var invalid3 = [[3, 5], {"hello" : 3}]
// These may not all even be valid Swift arrays, they are only
// intended as examples
As an example, here is the best I have come up with, which does not work :
The UnkeyedEncodingContainer contains a function, checkCanEncode, and an instance variable, ElementType :
var elementType : ElementType {
if self.count == 0 {
return .None
} else {
return self.storage[0].containedType
}
}
func checkCanEncode(_ value : Any?, compatibleElementTypes : [ElementType]) throws {
guard compatibleElementTypes.contains(self.elementType) || self.elementType == .None else {
let context = EncodingError.Context(
codingPath: self.nestedCodingPath,
debugDescription: "Cannot encode value to an array of \(self.elementType)s"
)
throw EncodingError.invalidValue(value as Any, context)
}
}
// I know the .None is weird and could be replaced by an optional,
// but it is useful as its rawValue is 0. The Encoder has to encode
// the rawValue of the ElementType at some point, so using an optional
// would actually be more complicated
Everything is then encoded as a contained singleValueContainer :
func encode<T>(_ value: T) throws where T : Encodable {
let container = self.nestedSingleValueContainer()
try container.encode(value)
try checkCanEncode(value, compatibleElementTypes: [container.containedType])
}
// containedType is an instance variable of SingleValueContainer that is set
// when a value is encoded into it
But this causes an issue when it comes to nestedContainer and nestedUnkeyedContainer : (used for stored dictionaries and arrays respectively)
// This violates the protocol, this function should not be able to throw
func nestedContainer<NestedKey>(keyedBy keyType: NestedKey.Type) throws -> KeyedEncodingContainer<NestedKey> where NestedKey : CodingKey {
let container = KeyedContainer<NestedKey>(
codingPath: self.nestedCodingPath,
userInfo: self.userInfo
)
try checkCanEncode(container, compatibleElementTypes: [.Dictionary])
self.storage.append(container)
return KeyedEncodingContainer(container)
}
As you can see, since I need checkCanEncode to know whether it is even possible to create a NestedContainer in the first place (because if the array already has stuff inside that aren't dictionaries, then adding dictionaries to it is invalid), I have to make the function throw. But this breaks the UnkeyedEncodingContainer protocol which demands non-throwing versions.
But I can't just handle the error inside the function ! If something tries to put an array inside an array of integers, it must fail. Therefore this is an invalid solution.
Additional remarks :
Checking after having encoded the values already feels sketchy, but checking only when producing the final encoded payload is definitely a violation of the "No Zombies" principle (fail as soon as the program enters an invalid state) which I would rather avoid. However if no better solution is possible I may accept it as a last resort.
One other solution I have thought about is encoding the array as a dictionary with numbered keys, since dictionaries in this format may contain mixed types. However this is likely to pose decoding issues, so once again, it is a last resort.
You will be advised not to edit other people’s questions. If you have edits to suggest please do so in the comments, otherwise mind your own business
Unless anyone has a better idea, here is the best I could come up with :
Do not enforce that all elements be of the same type inside the UnkeyedEncodingContainer
If all elements are the same type, encode it as an array
If elements have varying types, encode it as a dictionary with integers as keys
This is completely fine as far as the encoding format goes, has minimal costs and only slightly complicates decoding (check whether keys contain integers) and greatly widens how many different Swift object will be compatible with the format.
Note : Remember that the "real" encoding step where the data is generated is not actually part of the protocol. That is where I am proposing the shenanigans should take place 😈
I don't know whether this helps you but in Swift you can overload functions. This means that you can declare functions with the same signature but with different parameter types or constraints. The compiler will take always the right choice. It's much more efficient than a type check at runtime.
The first method is called if the array conforms to Encodable
func encode<T: Encodable>(object: [T]) throws -> Data {
return try JSONEncoder().encode(object)
}
Otherwise the second method is called. If the array cannot even be encoded with JSONSerialization you can add custom encoding logic.
func encode<T>(object: [T]) throws -> Data {
do {
return try JSONSerialization.data(withJSONObject: object)
} catch {
// do custom encoding and return Data
return try myCustomEncoding(object)
}
}
This example
let array = [["1":1, "2":2]]
try encode(object: array)
calls the first method.
On the other hand this – even if the actual type is not heterogenous – calls the second method
let array : [[String:Any]] = [["1":1, "2":2]]
try encode(object: array)

Swift - Ambiguous reference to member '==' error

I've done some searching and can't seem to find a solution for my below issue. I'm fairly new to swift so working through some issues. I have the below function which is just a stub containing the signature of the function and a return value just to get it to compile.
The test code is what I've been given, I did not write this and unfortunately cannot alter it.
My issues is that when I run it, it says that the test that calls this function has an "Ambiguous call to member '=='". I cannot alter the test code. Whatever the issue is must be in my function signature i'm assuming but could use some help.
Function That I am writing (That I assume contains the issue):
func contains(target: StringOrInt, compare: Character ) -> Bool {
return false
} // contains
Test that calls the function (I'm not allowed to edit this and did not write it):
func test_contains_cons_2_strings_3() {
let list1 = MyList.cons("foo", MyList.cons("bar", MyList.empty))
assertEquals(testName: "test_contains_cons_2_strings_3",
expected: true,
received: list1.contains(target: "bar", compare: ==))//Error is on this line '=='
} // test_contains_cons_2_strings_3
Error:
main.swift:807:67: error: ambiguous reference to member '=='
received: list1.contains(target: "foo", compare: ==))
Also note that "StringOrInt" is a protocol that I've defined that acts as an extension on both Int and String. This is done because the test code (Which i did not write and cannot edit) passes both strings and ints to this same variable and function.
Thanks advance for any help, I really appreciate it!
I think you want to compare two strings by passing "==" operator and return a Boolean value.If so you can use the below method,
func myList(_ compare:((String, String) -> Bool)) -> Bool {
return compare("foo","bar")
}
myList(==)
I was able to figure this out using the below. I implemented an enum of generic type and the nan extension on that enum. I then used this generic type within the contains function to represent multiple types instead of StringOrInt. Thanks everyone for the comments and the help.
enum implemented:
indirect enum my_list<A> {
case cons(A, my_list<A>)
case empty
}
I then implemented in extension for this enum "extension my_list" and placed the contains function within the extension.
contains function signature/stub:
func contains(target: A, compare:((A, A) -> Bool))
-> Bool {
return true
}// contains

Can a condition be used to determine the type of a generic?

I will first explain what I'm trying to do and how I got to where I got stuck before getting to the question.
As a learning exercise for myself, I took some problems that I had already solved in Objective-C to see how I can solve them differently with Swift. The specific case that I got stuck on is a small piece that captures a value before and after it changes and interpolates between the two to create keyframes for an animation.
For this I had an object Capture with properties for the object, the key path and two id properties for the values before and after. Later, when interpolating the captured values I made sure that they could be interpolated by wrapping each of them in a Value class that used a class cluster to return an appropriate class depending on the type of value it wrapped, or nil for types that wasn't supported.
This works, and I am able to make it work in Swift as well following the same pattern, but it doesn't feel Swift like.
What worked
Instead of wrapping the captured values as a way of enabling interpolation, I created a Mixable protocol that the types could conform to and used a protocol extension for when the type supported the necessary basic arithmetic:
protocol SimpleArithmeticType {
func +(lhs: Self, right: Self) -> Self
func *(lhs: Self, amount: Double) -> Self
}
protocol Mixable {
func mix(with other: Self, by amount: Double) -> Self
}
extension Mixable where Self: SimpleArithmeticType {
func mix(with other: Self, by amount: Double) -> Self {
return self * (1.0 - amount) + other * amount
}
}
This part worked really well and enforced homogeneous mixing (that a type could only be mixed with its own type), which wasn't enforced in the Objective-C implementation.
Where I got stuck
The next logical step, and this is where I got stuck, seemed to be to make each Capture instance (now a struct) hold two variables of the same mixable type instead of two AnyObject. I also changed the initializer argument from being an object and a key path to being a closure that returns an object ()->T
struct Capture<T: Mixable> {
typealias Evaluation = () -> T
let eval: Evaluation
let before: T
var after: T {
return eval()
}
init(eval: Evaluation) {
self.eval = eval
self.before = eval()
}
}
This works when the type can be inferred, for example:
let captureInt = Capture {
return 3.0
}
// > Capture<Double>
but not with key value coding, which return AnyObject:\
let captureAnyObject = Capture {
return myObject.valueForKeyPath("opacity")!
}
error: cannot invoke initializer for type 'Capture' with an argument list of type '(() -> _)'
AnyObject does not conform to the Mixable protocol, so I can understand why this doesn't work. But I can check what type the object really is, and since I'm only covering a handful of mixable types, I though I could cover all the cases and return the correct type of Capture. Too see if this could even work I made an even simpler example
A simpler example
struct Foo<T> {
let x: T
init(eval: ()->T) {
x = eval()
}
}
which works when type inference is guaranteed:
let fooInt = Foo {
return 3
}
// > Foo<Int>
let fooDouble = Foo {
return 3.0
}
// > Foo<Double>
But not when the closure can return different types
let condition = true
let foo = Foo {
if condition {
return 3
} else {
return 3.0
}
}
error: cannot invoke initializer for type 'Foo' with an argument list of type '(() -> _)'
I'm not even able to define such a closure on its own.
let condition = true // as simple as it could be
let evaluation = {
if condition {
return 3
} else {
return 3.0
}
}
error: unable to infer closure type in the current context
My Question
Is this something that can be done at all? Can a condition be used to determine the type of a generic? Or is there another way to hold two variables of the same type, where the type was decided based on a condition?
Edit
What I really want is to:
capture the values before and after a change and save the pair (old + new) for later (a heterogeneous collection of homogeneous pairs).
go through all the collected values and get rid of the ones that can't be interpolated (unless this step could be integrated with the collection step)
interpolate each homogeneous pair individually (mixing old + new).
But it seems like this direction is a dead end when it comes to solving that problem. I'll have to take a couple of steps back and try a different approach (and probably ask a different question if I get stuck again).
As discussed on Twitter, the type must be known at compile time. Nevertheless, for the simple example at the end of the question you could just explicitly type
let evaluation: Foo<Double> = { ... }
and it would work.
So in the case of Capture and valueForKeyPath: IMHO you should cast (either safely or with a forced cast) the value to the Mixable type you expect the value to be and it should work fine. Afterall, I'm not sure valueForKeyPath: is supposed to return different types depending on a condition.
What is the exact case where you would like to return 2 totally different types (that can't be implicitly casted as in the simple case of Int and Double above) in the same evaluation closure?
in my full example I also have cases for CGPoint, CGSize, CGRect, CATransform3D
The limitations are just as you have stated, because of Swift's strict typing. All types must be definitely known at compile time, and each thing can be of only one type - even a generic (it is resolved by the way it is called at compile time). Thus, the only thing you can do is turn your type into into an umbrella type that is much more like Objective-C itself:
let condition = true
let evaluation = {
() -> NSObject in // *
if condition {
return 3
} else {
return NSValue(CGPoint:CGPointMake(0,1))
}
}

How to handle initial nil value for reduce functions

I would like to learn and use more functional programming in Swift. So, I've been trying various things in playground. I don't understand Reduce, though. The basic textbook examples work, but I can't get my head around this problem.
I have an array of strings called "toDoItems". I would like to get the longest string in this array. What is the best practice for handling the initial nil value in such cases? I think this probably happens often. I thought of writing a custom function and use it.
func optionalMax(maxSofar: Int?, newElement: Int) -> Int {
if let definiteMaxSofar = maxSofar {
return max(definiteMaxSofar, newElement)
}
return newElement
}
// Just testing - nums is an array of Ints. Works.
var maxValueOfInts = nums.reduce(0) { optionalMax($0, $1) }
// ERROR: cannot invoke 'reduce' with an argument list of type ‘(nil, (_,_)->_)'
var longestOfStrings = toDoItems.reduce(nil) { optionalMax(count($0), count($1)) }
It might just be that Swift does not automatically infer the type of your initial value. Try making it clear by explicitly declaring it:
var longestOfStrings = toDoItems.reduce(nil as Int?) { optionalMax($0, count($1)) }
By the way notice that I do not count on $0 (your accumulator) since it is not a String but an optional Int Int?
Generally to avoid confusion reading the code later, I explicitly label the accumulator as a and the element coming in from the serie as x:
var longestOfStrings = toDoItems.reduce(nil as Int?) { a, x in optionalMax(a, count(x)) }
This way should be clearer than $0 and $1 in code when the accumulator or the single element are used.
Hope this helps
Initialise it with an empty string "" rather than nil. Or you could even initialise it with the first element of the array, but an empty string seems better.
Second go at this after writing some wrong code, this will return the longest string if you are happy with an empty string being returned for an empty array:
toDoItems.reduce("") { count($0) > count($1) ? $0 : $1 }
Or if you want nil, use
toDoItems.reduce(nil as String?) { count($0!) > count($1) ? $0 : $1 }
The problem is that the compiler cannot infer the types you are using for your seed and accumulator closure if you seed with nil, and you also need to get the optional type correct when using the optional string as $0.