I'm trying to build an extension that adds some of the convenience functionality of NSArray/NSMutableArray to the Swift Array class, and I'm trying to add this function:
func indexOfObject(object:AnyObject) -> Int? {
if self.count > 0 {
for (idx, objectToCompare) in enumerate(self) {
if object == objectToCompare {
return idx
}
}
}
return nil
}
But unfortunately, this line:
if object == objectToCompare {
Is giving the error:
could not find an overload for '==' that accepts the supplied arguments
Question
What am I doing wrong to cause this error?
Example
extension Array {
func indexOfObject(object:AnyObject) -> Int? {
if self.count > 0 {
for (idx, objectToCompare) in enumerate(self) {
if object == objectToCompare {
return idx
}
}
}
return nil
}
}
Actually there is no need to implement indexOfObject:; there is a global function find(array, element) already.
You can always create an extension that uses NSArray's indexOfObject, e.g:
extension Array {
func indexOfObject(object:AnyObject) -> Int? {
return (self as NSArray).indexOfObject(object)
}
}
You can specify that your array items can be compared with the <T : Equatable> constraint, then you can cast your object into T and compare them, e.g:
extension Array {
func indexOfObject<T : Equatable>(o:T) -> Int? {
if self.count > 0 {
for (idx, objectToCompare) in enumerate(self) {
let to = objectToCompare as T
if o == to {
return idx
}
}
}
return nil
}
}
My guess is that you have to do something like this:
func indexOfObject<T: Equatable>(object: T) -> Int? {
and so on.
Here's a relevant example from Apple's "The Swift Programming Language" in the "Generics" section:
func findIndex<T: Equatable>(array: T[], valueToFind: T) -> Int? {
for (index, value) in enumerate(array) {
if value == valueToFind {
return index
}
}
return nil
}
The key idea here is that both value and valueToFind must of a type that is guaranteed to have the == operator implemented/overloaded. The <T: Equatable> is a generic that allows only objects of a type that are, well, equatable.
In your case, we would need to ensure that the array itself is composed only of objects that are equatable. The Array is declared as a struct with a generic <T> that does not require it to be equatable, however. I don't know whether it is possible to use extensions to change what kind of types an array can be composed of. I've tried some variations on the syntax and haven't found a way.
You can extract the compare part to another helper function, for example
extension Array {
func indexOfObject(object: T, equal: (T, T) -> Bool) -> Int? {
if self.count > 0 {
for (idx, objectToCompare) in enumerate(self) {
if equal(object, objectToCompare) {
return idx
}
}
}
return nil
}
}
let arr = [1, 2, 3]
arr.indexOfObject(3, ==) // which returns {Some 2}
You were close. Here's a working extension:
extension Array {
func indexOfObject<T: Equatable>(object:T) -> Int? {
if self.count > 0 {
for (idx, objectToCompare) in enumerate(self) {
if object == objectToCompare as T {
return idx
}
}
}
return nil
}
}
Swift had no way of knowing if object or objectToCompare were equatable. By adding generic information to the method, we're then in business.
Related
I'm creating an adjacency list in Swift, storing an array of nodes. However, when adding an edge from an existing node I need to check if the from key exists in any of the children, and if it does check if the to value exists in the same. It seems to be a mess s.t.
func addEdge(from: String, to: String) {
//the full code for addEdge is incomplete here
if (children.contains{ $0.nodes[from] != nil}) {
for child in children {
if (child.nodes[from] != nil) {
if (!(child.nodes[from]?.contains{$0 == to})!){
child.nodes[from]?.append(to)
}
}
}
}
}
Children is
var children = [Node]()
and Node is
class Node: Hashable {
var nodes = [String:[String]]()
var hashValue: Int{ return nodes.hashValue }
static func == (lhs: Node, rhs: Node) -> Bool {
return lhs.nodes.keys == rhs.nodes.keys
}
}
Now it works, but seems really ugly. There must be a better way in Swift, but what is it?
Assuming that you do not wish to change the way you have implemented the above code but want to improve readability, you can utilise if let and optional chaining to make your code cleaner and more readable.
func addEdge(from: String, to: String) {
//the full code for addEdge is incomplete here
if children.contains{ $0.nodes[from] != nil } {
for child in children {
if let fromNode = child.nodes[from], fromNode.contains{$0 == to} {
fromNode.append(to)
}
}
}
}
Swift Optional Chaining
Try something like:
if (children.contains{ $0.nodes[from] != nil}) {
children.filter { $0.nodes[from] != nil }.
compactMap { $0.nodes[from] }.
filter { !($0.nodes[from]!.contains{$0 == to}) }.
forEach { $0.nodes[from]?.append(to) }
}
I've declared the following for my tuple type:
public typealias Http2HeaderTableEntry = (field: String, value: String?)
func ==(lhs: [Http2HeaderTableEntry], rhs: [Http2HeaderTableEntry]) -> Bool {
guard lhs.count == rhs.count else { return false }
for (idx, element) in lhs.enumerated() {
guard element.field == rhs[idx].field && element.value == rhs[idx].value else {
return false
}
}
return true
}
But now when I try and use it in an assertion, like XCTAssertEqual(var1, var2) I'm getting a compiler error:
Cannot invoke 'XCTAssertEqual' with an argument list of type '([Http2HeaderTableEntry], [Http2HeaderTableEntry])'
What do I need to do in order to be able to compare two arrays of tuples in Swift 4?
Since tuples can't have conformances yet, there's not really a way to implement this in any general way at this point.
Reference here
I saw the line of code below at swift github repository
associatedtype Indices : _RandomAccessIndexable, BidirectionalCollection
= DefaultRandomAccessIndices<Self>
I know that an associatedtype is an type alias for protocols, and i know how to interpret it in simple cases
But can someone please explain to me the line of code i saw from swift github repository?
That means that the associated type Indices must conform to
_RandomAccessIndexable and BidirectionalCollection, and by default is DefaultRandomAccessIndices<Self> unless declared (or inferred) otherwise (where Self is the actual type adopting the protocol).
Example:
struct MyIndex : Comparable {
var value : Int16
static func ==(lhs : MyIndex, rhs : MyIndex) -> Bool {
return lhs.value == rhs.value
}
static func <(lhs : MyIndex, rhs : MyIndex) -> Bool {
return lhs.value < rhs.value
}
}
struct MyCollectionType : RandomAccessCollection {
var startIndex : MyIndex { return MyIndex(value: 0) }
var endIndex : MyIndex { return MyIndex(value: 3) }
subscript(position : MyIndex) -> String {
return "I am element #\(position.value)"
}
func index(after i: MyIndex) -> MyIndex {
guard i != endIndex else { fatalError("Cannot increment endIndex") }
return MyIndex(value: i.value + 1)
}
func index(before i: MyIndex) -> MyIndex {
guard i != startIndex else { fatalError("Cannot decrement startIndex") }
return MyIndex(value: i.value - 1)
}
}
let coll = MyCollectionType()
let i = coll.indices
print(type(of: i)) // DefaultRandomAccessIndices<MyCollectionType>
MyCollectionType is a (minimal?) implementation of a
RandomAccessCollection, using a custom index type MyIndex.
It does not define its own indices method or Indices type,
so that Indices becomes the default associated type,
and
indices
is a default protocol extension method of RandomAccessCollection.
I'm writing a graphics library to display data in a graph. Since most of the projects I do tend to have a large learning component in them, I decided to create a generically typed struct to manage my data set DataSet<T: Plottable> (note here that Plottable is also Comparable).
In trying to conform to MutableCollectionType, I've run across an error. I'd like to use the default implementation of sort(), but the compiler is giving the following error when trying to use the sorting function.
Ambiguous reference to member 'sort()'
Here's a code example:
var data = DataSet<Int>(elements: [1,2,3,4])
data.sort() //Ambiguous reference to member 'sort()'
The compiler suggests two candidates, but will not actually display them to me. Note that the compiler error goes away if I explicitly implement sort() on my struct.
But the bigger question remains for me. What am I not seeing that I expect the default implementation to be providing? Or am I running across a bug in Swift 3 (this rarely is the case... usually I have overlooked something).
Here's the balance of the struct:
struct DataSet<T: Plottable>: MutableCollection, BidirectionalCollection {
typealias Element = T
typealias Iterator = DataSetIterator<T>
typealias Index = Int
/**
The list of elements in the data set. Private.
*/
private var elements: [Element] = []
/**
Initalize the data set with an array of data.
*/
init(elements data: [T] = []) {
self.elements = data
}
//MARK: Sequence Protocol
func makeIterator() -> DataSetIterator<T> {
return DataSetIterator(self)
}
//MARK: Collection Protocol
subscript(_ index:DataSet<T>.Index) -> DataSet<T>.Iterator.Element {
set {
elements[index] = newValue
}
get {
return elements[index]
}
}
subscript(_ inRange:Range<DataSet<T>.Index>) -> DataSet<T> {
set {
elements.replaceSubrange(inRange, with: newValue)
}
get {
return DataSet<T>(elements: Array(elements[inRange]))
}
}
//required index for MutableCollection and BidirectionalCollection
var endIndex: Int {
return elements.count
}
var startIndex: Int {
return 0
}
func index(after i: Int) -> Int {
return i+1
}
func index(before i: Int) -> Int {
return i-1
}
mutating func append(_ newElement: T) {
elements.append(newElement)
}
// /**
// Sorts the elements of the DataSet from lowest value to highest value.
// Commented because I'd like to use the default implementation.
// - note: This is equivalent to calling `sort(by: { $0 < $1 })`
// */
// mutating func sort() {
// self.sort(by: { $0 < $1 })
// }
//
// /**
// Sorts the elements of the DataSet by an abritrary block.
// */
// mutating func sort(by areInIncreasingOrder: #noescape (T, T) -> Bool) {
// self.elements = self.elements.sorted(by: areInIncreasingOrder)
// }
/**
Returns a `DataSet<T>` with the elements sorted by a provided block.
This is the default implementation `sort()` modified to return `DataSet<T>` rather than `Array<T>`.
- returns: A sorted `DataSet<T>` by the provided block.
*/
func sorted(by areInIncreasingOrder: #noescape (T, T) -> Bool) -> DataSet<T> {
return DataSet<T>(elements: self.elements.sorted(by: areInIncreasingOrder))
}
func sorted() -> DataSet<T> {
return self.sorted(by: { $0 < $1 })
}
}
Your DataSet is a BidirectionalCollection. The sort() you're trying to use requires a RandomAccessCollection. The most important thing you need to add is an Indicies typealias.
typealias Indices = Array<Element>.Indices
Here's my version of your type:
protocol Plottable: Comparable {}
extension Int: Plottable {}
struct DataSet<Element: Plottable>: MutableCollection, RandomAccessCollection {
private var elements: [Element] = []
typealias Indices = Array<Element>.Indices
init(elements data: [Element] = []) {
self.elements = data
}
var startIndex: Int {
return elements.startIndex
}
var endIndex: Int {
return elements.endIndex
}
func index(after i: Int) -> Int {
return elements.index(after: i)
}
func index(before i: Int) -> Int {
return elements.index(before: i)
}
subscript(position: Int) -> Element {
get {
return elements[position]
}
set {
elements[position] = newValue
}
}
subscript(bounds: Range<Int>) -> DataSet<Element> {
get {
return DataSet(elements: Array(elements[bounds]))
}
set {
elements[bounds] = ArraySlice(newValue.elements)
}
}
}
var data = DataSet(elements: [4,2,3,1])
data.sort()
print(data.elements) // [1,2,3,4]
You don't actually need an Iterator if you don't want one. Swift will give you Sequence automatically if you implement Collection.
As of Swift 2.0 it seems we can get closer to extensions of generic types applicable to predicated situations.
Although we still can't do this:
protocol Idable {
var id : String { get }
}
extension Array where T : Idable {
...
}
...we can now do this:
extension Array {
func filterWithId<T where T : Idable>(id : String) -> [T] {
...
}
}
...and Swift grammatically accepts it. However, for the life of me I cannot figure out how to make the compiler happy when I fill in the contents of the example function. Suppose I were to be as explicit as possible:
extension Array {
func filterWithId<T where T : Idable>(id : String) -> [T] {
return self.filter { (item : T) -> Bool in
return item.id == id
}
}
}
...the compiler will not accept the closure provided to filter, complaining
Cannot invoke 'filter' with an argument list of type '((T) -> Bool)'
Similar if item is specified as Idable. Anyone had any luck here?
extension Array {
func filterWithId<T where T : Idable>(id : String) -> [T] {
...
}
}
defines a generic method filterWithId() where the generic
placeholder T is restricted to be Idable. But that definition introduces a local placeholder T
which is completely unrelated to the array element type T
(and hides that in the scope of the method).
So you have not specified that the array elements must conform
to Idable, and that is the reason why you cannot call
self.filter() { ... } with a closure which expects the elements
to be Idable.
As of Swift 2 / Xcode 7 beta 2, you can define extension methods on a generic type which are more restrictive on the template
(compare Array extension to remove object by value for a very similar issue):
extension Array where Element : Idable {
func filterWithId(id : String) -> [Element] {
return self.filter { (item) -> Bool in
return item.id == id
}
}
}
Alternatively, you can define a protocol extension method:
extension SequenceType where Generator.Element : Idable {
func filterWithId(id : String) -> [Generator.Element] {
return self.filter { (item) -> Bool in
return item.id == id
}
}
}
Then filterWithId() is available to all types conforming
to SequenceType (in particular to Array) if the sequence element
type conforms to Idable.
In Swift 3 this would be
extension Sequence where Iterator.Element : Idable {
func filterWithId(id : String) -> [Iterator.Element] {
return self.filter { (item) -> Bool in
return item.id == id
}
}
}