Swift subscript with generic data type - swift

I am trying to program a two-dimensional data storage struct for various data types. However, I am struggling with the subscript for setting the data due 'Cannot assign value of type 'T' to subscript of type 'T' errors. Any help is much appreciated!
struct dataMatrix<T> : Sequence, IteratorProtocol {
var rows: Int, columns: Int
var data: [T]
var position = 0
init(rows: Int, columns: Int) {
self.rows = rows
self.columns = columns
data = Array<T>()
}
func valueAt(column: Int, row: Int) -> T? {
guard column >= 0 && row >= 0 && column < columns else {
return nil
}
let indexcolumn = column + row * columns
guard indexcolumn < data.count else {
return nil
}
return data[indexcolumn]
}
}
subscript<T>(column: Int, row:Int) -> T?{
get{
return valueAt(column: column, row: row) as? T
}
set{
data[(column * row) + column] = (newValue as! T) // does not compile
}
}
// sequence iterator protorocl methods
mutating func next() -> String? {
if position <= data.count{
print(position)
defer { position += 1 }
return "\(position)"
}else{
defer {position = 0}
return nil
}
}
}

subscript<T>(column: Int, row:Int) -> T?{
defines a generic method with a type placeholder T which is unrelated to the generic type T of struct dataMatrix<T>. The solution is simple: Remove the type placeholder:
subscript(column: Int, row: Int) -> T? {
// ...
}
That makes also the type casts inside the getter and setter unnecessary. You only have to decide what to do if the setter is called with a nil argument (e.g.: nothing):
subscript(column: Int, row: Int) -> T? {
get {
return valueAt(column: column, row: row)
}
set {
if let value = newValue {
data[(column * row) + column] = value
}
}
}
Another option is to make the return type of the subscript method non-optional, and treat invalid indices as a fatal error (which is how the Swift Array handles it):
subscript(column: Int, row: Int) -> T {
get {
guard let value = valueAt(column: column, row: row) else {
fatalError("index out of bounds")
}
return value
}
set {
data[(column * row) + column] = newValue
}
}

Related

Why does using a parameter name other the index result in Ambiguous use of 'subscript(_:)'

Take the two snippets below, the top one works fine but the bottom one results in
Ambiguous use of 'subscript(_:)'
using index ✅
extension Array {
subscript(i index: Int) -> (String, String)? {
guard let value = self[index] as? Int else {
return nil
}
switch (value >= 0, abs(value % 2)) {
case (true, 0): return ("positive", "even")
case(true, 1): return ("positive", "odd")
case(false, 0): return ("negative", "even")
case(false, 1): return ("negative", "odd")
default: return nil
}
}
}
Without using index ❌
extension Array {
subscript(i: Int) -> (String, String)? {
guard let value = self[i] as? Int else {
return nil
}
switch (value >= 0, abs(value % 2)) {
case (true, 0): return ("positive", "even")
case(true, 1): return ("positive", "odd")
case(false, 0): return ("negative", "even")
case(false, 1): return ("negative", "odd")
default: return nil
}
}
}
First, the name index is irrelevant to the problem; it could be any name.
The actual problem is that Array already has a subscript that takes an unlabeled Int.
Your first overload does not have the same input signature. Instead, it requires an argument label:
[1][i: 0] // ("positive", "odd")
You can still use an overload without a label…
extension Array where Element: BinaryInteger {
subscript(🫵: Int) -> (String, String) {
let value: Element = self[🫵]
return (
value >= 0 ? "positive" : "negative",
value.isMultiple(of: 2) ? "even" : "odd"
)
}
}
…but then, as is necessary within the subscript body itself, you'll need to always explicitly type the result, in order to access the overload from the standard library, because whatever you have in your own module is going to take precedence.
[1][0] // ("positive", "odd")
[1][0] as Int // 1
So, I recommend either using a subscript with a label, or a method.*
* What I would like to recommend is a named subscript. But Swift doesn't have them. You can emulate them with more types, however. Like this:
extension Array where Element: BinaryInteger {
struct InfoStrings {
fileprivate let array: Array
}
var infoStrings: InfoStrings { .init(array: self) }
}
extension Array.InfoStrings {
subscript(index: Int) -> (String, String) {
let value = array[index]
return (
value >= 0 ? "positive" : "negative",
value.isMultiple(of: 2) ? "even" : "odd"
)
}
}
[1].infoStrings[0]

Generic way to do math on protocol extensions

Goal
I want to extend basic types like Int, Double, Float... with more flexible properties and make it presentable in a chart on my app. For example, I made a chart draw that is suitable only for displaying Intbut cannot really display Float. I want to make sure when I pass arguments to this view it will display correctly.
Solution
So I made a protocol (for this example made it like this):
protocol SimplyChartable {
static func max(_ dataSet: [SimplyChartable]) -> SimplyChartable
}
And then make an extension for some types:
extension Int: SimplyChartable { }
extension Double: SimplyChartable { }
extension Float: SimplyChartable { }
and so on ...
Problem
This will be all numeric types, and whenever I pass it as numeric types to a func I need to extend all extension like this:
public static func max(_ dataSet: [SimplyChartable]) -> SimplyChartable {
return (dataSet as? [Int])?.max() ?? 0
}
But for Double func will be identical.
So for min I will end up with similar function, the same for divide, adding , some other math... There is a way to write it once and reuse for every type that extends this protocol?
I found out that:
let dataType = type(of: maxValue) /* where `maxValue` is SimplyChartable*/
Will return original type as rawValue. But output of a method type(of is a Metatype and I cannot return it from function and then add two values of this type. So for example this code will not work:
let val1 = SimplyChartable(4)
let val2 = SimplyChartable(2)
let sum = val1 + val2
And how to make it work not ending up with 3 functions like this:
let val1 = SimplyChartable(4)
let val2 = SimplyChartable(2)
let sum = (val1 as! Int) + (val2 as! Int)
Since they all numeric types why don't you use Comparable?
extension SimplyChartable {
static func max<T: Comparable>(dataSet: [T]) -> T? {
return dataSet.max()
}
static func min<T: Comparable>(dataSet: [T]) -> T? {
return dataSet.min()
}
}
extension Int: SimplyChartable { }
extension Double: SimplyChartable { }
Double.max([1.2, 1.1, 1.3]) // 1.3
Int.min([12, 11, 13]) // 11
Just my two cents worth...
This isn't exactly what you've asked for, since it doesn't let you call a static function directly from a protocol metatype. But since that, AFAIK, isn't possible in Swift currently, perhaps this would be the next best thing?
extension Sequence where Element == SimplyChartable {
func max() -> SimplyChartable {
// put your implementation here
}
}
You can then call this by just:
let arr: [SimplyChartable] = ...
let theMax = arr.max()
For your situation, it's much better to use an Array extension rather than a protocol with an array parameter.
To handle each possible type of array i.e [Int], [Double] or [Float], create a wrapper enum with associated types as follows:
public enum SimplyChartableType {
case int(Int)
case float(Float)
case double(Double)
func getValue() -> NSNumber {
switch self {
case .int(let int):
return NSNumber(value: int)
case .float(let float):
return NSNumber(value: float)
case .double(let double):
return NSNumber(value: double)
}
}
init(int: Int) {
self = SimplyChartableType.int(int)
}
init(float: Float) {
self = SimplyChartableType.float(float)
}
init(double: Double) {
self = SimplyChartableType.double(double)
}
}
You can extend Array as follows:
extension Array where Element == SimplyChartableType {
func max() -> SimplyChartableType {
switch self[0] {
case .int(_):
let arr = self.map({ $0.getValue().intValue })
return SimplyChartableType(int: arr.max()!)
case .double(_):
let arr = self.map({ $0.getValue().doubleValue })
return SimplyChartableType(double: arr.max()!)
case .float(_):
let arr = self.map({ $0.getValue().floatValue })
return SimplyChartableType(float: arr.max()!)
}
}
}
Example usage is:
var array = [SimplyChartableType.double(3),SimplyChartableType.double(2),SimplyChartableType.double(4)]
var max = array.max()
And now it's a lot easier to operate on Int, Double or Float together with:
extension SimplyChartableType: SimplyChartable {
//insert functions here
static func randomFunction() -> SimplyChartableType {
//perform logic here
}
}
The above snippet is good if you need a different functionality which operates on non-Collection types.
This doesn't answer your specific question, unfortunately. Perhaps a work around to use a free function and casting.
import UIKit
protocol SimplyChartable {
func chartableValue() -> Double
}
extension Int: SimplyChartable {
func chartableValue() -> Double {
return Double(self) ?? 0
}
}
extension Double: SimplyChartable {
func chartableValue() -> Double {
return self
}
}
extension Float: SimplyChartable {
func chartableValue() -> Double {
return Double(self) ?? 0
}
}
func maxOfSimplyChartables(_ dataSet: [SimplyChartable]) -> SimplyChartable {
return dataSet.max(by: { (lhs, rhs) -> Bool in
return lhs.chartableValue() < rhs.chartableValue()
}) ?? 0
}
let chartableItem1: SimplyChartable = 1255555.4
let chartableItem2: SimplyChartable = 24422
let chartableItem3: SimplyChartable = 35555
let simplyChartableValues = [chartableItem1, chartableItem2, chartableItem3]
maxOfSimplyChartables(simplyChartableValues)

Swift Error: 'Sequence' requires the types 'T' and 'ArraySlice<T>' be equivalent

I'm trying to update a math library to be compatible with Swift 3, but I'm running into an error:
'Sequence' requires the types 'T' and 'ArraySlice<T>' be equivalent
Apple's documentation on Sequence recommends that makeIterator() method returns an iterator, which it does. And it seems that the iterator is returning an element in the grid variable, which is of variable T. I'm not quite sure what I'm missing here. Any advice would be helpful.
public struct Matrix<T> where T: FloatingPoint, T: ExpressibleByFloatLiteral {
public typealias Element = T
let rows: Int
let columns: Int
var grid: [Element]
public init(rows: Int, columns: Int, repeatedValue: Element) {
self.rows = rows
self.columns = columns
self.grid = [Element](repeating: repeatedValue, count: rows * columns)
}
...
}
extension Matrix: Sequence { // <-- getting error here
public func makeIterator() -> AnyIterator<ArraySlice<Element>> {
let endIndex = rows * columns
var nextRowStartIndex = 0
return AnyIterator {
if nextRowStartIndex == endIndex {
return nil
}
let currentRowStartIndex = nextRowStartIndex
nextRowStartIndex += self.columns
return self.grid[currentRowStartIndex..<nextRowStartIndex]
}
}
}
Your code compiles fine as Swift 3.1 (Xcode 8.3.3). The error
'Sequence' requires the types 'T' and 'ArraySlice<T>' be equivalent
occurs when compiling as Swift 4 (Xcode 9, currently beta), because then
the Sequence protocol already defines the
associatedtype Element where Self.Element == Self.Iterator.Element
which conflicts with your definition. You can either choose a different
name for your type alias, or just remove it (and use T instead):
public struct Matrix<T> where T: FloatingPoint, T: ExpressibleByFloatLiteral {
let rows: Int
let columns: Int
var grid: [T]
public init(rows: Int, columns: Int, repeatedValue: T) {
self.rows = rows
self.columns = columns
self.grid = [T](repeating: repeatedValue, count: rows * columns)
}
}
extension Matrix: Sequence {
public func makeIterator() -> AnyIterator<ArraySlice<T>> {
let endIndex = rows * columns
var nextRowStartIndex = 0
return AnyIterator {
if nextRowStartIndex == endIndex {
return nil
}
let currentRowStartIndex = nextRowStartIndex
nextRowStartIndex += self.columns
return self.grid[currentRowStartIndex..<nextRowStartIndex]
}
}
}
This compiles and runs with both Swift 3 and 4.

Why do I get an error when attempting to invoke indexOf on a generic ArraySlice?

The following function finds the second index of a given item in Array of Int:
func secondIndexOf(item: Int, inArray array: Array<Int>) -> Int? {
if let firstIndex: Int = array.indexOf(item) {
let slice: ArraySlice<Int> = array.suffixFrom(firstIndex + 1)
return slice.indexOf(item)
}
return nil
}
However, when I attempt to create a generic version of this function to find the second Equatable item, I get an error:
func secondIndexOf<T: Equatable>(item: T, inArray array: Array<T>) -> T? {
if let firstIndex: Int = array.indexOf(item) {
let slice: ArraySlice<T> = array.suffixFrom(firstIndex + 1)
return slice.indexOf(item) // Cannot invoke 'indexOf' with an argument list of type '(T)'
}
return nil
}
Why is this not valid Swift code, and what is the expected argument list if not (T)? Xcode autocomplete shows indexOf(element: Comparable) with which T should be compatible.
The compiler is giving you a confusing error message here—it isn't actually concerned about the argument. The return value is the source of the problem, since you aren't returning a value of type T, but an index of the array. You just need to change your return type to Int?:
func secondIndexOf<T: Equatable>(item: T, inArray array: Array<T>) -> Int? {
if let firstIndex: Int = array.indexOf(item) {
let slice: ArraySlice<T> = array.suffixFrom(firstIndex + 1)
return slice.indexOf(item)
}
return nil
}

Swift 2D Array optional Type and subscripting (Beta 3)

I have a 2D array that worked in Beta 2. However, in Beta 3 I'm getting '#lvalue $T15 is not identical to T?' when setting via subscript.
class Array2D<T> {
let columns: Int
let rows: Int
let array: [T?]
init(columns: Int, rows: Int) {
self.columns = columns
self.rows = rows
array = [T?](count: rows*columns, repeatedValue: nil)
}
subscript(column: Int, row: Int) -> T? {
get {
return array[row*columns + column]
}
set {
array[row*columns + column] = newValue // Error here
}
}}
Any thoughts on how to resolve this?
In Beta3 constant arrays are completely immutable while variable arrays are entirely mutable. Change let array: [T?] to var array: [T?] and your code should work.