Gets an error multiplying Swift Ints - swift

New Swift enthusiast here! I'm following Rey Wenderlich's Candy Crush tutorial and get an error when multiplying two Int values. I know Swift is strictly typed so would be the reason? Am I not allowed to do this in Swift? Note the error comments where I'm having trouble. Any help in the right direction is greatly appreciated!
class Array2D<T> {
let columns: Int
let rows: Int
let array: Array<T>
init(columns: Int, rows: Int) {
self.columns = columns
self.rows = rows
array = Array<T?>(count: rows*columns, repeatedValue: nil) // ERROR: could not find an overload for '*' that accepts the supplied arguments
}
subscript(column: Int, row: Int) -> T? {
get {
return array[row*columns + column]
}
set {
array[row*columns + column] = newValue // ERROR: could not find an overload for '*' that accepts the supplied arguments
}
}
}

Change your array to be of type T?.
In the first case, you are trying to assign array of type T? to array of type T. In the second, you are trying to assign newValue of type T? to an element of array of type T.
Changing the type of the array fixes both these things.

Related

Cannot convert value of type 'Range<Int>' to expected argument type 'Range<_>'

Using Swift 4.2, I get the title as an error in this function:
func jitter(range: Int) -> Int {
return Int.random(in: 0..<range, using: SystemRandomNumberGenerator())
}
Questions:
What precisely does Range<_> mean?
Is there a better way to get this? I simply want a small random number inside an animation loop.
The Swift compiler is giving you a bad error message. The problem is that the second argument to Int.random(in:using:) must be passed inout (i.e. with a & prefix). This works:
func jitter(range: Int) -> Int {
var rng = SystemRandomNumberGenerator()
return Int.random(in: 0..<range, using: &rng)
}
Even easier, omit the using: parameter altogether (SystemRandomNumberGenerator is the default RNG anyway):
func jitter(range: Int) -> Int {
return Int.random(in: 0..<range)
}

Parsing array recursively in Swift 4

I'm trying to implement Quick Sort in Swift and having issues recursively parsing the array into the quick_sort function. I'm receiving the error:
error: ambiguous subscript with base type '[String]' and index type 'CountableRange<Int>'
The function is:
func quick_sort(_ array: inout [String]) {
if array.count > 0 {
let pivot = array[0]
var (left, right) = partition(&array, pivot)
quick_sort(&array[0..<left])
}
}
The error is occurring on the line quick_sort(&array[0..<left]).
It may have to do with it potentially being an ArraySlice?
When you slice an Array, you get an ArraySlice. When you slice an ArraySlice, you get another ArraySlice. This property, that T.SubSequence == T, is vital to making a recursive algorithm like this, so that you only have to work with a single type.
You need your recursive function work with ArraySlice, but you can make a wrapper function that takes an Array and does the necessary conversion.
func quickSort(_ array: inout [String]) {
func quickSort(_ slice: inout ArraySlice<String>) {
if let first = slice.first {
var (left, right) = partition(&slice, pivot)
quickSort(&slice[0..<left]) // This part of the algorithm will break...
}
}
quickSort(ArraySlice(array))
}

Why do I get an error when attempting to invoke indexOf on a generic ArraySlice?

The following function finds the second index of a given item in Array of Int:
func secondIndexOf(item: Int, inArray array: Array<Int>) -> Int? {
if let firstIndex: Int = array.indexOf(item) {
let slice: ArraySlice<Int> = array.suffixFrom(firstIndex + 1)
return slice.indexOf(item)
}
return nil
}
However, when I attempt to create a generic version of this function to find the second Equatable item, I get an error:
func secondIndexOf<T: Equatable>(item: T, inArray array: Array<T>) -> T? {
if let firstIndex: Int = array.indexOf(item) {
let slice: ArraySlice<T> = array.suffixFrom(firstIndex + 1)
return slice.indexOf(item) // Cannot invoke 'indexOf' with an argument list of type '(T)'
}
return nil
}
Why is this not valid Swift code, and what is the expected argument list if not (T)? Xcode autocomplete shows indexOf(element: Comparable) with which T should be compatible.
The compiler is giving you a confusing error message hereā€”it isn't actually concerned about the argument. The return value is the source of the problem, since you aren't returning a value of type T, but an index of the array. You just need to change your return type to Int?:
func secondIndexOf<T: Equatable>(item: T, inArray array: Array<T>) -> Int? {
if let firstIndex: Int = array.indexOf(item) {
let slice: ArraySlice<T> = array.suffixFrom(firstIndex + 1)
return slice.indexOf(item)
}
return nil
}

Swift 2D Array optional Type and subscripting (Beta 3)

I have a 2D array that worked in Beta 2. However, in Beta 3 I'm getting '#lvalue $T15 is not identical to T?' when setting via subscript.
class Array2D<T> {
let columns: Int
let rows: Int
let array: [T?]
init(columns: Int, rows: Int) {
self.columns = columns
self.rows = rows
array = [T?](count: rows*columns, repeatedValue: nil)
}
subscript(column: Int, row: Int) -> T? {
get {
return array[row*columns + column]
}
set {
array[row*columns + column] = newValue // Error here
}
}}
Any thoughts on how to resolve this?
In Beta3 constant arrays are completely immutable while variable arrays are entirely mutable. Change let array: [T?] to var array: [T?] and your code should work.

Swift generics: requiring addition and multiplication abilities of a type

I'm trying out some examples from the Swift book, namely the matrix example they have which introduces subscript options. This is the code I have:
struct Matrix<T> {
let rows: Int, columns: Int
var grid: T[]
var description: String {
return "\(grid)"
}
init(rows: Int, columns: Int, initialValue: T) {
self.rows = rows
self.columns = columns
grid = Array(count: rows * columns, repeatedValue: initialValue)
}
func indexIsValidForRow(row: Int, column: Int) -> Bool {
return row >= 0 && row < rows && column >= 0 && column < columns
}
subscript(row: Int, column: Int) -> T {
get {
assert(indexIsValidForRow(row, column: column), "Index out of range")
return grid[(row * columns) + column]
}
set {
assert(indexIsValidForRow(row, column: column), "Index out of range")
grid[(row * columns) + column] = newValue
}
}
}
This is mostly copied from the book. A major difference is in this line here:
struct Matrix<T>
As far as I can tell, this says to the compiler that my Matrix class can hold values of type T, specified by the code using this class. Now, I'd like to make sure that the type T can be compared, so I can write this:
struct Matrix<T: Equatable>
This might be useful in case I want to compare 2 matrices, which would mean comparing their values. I also want to provide the ability to sum two matrices, so I should also add to this line a protocol requiring that the type 'T' given by the user of the matrix can be added:
struct Matrix<T: Equatable, "Summable">
Likewise, I'd also like to say:
struct Matrix<T: Equatable, "Summable", "Multipliable">
Question 1: What protocol name can I use? How can I achieve this?
On a related note, to add addition abilities using the '+' operator, I should declare a function like this (this applies also to multiplication):
#infix func + (m1: Matrix<T>, m2: Matrix<T>) -> Matrix<T> {
// perform addition here and return a new matrix
return result
}
However, this code is not accepted by Xcode. More specifically, this ) -> Matrix<T> { produces the error: Use of undeclared type 'T'. What I mean by that <T> is that the result will be a matrix that has the same type of the two input matrices, but I'm probably messing the syntax completely.
Question 2: How can I provide type information to the result of the addition?
Here's for your second question (but you really should ask two separate questions):
#infix func + <T> (m1: Matrix<T>, m2: Matrix<T>) -> Matrix<T> { ... }
For your first question: before solving it, here's the syntax to define multiple constraints for type parameter:
struct Matrix<T where T: Equatable, T: Summable, T: Multipliable> {...}
or, as GoZoner writes in the comments:
struct Matrix<T: protocol<Equatable, Summable, Multipliable>> {...}
But we're not going to need it. First, define a new protocol and list the operations that you need. You can even make it extend Equatable:
protocol SummableMultipliable: Equatable {
func +(lhs: Self, rhs: Self) -> Self
func *(lhs: Self, rhs: Self) -> Self
}
Then, provide extensions for the types that you want to conform. Here, for Int and Double, the extensions are even empty, as the implementation of the needed ops is built-in:
extension Int: SummableMultipliable {}
extension Double: SummableMultipliable {}
Then, declare your type constraint on the type parameter:
struct Matrix<T: SummableMultipliable> { ... }
Finally, you can write stuff like this:
let intMat = Matrix<Int>(rows: 3, columns: 3, initialValue: 0)
let doubleMat = Matrix<Double>(rows: 3, columns: 3, initialValue: 0)
let i: Int = intMat[0,0]
let d: Double = doubleMat[0,0]
The last thing you'll need is to insert the type constraint in the definition of your operator:
#infix func + <T: SummableMultipliable> (m1: Matrix<T>, m2: Matrix<T>) -> Matrix<T> { ... }
For Question 1 start by defining a protocol
protocol Summable { func ignore () }
It has a throw away method. Then add it as an extension to the things that you want to be summable.
extension Int: Summable { func ignore () {} }
[Note: I tried the above w/o a throw away method but got a failure; I suspect Swift needed something, anything in the protocol.]
Now a test
35> protocol Summable { func ignore () }
36> extension Int: Summable { func ignore () {} }
37> func testing<T: Summable> (x: T) -> T { return x }
38> testing(1)
$R16: (Int) = 1
39> testing(1.2)
<REPL>:39:1: error: cannot convert the expression's type '$T1' to type 'Summable'
testing(1.2)
^~~~~~~~~~~~
For Question 2, [edit] Use the following
#infix func +<T: Summable> (m1: Matrix<T>, m2: Matrix<T>) -> Matrix<T> { ... }
[Note: I tried the above in the REPL, which didn't work. But it works in a file (probably defines a 'global environment' which the REPL doesn't)]