What's wrong with this solution for neighbor cell counting of a matrix? (Swift) - swift

I was given this problem to solve inside a Swift playground. I was told my answer is correct but not good enough (that's right, pretty vague).
/*
Write a Swift playground that takes an n x n grid of integers. Each integer can be either 1 or 0.
The playground then outputs an n x n grid where each block indicates the number of 1's around that block, (excluding the block itself) . For Block 0 on row 0, surrounding blocks are (0,1) (1,0) and (1,1). Similary for block (1,1) all the blocks around it are to be counted as surrounding blocks.
Requirements:
Make sure your solution works for any size grid.
Spend an hour and a half coding the logic and another hour cleaning your code (adding comments, cleaning variable and function names).
Optimize your functions to be O(n^2).
Your output lines should not have any trailing or leading whitespaces.
Please use hard coded example input constants below.
Examples:
Input Grid:
let sampleGrid = [[0,1,0], [0,0,0], [1,0,0]]
Console Output:
1 0 1
2 2 1
0 1 0
/////////////////
Input Grid:
let sampleGrid = [[0,1,0,0], [0,0,0,1], [1,0,0,1],[0,1,0,1]]
Console Output:
1 0 2 1
2 2 3 1
1 2 4 2
2 1 3 1
*/
/// An *Error* type
struct ComputationError : Error {
/// The message of the error
let message : String
}
/// This function computes a matrix result from the input *matrix*
/// where an integer value represents the number of adjacent cells
/// in the input *matrix* having a 1.
///
/// The algorithm is O(n^2), if n is the side length of the matrix:
///
/// for each row in rows of matrix
/// for each column in columns of matrix
/// for each matrix[row][column] cell if equal to 1
/// add a 1 to adjacent (valid) cell in result matrix
///
/// - Parameter matrix: input square matrix with values of 0 or 1 only
/// - Returns: a matrix of equal size to input matrix with computed adjacency values
/// - Throws: throws a ComputationError is the matrix is of invalid size or has invalid values (something other than 1 or 0)
func compute(matrix:[[Int]]) throws -> [[Int]] {
// The number of rows in matrix, which should equal the number of columns, ie side length, or n
let side = matrix.count
// The resulting matrix to return
var result:[[Int]] = []
// Initialize the result matrix
for _ in 0..<side {
result.append(Array<Int>(repeating: 0, count: side))
}
// A convenience constant to refer to the last element in a row or column
let last = side-1
// Iterate over rows in matrix
for row in 0..<side {
if matrix[row].count < side {
throw ComputationError(message:"Invalid number of columns (\(matrix[row].count)), should match number of rows (\(side))")
}
// Iterate over columns in matrix
for column in 0..<side {
// Consider this cell if it is 1, otherwise skip
// If it is 1, then add a 1 to all valid adjacent cells
// in result matrix.
if matrix[row][column] == 1 {
if 0 < row {
if 0 < column {
result[row-1][column-1] += 1
}
result[row-1][column] += 1
if column < last {
result[row-1][column+1] += 1
}
}
if 0 < column {
result[row][column-1] += 1
}
if column < last {
result[row][column+1] += 1
}
if row < last {
if 0 < column {
result[row+1][column-1] += 1
}
result[row+1][column] += 1
if column < last {
result[row+1][column+1] += 1
}
}
}
else if matrix[row][column] == 0 {
// ok
}
// If value is neither 0 or 1 throw an error
else {
throw ComputationError(message:"Invalid value (\(matrix[row][column])) encountered at row \(row) and column \(column)")
}
}
}
return result
}
/// Print *matrix* to console by iterating over each row in the
/// matrix and each column in the row, regardless of the count
/// of columns in the row. The output is row-wise and a single
/// space delimits columns, with no leading or trailing whitespace.
///
/// - Parameter matrix: an array of array of Int
func print(matrix:[[Int]]) {
let rows = matrix.count
for row in 0..<rows {
let columns = matrix[row].count
var line = ""
for column in 0..<columns {
if 0 < column {
line += " "
}
line += "\(matrix[row][column])"
}
print(line)
}
}
do {
print(matrix:try compute(matrix: [[0,1,0], [0,0,0], [1,0,0]]))
}
catch let error {
print(error)
}
do {
print()
print(matrix:try compute(matrix: [[0,1,0,0], [0,0,0,1], [1,0,0,1],[0,1,0,1]]))
}
catch let error {
print(error)
}
// test for exception
do {
print()
print(matrix:try compute(matrix: [[0,1], [0,0,0], [1,0,0]]))
}
catch let error {
print(error)
}
// test for exception
do {
print()
print(matrix:try compute(matrix: [[0,1,0], [3,0,0], [1,0,0]]))
}
catch let error {
print(error)
}

They were likely looking for something in a functional programming style and you probably needed to demonstrate that your solution is O(n^2).
for example:
// let sampleGrid = [[0,1,0], [0,0,0], [1,0,0]]
let sampleGrid = [[0,1,0,0], [0,0,0,1], [1,0,0,1],[0,1,0,1]]
func printMatrix(_ m:[[Any]])
{
print( m.map{ $0.map{"\($0)"}.joined(separator:" ")}.joined(separator:"\n") )
}
let emptyLine = [Array(repeating:0, count:sampleGrid.first!.count)]
let left = sampleGrid.map{ [0] + $0.dropLast() } // O(n)
let right = sampleGrid.map{ $0.dropFirst() + [0] } // O(n)
let up = emptyLine + sampleGrid.dropLast() // O(n)
let down = sampleGrid.dropFirst() + emptyLine // O(n)
let leftRight = zip(left,right).map{zip($0,$1).map{$0+$1}} // O(n^2)
let upDown = zip(up,down).map{zip($0,$1).map{$0+$1}} // O(n^2)
let cornersUp = emptyLine + leftRight.dropLast() // O(n)
let cornersDown = leftRight.dropFirst() + emptyLine // O(n)
let sides = zip(leftRight,upDown).map{zip($0,$1).map{$0+$1}} // O(n^2)
let corners = zip(cornersUp,cornersDown).map{zip($0,$1).map{$0+$1}} // O(n^2)
// 6 x O(n) + 5 x O(n^2) ==> O(n^2)
let neighbourCounts = zip(sides,corners).map{zip($0,$1).map{$0+$1}} // O(n^2)
print("SampleGrid:")
printMatrix(sampleGrid)
print("\nNeighbour counts:")
printMatrix(neighbourCounts)
...
SampleGrid:
0 1 0 0
0 0 0 1
1 0 0 1
0 1 0 1
Neighbour counts:
1 0 2 1
2 2 3 1
1 2 4 2
2 1 3 1

Related

Swift - Find number that is a multiple of all numbers 1...20

I am working on Euler problem 5 which is:
2520 is the smallest number that can be divided by each of the numbers from 1 to 10 without any remainder.
What is the smallest positive number that is evenly divisible by all of the numbers from 1 to 20?
I am having trouble with my nested loops. I have a for loop within in a while loop. My logic is I
check a number (currentNumber) against 1-20 (i), if currentNumber is not a multiple of i (checked using modular arithmatic)
then it breaks out of that loop and trys the next largest number.
My issue is I cannot figure out how to jump out of only my inner loop and not my outer loop. Here is my code:
class Five {
init() {
var currentNumber = 1
while true {
for i in 1...20 {
if currentNumber % i != 0 {
currentNumber += 1
continue
}
}
break
}
print("the smallest positive number that is evenly divisible " +
"by all of the numbers from 1 to 20 is \(currentNumber)")
}
}
You already got a good and correct answer. Just as an add-on, for the
sake of completeness:
An alternative to labeled continue statements is to move the inner loop into a separate function from which you can “early return”:
func isDivisibleBy1To20(_ number: Int) -> Bool {
for j in 2...20 {
if number % j != 0 {
return false
}
}
return true
}
var currentNumber = 1
while !isDivisibleBy1To20(currentNumber) {
currentNumber += 1
}
print("solution:", currentNumber)
Using functional methods this can be simplified to
func isDivisibleBy1To20(_ number: Int) -> Bool {
return !(2...20).contains(where: { number % $0 != 0 })
}
let solution = (1...).first(where: isDivisibleBy1To20)!
print("solution:", solution)
(Remark: There are other, much faster methods to solve this problem.)

Sum of Printed For Loop in Swift

For a project, I'm trying to find the sum of the multiples of both 3 and 5 under 10,000 using Swift. Insert NoobJokes.
Printing the multiples of both 3 and 5 was fairly easy using a ForLoop, but I'm wondering how I can..."sum" all of the items that I printed.
for i in 0...10000 {
if i % 3 == 0 || i % 5 == 0 {
print(i)
}
}
(468 individual numbers printed; how can they be summed?)
Just a little walk through about the process. First you will need a variable which can hold the value of your sum, whenever loop will get execute. You can define an optional variable of type Int or initialize it with a default value same as I have done in the first line. Every time the loop will execute, i which is either multiple of 3 or 5 will be added to the totalSum and after last iteration you ll get your result.
var totalSum = 0
for i in 0...10000 {
if i % 3 == 0 || i % 5 == 0
{
print(i)
totalSum = totalSum + i
}
}
print (totalSum)
In Swift you can do it without a repeat loop:
let numberOfDivisiblesBy3And5 = (0...10000).filter{ $0 % 3 == 0 || $0 % 5 == 0 }.count
Or to get the sum of the items:
let sumOfDivisiblesBy3And5 = (0...10000).filter{ $0 % 3 == 0 || $0 % 5 == 0 }.reduce(0, {$0 + $1})
range : to specify the range of numbers for operation to act on.
here we are using filter method to filter out numbers that are multiple of 3 and 5 and then sum the filtered values.
(reduce(0,+) does the job)
let sum = (3...n).filter({($0 % 3) * ($0 % 5) == 0}).reduce(0,+)
You just need to sum the resulting i like below
var sum = 0
for i in 0...10000 {
if i % 3 == 0 || i % 5 == 0 {
sum = sum + i
print(i)
}
}
Now sum contains the Sum of the values
Try this:
var sum = 0
for i in 0...10000 {
if i % 3 == 0 || i % 5 == 0 {
sum = sum + i
print(i)
}
}
print(sum)
In the Bottom line, this should to be working.
var sum = 0
for i in 0...10000 {
if i % 3 == 0 || i % 5 == 0 {
sum += i
print(i)
}
}
print(sum)

How to find all numbers divisible by another number in swift?

How do I find all the numbers divisible by another number in swift that have a remainder of 0? This is a Fizzbuzz related question.
Lets say that...
let number = 150
And I want to do something like...
print("Fizz") // for all the numbers where the remainder of number % 3 == 0.
So if number was 15, it would print "Fizz" 5 times.
This will work
let number = 150
for num in 1...number {
if num % 3 == 0 {
print("Fizz :\(num)")
}
}
you can just loop through the number and check with your desired divisible number if the remainder is 0 then print fizz
let number = 15
for i in 0..<number {
if i % 3 == 0 {
print("\(i) Fizz")
}
}
It will print Fizz 5 times with the i value, that which number is Fizz.
Simply try this code: (You can simply replace num with any Int number and divider that is also an Int value which is used to divide all numbers till num. )
override func viewDidLoad() {
let num:Int = 15
let divider:Int = 3
var counter:Int = divider
while counter <= num {
print("Fizz")
counter += divider
}
}
func fizzbuzz(number: Int) -> String {
if number % 3 == 0 && number % 5 == 0 {
return "Fizz Buzz"
} else if number % 3 == 0 {
return "Fizz"
} else if number % 5 == 0 {
return "Buzz"
} else {
return String(number)
}
}
https://www.hackingwithswift.com/guide/ios-classic/1/3/challenge

for in loop with where clause in Swift

I have tried to update a little function to Swift 2.1. The original working code was:
import func Darwin.sqrt
func sqrt(x:Int) -> Int { return Int(sqrt(Double(x))) }
func sigma(n: Int) -> Int {
// adding up proper divisors from 1 to sqrt(n) by trial divison
if n == 1 { return 0 } // definition of aliquot sum
var result = 1
let root = sqrt(n)
for var div = 2; div <= root; ++div {
if n % div == 0 {
result += div + n/div
}
}
if root*root == n { result -= root }
return (result)
}
print(sigma(10))
print(sigma(3))
After updating the for loop I get a runtime error for the last line. Any idea why that happens?
import func Darwin.sqrt
func sqrt(x:Int) -> Int { return Int(sqrt(Double(x))) }
func sigma(n: Int) -> Int {
// adding up proper divisors from 1 to sqrt(n) by trial divison
if n == 1 { return 0 } // definition of aliquot sum
var result = 1
let root = sqrt(n)
for div in 2...root where n % div == 0 {
result += div + n/div
}
if root*root == n { result -= root }
return (result)
}
print(sigma(10))
print(sigma(3)) //<- run time error with for in loop
When you pass 3 to sigma, your range 2...root becomes invalid, because the left side, the root, is less than the right side, 2.
The closed range operator (a...b) defines a range that runs from a to b, and includes the values a and b. The value of a must not be greater than b.
root is assigned sqrt(n), which means that in order for the 2...root range to remain valid, n must be above 22.
You can fix by supplying a lower limit for the right side, i.e.
for div in 2...max(root,2) where n % div == 0 {
...
}
However, at this point your solution with the regular for loop is more readable.

ios how to check if division remainder is integer

any of you knows how can I check if the division remainder is integer or zero?
if ( integer ( 3/2))
You should use the modulo operator like this
// a,b are ints
if ( a % b == 0) {
// remainder 0
} else
{
// b does not divide a evenly
}
It sounds like what you are looking for is the modulo operator %, which will give you the remainder of an operation.
3 % 2 // yields 1
3 % 1 // yields 0
3 % 4 // yields 1
However, if you want to actually perform the division first, you may need something a bit more complex, such as the following:
//Perform the division, then take the remainder modulo 1, which will
//yield any decimal values, which then you can compare to 0 to determine if it is
//an integer
if((a / b) % 1 > 0))
{
//All non-integer values go here
}
else
{
//All integer values go here
}
Walkthrough
(3 / 2) // yields 1.5
1.5 % 1 // yields 0.5
0.5 > 0 // true
swift 3:
if a.truncatingRemainder(dividingBy: b) == 0 {
//All integer values go here
}else{
//All non-integer values go here
}
You can use the below code to know which type of instance it is.
var val = 3/2
var integerType = Mirror(reflecting: val)
if integerType.subjectType == Int.self {
print("Yes, the value is an integer")
}else{
print("No, the value is not an integer")
}
let me know if the above was useful.
Swift 5
if numberOne.isMultiple(of: numberTwo) { ... }
Swift 4 or less
if numberOne % numberTwo == 0 { ... }
Swift 2.0
print(Int(Float(9) % Float(4))) // result 1