We came across this odd behaviour when using loops in a didSet. The idea was that we have a data type with a tree structure and in each element we wanted to store the level that item is on. So in the didSet of the level attribute we would also set the level attribute of the children. However we realised that this does only work if one uses forEach and not when using for .. in. Here a short example:
class Item {
var subItems: [Item] = []
var depthA: Int = 0 {
didSet {
for item in subItems {
item.depthA = depthA + 1
}
}
}
var depthB: Int = 0 {
didSet {
subItems.forEach({ $0.depthB = depthB + 1 })
}
}
init(depth: Int) {
self.depthA = 0
if depth > 0 {
for _ in 0 ..< 2 {
subItems.append(Item(depth: depth - 1))
}
}
}
func printDepths() {
print("\(depthA) - \(depthB)")
subItems.forEach({ $0.printDepths() })
}
}
let item = Item(depth: 3)
item.depthA = 0
item.depthB = 0
item.printDepths()
When I run this I get the following output:
0 - 0
1 - 1
0 - 2
0 - 3
0 - 3
0 - 2
0 - 3
0 - 3
1 - 1
0 - 2
0 - 3
0 - 3
0 - 2
0 - 3
0 - 3
It seems like it will not call the didSet of the subItems attribute when it's called from an for .. in loop. Does anyone know why this is the case?
UPDATE:
The problem is not that the didSet is not called from the init. We change the attribute afterwards (see last 4 lines of code) and only one of the two depth attribute will propagate the new value to the children
If you use defer, for updating any optional properties or further updating non-optional properties that you've already initialized and after you've called any super init methods, then your willSet, didSet, etc. will be called.
for item in subItems {
defer{
item.depthA = depthA + 1
}
}
When you use the forEach it makes kindOf "contract" with the elements and because it's an instance method unlike for .. in loop, it triggers the didSet of the variable. The above case applies where we use the loop, we have to trigger the didSet manually
This Solves the problem I think. Hope it helps!!
It seems that in the init, didSet is not called.
Tried this line on the Swift REPL
class A { var a: Int { didSet { print("A") } }; init() { a = 5 } }
Then called A()
didSet is NOT called
But
A().a = 7
Found that didSet is called
So, the solution is to make a function (prefered to be final), that makes your effect you need to put in didSet. Then call it both from didSet and from init. You can put this in your class.
final func updateA() {
// Do your "didSet" here
}
var a: Int {
didSet {
updateA()
}
}
init() {
a = 5
updateA()
}
So in your case:
func updateDepthA() {
for item in subItems {
item.depthA = depthA + 1
}
}
var depthA: Int = 0 {
didSet {
updateDepthA()
}
}
...
init(depth: Int) {
self.depthA = 0
updateDepthA()
...
Related
Im looking for a solution in Swift.
I have a counter variable "counter" which adds 1 every time the func is called. Now with every new calling of the func, another variable should get called, like:
func questions() {
counter += 1
labelQuestion.text = question"counter" -> question0 -> question1 -> ...
}
How I can do that?!
Thanks already for answers!
May be use a didSet property observer on counter :
var counter: Int = 0 {
didSet {
labelQuestion.text = set the new question
}
}
func nextQuestion() {
counter += 1
}
https://leetcode.com/problems/maximum-depth-of-n-ary-tree/
I've already solved this through other ways. I'm just trying to resolve it through this code. Trying to figure out what's incorrect for this. It currently returns incorrect results:
class Solution {
func maxDepth(_ root: Node?) -> Int {
guard let node = root else { return 0 }
return node.children.map(maxDepth).max() ?? 0 + 1
}
}
Helper class if you wanted to test this on Xcode:
class Node {
var value: Int
var children: [Node] = []
weak var parent: Node?
init(value: Int) {
self.value = value
}
func add(child: Node) {
children.append(child)
child.parent = self
}
}
Example:
let one = Node(value: 1)
let two = Node(value: 2)
let three = Node(value: 3)
one.add(child: two)
two.add(child: three)
print("res", maxDepth(one)) // returns: 2. Expected: 3
I'm always returning 2 actually. Not sure why...
Shout out to Martin for helping me figure this out.
Pro tip. For such leetcode style questions. The dumbest/simplest tests are the best.
The line below has 2 mistakes:
return node.children.map(maxDepth).max() ?? 1 + 1
The ?? is defaulting it to 0 + 1. Wrap the ?? in a parenthesis
The default should actually be 0. Not 1
So just do:
return (node.children.map(maxDepth).max() ?? 0) + 1
I made that mistake because I almost never have any arithmetic operations after the ?? 🤦♂️
here is my problem:
struct NewStack<S> {
var stacks = [S]()
mutating func addS(_ s: S) {
stacks.append(s)
}
mutating func removeLastS() {
stacks.removeLast()
}
}
var newStacks = NewStack<String>()
newStacks.addS("Eins")
newStacks.addS("Zwei")
newStacks.addS("Drei")
for stack in newStacks.stacks {
var stackCount = 1
repeat {
stackCount + 1
} while stackCount <= newStacks.stacks.count {
print("Stack \(stackCount) = \(stack)")
}
}
At the line with the while-Statement it throughs me the warning: Cannot call value of non-function type 'Int'
Ìt would be very helpful if someone could tell me what I have to do, thank`s
There are several syntax issues with your code.
First, stackCount + 1 doesn't increment stackCount, to do that, you need stackCount += 1. Secondly, you cannot have code executed in the while condition, you need to move the print inside repeat.
for stack in newStacks.stacks {
var stackCount = 1
repeat {
stackCount += 1
print("Stack \(stackCount) = \(stack)")
} while stackCount <= newStacks.stacks.count
}
There is really no need to have an inner loop, you can use a simple for loop like this
for stack in newStacks.stacks {
print("Stack \(stack)")
}
Or if you want the index to be printed as well (I added + 1 so the printed index start at 1)
for (index, stack) in newStacks.stacks.enumerated() {
print("Stack \(index + 1) = \(stack)")
}
Hi I'm stuck trying to solve this:
class Classy, to represent how classy someone or something is. "Classy". If you add fancy-looking items, "classiness" increases!
Create a method, addItem() in Classy that takes a string as input, adds it to the "items" array and updates the classiness total.
Add another method, getClassiness() that returns the "classiness" value based on the items.
The following items have classiness points associated with them:
"tophat" = 2
"bowtie" = 4
"monocle" = 5
Everything else has 0 points.
The sum is not performing correctly.
The first problem is when it falls in te default case, everything is 0, I've tried in the default with:
default:
self.classiness += 0
and I got 2 for every case
I've tried to sum the total inside in each case, and return the total but got the same result.
This is my last version
class Classy {
var items: [String]
var classiness: Int
init() {
self.items = []
self.classiness = 0
}
func addItem(_ item: String) {
var total = 0
self.items.append(item)
total += classiness
}
func getClassiness() -> Int {
switch items {
case ["tophat"]:
self.classiness = 2
case ["bowtie"]:
self.classiness = 4
case ["monocle"]:
self.classiness = 5
default:
self.classiness = 0
}
return self.classiness
}
}
let me = Classy()
print(me.getClassiness())
me.addItem("tophat")
print(me.getClassiness())
me.addItem("bowtie")
me.addItem("jacket")
me.addItem("monocle")
print(me.getClassiness()) //This would be 11
Your switch case need update, it need to loop and the case is String not Array
func getClassiness() -> Int {
var total = 0
for item in items{
switch item {
case "tophat":
total += 2
case "bowtie":
total += 4
case "monocle":
total += 5
default:
total +=0
}
}
self.classiness = total
return self.classiness
}
Hello i practice on hackerRank using swift and now i have a problem. My code works great in swift playground, and return the expected result, but in HackerRank i have runtime error ~ no response on stdout ~ I've tried to reset code and refresh page. What could be the problem?
func diagonalDifference(arr: [[Int]]) -> Int {
// Write your code here
let rowNumber = arr[0][0]
var leftD = 0
var rightD = 0
for i in 1...rowNumber {
leftD += arr[i][i - 1]
}
var increasedNum = 0
for i in (1...rowNumber).reversed() {
rightD += arr[i][increasedNum]
increasedNum += 1
}
var absoluteDifference = leftD - rightD
if absoluteDifference < 0 {
absoluteDifference = absoluteDifference * -1
}
return absoluteDifference
}
Here is the challenge page:
https://www.hackerrank.com/challenges/diagonal-difference/problem
Your problem is a misunderstanding of what is passed to your diagonalDifference() function. The code which calls that function uses the first line of input to correctly size the array, but that value is not passed to your function in arr[0][0]. Instead, you should use arr.count to determine the dimensions of the array, then you should be indexing the array as 0..<arr.count.
To fix your code
change:
let rowNumber = arr[0][0]
to:
let rowNumber = arr.count
change:
leftD += arr[i][i - 1]
to:
leftD += arr[i][i]
And change both instances of
1...rowNumber
to:
0..<rowNumber
func diagonalDifference(arr: [[Int]]) -> Int {
var difference = 0
for i in 0..<arr.count {
difference += (arr[i][i] - arr[i][arr.count-1-i])
}
return Int(abs(difference))
}