Initialize local variable inside closure - swift

Is it possible to initialize a variable in a closure? Specifically, the following code gives an error:
func doSomething(todo: (Void -> Void)) -> Void {
todo()
}
var i: Int
doSomething( { i = 3} )
print(i)
because i is captured before being initialized. Of course, I can always just use default initialization for the variable and skipping that is most of the time a microoptimization, but I'm wondering.
Edit:
In the end I've gone with an implicitly unwrapped optional var i: Int!,
thanks to #Laffen and #dfri for pointing me in the right direction. Using an optional should be the best way in the majority of cases.

If no default value is set for the initialization, you should append the ? to make this an optional.
var i: Int?
func someFunc(){
i = 1
}
Note: To enhance the readability of this answer together with the comment made by #dfri, I've included the comment in the answer:
Perhaps worth mentioning that i is now an implicitly unwrapped optional, so it's possible for i to take a value nil, whereafter trying to access nil-valued i will not prompt compile time warning, however yielding a runtime exception. E.g., i=nil ... print(i). For this case, I find it safer to let i be a "regular" optional, in which case compiler will prompt you for unwrapping (and you can do this in a safe manner rather than implicitly using forced unwrapping ! by default: e.g. var i: Int? .... print(i ?? "nil")), or, safely print "nil" if not unwrapped.

If you move the print(i) inside the closure then it works.
func doSomething(todo: (Void -> Void)) -> Void {
todo()
print(i)
}
var i: Int
doSomething( { i = 3} )
If the value is not mutated outside the closure Swift will make a copy of that value and so the i referred to outside the doSomething is different from the one inside the doSomething.

Related

Is it possible to have parameters that aren't used as arguments?

For example, I have func test(paramA: String){}.
Sometimes I want to pass a string as argument but other times, I definitely don't want to have an argument at all, but rather: test()
Is it possible to call both test() and test("hello") or does it need to be different functions?
Also I'm not sure if this is called something in particular. SO defined optional parameters as:
An optional parameter is one that a caller can include in a call to a function or method, but doesn't have to. When omitted, a default value is used instead. Optional parameters are useful when the default value is used in most cases, but still needs to be specified on occasion.
To me, using Swift, optional means using ? in reference to it perhaps being nil.
EDIT
Thanks for the response. It's evident that I should use default parameters. However, doing so in a closure result in the following error:
"Default argument not permitted in a tuple type"
func characterIsMoving(i: Int, j: Int, completion: #escaping(_ moveWasWithinLimit: Bool, _ test: Bool = false ) -> Void) { ... }
Here comes the full function if that's helpful:
func characterIsMoving(i: Int, j: Int, completion: #escaping(_ moveWasWithinLimit: Bool, _ test: Bool = false ) -> Void) {
if !gameBoardArray[i][j].isAccessibilityElement {
print("Not Accessible")
currentSelectedTile = character.getCurrentPosition()
return
}else {
print("Moving")
var moveWasWithinLimit: Bool
if(characterMoveLimit.contains(currentSelectedTile)){
print("Within Limit")
previousSelectedTile.fillColor = SKColor.gray
currentSelectedTile.fillColor = SKColor.brown
placeCharacter(row: i, col: j)
buttonIsAvailable = false
for moveRadius in characterMoveLimit {
moveRadius.fillColor = SKColor.gray
}
characterMoveLimit.removeAll()
moveLimit(limitWho: "CHARACTER", characterOrEnemy: character, i: i, j: j)
moveWasWithinLimit = true
completion(moveWasWithinLimit)
}else{
print("Outside Move Limit")
currentSelectedTile = previousSelectedTile
moveWasWithinLimit = false
completion(moveWasWithinLimit)
}
}
}
You (everyone, really) would really benefit from reading the Swift book, cover to cover.
What you're looking for is called a default value.
func test(paramA: String = "Your default value here") {}
or
func test(paramA: String? = nil) {}
The former is simpler, but more limited. For example, you can't distinguish rather the default value "Your default value here" was used, or whether the caller passed in their own value, which happens to be "Your default value here"). In my experience, the distinction is seldom required, but it's good to call out just in case.
In the latter case, you have the flexibility to handle the optional in many more ways. You could substitute a default value with ??, do conditional binding, map it, etc.
First you have to be clear with your requirements.
You should not use methods which are directly dependent on instance variables, as setting there values will be a problem according to your unit test cases.
As per my understanding this is the correct approach you can define the method with parameter having default value.
So that you can call that method as per your requirements with or without the parameter.

UnsafeMutablePointer.pointee and didSet properties

I got some unexpected behavior using UnsafeMutablePointer on an observed property in a struct I created (on Xcode 10.1, Swift 4.2). See the following playground code:
struct NormalThing {
var anInt = 0
}
struct IntObservingThing {
var anInt: Int = 0 {
didSet {
print("I was just set to \(anInt)")
}
}
}
var normalThing = NormalThing(anInt: 0)
var ptr = UnsafeMutablePointer(&normalThing.anInt)
ptr.pointee = 20
print(normalThing.anInt) // "20\n"
var intObservingThing = IntObservingThing(anInt: 0)
var otherPtr = UnsafeMutablePointer(&intObservingThing.anInt)
// "I was just set to 0."
otherPtr.pointee = 20
print(intObservingThing.anInt) // "0\n"
Seemingly, modifying the pointee on an UnsafeMutablePointer to an observed property doesn't actually modify the value of the property. Also, the act of assigning the pointer to the property fires the didSet action. What am I missing here?
Any time you see a construct like UnsafeMutablePointer(&intObservingThing.anInt), you should be extremely wary about whether it'll exhibit undefined behaviour. In the vast majority of cases, it will.
First, let's break down exactly what's happening here. UnsafeMutablePointer doesn't have any initialisers that take inout parameters, so what initialiser is this calling? Well, the compiler has a special conversion that allows a & prefixed argument to be converted to a mutable pointer to the 'storage' referred to by the expression. This is called an inout-to-pointer conversion.
For example:
func foo(_ ptr: UnsafeMutablePointer<Int>) {
ptr.pointee += 1
}
var i = 0
foo(&i)
print(i) // 1
The compiler inserts a conversion that turns &i into a mutable pointer to i's storage. Okay, but what happens when i doesn't have any storage? For example, what if it's computed?
func foo(_ ptr: UnsafeMutablePointer<Int>) {
ptr.pointee += 1
}
var i: Int {
get { return 0 }
set { print("newValue = \(newValue)") }
}
foo(&i)
// prints: newValue = 1
This still works, so what storage is being pointed to by the pointer? To solve this problem, the compiler:
Calls i's getter, and places the resultant value into a temporary variable.
Gets a pointer to that temporary variable, and passes that to the call to foo.
Calls i's setter with the new value from the temporary.
Effectively doing the following:
var j = i // calling `i`'s getter
foo(&j)
i = j // calling `i`'s setter
It should hopefully be clear from this example that this imposes an important constraint on the lifetime of the pointer passed to foo – it can only be used to mutate the value of i during the call to foo. Attempting to escape the pointer and using it after the call to foo will result in a modification of only the temporary variable's value, and not i.
For example:
func foo(_ ptr: UnsafeMutablePointer<Int>) -> UnsafeMutablePointer<Int> {
return ptr
}
var i: Int {
get { return 0 }
set { print("newValue = \(newValue)") }
}
let ptr = foo(&i)
// prints: newValue = 0
ptr.pointee += 1
ptr.pointee += 1 takes place after i's setter has been called with the temporary variable's new value, therefore it has no effect.
Worse than that, it exhibits undefined behaviour, as the compiler doesn't guarantee that the temporary variable will remain valid after the call to foo has ended. For example, the optimiser could de-initialise it immediately after the call.
Okay, but as long as we only get pointers to variables that aren't computed, we should be able to use the pointer outside of the call it was passed to, right? Unfortunately not, turns out there's lots of other ways to shoot yourself in the foot when escaping inout-to-pointer conversions!
To name just a few (there are many more!):
A local variable is problematic for a similar reason to our temporary variable from earlier – the compiler doesn't guarantee that it will remain initialised until the end of the scope it's declared in. The optimiser is free to de-initialise it earlier.
For example:
func bar() {
var i = 0
let ptr = foo(&i)
// Optimiser could de-initialise `i` here.
// ... making this undefined behaviour!
ptr.pointee += 1
}
A stored variable with observers is problematic because under the hood it's actually implemented as a computed variable that calls its observers in its setter.
For example:
var i: Int = 0 {
willSet(newValue) {
print("willSet to \(newValue), oldValue was \(i)")
}
didSet(oldValue) {
print("didSet to \(i), oldValue was \(oldValue)")
}
}
is essentially syntactic sugar for:
var _i: Int = 0
func willSetI(newValue: Int) {
print("willSet to \(newValue), oldValue was \(i)")
}
func didSetI(oldValue: Int) {
print("didSet to \(i), oldValue was \(oldValue)")
}
var i: Int {
get {
return _i
}
set {
willSetI(newValue: newValue)
let oldValue = _i
_i = newValue
didSetI(oldValue: oldValue)
}
}
A non-final stored property on classes is problematic as it can be overridden by a computed property.
And this isn't even considering cases that rely on implementation details within the compiler.
For this reason, the compiler only guarantees stable and unique pointer values from inout-to-pointer conversions on stored global and static stored variables without observers. In any other case, attempting to escape and use a pointer from an inout-to-pointer conversion after the call it was passed to will lead to undefined behaviour.
Okay, but how does my example with the function foo relate to your example of calling an UnsafeMutablePointer initialiser? Well, UnsafeMutablePointer has an initialiser that takes an UnsafeMutablePointer argument (as a result of conforming to the underscored _Pointer protocol which most standard library pointer types conform to).
This initialiser is effectively same as the foo function – it takes an UnsafeMutablePointer argument and returns it. Therefore when you do UnsafeMutablePointer(&intObservingThing.anInt), you're escaping the pointer produced from the inout-to-pointer conversion – which, as we've discussed, is only valid if it's used on a stored global or static variable without observers.
So, to wrap things up:
var intObservingThing = IntObservingThing(anInt: 0)
var otherPtr = UnsafeMutablePointer(&intObservingThing.anInt)
// "I was just set to 0."
otherPtr.pointee = 20
is undefined behaviour. The pointer produced from the inout-to-pointer conversion is only valid for the duration of the call to UnsafeMutablePointer's initialiser. Attempting to use it afterwards results in undefined behaviour. As matt demonstrates, if you want scoped pointer access to intObservingThing.anInt, you want to use withUnsafeMutablePointer(to:).
I'm actually currently working on implementing a warning (which will hopefully transition to an error) that will be emitted on such unsound inout-to-pointer conversions. Unfortunately I haven't had much time lately to work on it, but all things going well, I'm aiming to start pushing it forwards in the new year, and hopefully get it into a Swift 5.x release.
In addition, it's worth noting that while the compiler doesn't currently guarantee well-defined behaviour for:
var normalThing = NormalThing(anInt: 0)
var ptr = UnsafeMutablePointer(&normalThing.anInt)
ptr.pointee = 20
From the discussion on #20467, it looks like this will likely be something that the compiler does guarantee well-defined behaviour for in a future release, due to the fact that the base (normalThing) is a fragile stored global variable of a struct without observers, and anInt is a fragile stored property without observers.
I'm pretty sure the problem is that what you're doing is illegal. You can't just declare an unsafe pointer and claim that it points at the address of a struct property. (In fact, I don't even understand why your code compiles in the first place; what initializer does the compiler think this is?) The correct way, which gives the expected results, is to ask for a pointer that does point at that address, like this:
struct IntObservingThing {
var anInt: Int = 0 {
didSet {
print("I was just set to \(anInt)")
}
}
}
withUnsafeMutablePointer(to: &intObservingThing.anInt) { ptr -> Void in
ptr.pointee = 20 // I was just set to 20
}
print(intObservingThing.anInt) // 20

Optional chaining with Swift strings

With optional chaining, if I have a Swift variable
var s: String?
s might contain nil, or a String wrapped in an Optional. So, I tried this to get its length:
let count = s?.characters?.count ?? 0
However, the compiler wants this:
let count = s?.characters.count ?? 0
My understanding of optional chaining is that, once you start using ?. in a dotted expression, the rest of the properties are made optional and are typically accessed by ?., not ..
So, I dug a little further and tried this in the playground:
var s: String? = "Foo"
print(s?.characters)
// Output: Optional(Swift.String.CharacterView(_core: Swift._StringCore(_baseAddress: 0x00000001145e893f, _countAndFlags: 3, _owner: nil)))
The result indicates that s?.characters is indeed an Optional instance, indicating that s?.characters.count should be illegal.
Can someone help me understand this state of affairs?
When you say:
My understanding of optional chaining is that, once you start using ?. in a dotted expression, the rest of the properties are made optional and are typically accessed by ?., not ..
I would say that you are almost there.
It’s not that all the properties are made optional, it’s that the original call is optional, so it looks like the other properties are optional.
characters is not an optional property, and neither is count, but the value that you are calling it on is optional. If there is a value, then the characters and count properties will return a value; otherwise, nil is returned. It is because of this that the result of s?.characters.count returns an Int?.
If either of the properties were optional, then you would need to add ? to it, but, in your case, they aren’t. So you don’t.
Edited following comment
From the comment:
I still find it strange that both s?.characters.count and (s?.characters)?.count compile, but (s?.characters).count doesn't. Why is there a difference between the first and the last expression?
I’ll try and answer it here, where there is more room than in the comment field:
s?.characters.count
If s is nil, the whole expression returns nil, otherwise an Int. So the return type is Int?.
(s?.characters).count // Won’t compile
Breaking this down: if s is nil, then (s?.characters) is nil, so we can’t call count on it.
In order to call the count property on (s?.characters), the expression needs to be optionally unwrapped, i.e. written as:
(s?.characters)?.count
Edited to add further
The best I can get to explaining this is with this bit of playground code:
let s: String? = "hello"
s?.characters.count
(s?.characters)?.count
(s)?.characters.count
((s)?.characters)?.count
// s?.characters.count
func method1(s: String?) -> Int? {
guard let s = s else { return nil }
return s.characters.count
}
// (s?.characters).count
func method2(s: String?) -> Int? {
guard let c = s?.characters else { return nil }
return c.count
}
method1(s)
method2(s)
On the Swift-users mailing list, Ingo Maier was kind enough to point me to the section on optional chaining expressions in the Swift language spec, which states:
If a postfix expression that contains an optional-chaining expression is nested inside other postfix expressions, only the outermost expression returns an optional type.
It continues with the example:
var c: SomeClass?
var result: Bool? = c?.property.performAction()
This explains why the compiler wants s?.characters.count in my example above, and I consider that it answers the original question. However, as #Martin R observed in a comment, there is still a mystery as to why these two expressions are treated differently by the compiler:
s?.characters.count
(s?.characters).count
If I am reading the spec properly, the subexpression
(s?.characters)
is "nested inside" the overall postfix expression
(s?.characters).count
and thus should be treated the same as the non-parenthesized version. But that's a separate issue.
Thanks to all for the contributions!

String(x) puts "optional" in the output

I'm writing a little program that reads and writes text files into an NSTableView. I had the reading working fine, and then moved onto the writing. And I got...
FR Optional(0) Optional(0) Optional(0) Optional(0) Optional(46.29) Optional(0)
I understand why this is happening: the values are NSNumbers in a dictionary, so they are, by definition, optional. But obviously this is not useful output.
Is there an easy way to output the value without the Optional and any similar bumpf?
The reason why you see Optional(n) is because you're printing the optional without unwrapping.
I suggest you re-read the optionals chapter in the Apple book to get a better grasp at why this is happening to you.
The short version is, an Optional is a type, in fact if you look at the Swift source code, you will find that's it's just an Enum!
public enum Optional<Wrapped> : _Reflectable, NilLiteralConvertible {
case None
case Some(Wrapped)
/// Construct a `nil` instance.
public init()
/// Construct a non-`nil` instance that stores `some`.
public init(_ some: Wrapped)
/// If `self == nil`, returns `nil`. Otherwise, returns `f(self!)`.
#warn_unused_result
public func map<U>(#noescape f: (Wrapped) throws -> U) rethrows -> U?
/// Returns `nil` if `self` is nil, `f(self!)` otherwise.
#warn_unused_result
public func flatMap<U>(#noescape f: (Wrapped) throws -> U?) rethrows -> U?
/// Create an instance initialized with `nil`.
public init(nilLiteral: ())
}
So if your values are Optionals, you have to unwrap them to see their values.
Take a look at this code and try to guess the output:
var code: String? = "hello"
if let code = code where code == "hello" {
print(code)
}
var upperCase = code?.uppercaseString
print(upperCase)
Output:
Did you figure it out?
It looks like this:
hello
Optional("HELLO")
Why does the first hello print ok?
Because it's being unwrapped in the if let statement.
But the second one, upperCase is never unwrapped, so an Optional will remain optional (unless unwrapped). code?.uppercaseString returns a new Optional. By printing it directly, we get what you see.
If you want to extract the value the optional is holding you have two operators. ? and !.
The first one is usually preferred, but if you're sure that the value is ok, you can use force the unwrap by using !, so you could do
print(uppercase!)
Careful tho, because if upperCase happens to be nil, you'd get a runtime crash.
If the values you write are held in an (optional) array, then .flatMap, prior to printing, will do the trick for you
// myOptionalDoubles : [Double?]
myOptionalDoublesForWriting = myOptionalDoubles.flatMap { $0 }
Using .flatMap like this, you unwrap any non-nil values in the array myOptionalDoubles, and return an array myOptionalDoublesForWriting of type [Double] (non-optional). Any nil entries in myOptionalDoubles will not follow to myOptionalDoublesForWriting.
If you don't want to lose information about optional entries, you can use .map instead:
// myOptionalDoubles : [Double?]
var myOptionalDoublesForWriting = myOptionalDoubles.map { $0 ?? 0.0 }
In this case, you make use of the nil coalescing operator to either unwrap each (non-nil) array entry, or, if it's nil, give it value 0.0. As above, the mapped to array myOptionalDoublesForWriting will be of non-optional type [Double].

How to handle initial nil value for reduce functions

I would like to learn and use more functional programming in Swift. So, I've been trying various things in playground. I don't understand Reduce, though. The basic textbook examples work, but I can't get my head around this problem.
I have an array of strings called "toDoItems". I would like to get the longest string in this array. What is the best practice for handling the initial nil value in such cases? I think this probably happens often. I thought of writing a custom function and use it.
func optionalMax(maxSofar: Int?, newElement: Int) -> Int {
if let definiteMaxSofar = maxSofar {
return max(definiteMaxSofar, newElement)
}
return newElement
}
// Just testing - nums is an array of Ints. Works.
var maxValueOfInts = nums.reduce(0) { optionalMax($0, $1) }
// ERROR: cannot invoke 'reduce' with an argument list of type ‘(nil, (_,_)->_)'
var longestOfStrings = toDoItems.reduce(nil) { optionalMax(count($0), count($1)) }
It might just be that Swift does not automatically infer the type of your initial value. Try making it clear by explicitly declaring it:
var longestOfStrings = toDoItems.reduce(nil as Int?) { optionalMax($0, count($1)) }
By the way notice that I do not count on $0 (your accumulator) since it is not a String but an optional Int Int?
Generally to avoid confusion reading the code later, I explicitly label the accumulator as a and the element coming in from the serie as x:
var longestOfStrings = toDoItems.reduce(nil as Int?) { a, x in optionalMax(a, count(x)) }
This way should be clearer than $0 and $1 in code when the accumulator or the single element are used.
Hope this helps
Initialise it with an empty string "" rather than nil. Or you could even initialise it with the first element of the array, but an empty string seems better.
Second go at this after writing some wrong code, this will return the longest string if you are happy with an empty string being returned for an empty array:
toDoItems.reduce("") { count($0) > count($1) ? $0 : $1 }
Or if you want nil, use
toDoItems.reduce(nil as String?) { count($0!) > count($1) ? $0 : $1 }
The problem is that the compiler cannot infer the types you are using for your seed and accumulator closure if you seed with nil, and you also need to get the optional type correct when using the optional string as $0.