Swift optional inout parameters and nil - swift

Is it possible to have an Optional inout parameter to a function in Swift? I am trying to do this:
func testFunc( inout optionalParam: MyClass? ) {
if optionalParam {
...
}
}
...but when I try to call it and pass nil, it is giving me a strange compile error:
Type 'inout MyClass?' does not conform to protocol 'NilLiteralConvertible'
I don't see why my class should have to conform to some special protocol when it's already declared as an optional.

It won't compile because the function expecting a reference but you passed nil. The problem have nothing to do with optional.
By declaring parameter with inout means that you will assign some value to it inside the function body. How can it assign value to nil?
You need to call it like
var a : MyClass? = nil
testFunc(&a) // value of a can be changed inside the function
If you know C++, this is C++ version of your code without optional
struct MyClass {};
void testFunc(MyClass &p) {}
int main () { testFunc(nullptr); }
and you have this error message
main.cpp:6:6: note: candidate function not viable: no known conversion from 'nullptr_t' to 'MyClass &' for 1st argument
which is kind of equivalent to the on you got (but easier to understand)

Actually what #devios1 needs is "optional pointer".
But inout MyClass? means "pointer to an optional".
The following should work in Swift 4
class MyClass {
// func foo() {}
}
func testFunc(_ optionalParam: UnsafeMutablePointer<MyClass>? ) {
if let optionalParam = optionalParam {
// optionalParam.pointee.foo()
// optionalParam.pointee = MyClass()
}
}
testFunc(nil)
var myClass = MyClass()
testFunc(&myClass)

This is possible, and it may have everything to do with Closure.()
The optional can have a nil value, but it is not expecting nil. So the optional would work if you simply did not pass it to the function or passed a variable of type MyClass? with value nil. Like Bryan states.
When you write the function it has to have a default value to be an optional param like so:
func testFunc(inout optionalParam:MyClass?={var nilRef:MyClass?;return &nilRef}()) {
if optionalParam != nil {
...
}
}
Notice if optionalParam {...} is changed to if optionalParam != nil {...}
you can NOT uwwrap optionalParam without checking it,i.e.if optionalParam? != nil, as uwrapping, optionalParam? fails when optionalParam==nil. ~note, var a :MyClass? has a value of nil until it is assigned.
Call is either, testFunc(optionalParam:&a) or testFunc(), but never testFunc(nil)
Think you may be getting two separate concepts intertwined as they have similar names, optional parameters and optionals. Optionals are variables with an extended nil condition. Optional parameters are function parameters which are optional.
Further reading here. Apple is trying to change the verbiage from optional parameter to 'parameters with default values'. It is unfortunate they didn't incorporate optional parameter behavior within these new optional thingies. What is the point of passing nil optionals as optionals if they will fail when unwrapped. Maybe it gets deeper at the heart of what this optional thing is if it can be passed before unwrapping... Schrodinger's cat

The exact passage from the docs is:
You can only pass a variable as the argument for an in-out parameter.
You cannot pass a constant or a literal value as the argument, because
constants and literals cannot be modified. You place an ampersand (&)
directly before a variable’s name when you pass it as an argument to
an inout parameter, to indicate that it can be modified by the
function.
Note however, that (edited) per #gnasher's comment, you only need to pass a class as inout (aka by reference), if you intend or want to allow for the instance to be / being replaced by another instance, and not just have the original modified. The passage in the documentation that covers this is:
In-Out Parameters
Variable parameters, as described above, can only be changed within
the function itself. If you want a function to modify a parameter’s
value, and you want those changes to persist after the function call
has ended, define that parameter as an in-out parameter instead.
Here are three tests that cover the usage of var and inout.
class Bar : Printable {
var value = 1
init(_ value:Int) { self.value = value }
var description:String { return "Bar is: \(value)" }
}
let bar = Bar(1)
func changeBarWithoutInoutSinceBarIsAClassYaySwift(b:Bar) { b.value = 2 }
changeBarWithoutInoutSinceBarIsAClassYaySwift(bar)
println("after: \(bar)") // 2
var bar2 = Bar(0)
func tryToReplaceLocalBarWithoutInout(var b:Bar) { b = Bar(99) }
tryToReplaceLocalBarWithoutInout(bar2)
println("after: \(bar2)") // 0
var bar3 = Bar(0)
func replaceClassInstanceViaInout(inout b:Bar) { b = Bar(99) }
replaceClassInstanceViaInout(&bar3)
println("after: \(bar3)") // 99

Related

Constraining method or class to accept only optionals

I need to constrain API so that users can call it only with types that are explicitly optional. How can I make this happen?
class Foo<T> {
let value: T?
init(_ value: T?) {
self.value = value
}
}
let optionalValue: Bool? = true
Foo(optionalValue) // Should work
let nonOptionalValue: Bool = true
Foo(nonOptionalValue) // Should fail, ideally at compile time
Foo(true) // Should work or acceptable to replace it with
Foo(.some(true))
The problem is that if you pass a non-Optional where an Optional is expected, the non-Optional is not rejected; instead, it is implicitly wrapped in an Optional. That is why this line of code is legal:
let optionalValue: Bool? = true
And for the same reason, this line of code is legal:
let nonOptionalValue: Bool = true
Foo(nonOptionalValue) // even though Foo's parameter is typed as Optional
You cannot turn off this feature. It is baked into the language. And indeed, as the first example shows, you yourself have come to rely upon it! Thus, for that very reason, you can never prevent a non-Optional from being passed here. That's the price we pay for the convenience of implicit Optional wrapping.
One option can be to declare init with inout property.
init(_ value: inout T?) {
self.value = value
}
Now below line gives compiler error "Inout argument could be set to a value with a type other than 'Bool'; use a value declared as type 'Bool?' instead"
Foo(&nonOptionalValue) // Should fail, ideally at compile time
Constraints:
User of class Foo may not want to pass by reference.
User will not be able to write Foo(true)
But, these can be worked around by creating a local var before calling the init.

Is it safe to capture properties of `self` without capturing `self`?

You can copy this playground verbatim:
var closures=[() -> Void]()
class Thing {
let name: String
init(_ name: String) { self.name=name }
}
class RefThingWrapper {
let thing: Thing
init(_ t: Thing) {
self.thing=t
closures.append { [thing, weak self] in // Even though `thing` captured strongly, `self` could still get deallocated.
print((self == nil) ? "`self` deallocated" : "`self` not deallocated")
print(thing.name) // Decided to use `thing` within closure to eliminate any chance of optimizations affecting the test results — which I found when I captured `self` strongly without using it.
}
}
}
struct ValThingWrapper {
let thing: Thing
init(_ t: Thing) {
self.thing=t
closures.append { [weak thing] in
print((thing == nil) ? "`thing` deallocated" : "`thing` not deallocated")
}
}
}
var wrapper=Optional(RefThingWrapper(Thing("Thing 1"))) // Test with reference type.
//var wrapper=Optional(ValThingWrapper(Thing("Thing 1"))) // Test with value type.
closures[0]()
wrapper=nil
closures[0]()
It demonstrates how a property of self — whether self is a reference or value type — can be captured within a closure independently of self. Running the program as is demonstrates a captured property existing after self has been deallocated. Testing with the value type wrapper demonstrates that, if weakly captured, the instance will be deallocated once the referencing value instance is deallocated.
I wasn't sure this was possible because when creating the closure at first, I forgot to initialize the property I was capturing. So the compiler complained — in the capture list — 'self' used before all stored properties are initialized. So I figured self was being captured implicitly, and only after digging deeper discovered otherwise.
Is this documented somewhere? I found this post by Joe Groff where he proposes:
For 'let' properties of classes, it'd be reasonable to propose having
closures capture the property directly by default in this way
instead of capturing 'self' (and possibly allowing referencing them
without 'self.', since 'self' wouldn't be involved in any cycle formed
this way).
This was back in 2015, and I didn't find any implemented proposals that arose from the discussion. Is there any authoritative source that communicates this behavior?
If you’re just asking for documentation on capture lists and reference types, see The Swift Programming Language Resolving Strong Reference Cycles for Closures. Also see the Language Reference: Capture Lists
If your capture list includes value type, you’re getting copy of the value.
var foo = 1
let closure = { [foo] in
print(foo)
}
foo = 42
closure() // 1; capturing value of `foo` as it was when the closure was declared
If your capture list includes a reference type, you’re getting a copy of the reference to that current instance.
class Bar {
var value: Int
init(value: Int) { self.value = value }
}
var bar = Bar(value: 1)
let closure = { [bar] in
print(bar.value)
}
bar.value = 2
bar = Bar(value: 3)
closure() // 2; capturing reference to first instance that was subsequently updated
These captured references are strong by default, but can optionally be marked as weak or unowned, as needed, too.
That Capture Lists document outlines the way that you can capture a property without capturing self:
You can also bind an arbitrary expression to a named value in a capture list. The expression is evaluated when the closure is created, and the value is captured with the specified strength. For example:
// Weak capture of "self.parent" as "parent"
myFunction { [weak parent = self.parent] in print(parent!.title) }
I’m not crazy about their code sample, but it illustrates the capture of a property without capturing self, nonetheless.

The strange behaviour of Swift's AnyObject

In messing around with Swift today I came across a strange thing. Here's the unit test I developed which shows some unexpected behaviours when using Swift's AnyObject.
class SwiftLanguageTests: XCTestCase {
class TestClass {
var name:String?
var xyz:String?
}
func testAccessingPropertiesOfAnyObjectInstancesReturnsNils() {
let instance = TestClass()
instance.xyz = "xyz"
instance.name = "name"
let typeAnyObject = instance as AnyObject
// Correct: Won't compile because 'xyz' is an unknown property in any class.
XCTAssertEqual("xyz", typeAnyObject.xyz)
// Unexpected: Will compile because 'name' is a property of NSException
// Strange: But returns a nil when accessed.
XCTAssertEqual("name", typeAnyObject.name)
}
}
This code is a simplification of some other code where there is a Swift function that can only return a AnyObject.
As expected, after creating an instance of TestClass, casting it to AnyObject and setting another variable, accessing the property xyz won't compile because AnyObject does not have such a property.
But surprisingly a property called name is accepted by the compiler because there is a property by that name on NSException. It appears that Swift is quite happy to accept any property name as long as it exists somewhere in the runtime.
The next unexpected behaviour and the thing that got all this started is that attempting to access the name property returns a nil. Watching the various variables in the debugger, I can see that typeAnyObject is pointing at the original TestClass instance and it's name property has a value of "name".
Swift doesn't throw an error when accessing typeAnyObject.name so I would expect it to find and return "name". But instead I get nil.
I would be interested if anyone can shed some light on what is going on here?
My main concern is that I would expect Swift to either throw an error when accessing a property that does not exist on AnyObject, or find and return the correct value. Currently neither is happening.
Similar as in Objective-C, where you can send arbitrary messages to id,
arbitrary properties and methods can be called on an instance of AnyObject
in Swift. The details are different however, and it is documented in
Interacting with Objective-C APIs
in the "Using Swift with Cocoa and Objective-C" book.
Swift includes an AnyObject type that represents some kind of object. This is similar to Objective-C’s id type. Swift imports id as AnyObject, which allows you to write type-safe Swift code while maintaining the flexibility of an untyped object.
...
You can call any Objective-C method and access any property on an AnyObject value without casting to a more specific class type. This includes Objective-C compatible methods and properties marked with the #objc attribute.
...
When you call a method on a value of AnyObject type, that method call behaves like an implicitly unwrapped optional. You can use the same optional chaining syntax you would use for optional methods in protocols to optionally invoke a method on AnyObject.
Here is an example:
func tryToGetTimeInterval(obj : AnyObject) {
let ti = obj.timeIntervalSinceReferenceDate // NSTimeInterval!
if let theTi = ti {
print(theTi)
} else {
print("does not respond to `timeIntervalSinceReferenceDate`")
}
}
tryToGetTimeInterval(NSDate(timeIntervalSinceReferenceDate: 1234))
// 1234.0
tryToGetTimeInterval(NSString(string: "abc"))
// does not respond to `timeIntervalSinceReferenceDate`
obj.timeIntervalSinceReferenceDate is an implicitly unwrapped optional
and nil if the object does not have that property.
Here an example for checking and calling a method:
func tryToGetFirstCharacter(obj : AnyObject) {
let fc = obj.characterAtIndex // ((Int) -> unichar)!
if let theFc = fc {
print(theFc(0))
} else {
print("does not respond to `characterAtIndex`")
}
}
tryToGetFirstCharacter(NSDate(timeIntervalSinceReferenceDate: 1234))
// does not respond to `characterAtIndex`
tryToGetFirstCharacter(NSString(string: "abc"))
// 97
obj.characterAtIndex is an implicitly unwrapped optional closure. That code
can be simplified using optional chaining:
func tryToGetFirstCharacter(obj : AnyObject) {
if let c = obj.characterAtIndex?(0) {
print(c)
} else {
print("does not respond to `characterAtIndex`")
}
}
In your case, TestClass does not have any #objc properties.
let xyz = typeAnyObject.xyz // error: value of type 'AnyObject' has no member 'xyz'
does not compile because the xyz property is unknown to the compiler.
let name = typeAnyObject.name // String!
does compile because – as you noticed – NSException has a name property.
The value however is nil because TestClass does not have an
Objective-C compatible name method. As above, you should use optional
binding to safely unwrap the value (or test against nil).
If your class is derived from NSObject
class TestClass : NSObject {
var name : String?
var xyz : String?
}
then
let xyz = typeAnyObject.xyz // String?!
does compile. (Alternatively, mark the class or the properties with #objc.)
But now
let name = typeAnyObject.name // error: Ambigous use of `name`
does not compile anymore. The reason is that both TestClass and NSException
have a name property, but with different types (String? vs String),
so the type of that expression is ambiguous. This ambiguity can only be
resolved by (optionally) casting the AnyObject back to TestClass:
if let name = (typeAnyObject as? TestClass)?.name {
print(name)
}
Conclusion:
You can call any method/property on an instance of AnyObject if that
method/property is Objective-C compatible.
You have to test the implicitly unwrapped optional against nil or
use optional binding to check that the instance actually has that
method/property.
Ambiguities arise if more than one class has (Objective-C) compatible
methods with the same name but different types.
In particular because of the last point, I would try to avoid this
mechanism if possible, and optionally cast to a known class instead
(as in the last example).
it has nothing with NSException!
from Apple documentation:
protocol AnyObject { ... }
The protocol to which all classes implicitly conform.
When used as a concrete type, all known #objc methods and properties are available, as implicitly-unwrapped-optional methods and properties respectively, on each instance of AnyObject
name is #objc property, xyz is not.
try this :-)
let typeAnyObject = instance as Any
or
#objc class TestClass: NSObject {
var name:String?
var xyz:String? }
let instance = TestClass() instance.xyz = "xyz" instance.name = "name"
let typeAnyObject = instance as AnyObject
typeAnyObject.name // will not compile now

Why I can't use 'object== nil' in Swift?

I have simple method (hope the syntax is right)
func foo() -> WmGroupItemSample?{
var sample:WmGroupItemSample?
if(a > 0){ // some condition
sample = mGroupItemSampleList[0]
}
else{
// do nothing
}
return sample
}
As I understand foo method might return nil if statement a>0 doesn't occur. Therefore return type is: WmGroupItemSample?.
My doubts are in place where I call this method:
var sample:WmGroupItemSample? = foo()
if let temp = sample{
return sample.getStartDate()
}
else{
return -1
}
Is this proper way to validate if not nil? I use Beta 6
What difference if I'll call without ? or with ! like:
var sample:WmGroupItemSample = foo()
or:
var sample:WmGroupItemSample! = foo()
Why I can't validate sample like:
if sample == nil {
/* do one*/
}
else {
}
To compare value with a nil it has to be allowed to be a nil by ? mark. This means that the value can (possibly) be a nil. So by default:
var a : String
can not be a nil. To allow a string be a nil you have to do:
var a : String?
calling variable with ! is unwrapping an optional value
"Trying to use ! to access a non-existent optional value triggers a
runtime error. Always make sure that an optional contains a non-nil
value before using ! to force-unwrap its value." Excerpt From: Apple
Inc. "The Swift Programming Language."
iBooks. https://itun.es/pl/jEUH0.l
For more information I would suggest:
basics
more advanced
reading. Also that SO question with answer might be helpful.
You may have to implement either the Equatable or Comparable protocols.
implementing the Equatable protocol using a global function
func ==<T>(lhs:T, rhs: T?)-> Bool{
return lhs == nil
}
// in your class you can now extend the Equatable protocol in the class definition. ie
class ClassName: Equatable{
// properties and functions
}

Optional chaining and Array in swift

Let's take these two simple classes to show my problem:
class Object{
var name:String?
// keep it simple... and useless
}
class TestClass {
var objects:AnyObject[]?
func initializeObjects (){
objects?.insert(Object(), atIndex:0) // Error
objects?.insert(Object(), atIndex:1) // Error
objects?.insert(Object(), atIndex:2) // Error
}
}
With this implementation I get 3 errors Could not find member 'insert' where I try to add object into the objects array.
Now, if I remove the optional from objects definition and the optional chain in initializeObjects it works with no problem (here the working code)
class Object{
var name:String?
}
class TestClass {
var objects:AnyObject[] = AnyObject[]() // REMOVE optional and initialize an empty array
func initializeObjects (){
objects.insert(Object(), atIndex:0) // Remove Opt chaining
objects.insert(Object(), atIndex:1) // Remove Opt chaining
objects.insert(Object(), atIndex:2) // Remove Opt chaining
}
}
I can't understand what is wrong in the first implementation.
I thought it checks with objects? if objects is not nil and at this point it adds an element using insert:atIndex:. But I'm probably wrong -.-
Arrays in Swift are structs and structs are value types.
Optionals in Swift are actually enums (Optional<T> or ImplicitlyUnwrappedOptional<T>).
When you are unwrapping an optional (implicitly or explicitly) of a value type, what you get is actually a constant copy of the struct. And you can't call mutating methods on a constant struct.
Executing objects?.insert(Object(), atIndex:0) basically means this:
if let tmp = objects {
tmp.insert(Object(), atIndex:0)
}
As a workaround, you need to assign the unwrapped value to a variable and then assign the variable back to your optional property. That's how value types work.
This is reproducible for any struct, not only Arrays:
struct S {
var value: Int = 0
}
var varS: S = S()
varS.value = 10 //can be called
let constS: S = S()
constS.value = 10 //cannot be called - constant!
var optionalS: S? = S()
optionalS?.value = 10 //cannot be called, unwrapping makes a constant copy!
//workaround
if optionalS {
var tmpS = optionalS!
tmpS.value = 10
optionalS = tmpS
}
Some relevant discussion here: https://devforums.apple.com/thread/233111?tstart=60